Optimization theory and method profoundly impact numerous engineering desig
ns and applications. The gradient descent method is simpler and more extens
ively used to solve numerous optimization problems than other search method
s. However, the gradient descent method is easily trapped into a local mini
mum and slowly converges. This work presents a Gradient Forecasting Search
Method (GFSM) for enhancing the performance of the gradient descent method
in order to resolve optimization problems.
GFSM is based on the gradient descent method and on the universal Discrete
Difference Equation Prediction Model (DDEPM) proposed herein. In addition,
the concept of the universal DDEPM is derived from the grey prediction mode
l. The original grey prediction model uses a mathematical hypothesis and ap
proximation to transform a continuous differential equation into a discrete
difference equation. This is not a logical approach because the forecastin
g sequence data is invariably discrete. To construct a more precise predict
ion model, this work adopts a discrete difference equation. GFSM proposed h
erein can accurately predict the precise searching direction and trend of t
he gradient descent method via the universal DDEPM and can adjust predictio
n steps dynamically using the golden section search algorithm.
Experimental results indicate that the proposed method can accelerate the s
earching speed of gradient descent method as well as help the gradient desc
ent method escape from local minima. Our results further demonstrate that a
pplying the golden section search method to achieve dynamic prediction step
s of the DDEPM is an efficient approach for this search algorithm.