• Title/Summary/Keyword: unconstrained global optimization

Search Result 21, Processing Time 0.022 seconds

Polynomial-Filled Function Algorithm for Unconstrained Global Optimization Problems

  • Salmah;Ridwan Pandiya
    • Kyungpook Mathematical Journal
    • /
    • v.64 no.1
    • /
    • pp.95-111
    • /
    • 2024
  • The filled function method is useful in solving unconstrained global optimization problems. However, depending on the type of function, and parameters used, there are limitations that cause difficultiies in implemenations. Exponential and logarithmic functions lead to the overflow effect, requiring iterative adjustment of the parameters. This paper proposes a polynomial-filled function that has a general form, is non-exponential, nonlogarithmic, non-parameteric, and continuously differentiable. With this newly proposed filled function, the aforementioned shortcomings of the filled function method can be overcome. To confirm the superiority of the proposed filled function algorithm, we apply it to a set of unconstrained global optimization problems. The data derived by numerical implementation shows that the proposed filled function can be used as an alternative algorithm when solving unconstrained global optimization problems.

AN ADAPTIVE APPROACH OF CONIC TRUST-REGION METHOD FOR UNCONSTRAINED OPTIMIZATION PROBLEMS

  • FU JINHUA;SUN WENYU;SAMPAIO RAIMUNDO J. B. DE
    • Journal of applied mathematics & informatics
    • /
    • v.19 no.1_2
    • /
    • pp.165-177
    • /
    • 2005
  • In this paper, an adaptive trust region method based on the conic model for unconstrained optimization problems is proposed and analyzed. We establish the global and super linear convergence results of the method. Numerical tests are reported that confirm the efficiency of the new method.

A NOVEL FILLED FUNCTION METHOD FOR GLOBAL OPTIMIZATION

  • Lin, Youjiang;Yang, Yongjian;Zhang, Liansheng
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.6
    • /
    • pp.1253-1267
    • /
    • 2010
  • This paper considers the unconstrained global optimization with the revised filled function methods. The minimization sequence could leave from a local minimizer to a better minimizer of the objective function through minimizing an auxiliary function constructed at the local minimizer. Some promising numerical results are also included.

GLOBAL CONVERGENCE OF AN EFFICIENT HYBRID CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Liu, Jinkui;Du, Xianglin
    • Bulletin of the Korean Mathematical Society
    • /
    • v.50 no.1
    • /
    • pp.73-81
    • /
    • 2013
  • In this paper, an efficient hybrid nonlinear conjugate gradient method is proposed to solve general unconstrained optimization problems on the basis of CD method [2] and DY method [5], which possess the following property: the sufficient descent property holds without any line search. Under the Wolfe line search conditions, we proved the global convergence of the hybrid method for general nonconvex functions. The numerical results show that the hybrid method is especially efficient for the given test problems, and it can be widely used in scientific and engineering computation.

MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION

  • Yuan, Gonglin;Wei, Zengxin;Wu, Yanlin
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.4
    • /
    • pp.767-788
    • /
    • 2010
  • In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.

GLOBAL CONVERGENCE OF A NEW SPECTRAL PRP CONJUGATE GRADIENT METHOD

  • Liu, Jinkui
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.5_6
    • /
    • pp.1303-1309
    • /
    • 2011
  • Based on the PRP method, a new spectral PRP conjugate gradient method has been proposed to solve general unconstrained optimization problems which produce sufficient descent search direction at every iteration without any line search. Under the Wolfe line search, we prove the global convergence of the new method for general nonconvex functions. The numerical results show that the new method is efficient for the given test problems.

CONVERGENCE OF SUPERMEMORY GRADIENT METHOD

  • Shi, Zhen-Jun;Shen, Jie
    • Journal of applied mathematics & informatics
    • /
    • v.24 no.1_2
    • /
    • pp.367-376
    • /
    • 2007
  • In this paper we consider the global convergence of a new super memory gradient method for unconstrained optimization problems. New trust region radius is proposed to make the new method converge stably and averagely, and it will be suitable to solve large scale minimization problems. Some global convergence results are obtained under some mild conditions. Numerical results show that this new method is effective and stable in practical computation.

CONVERGENCE PROPERTIES OF A CORRELATIVE POLAK-RIBIERE CONJUGATE GRADIENT METHOD

  • Hu Guofang;Qu Biao
    • Journal of applied mathematics & informatics
    • /
    • v.22 no.1_2
    • /
    • pp.461-466
    • /
    • 2006
  • In this paper, an algorithm with a new Armijo-type line search is proposed that ensure global convergence of a correlative Polak-Ribiere conjugate method for the unconstrained minimization of non-convex differentiable function.

CONVERGENCE OF THE NONMONOTONE PERRY-SHANNO METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Ou, Yigui;Ma, Wei
    • Journal of applied mathematics & informatics
    • /
    • v.30 no.5_6
    • /
    • pp.971-980
    • /
    • 2012
  • In this paper, a method associating with one new form of nonmonotone linesearch technique is proposed, which can be regarded as a generalization of the Perry-Shanno memoryless quasi-Newton type method. Under some reasonable conditions, the global convergence of the proposed method is proven. Numerical tests show its efficiency.

A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD

  • Yu, Zhensheng;Zang, Jinsong;Liu, Jingzhao
    • Journal of the Korean Mathematical Society
    • /
    • v.47 no.1
    • /
    • pp.63-70
    • /
    • 2010
  • In this paper, we develop a nonmonotone spectral memory gradient method for unconstrained optimization, where the spectral stepsize and a class of memory gradient direction are combined efficiently. The global convergence is obtained by using a nonmonotone line search strategy and the numerical tests are also given to show the efficiency of the proposed algorithm.