In this paper,the new SQP feasible descent algorithm for nonlinear constrained optimization problems presented,and under weaker conditions of relative,we proofed the new method still possesses global convergence and i...In this paper,the new SQP feasible descent algorithm for nonlinear constrained optimization problems presented,and under weaker conditions of relative,we proofed the new method still possesses global convergence and its strong convergence.The numerical results illustrate that the new methods are valid.展开更多
In this paper, we present a new form of successive approximation Broyden-like algorithm for nonlinear complementarity problem based on its equivalent nonsmooth equations. Under suitable conditions, we get the global c...In this paper, we present a new form of successive approximation Broyden-like algorithm for nonlinear complementarity problem based on its equivalent nonsmooth equations. Under suitable conditions, we get the global convergence on the algorithms. Some numerical results are also reported.展开更多
Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS met...Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.展开更多
基金Supported by the NNSF of China(10231060)Supported by the Soft Science Foundation of Henan Province(082400430820)
文摘In this paper,the new SQP feasible descent algorithm for nonlinear constrained optimization problems presented,and under weaker conditions of relative,we proofed the new method still possesses global convergence and its strong convergence.The numerical results illustrate that the new methods are valid.
文摘In this paper, we present a new form of successive approximation Broyden-like algorithm for nonlinear complementarity problem based on its equivalent nonsmooth equations. Under suitable conditions, we get the global convergence on the algorithms. Some numerical results are also reported.
文摘Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameters in the search directions. In this note, by combining the nice numerical performance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991), a class of new restarting conjugate gradient methods is presented. Global convergences of the new method with two kinds of common line searches, are proved. Firstly, it is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continously dif ferentiable function with Curry-Altman's step size rule and a bounded level set. Secondly, by using comparing technique, some general convergence properties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method.