The Newton-Like algorithm with price estimation error in optimization flow control in network is analyzed. The estimation error is treated as inexactness of the gradient and the inexact descent direction is analyzed. ...The Newton-Like algorithm with price estimation error in optimization flow control in network is analyzed. The estimation error is treated as inexactness of the gradient and the inexact descent direction is analyzed. Based on the optimization theory, a sufficient condition for convergence of this algorithm with bounded price estimation error is obtained. Furthermore, even when this sufficient condition doesn't hold, this algorithm can also converge, provided a modified step size, and an attraction region is obtained. Based on Lasalle's invariance principle applied to a suitable Lyapunov function, the dynamic system described by this algorithm is proved to be global stability if the error is zero. And the Newton-Like algorithm with bounded price estimation error is also globally stable if the error satisfies the sufficient condition for convergence. All trajectories ultimately converge to the equilibrium point.展开更多
Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while...Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while fixing other components. All components of w update after one iteration. Then go to next iteration. Though the method converges and converges fast in the beginning, it converges slow for final convergence. To improve the speed of final convergence of coordinate descent method, Hooke and Jeeves algorithm which adds pattern search after every iteration in coordinate descent method was applied to SVM and a global Newton algorithm was used to solve one-variable subproblems. We proved the convergence of the algorithm. Experimental results show Hooke and Jeeves' method does accelerate convergence specially for final convergence and achieves higher testing accuracy more quickly in classification.展开更多
基金supported in part by the National Outstanding Youth Foundation of P.R.China (60525303)the National Natural Science Foundation of P.R.China(60404022,60604004)+2 种基金the Natural Science Foundation of Hebei Province (102160)the special projects in mathematics funded by the Natural Science Foundation of Hebei Province(07M005)the NS of Education Office in Hebei Province (2004123).
文摘The Newton-Like algorithm with price estimation error in optimization flow control in network is analyzed. The estimation error is treated as inexactness of the gradient and the inexact descent direction is analyzed. Based on the optimization theory, a sufficient condition for convergence of this algorithm with bounded price estimation error is obtained. Furthermore, even when this sufficient condition doesn't hold, this algorithm can also converge, provided a modified step size, and an attraction region is obtained. Based on Lasalle's invariance principle applied to a suitable Lyapunov function, the dynamic system described by this algorithm is proved to be global stability if the error is zero. And the Newton-Like algorithm with bounded price estimation error is also globally stable if the error satisfies the sufficient condition for convergence. All trajectories ultimately converge to the equilibrium point.
基金supported by the National Natural Science Foundation of China (6057407560705004)
文摘Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while fixing other components. All components of w update after one iteration. Then go to next iteration. Though the method converges and converges fast in the beginning, it converges slow for final convergence. To improve the speed of final convergence of coordinate descent method, Hooke and Jeeves algorithm which adds pattern search after every iteration in coordinate descent method was applied to SVM and a global Newton algorithm was used to solve one-variable subproblems. We proved the convergence of the algorithm. Experimental results show Hooke and Jeeves' method does accelerate convergence specially for final convergence and achieves higher testing accuracy more quickly in classification.