A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search...A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search direction is based on empir- ical gradients by evaluating the response to the position changes, while step length is based on uncertainty reasoning by using a simple fuzzy rule. The effectiveness of the SOA is evaluated by using a challenging set of typically complex functions in compari- son to differential evolution (DE) and three modified particle swarm optimization (PSO) algorithms. The simulation results show that the performance of the SOA is superior or comparable to that of the other algorithms.展开更多
A modification of evolutionary programming or evolution strategies for ndimensional global optimization is proposed. Based on the ergodicity and inherentrandomness of chaos, the main characteristic of the new algorith...A modification of evolutionary programming or evolution strategies for ndimensional global optimization is proposed. Based on the ergodicity and inherentrandomness of chaos, the main characteristic of the new algorithm which includes two phases is that chaotic behavior is exploited to conduct a rough search of the problem space in order to find the promising individuals in Phase I. Adjustment strategy of steplength and intensive searches in Phase II are employed. The population sequences generated by the algorithm asymptotically converge to global optimal solutions with probability one. The proposed algorithm is applied to several typical test problems. Numerical results illustrate that this algorithm can more efficiently solve complex global optimization problems than evolutionary programming and evolution strategies in most cases.展开更多
This paper presents an improved gravitational search algorithm (IGSA) as a hybridization of a relatively recent evolutionary algorithm called gravitational search algorithm (GSA), with the free search differential...This paper presents an improved gravitational search algorithm (IGSA) as a hybridization of a relatively recent evolutionary algorithm called gravitational search algorithm (GSA), with the free search differential evolution (FSDE). This combination incorporates FSDE into the optimization process of GSA with an attempt to avoid the premature convergence in GSA. This strategy makes full use of the exploration ability of GSA and the exploitation ability of FSDE. IGSA is tested on a suite of benchmark functions. The experimental results demonstrate the good performance of IGSA.展开更多
Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while...Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while fixing other components. All components of w update after one iteration. Then go to next iteration. Though the method converges and converges fast in the beginning, it converges slow for final convergence. To improve the speed of final convergence of coordinate descent method, Hooke and Jeeves algorithm which adds pattern search after every iteration in coordinate descent method was applied to SVM and a global Newton algorithm was used to solve one-variable subproblems. We proved the convergence of the algorithm. Experimental results show Hooke and Jeeves' method does accelerate convergence specially for final convergence and achieves higher testing accuracy more quickly in classification.展开更多
基金supported by the National Natural Science Foundation of China(60870004)
文摘A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search direction is based on empir- ical gradients by evaluating the response to the position changes, while step length is based on uncertainty reasoning by using a simple fuzzy rule. The effectiveness of the SOA is evaluated by using a challenging set of typically complex functions in compari- son to differential evolution (DE) and three modified particle swarm optimization (PSO) algorithms. The simulation results show that the performance of the SOA is superior or comparable to that of the other algorithms.
文摘A modification of evolutionary programming or evolution strategies for ndimensional global optimization is proposed. Based on the ergodicity and inherentrandomness of chaos, the main characteristic of the new algorithm which includes two phases is that chaotic behavior is exploited to conduct a rough search of the problem space in order to find the promising individuals in Phase I. Adjustment strategy of steplength and intensive searches in Phase II are employed. The population sequences generated by the algorithm asymptotically converge to global optimal solutions with probability one. The proposed algorithm is applied to several typical test problems. Numerical results illustrate that this algorithm can more efficiently solve complex global optimization problems than evolutionary programming and evolution strategies in most cases.
基金supported by the National Natural Science Foundation of China (70871081)the Shanghai Leading Academic Discipline Project of China (S1205YLXK)
文摘This paper presents an improved gravitational search algorithm (IGSA) as a hybridization of a relatively recent evolutionary algorithm called gravitational search algorithm (GSA), with the free search differential evolution (FSDE). This combination incorporates FSDE into the optimization process of GSA with an attempt to avoid the premature convergence in GSA. This strategy makes full use of the exploration ability of GSA and the exploitation ability of FSDE. IGSA is tested on a suite of benchmark functions. The experimental results demonstrate the good performance of IGSA.
基金supported by the National Natural Science Foundation of China (6057407560705004)
文摘Coordinate descent method is a unconstrained optimization technique. When it is applied to support vector machine (SVM), at each step the method updates one component of w by solving a one-variable sub-problem while fixing other components. All components of w update after one iteration. Then go to next iteration. Though the method converges and converges fast in the beginning, it converges slow for final convergence. To improve the speed of final convergence of coordinate descent method, Hooke and Jeeves algorithm which adds pattern search after every iteration in coordinate descent method was applied to SVM and a global Newton algorithm was used to solve one-variable subproblems. We proved the convergence of the algorithm. Experimental results show Hooke and Jeeves' method does accelerate convergence specially for final convergence and achieves higher testing accuracy more quickly in classification.