In this paper,a new technique is introduced to construct higher-order iterative methods for solving nonlinear systems.The order of convergence of some iterative methods can be improved by three at the cost of introduc...In this paper,a new technique is introduced to construct higher-order iterative methods for solving nonlinear systems.The order of convergence of some iterative methods can be improved by three at the cost of introducing only one additional evaluation of the function in each step.Furthermore,some new efficient methods with a higher-order of convergence are obtained by using only a single matrix inversion in each iteration.Analyses of convergence properties and computational efficiency of these new methods are made and testified by several numerical problems.By comparison,the new schemes are more efficient than the corresponding existing ones,particularly for large problem sizes.展开更多
Newton's learning algorithm of NN is presented and realized. In theory, the convergence rate of learning algorithm of NN based on Newton's method must be faster than BP's and other learning algorithms, because the ...Newton's learning algorithm of NN is presented and realized. In theory, the convergence rate of learning algorithm of NN based on Newton's method must be faster than BP's and other learning algorithms, because the gradient method is linearly convergent while Newton's method has second order convergence rate. The fast computing algorithm of Hesse matrix of the cost function of NN is proposed and it is the theory basis of the improvement of Newton's learning algorithm. Simulation results show that the convergence rate of Newton's learning algorithm is high and apparently faster than the traditional BP method's, and the robustness of Newton's learning algorithm is also better than BP method' s.展开更多
基金Supported by the National Natural Science Foundation of China(12061048)NSF of Jiangxi Province(20232BAB201026,20232BAB201018)。
文摘In this paper,a new technique is introduced to construct higher-order iterative methods for solving nonlinear systems.The order of convergence of some iterative methods can be improved by three at the cost of introducing only one additional evaluation of the function in each step.Furthermore,some new efficient methods with a higher-order of convergence are obtained by using only a single matrix inversion in each iteration.Analyses of convergence properties and computational efficiency of these new methods are made and testified by several numerical problems.By comparison,the new schemes are more efficient than the corresponding existing ones,particularly for large problem sizes.
文摘Newton's learning algorithm of NN is presented and realized. In theory, the convergence rate of learning algorithm of NN based on Newton's method must be faster than BP's and other learning algorithms, because the gradient method is linearly convergent while Newton's method has second order convergence rate. The fast computing algorithm of Hesse matrix of the cost function of NN is proposed and it is the theory basis of the improvement of Newton's learning algorithm. Simulation results show that the convergence rate of Newton's learning algorithm is high and apparently faster than the traditional BP method's, and the robustness of Newton's learning algorithm is also better than BP method' s.