As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit...As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance.展开更多
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w...To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method.展开更多
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p...The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem.展开更多
Removal of cloud cover on the satellite remote sensing image can effectively improve the availability of remote sensing images. For thin cloud cover, support vector value contourlet transform is used to achieve multi-...Removal of cloud cover on the satellite remote sensing image can effectively improve the availability of remote sensing images. For thin cloud cover, support vector value contourlet transform is used to achieve multi-scale decomposition of the area of thin cloud cover on remote sensing images. Through enhancing coefficients of high frequency and suppressing coefficients of low frequency, the thin cloud is removed. For thick cloud cover, if the areas of thick cloud cover on multi-source or multi-temporal remote sensing images do not overlap, the multi-output support vector regression learning method is used to remove this kind of thick clouds. If the thick cloud cover areas overlap, by using the multi-output learning of the surrounding areas to predict the surface features of the overlapped thick cloud cover areas, this kind of thick cloud is removed. Experimental results show that the proposed cloud removal method can effectively solve the problems of the cloud overlapping and radiation difference among multi-source images. The cloud removal image is clear and smooth.展开更多
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a...Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.展开更多
潮流计算及其灵敏度分析是电力系统稳态分析与控制的基础。传统基于模型驱动的潮流计算是在电网拓扑和模型参数完备条件下,通过构建节点功率非线性方程并采用迭代方式进行求解的,灵敏度则由潮流雅可比矩阵求逆获取。模型及参数的准确性...潮流计算及其灵敏度分析是电力系统稳态分析与控制的基础。传统基于模型驱动的潮流计算是在电网拓扑和模型参数完备条件下,通过构建节点功率非线性方程并采用迭代方式进行求解的,灵敏度则由潮流雅可比矩阵求逆获取。模型及参数的准确性和迭代求解的时效性是影响潮流计算精度和速度的重要因素。该文提出一种数据驱动的潮流非线性回归及灵敏度解析计算方法,以实现不依赖于电网物理模型的潮流快速计算与分析。首先,利用电网潮流量测数据,构建基于改进的多输出最小二乘支持向量回归(multi-output least-squares support vector regression,MLSSVR)的潮流显式回归模型;其次,通过矩阵快速递归求逆,提出MLSSVR在线学习方法,增强对电网运行场景变化的适应性;最后,对潮流回归模型进行泰勒展开,提出潮流灵敏度解析计算方法。所提方法在多个IEEE标准系统和某实际省级电网进行仿真,验证了所提方法可有效得到高准确度的潮流解及其灵敏度。展开更多
Considering the modeling errors of on-board self-tuning model in the fault diagnosis of aero-engine, a new mechanism for compensating the model outputs is proposed. A discrete series predictor based on multi-outputs l...Considering the modeling errors of on-board self-tuning model in the fault diagnosis of aero-engine, a new mechanism for compensating the model outputs is proposed. A discrete series predictor based on multi-outputs least square support vector regression (LSSVR) is applied to the compensation of on-board self-tuning model of aero-engine, and particle swarm optimization (PSO) is used to the kernels selection of multi-outputs LSSVR. The method need not reconstruct the model of aero-engine because of the differences in the individuals of the same type engines and engine degradation after use. The concrete steps for the application of the method are given, and the simulation results show the effectiveness of the algorithm.展开更多
基金supported by the National Natural Science Foundation of China (61074127)
文摘As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance.
基金Project(50675186) supported by the National Natural Science Foundation of China
文摘To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method.
基金supported by the National Natural Science Foundation of China(50576033)
文摘The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem.
基金supported by the National Natural Science Foundation of China(61172127)the Natural Science Foundation of Anhui Province(1408085MF121)
文摘Removal of cloud cover on the satellite remote sensing image can effectively improve the availability of remote sensing images. For thin cloud cover, support vector value contourlet transform is used to achieve multi-scale decomposition of the area of thin cloud cover on remote sensing images. Through enhancing coefficients of high frequency and suppressing coefficients of low frequency, the thin cloud is removed. For thick cloud cover, if the areas of thick cloud cover on multi-source or multi-temporal remote sensing images do not overlap, the multi-output support vector regression learning method is used to remove this kind of thick clouds. If the thick cloud cover areas overlap, by using the multi-output learning of the surrounding areas to predict the surface features of the overlapped thick cloud cover areas, this kind of thick cloud is removed. Experimental results show that the proposed cloud removal method can effectively solve the problems of the cloud overlapping and radiation difference among multi-source images. The cloud removal image is clear and smooth.
基金supported by the Science and Technology on Space Intelligent Control Laboratory for National Defense(KGJZDSYS-2018-08)。
文摘Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.
文摘潮流计算及其灵敏度分析是电力系统稳态分析与控制的基础。传统基于模型驱动的潮流计算是在电网拓扑和模型参数完备条件下,通过构建节点功率非线性方程并采用迭代方式进行求解的,灵敏度则由潮流雅可比矩阵求逆获取。模型及参数的准确性和迭代求解的时效性是影响潮流计算精度和速度的重要因素。该文提出一种数据驱动的潮流非线性回归及灵敏度解析计算方法,以实现不依赖于电网物理模型的潮流快速计算与分析。首先,利用电网潮流量测数据,构建基于改进的多输出最小二乘支持向量回归(multi-output least-squares support vector regression,MLSSVR)的潮流显式回归模型;其次,通过矩阵快速递归求逆,提出MLSSVR在线学习方法,增强对电网运行场景变化的适应性;最后,对潮流回归模型进行泰勒展开,提出潮流灵敏度解析计算方法。所提方法在多个IEEE标准系统和某实际省级电网进行仿真,验证了所提方法可有效得到高准确度的潮流解及其灵敏度。
文摘Considering the modeling errors of on-board self-tuning model in the fault diagnosis of aero-engine, a new mechanism for compensating the model outputs is proposed. A discrete series predictor based on multi-outputs least square support vector regression (LSSVR) is applied to the compensation of on-board self-tuning model of aero-engine, and particle swarm optimization (PSO) is used to the kernels selection of multi-outputs LSSVR. The method need not reconstruct the model of aero-engine because of the differences in the individuals of the same type engines and engine degradation after use. The concrete steps for the application of the method are given, and the simulation results show the effectiveness of the algorithm.