As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorit...As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance.展开更多
To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm w...To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method.展开更多
The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in p...The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem.展开更多
Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a...Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.展开更多
通过对最小二乘支持向量机(Least squares support vector regression,LS-SVR)滤波特性的分析,给出了LS-SVR用于图像滤波的卷积模板构造方法,解决了LS-SVR在应用中需要求解的问题,在此基础上,提出了基于LS-SVR的开关型椒盐噪声滤波算法...通过对最小二乘支持向量机(Least squares support vector regression,LS-SVR)滤波特性的分析,给出了LS-SVR用于图像滤波的卷积模板构造方法,解决了LS-SVR在应用中需要求解的问题,在此基础上,提出了基于LS-SVR的开关型椒盐噪声滤波算法.滤波算法中以Maximum-minimum算子作为椒盐噪声检测器,利用滤波窗口内非噪声点构成LS-SVR的输入数据,使用事先构造出的LS-SVR滤波算子,对滤波窗口进行简单的卷积运算,实现了被椒盐噪声污染点数据的有效恢复,实验表明,本文提出的方法具有较好的细节保护能力和较强的噪声去除能力.展开更多
最小二乘支持向量回归(the least squares support vector regression,LS-SVR)算法因其回归拟合度高广泛应用于各领域中.以目标物在不同光源下采集的图像呈现出不同的颜色值,从而导致图像与目标物出现视觉上的偏差为研究对象,并以潘通...最小二乘支持向量回归(the least squares support vector regression,LS-SVR)算法因其回归拟合度高广泛应用于各领域中.以目标物在不同光源下采集的图像呈现出不同的颜色值,从而导致图像与目标物出现视觉上的偏差为研究对象,并以潘通色卡为参照,利用LSSVR算法,结合将RGB颜色空间到sRGB颜色空间的转换模型,对测试图像进行矫正处理.实验结果表明:与多项式回归相比,LS-SVR算法能取得更小的色差,且矫正后的图像更接近于目标图像.展开更多
为解决最小二乘支持向量回归(least-square support vector regression,LS-SVR)定位精度不高的问题,提出基于LS-SVR的混合定位算法,充分考虑未知节点之间的距离信息在定位过程中的有效修正作用。通过LS-SVR算法提供初始值,提高多元Taylo...为解决最小二乘支持向量回归(least-square support vector regression,LS-SVR)定位精度不高的问题,提出基于LS-SVR的混合定位算法,充分考虑未知节点之间的距离信息在定位过程中的有效修正作用。通过LS-SVR算法提供初始值,提高多元Taylor级数展开法的收敛速度;通过多元Taylor级数展开法,充分利用未知节点之间的距离信息,减小测距误差造成的定位误差。仿真结果表明,与传统LS-SVR定位算法相比,混合定位算法的精度更高,减少了正则化参数和核参数的选取对定位精度的影响。展开更多
基金supported by the National Natural Science Foundation of China (61074127)
文摘As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the ex- isting adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satis- factory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which in- volves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generali- zation performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance.
基金Project(50675186) supported by the National Natural Science Foundation of China
文摘To overcome the disadvantage that the standard least squares support vector regression(LS-SVR) algorithm is not suitable to multiple-input multiple-output(MIMO) system modelling directly,an improved LS-SVR algorithm which was defined as multi-output least squares support vector regression(MLSSVR) was put forward by adding samples' absolute errors in objective function and applied to flatness intelligent control.To solve the poor-precision problem of the control scheme based on effective matrix in flatness control,the predictive control was introduced into the control system and the effective matrix-predictive flatness control method was proposed by combining the merits of the two methods.Simulation experiment was conducted on 900HC reversible cold roll.The performance of effective matrix method and the effective matrix-predictive control method were compared,and the results demonstrate the validity of the effective matrix-predictive control method.
基金supported by the National Natural Science Foundation of China(50576033)
文摘The pruning algorithms for sparse least squares support vector regression machine are common methods, and easily com- prehensible, but the computational burden in the training phase is heavy due to the retraining in performing the pruning process, which is not favorable for their applications. To this end, an im- proved scheme is proposed to accelerate sparse least squares support vector regression machine. A major advantage of this new scheme is based on the iterative methodology, which uses the previous training results instead of retraining, and its feasibility is strictly verified theoretically. Finally, experiments on bench- mark data sets corroborate a significant saving of the training time with the same number of support vectors and predictive accuracy compared with the original pruning algorithms, and this speedup scheme is also extended to classification problem.
基金supported by the Science and Technology on Space Intelligent Control Laboratory for National Defense(KGJZDSYS-2018-08)。
文摘Least square support vector regression(LSSVR)is a method for function approximation,whose solutions are typically non-sparse,which limits its application especially in some occasions of fast prediction.In this paper,a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking(GRPR-AP-LSSVR)is proposed.At first,the global representative point ranking(GRPR)algorithm is given,and relevant data analysis experiment is implemented which depicts the importance ranking of data points.Furthermore,the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity.The removed data points are utilized to test the temporary learning model which ensures the regression accuracy.Finally,the proposed algorithm is verified on artificial datasets and UCI regression datasets,and experimental results indicate that,compared with several benchmark algorithms,the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.
文摘通过对最小二乘支持向量机(Least squares support vector regression,LS-SVR)滤波特性的分析,给出了LS-SVR用于图像滤波的卷积模板构造方法,解决了LS-SVR在应用中需要求解的问题,在此基础上,提出了基于LS-SVR的开关型椒盐噪声滤波算法.滤波算法中以Maximum-minimum算子作为椒盐噪声检测器,利用滤波窗口内非噪声点构成LS-SVR的输入数据,使用事先构造出的LS-SVR滤波算子,对滤波窗口进行简单的卷积运算,实现了被椒盐噪声污染点数据的有效恢复,实验表明,本文提出的方法具有较好的细节保护能力和较强的噪声去除能力.
文摘最小二乘支持向量回归(the least squares support vector regression,LS-SVR)算法因其回归拟合度高广泛应用于各领域中.以目标物在不同光源下采集的图像呈现出不同的颜色值,从而导致图像与目标物出现视觉上的偏差为研究对象,并以潘通色卡为参照,利用LSSVR算法,结合将RGB颜色空间到sRGB颜色空间的转换模型,对测试图像进行矫正处理.实验结果表明:与多项式回归相比,LS-SVR算法能取得更小的色差,且矫正后的图像更接近于目标图像.
文摘为解决最小二乘支持向量回归(least-square support vector regression,LS-SVR)定位精度不高的问题,提出基于LS-SVR的混合定位算法,充分考虑未知节点之间的距离信息在定位过程中的有效修正作用。通过LS-SVR算法提供初始值,提高多元Taylor级数展开法的收敛速度;通过多元Taylor级数展开法,充分利用未知节点之间的距离信息,减小测距误差造成的定位误差。仿真结果表明,与传统LS-SVR定位算法相比,混合定位算法的精度更高,减少了正则化参数和核参数的选取对定位精度的影响。