在最小二乘方法(RLS,recursive least square)的基础上,提出利用格型递归最小二乘(LRLS,lattice recursiveleast square)算法对AR模型参数进行自适应估计。该算法为模块式的多极格型结构,降低了一般RLS算法的计算复杂度。利用实测的动...在最小二乘方法(RLS,recursive least square)的基础上,提出利用格型递归最小二乘(LRLS,lattice recursiveleast square)算法对AR模型参数进行自适应估计。该算法为模块式的多极格型结构,降低了一般RLS算法的计算复杂度。利用实测的动态数据结合AIC准则建立自适应AR预报模型,并将该模型应用于船舶运动预报中,仿真结果表明,相对于最小二乘算法,基于LRLS算法的AR预报模型可有效提高船舶运动预报精度。展开更多
在加性高斯白噪声的影响下,对于三阶多项式相位信号(CPS),经典的字典学习算法,如K-means Singular Value Decomposition(K-SVD),递归最小二乘字典学习算法(RLS-DLA)和K-means Singular Value Decomposition Denoising(K-SVDD)得到的学...在加性高斯白噪声的影响下,对于三阶多项式相位信号(CPS),经典的字典学习算法,如K-means Singular Value Decomposition(K-SVD),递归最小二乘字典学习算法(RLS-DLA)和K-means Singular Value Decomposition Denoising(K-SVDD)得到的学习字典,通过稀疏分解,不能有效去除信号的噪声。为此,该文提出了针对CPS去噪的字典学习算法。该算法首先利用RLS-DLA对的字典进行学习;其次采用非线性最小二乘(NLLS)法修改了该算法对字典更新的部分;最后对训练后的字典通过对信号的稀疏表示得到重构信号。对比其它的字典学习算法,该算法的信噪比(SNR)值明显高于其它算法,而均方误差(MSE)显著低于其它算法,具有明显的降噪效果。实验结果表明,采用该算法得到的字典通过稀疏分解,信号的平均信噪比比K-SVD,RLS-DLS和K-SVDD高出9.55 dB,13.94 dB和9.76 dB。展开更多
In this paper, we present a least squares version for support vector machines(SVM)classifiers and functionestimation. Due to equality type constraints in the formulation, the solution follows from solving a set of lin...In this paper, we present a least squares version for support vector machines(SVM)classifiers and functionestimation. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equa-tions, instead of quadratic programming for classical SVM. The approach is illustrated on a two-spiral benchmarkclassification problem. The results show that the LS-SVM is an efficient method for solving pattern recognition.展开更多
文摘在最小二乘方法(RLS,recursive least square)的基础上,提出利用格型递归最小二乘(LRLS,lattice recursiveleast square)算法对AR模型参数进行自适应估计。该算法为模块式的多极格型结构,降低了一般RLS算法的计算复杂度。利用实测的动态数据结合AIC准则建立自适应AR预报模型,并将该模型应用于船舶运动预报中,仿真结果表明,相对于最小二乘算法,基于LRLS算法的AR预报模型可有效提高船舶运动预报精度。
文摘In this paper, we present a least squares version for support vector machines(SVM)classifiers and functionestimation. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equa-tions, instead of quadratic programming for classical SVM. The approach is illustrated on a two-spiral benchmarkclassification problem. The results show that the LS-SVM is an efficient method for solving pattern recognition.