期刊文献+
共找到5,811篇文章
< 1 2 250 >
每页显示 20 50 100
A simulation model for estimating train and passenger delays in large-scale rail transit networks 被引量:5
1
作者 江志彬 李锋 +1 位作者 徐瑞华 高鹏 《Journal of Central South University》 SCIE EI CAS 2012年第12期3603-3613,共11页
A simulation model was proposed to investigate the relationship between train delays and passenger delays and to predict the dynamic passenger distribution in a large-scale rail transit network. It was assumed that th... A simulation model was proposed to investigate the relationship between train delays and passenger delays and to predict the dynamic passenger distribution in a large-scale rail transit network. It was assumed that the time varying original-destination demand and passenger path choice probability were given. Passengers were assumed not to change their destinations and travel paths after delay occurs. CapaciW constraints of train and queue rules of alighting and boarding were taken into account. By using the time-driven simulation, the states of passengers, trains and other facilities in the network were updated every time step. The proposed methodology was also tested in a real network, for demonstration. The results reveal that short train delay does not necessarily result in passenger delays, while, on the contrary, some passengers may get benefits from the short delay. However, large initial train delay may result in not only knock-on train and passenger delays along the same line, but also the passenger delays across the entire rail transit network. 展开更多
关键词 delay simulation passenger delay train delay rail transit network TIMETABLE
在线阅读 下载PDF
Design for the simulation of space based information network 被引量:1
2
作者 Zeng Bin Li Zitang Wang Wei 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2006年第2期443-449,共7页
Ongoing research is described that is focused upon modelling the space base information network and simulating its behaviours: simulation of spaced based communications and networking project. Its objective is to dem... Ongoing research is described that is focused upon modelling the space base information network and simulating its behaviours: simulation of spaced based communications and networking project. Its objective is to demonstrate the feasibility of producing a tool that can provide a performance evaluation of various eonstellation access techniques and routing policies. The architecture and design of the simulation system are explored. The algorithm of data routing and instrument scheduling in this project is described. Besides these, the key methodologies of simulating the inter-satellite link features in the data transmissions are also discussed. The performance of both instrument scheduling algorithm and routing schemes is evaluated and analyzed through extensive simulations under a typical scenario. 展开更多
关键词 space based information network network simulation inter-satellite link routing scheduling simulation
在线阅读 下载PDF
Hardware-in-loop simulation on hydrostatic thrust bearing worktable pose 被引量:1
3
作者 韩桂华 邵俊鹏 +1 位作者 秦柏 董玉红 《Journal of Central South University》 SCIE EI CAS 2008年第S2期250-256,共7页
A controllable hydrostatic thrust bearing was presented to improve rigidity. The bearing worktable poses were controlled by coupling oilfilm thickness of four controllable chambers. The chamber flow can be regulated b... A controllable hydrostatic thrust bearing was presented to improve rigidity. The bearing worktable poses were controlled by coupling oilfilm thickness of four controllable chambers. The chamber flow can be regulated by electro hydraulic servo valve-control variable pump according to the surface roughness, load, cutting force, and thermal effects of worktable. The mathematical models of the controllable chamber flow, servo variable mechanism and controller were built. The pose control model was established, which contained the kinematics positive and negative solution and control strategy of feedforward and hydraulic cylinder position feedback. Hardware-in-loop simulation experiment was carried out on the electro hydraulic servo test bench by means of the non-linear relation of film thickness and hydraulic cylinder displacement. Hardware-in-loop simulation experiment results show that the controllable bearings exhibit high oilfilm rigidity, the rising time is 0.24 s and the maximum overshoot is 2.23%, and can be applied in high precision heavy machine tool. 展开更多
关键词 HYDROSTATIC THRUST bearing hardware-in-loop simulation worktable POSE CONTROLLABLE CHAMBER
在线阅读 下载PDF
3-D fracture network dynamic simulation based on error analysis in rock mass of dam foundation 被引量:5
4
作者 ZHONG Deng-hua WU Han +2 位作者 WU Bin-ping ZHANG Yi-chi YUE Pan 《Journal of Central South University》 SCIE EI CAS CSCD 2018年第4期919-935,共17页
Accurate 3-D fracture network model for rock mass in dam foundation is of vital importance for stability,grouting and seepage analysis of dam foundation.With the aim of reducing deviation between fracture network mode... Accurate 3-D fracture network model for rock mass in dam foundation is of vital importance for stability,grouting and seepage analysis of dam foundation.With the aim of reducing deviation between fracture network model and measured data,a 3-D fracture network dynamic modeling method based on error analysis was proposed.Firstly,errors of four fracture volume density estimation methods(proposed by ODA,KULATILAKE,MAULDON,and SONG)and that of four fracture size estimation methods(proposed by EINSTEIN,SONG and TONON)were respectively compared,and the optimal methods were determined.Additionally,error index representing the deviation between fracture network model and measured data was established with integrated use of fractal dimension and relative absolute error(RAE).On this basis,the downhill simplex method was used to build the dynamic modeling method,which takes the minimum of error index as objective function and dynamically adjusts the fracture density and size parameters to correct the error index.Finally,the 3-D fracture network model could be obtained which meets the requirements.The proposed method was applied for 3-D fractures simulation in Miao Wei hydropower project in China for feasibility verification and the error index reduced from 2.618 to 0.337. 展开更多
关键词 rock mass of dam foundation 3-D fracture network dynamic simulation fractal dimension error analysis relative absolute error(RAE) downhill simplex method
在线阅读 下载PDF
State simulation of water distribution networks based on DFP algorithm
5
作者 张卉 黄廷林 何文杰 《Journal of Central South University》 SCIE EI CAS 2009年第S1期298-303,共6页
The improved weighted-least-square model was used for state simulation of water distribution networks. And DFP algorithm was applied to get the model solution. In order to fit DFP algorithm,the initial model was trans... The improved weighted-least-square model was used for state simulation of water distribution networks. And DFP algorithm was applied to get the model solution. In order to fit DFP algorithm,the initial model was transformed into a non-constrained optimization problem using mass conservation. Then,through one dimensional optimization and scale matrix establishment,the feasible direction of iteration was obtained,and the values of state variables could be calculated. After several iterations,the optimal estimates of state variables were worked out and state simulation of water distribution networks was achieved as a result. A program of DFP algorithm is developed with Delphi 7 for verification. By running on a designed network,which is composed of 55 nodes,94 pipes and 40 loops,it is proved that DFP algorithm can quickly get the convergence. After 36 iterations,the root mean square of all nodal head errors is reduced by 90.84% from 5.57 to 0.51 m,and the maximum error is only 1.30 m. Compared to Marquardt algorithm,the procedure of DFP algorithm is more stable,and the initial values have less influences on calculation accuracy. Therefore,DFP algorithm can be used for real-time simulation of water distribution networks. 展开更多
关键词 water DISTRIBUTION network STATE simulation STATE ESTIMATION DFP algorithm
在线阅读 下载PDF
Design and simulation of a Torus topology for network on chip
6
作者 Wu Chang Li Yubai Chai Song 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2008年第4期694-701,共8页
Aiming at the applications of NOC (network on chip) technology in rising scale and complexity on chip systems, a Torus structure and corresponding route algorithm for NOC is proposed. This Torus structure improves t... Aiming at the applications of NOC (network on chip) technology in rising scale and complexity on chip systems, a Torus structure and corresponding route algorithm for NOC is proposed. This Torus structure improves traditional Torus topology and redefines the denotations of the routers. Through redefining the router denotations and changing the original router locations, the Torus structure for NOC application is reconstructed. On the basis of this structure, a dead-lock and live-lock free route algorithm is designed according to dimension increase. System C is used to implement this structure and the route algorithm is simulated. In the four different traffic patterns, average, hotspot 13%, hotspot 67% and transpose, the average delay and normalization throughput of this Torus structure are evaluated. Then, the performance of delay and throughput between this Torus and Mesh structure is compared. The results indicate that this Torus structure is more suitable for NOC applications. 展开更多
关键词 network on chip TORUS ROUTE System C simulation
在线阅读 下载PDF
A Feasible Partial Train Traffic Simulation Using Diagram Expressed in Network
7
作者 Cheng Yu (Railway Technical Research Institute Kokubunji-she Tokyo 185, Japan ) 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 1994年第3期57-63,共7页
Simulating large-scale and complex systems is commonly considered a difficult and time-consuming task. In this paper, we propose a partial simulation way to speed up the simulation with real time demands. It is based ... Simulating large-scale and complex systems is commonly considered a difficult and time-consuming task. In this paper, we propose a partial simulation way to speed up the simulation with real time demands. It is based on the idea that a train traffic diagram is expressed in a network, and through calculating the maximal long path in the network the simulation is done, but only within a particular partial area.Upon this, we let it become a problem oriented simulation. The simulation could be started at any time,from any trains or at any stations and stopped as the same way according to the problem to be concerned.We can use this kind of simulation to analyse or confirm the correctness of traffic schedule at a high speed to meet the real time demands. 展开更多
关键词 Train traffic simulation Traffic schedule network
在线阅读 下载PDF
Simsync: A Time Synchronization Simulator for Sensor Networks 被引量:8
8
作者 XU Chao-Nong ZHAO Lei +1 位作者 XU Yong-Jun LI Xiao-Wei 《自动化学报》 EI CSCD 北大核心 2006年第6期1008-1014,共7页
Time synchronization is a critical middleware service of wireless sensor networks. Researchers have already proposed some time synchronization algorithms. However, due to the demands for various synchronization precis... Time synchronization is a critical middleware service of wireless sensor networks. Researchers have already proposed some time synchronization algorithms. However, due to the demands for various synchronization precision, existing time synchronization algorithms often need to be adapted. So it is necessary to evaluate these adapted algorithms before use. Software simulation is a valid and quick way to do it. In this paper, we present a time synchronization simulator, Simsync, for wireless sensor networks. We decompose the packet delay into 6 delay components and model them separately. The frequency of crystal oscillator is modeled as Gaussian. To testify its effectiveness, we simulate the reference broadcast synchronization algorithm (RBS) and the timing-sync synchronization algorithm (TPSN) on Simsync. Simulated results are also presented and analyzed. 展开更多
关键词 Time synchronization sensor networks simulATOR
在线阅读 下载PDF
Applications of Wavelets in 3-D Audio Simulation
9
作者 Zhu, Xiaoguang Hong, Bingrong Wang, Dongmu 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2000年第3期74-81,共8页
Wavelet has been used as a powerful tool in the signal processing and function approximation recently. This paper presents the application of wavelets for solving two key problems in 3-D audio simulation. First, we em... Wavelet has been used as a powerful tool in the signal processing and function approximation recently. This paper presents the application of wavelets for solving two key problems in 3-D audio simulation. First, we employ discrete wavelet transform (DWT) combined with vector quantization (VQ) to compress audio data in order to reduce tremendous redundant data storage and transmission times. Secondly, we use wavelets as the activation functions in neural networks called feed-forward wavelet networks to approach auditory localization information cues (head-related transfer functions (HRTFs) are used here). The experimental results demonstrate that the application of wavelets is more efficient and useful in 3-D audio simulation. 展开更多
关键词 Approximation theory Computer simulation Data structures Feedforward neural networks Wavelet transforms
在线阅读 下载PDF
Haptic Modeling and Rendering Based on Neurofuzzy Rules for Surgical Cutting Simulation
10
作者 SONG Wei-Guo YUAN Kui FU Yu-Jin 《自动化学报》 EI CSCD 北大核心 2006年第2期193-199,共7页
This paper combines image processing with 3D magnetic tracking method to develop a scalpel for haptic simulation in surgical cutting. First, a cutting parameter acquisition setup is presented and the performance is va... This paper combines image processing with 3D magnetic tracking method to develop a scalpel for haptic simulation in surgical cutting. First, a cutting parameter acquisition setup is presented and the performance is validated from soft tissue cutting. Then, based on the acquired input-output data pairs, a method for fuzzy system modeling is presented, that is, after partitioning each input space equally and giving the premises and the total number of fuzzy rules, the consequent parameters and the fuzzy membership functions (MF) of the input variables are learned and optimized via a neurofuzzy modeling technique. Finally, a haptic scalpel implemented with the established cutting model is described. Preliminary results show the feasibility of the haptic display system for real-time interaction. 展开更多
关键词 图象处理 模糊神经网络 切割模拟 触觉显示
在线阅读 下载PDF
水平井网地浸采铀溶浸范围井储耦合和粒子示踪模拟 被引量:1
11
作者 杨蕴 左海啸 +5 位作者 李召坤 张宇 常勇 祝晓彬 吴剑锋 吴吉春 《同济大学学报(自然科学版)》 北大核心 2025年第3期503-512,共10页
为探索水平井技术在我国砂岩型铀矿地浸采铀领域的可行性,解析水平井网地浸采铀水动力过程和定量刻画溶浸范围是科学评判的基础和前提。水平井网地浸模式下地下水动力过程涉及井筒流与储层达西流的耦合(井储耦合),通过研发井储耦合数值... 为探索水平井技术在我国砂岩型铀矿地浸采铀领域的可行性,解析水平井网地浸采铀水动力过程和定量刻画溶浸范围是科学评判的基础和前提。水平井网地浸模式下地下水动力过程涉及井筒流与储层达西流的耦合(井储耦合),通过研发井储耦合数值模拟技术,构建水平井网地浸采铀地下水动力学模型,开发基于MODPATH粒子示踪模拟和Alpha-shape算法的溶浸范围自动提取技术,基于理想模型和实际场地模型的应用验证,实现地浸开采过程中溶浸液流动状态和溶浸范围的模拟刻画。研究结果表明:相比于传统的MODFLOW模型,井储耦合模型可刻画水平井注液过程中井储交互流量及变化,且交互流量值与储层渗透系数K值变化呈正相关;在井储耦合渗流模拟的基础上,可自动提取模拟时段内被抽液井有效捕捉的粒子迹线所包络的范围,识别抽液井不同捕获流量贡献率对应的溶浸范围,95%流量贡献率对应的溶浸范围为100%流量贡献率对应范围体积的40%,表明溶浸液从水平井注入储层至浸出液被竖井抽出,地浸开采前期流量交互主要集中于溶浸范围内部,而外部迹线的流速低,对抽液流量的贡献率低且浸出率低。 展开更多
关键词 地浸采铀 水平井网 水动力 溶浸范围 数值模拟
在线阅读 下载PDF
基于机器学习的酸性气藏地下储气库硫化氢含量预测方法 被引量:1
12
作者 冯国庆 杜勤锟 +3 位作者 周道勇 蔡家兰 程希 莫海帅 《天然气工业》 北大核心 2025年第2期159-169,共11页
地下储气库(以下简称储气库)中含有硫化氢等有害气体,不仅影响储气库的安全运行,还直接对环境造成严重污染,准确预测储气库采出气组分中H2S的含量具有重要意义。目前,常采用油藏数值模拟的组分模型来预测H2S含量,但其计算过程复杂且耗... 地下储气库(以下简称储气库)中含有硫化氢等有害气体,不仅影响储气库的安全运行,还直接对环境造成严重污染,准确预测储气库采出气组分中H2S的含量具有重要意义。目前,常采用油藏数值模拟的组分模型来预测H2S含量,但其计算过程复杂且耗时较长,不能方便快捷地用于储气库单井H2S的含量预测。为此,以HCX储气库为研究对象,在建立储气库的机理模型并开展数值模拟的基础上,以机理模型计算的储气库多周期H2S预测结果为样本集,应用多输出支持向量回归(MSVR)、长短期记忆网络(LSTM)、人工神经网络(ANN)3种机器学习算法建立了硫化氢含量的智能代理模型,并对3种模型预测精度进行对比分析。研究结果表明:①长短期记忆网络模型具有适中的训练时间、较好的预测精度,可将该模型作为HCX储气库的H2S预测智能代理模型;②进一步对LSTM模型的训练数据和过渡拟合问题进行优化,确定最佳训练数集1500组,最佳丢弃率为0.2,隐含层设置范围可控制在层数1~2层,节点数30~60个;③经HCX储气库的实例应用表明,建立的LSTM智能代理模型能够准确预测储气库采出气中H2S的含量。结论认为,经过优化的LSTM算法智能代理模型具有较好的外推性,该研究成果可为含H2S储气库的建设和安全高效运行提供技术支持。 展开更多
关键词 含硫储气库 数值模拟 组分模拟 硫化氢含量预测 机器学习 长短期记忆网络模型 机器学习模型优化
在线阅读 下载PDF
基于虚拟仿真技术的计算机网络技术专业教学研究 被引量:4
13
作者 张建珍 李晋超 《教育理论与实践》 北大核心 2025年第9期56-59,共4页
计算机网络技术专业要求学生不仅掌握扎实的理论知识,还需具备实践能力和良好的职业素养。虚拟仿真技术因其逼真的模拟特性、交互式的教学体验和不受物理空间限制的特点,为职业教育提供了全新的教学方向。要应用虚拟仿真技术突破计算机... 计算机网络技术专业要求学生不仅掌握扎实的理论知识,还需具备实践能力和良好的职业素养。虚拟仿真技术因其逼真的模拟特性、交互式的教学体验和不受物理空间限制的特点,为职业教育提供了全新的教学方向。要应用虚拟仿真技术突破计算机网络技术专业教学目标,使其知识目标情景化、能力目标实践化、素养目标项目化。要应用虚拟仿真技术拓展计算机网络专业教学内容,将岗位需求、技能竞赛和职业认证融入教学内容。要应用虚拟仿真技术创新计算机网络技术专业教学方法,通过沉浸式教学法提升学生学习体验、项目式教学法增强学生实践能力、闯关式教学法激发学生学习兴趣。要应用虚拟仿真技术优化计算机网络技术专业教学评价,提升企业导师教学评价占比、细化校内导师教学评价粒度。 展开更多
关键词 虚拟仿真技术 计算机网络技术专业 教学目标 岗位需求 技能竞赛 职业认证 企业导师评价 校内导师评价
在线阅读 下载PDF
知识数据双驱动的感潮河网水动力智能模拟方法
14
作者 袁赛瑜 陈逸鸿 +2 位作者 罗霄 张汇明 唐洪武 《水科学进展》 北大核心 2025年第1期28-38,共11页
感潮河网地区大量水闸、泵站智慧高效的联合调度是实现河网活水提质的重要保障,但以往的智能模拟方法缺乏物理可解释性,难以准确描述感潮河网复杂的水动力过程。本文提出了一种知识数据双驱动的感潮河网水动力智能模拟方法,应用于概化... 感潮河网地区大量水闸、泵站智慧高效的联合调度是实现河网活水提质的重要保障,但以往的智能模拟方法缺乏物理可解释性,难以准确描述感潮河网复杂的水动力过程。本文提出了一种知识数据双驱动的感潮河网水动力智能模拟方法,应用于概化感潮河网和上海蕰南片感潮河网的水动力模拟。结果表明:以人工神经网络为主干、以河网水流控制方程作为物理约束,构建包含控制方程残差的人工神经网络损失函数,不断迭代优化神经网络权重集直至损失函数满足要求,从而实现同时具备物理可解释性和高效计算效率的感潮河网水动力智能模拟;该方法区别于传统人工神经网络,表现在所需的训练数据大幅度减少,还可以得到没有训练数据断面的水动力过程;该方法具有良好的模拟精度、计算效率以及鲁棒性。 展开更多
关键词 水动力模拟 感潮河网 智能模拟 知识驱动 数据驱动
在线阅读 下载PDF
基于波动方程的地震波数值模拟研究综述 被引量:1
15
作者 李航 孙宇航 +2 位作者 李佳慧 李学贵 董宏丽 《吉林大学学报(地球科学版)》 北大核心 2025年第2期627-645,共19页
地震波场数值模拟在地震勘探、地震资料处理和地球构造研究等方面发挥着重要的作用。波动方程数值模拟方法充分考虑了地震波传播的动力学特征和几何学特征,可以为地震波传播机理的研究和复杂地层的解释提供强有力的理论支持,是目前应用... 地震波场数值模拟在地震勘探、地震资料处理和地球构造研究等方面发挥着重要的作用。波动方程数值模拟方法充分考虑了地震波传播的动力学特征和几何学特征,可以为地震波传播机理的研究和复杂地层的解释提供强有力的理论支持,是目前应用较为广泛的地震波场数值模拟方法之一。本文调研了五种基于波动方程的数值模拟方法:有限差分法易于理解,但数值频散问题明显;伪谱法精度高,但计算效率低;有限元法适用于复杂模型,但计算资源消耗大;谱元法适合高精度问题,但对计算内存需求较高;基于物理信息神经网络的深度学习法具有较强的适应性,但训练成本较高。并分别叙述了这五种数值模拟方法的理论基础、适用条件和最新进展。未来,地震波场数值模拟方法应结合深度学习等最新技术,优化边界条件模拟真实的边界反射情况,提高模拟的精度和效率。 展开更多
关键词 波场模拟 有限差分法 伪谱法 有限元法 谱元法 物理信息神经网络
在线阅读 下载PDF
致密气地面集输系统中后期增压模拟与分析
16
作者 李超 何泉 +4 位作者 李怡超 练兴元 李俊妤 周军 梁光川 《现代化工》 北大核心 2025年第6期230-235,共6页
剖析了致密气集输系统中后期增压所面临的问题。首先根据增压位置的不同,分析了致密气集输系统中的6种增压工艺,并以致密气田某井区管网结构为例,通过TGNET仿真建模,反算各平台增压时机和执行间歇生产的最低压力,模拟了集输系统在不同... 剖析了致密气集输系统中后期增压所面临的问题。首先根据增压位置的不同,分析了致密气集输系统中的6种增压工艺,并以致密气田某井区管网结构为例,通过TGNET仿真建模,反算各平台增压时机和执行间歇生产的最低压力,模拟了集输系统在不同生产时期和进站压力下,共20组增压方案的压缩机配置和运行情况,以及间歇生产阶段对压缩机运行参数的影响,为致密气地面集输系统中后期增压方案的确定提供研究思路。 展开更多
关键词 致密气 集输系统 增压方案 管网模拟
在线阅读 下载PDF
基于图神经网络的地下水位动态模拟模型
17
作者 许明家 孙龙 +1 位作者 李爽 鲁程鹏 《水文》 北大核心 2025年第1期30-36,共7页
地下水位的模拟精度在可持续的地下水资源利用和管理中起着重要的作用。机器学习方法可以捕获输入变量和目标变量之间的非线性关系,在地下水位模拟中得到了广泛的应用。然而,传统的机器学习方法没有考虑站与站之间的空间关系。本文使用... 地下水位的模拟精度在可持续的地下水资源利用和管理中起着重要的作用。机器学习方法可以捕获输入变量和目标变量之间的非线性关系,在地下水位模拟中得到了广泛的应用。然而,传统的机器学习方法没有考虑站与站之间的空间关系。本文使用图神经网络(GNN)模拟地下水位动态变化,以地下水水位监测站为节点,通过邻接矩阵连接节点;选择河北省典型漏斗区的监测数据对模型进行应用和评价。与三个对照模型:随机森林(RF)、支持向量机(SVR)和多层感知机(MLP)相比,所提出的模型在所定义的评估指标方面均表现更好。此外,所提出的模型可同时模拟建模系统中所有监测站的地下水位变化,相比单站模型具有更高的数据利用率。 展开更多
关键词 地下水位模拟 图神经网络 非平稳 时间序列
在线阅读 下载PDF
压缩行存储格式与解耦方法结合的掺氢天然气管网瞬态仿真算法
18
作者 李玉星 陈若飞 +7 位作者 朱建鲁 仇柏林 吕浩 陈凤 张双蕾 杨浩 陈俊文 何佳薪 《天然气工业》 北大核心 2025年第5期188-200,共13页
随着纯氢与掺氢天然气管网的快速发展,传统的数值求解算法面临数值稳定性及求解效率的挑战,迫切需要研究高效、稳定的数值求解新方法,以优化掺氢天然气管网设计,提高管网运营效率。为此,在研究了掺氢对天然气管网瞬态仿真过程中矩阵条... 随着纯氢与掺氢天然气管网的快速发展,传统的数值求解算法面临数值稳定性及求解效率的挑战,迫切需要研究高效、稳定的数值求解新方法,以优化掺氢天然气管网设计,提高管网运营效率。为此,在研究了掺氢对天然气管网瞬态仿真过程中矩阵条件数的影响基础上,提出了将压缩行存储格式(CSR)与水热力解耦策略相结合的数值求解算法,并评估了其在瞬态输氢管网仿真中的适应性。研究结果表明:①天然气管网掺氢会导致瞬态仿真中矩阵的条件数增加数十至数百倍,影响解的稳定性;②相较于传统的二维矩阵算法,基于CSR的数值求解算法能至少提升10倍计算效率,且随着矩阵规模的增大,提升效果呈指数级增长;③在管网仿真中推荐优先使用稀疏LU分解法,当结果不稳定时,再考虑使用稀疏QR分解作为备用方案;④解耦策略下,求解矩阵的条件数显著降低,算法的稳定性提高,仿真求解性能至少为耦合的10倍,在基于CSR的算法中,也有50%以上的性能提升。结论认为,该研究成果可以为提高纯氢与掺氢天然气管网瞬态仿真效率提供有益的参考,具有重要的理论意义和实际应用价值。 展开更多
关键词 掺氢天然气 瞬态 管网仿真 数值求解 稀疏矩阵 CSR 解耦 仿真效率
在线阅读 下载PDF
基于模拟退火遗传算法的舰船编队网络优化调度方法
19
作者 陆青梅 赵山林 高媛 《舰船科学技术》 北大核心 2025年第10期155-160,共6页
舰船编队网络是一个复杂的通信系统,为减少通信延迟,确保信息的及时传递,提高整个编队的反应速度和作战效能,提出基于模拟退火遗传算法的舰船编队网络优化调度方法。以最小通信总延迟与总能耗为目标函数,通过设置约束条件,建立舰船编队... 舰船编队网络是一个复杂的通信系统,为减少通信延迟,确保信息的及时传递,提高整个编队的反应速度和作战效能,提出基于模拟退火遗传算法的舰船编队网络优化调度方法。以最小通信总延迟与总能耗为目标函数,通过设置约束条件,建立舰船编队网络优化调度模型。利用模拟退火遗传算法求解调度模型,实现最小通信总延迟与总能耗的舰船编队网络优化调度。实验结果表明,应用本文方法后,舰船编队网络的通信总延迟在0~80 ms之间,能耗保持在580 kWh以下。说明本文方法可以有效提升舰船编队网络通信的稳定性和效率,显著增强了编队的作战适应性和应变能力,为海军作战和海上安全提供更为可靠的支撑。 展开更多
关键词 模拟退火 遗传算法 舰船编队网络 优化调度 适应性 应变能力
在线阅读 下载PDF
链长制提升产业链韧性的机理与成效研究——以人工智能产业链为例
20
作者 李红锦 戎芳毅 李胜会 《统计研究》 北大核心 2025年第4期63-73,共11页
链长制政策为产业链治理提供新路径。本文利用复杂网络分析方法,构建人工智能产业链复杂网络模型,在刻画其合作创新网络演化特征的基础上,将链长制这一政策因素纳入博弈模型,并采用实证手段验证政策有效性。研究发现:人工智能产业链跨... 链长制政策为产业链治理提供新路径。本文利用复杂网络分析方法,构建人工智能产业链复杂网络模型,在刻画其合作创新网络演化特征的基础上,将链长制这一政策因素纳入博弈模型,并采用实证手段验证政策有效性。研究发现:人工智能产业链跨领域合作渠道较为畅通,但相较于技术层和应用层,基础层还未形成创新主体集聚、技术集中涌现的态势。产业链韧性提升的重要环节是基础层以及基础层–技术层关联环节,且高密度合作或将引致冲击产生蝴蝶效应。链长制的实施不仅是产业链韧性提升的应急机制更是长效机制,且该作用存在马太效应。本研究对提升政府治理产业链韧性的能力以及提高人工智能产业链的韧性水平提供实证参考。 展开更多
关键词 链长制 人工智能产业链 合作创新网络 仿真模拟
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部