Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and mai...Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.展开更多
In e-commerce the multidimensional data analysis based on the Web data needs integrating various data sources such as XML data and relational data on the conceptual level. A conceptual data description approach to mul...In e-commerce the multidimensional data analysis based on the Web data needs integrating various data sources such as XML data and relational data on the conceptual level. A conceptual data description approach to multidimensional data model the UML galaxy diagram is presented in order to conduct multidimensional data analysis for multiple subjects. The approach is illuminated using a case of 2_roots UML galaxy diagram that takes marketing analysis of TV products involved one retailer and several suppliers into consideration.展开更多
This study aims to examine the usability of environmentally harmless vegetable oil in the minimum quantity of lubrication(MQL)system in face milling of AISI O2 steel and to optimize the cutting parameters by different...This study aims to examine the usability of environmentally harmless vegetable oil in the minimum quantity of lubrication(MQL)system in face milling of AISI O2 steel and to optimize the cutting parameters by different statistical methods.Vegetable oil was preferred as cutting fluid,and Taguchi method was used in the preparation of the test pattern.After testing with the prepared test pattern,cutting performance in all parameters has been improved according to dry conditions thanks to the MQL system.The highest tool life was obtained by using cutting parameters of 7.5 m cutting length,100 m/min cutting speed,100 mL/h MQL flow rate and 0.1 mm/tooth feed rate.Optimum cutting parameters were determined according to the Taguchi analysis,and the obtained parameters were confirmed with the verification tests.In addition,the optimum test parameter was determined by applying the gray relational analysis method.After using ANOVA analysis according to the measured surface roughness and cutting force values,the most effective cutting parameter was observed to be the feed rate.In addition,the models for surface roughness and cutting force values were obtained with precisions of 99.63%and 99.68%,respectively.Effective wear mechanisms were found to be abrasion and adhesion.展开更多
In the case of unknown weights, theories of multi-attributed decision making based on interval numbers and grey related analysis were used to optimize mining methods. As the representative of independence for the indi...In the case of unknown weights, theories of multi-attributed decision making based on interval numbers and grey related analysis were used to optimize mining methods. As the representative of independence for the indicator, the smaller the correlation of indicators is, the greater the weight is. Hence, the weights of interval numbers of indicators were determined by using correlation coefficient. Relative closeness based on positive and negative ideal methods was calculated by introducing distance between interval numbers, which made decision making more rational and comprehensive. A new method of ranking interval numbers based on normal distribution was proposed for the optimization of mining methods, whose basic properties were discussed. Finally, the feasibility and effectiveness of this method were verified by theories and practice.展开更多
The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs ty...The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs typically have some levels of fuzziness. To analyze a decision making unit (DMU) with fuzzy input/output data, previous studies provided the fuzzy DEA model and proposed an associated evaluating approach. Nonetheless, numerous deficiencies must still be improved, including the α- cut approaches, types of fuzzy numbers, and ranking techniques. Moreover, a fuzzy sample DMU still cannot be evaluated for the Fuzzy DEA model. Therefore, this paper proposes a fuzzy DEA model based on sample decision making unit (FSDEA). Five eval- uation approaches and the related algorithm and ranking methods are provided to test the fuzzy sample DMU of the FSDEA model. A numerical experiment is used to demonstrate and compare the results with those obtained using alternative approaches.展开更多
The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the...The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.展开更多
The classic data envelopment analysis(DEA) model is used to evaluate decision-making units'(DMUs) efficiency under the assumption that all DMUs are evaluated with the same criteria setting. Recently, new research...The classic data envelopment analysis(DEA) model is used to evaluate decision-making units'(DMUs) efficiency under the assumption that all DMUs are evaluated with the same criteria setting. Recently, new researches begin to focus on the efficiency analysis of non-homogeneous DMU arose by real practices such as the evaluation of departments in a university, where departments argue for the adoption of different criteria based on their disciplinary characteristics. A DEA procedure is proposed in this paper to address the efficiency analysis of two non-homogeneous DMU groups. Firstly, an analytical framework is established to compromise diversified input and output(IO) criteria from two nonhomogenous groups. Then, a criteria fusion operation is designed to obtain different DEA analysis strategies. Meanwhile, Friedman test is introduced to analyze the consistency of all efficiency results produced by different strategies. Next, ordered weighted averaging(OWA) operators are applied to integrate different information to reach final conclusions. Finally, a numerical example is used to illustrate the proposed method. The result indicates that the proposed method relaxes the restriction of the classical DEA model,and can provide more analytical flexibility to address different decision analysis scenarios arose from practical applications.展开更多
Traditional data envelopment analysis(DEA) theory assumes that decision variables are regarded as inputs or outputs,and no variable can play the roles of both an input and an output at the same time.In fact,there ex...Traditional data envelopment analysis(DEA) theory assumes that decision variables are regarded as inputs or outputs,and no variable can play the roles of both an input and an output at the same time.In fact,there exist some variables that work as inputs and outputs simultaneously and are called dual-role variables.Traditional DEA models cannot be used to appraise the performance of decision making units containing dual-role variables.The paper analyzes the structure and properties of the production systems comprising dual-role variables,and proposes a DEA model integrating dual-role variables.Finally the proposed model is illustrated to evaluate the efficiency of university departments.展开更多
It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting ...It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting and amending anomalous samples was proposed. Taking full advantage of wavelet analysis’ properties of multi-resolution and local analysis, this approach is able to detect and amend anomalous samples effectively. To realize the rapid numeric computation of wavelet translation for a discrete sequence, a modified algorithm based on Newton-Cores formula was also proposed. The experimental result shows that the approach is feasible with good result and good practicality.展开更多
In this paper a new method of passive underwater TMA (target motion analysis) using data fusion is presented. The findings of this research are based on an understanding that there is a powerful sonar system that cons...In this paper a new method of passive underwater TMA (target motion analysis) using data fusion is presented. The findings of this research are based on an understanding that there is a powerful sonar system that consists of many types of sonar but with one own-ship, and that different target parameter measurements can be obtained simultaneously. For the analysis 3 data measurements, passive bearing, elevation and multipath time-delay, are used, which are divided into two groups: a group with estimates of two preliminary target parameter obtained by dealing with each group measurement independently, and a group where correlated estimates are sent to a fusion center where the correlation between two data groups are considered so that the passive underwater TMA is realized. Simulation results show that curves of parameter estimation errors obtained by using the data fusion have fast convergence and the estimation accuracy is noticeably improved. The TMA algorithm presented is verified and is of practical significance because it is easy to be realized in one ship.展开更多
The magnetic resonance spectroscopy(MRS)results are greatly influenced by reconstruction of the spectrum and quantitative analysis.Because of this requirement a number of programs dedicated to MRS data analysis were d...The magnetic resonance spectroscopy(MRS)results are greatly influenced by reconstruction of the spectrum and quantitative analysis.Because of this requirement a number of programs dedicated to MRS data analysis were developed.The selection and use of appropriate software is crucial not only in clinical procedures,but also while carrying out scientific research.The choice of the software to suit the user's needs should be based on the analysis of the functionality of the program.It is particularly important from the user's viewpoint to identify what data can be loaded and processed in the program.The specific programs allow the user different degree of control over analysis parameters.Moreover,the programs for MRS data analysis differ in terms of the applied signal processing algorithms.The aim of this work,therefore,is to review available packages designed for MRS data analysis,taking into account their capabilities and limitations.展开更多
The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violat...Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violations of these MR assumptions through sensitivity analysis.Sensitivity analyses have been conducted for simple MR-based causal average effect analyses,but they are not available for MR-based mediation analysis studies,and we aim to fill this gap in this paper.We propose to use two sensitivity parameters to quantify the effect due to the deviation of the IV assumptions.With these two sensitivity parameters,we derive consistent indirect causal effect estimators and establish their asymptotic propersties.Our theoretical results can be used in MR-based mediation analysis to study the impact of violations of MR as-sumptions.The finite sample performance of the proposed method is illustrated through simulation studies,sensitivity ana-lysis,and application to a real genome-wide association study.展开更多
Copper-based azide(Cu(N_(3))2 or CuN_(3),CA)chips synthesized by in-situ azide reaction and utilized in miniaturized explosive systems has become a hot research topic in recent years.However,the advantages of in-situ ...Copper-based azide(Cu(N_(3))2 or CuN_(3),CA)chips synthesized by in-situ azide reaction and utilized in miniaturized explosive systems has become a hot research topic in recent years.However,the advantages of in-situ synthesis method,including small size and low dosage,bring about difficulties in quantitative analysis and differences in ignition capabilities of CA chips.The aim of present work is to develop a simplified quantitative analysis method for accurate and safe analysis of components in CA chips to evaluate and investigate the corresponding ignition ability.In this work,Cu(N_(3))2 and CuN_(3)components in CA chips were separated through dissolution and distillation by utilizing the difference in solubility and corresponding content was obtained by measuring N_(3)-concentration through spectrophotometry.The spectrophotometry method was optimized by studying influencing factors and the recovery rate of different separation methods was studied,ensuring the accuracy and reproducibility of test results.The optimized method is linear in range from 1.0-25.0 mg/L,with a correlation coefficient R^(2)=0.9998,which meets the requirements of CA chips with a milligram-level content test.Compared with the existing ICP method,component analysis results of CA chips obtained by spectrophotometry are closer to real component content in samples and have satisfactory accuracy.Moreover,as its application in miniaturized explosive systems,the ignition ability of CA chips with different component contents for direct ink writing CL-20 and the corresponding mechanism was studied.This study provided a basis and idea for the design and performance evaluation of CA chips in miniaturized explosive systems.展开更多
For many products,distributions of their life mostly comply with increasing failure rates in average(IFRA).Aiming to these distributions,using properties of IFRA classification,this paper gives a non-parametric method...For many products,distributions of their life mostly comply with increasing failure rates in average(IFRA).Aiming to these distributions,using properties of IFRA classification,this paper gives a non-parametric method for processing zero-failure data.Estimations of reliabilities in any time are first obtained,and based on a regression model of failure rates,estimations of reliability indexes are given.Finally,a practical example is processed with this method.展开更多
Due to rainfall infiltration,groundwater activity,geological processes,and natural erosion,soil often exhibits heterogeneity and unsaturation.Additionally,seismic events can compromise slope stability.Existing analyti...Due to rainfall infiltration,groundwater activity,geological processes,and natural erosion,soil often exhibits heterogeneity and unsaturation.Additionally,seismic events can compromise slope stability.Existing analytical solutions typically consider a single failure mode,leading to inaccurate slope stability assessments.This study analyzes the impact of matric suction through three nonlinear shear strength models and adopts a heterogeneous soil model where cohesion linearly increases with depth.An improved pseudo-dynamic method is used to account for seismic effects.Based on a three-dimensional(3D)trumpet-shaped rotational failure mechanism,a new framework is established to analyze the stability of 3D two-bench slopes in heterogeneous unsaturated soil under seismic effects.The internal energy dissipation rate and external power at failure are calculated,and the gravity increase method is introduced to derive an explicit expression for the safety factor(F_(s)).The results are compared with previously published results,demonstrating the effectiveness of the proposed method.Sensitivity analyses on different parameters are conducted,discussing the influence of various factors on F s.This study proposes a new formula for calculating the F_(s) of 3D two-bench slopes in heterogeneous unsaturated soil under seismic effects,providing a practical application for slope engineering.展开更多
“慧眼”硬X射线调制望远镜(简称慧眼卫星,英文名为Insight Hard X-ray Modulation Telescope,简称Insight-HXMT或HXMT),于2017年6月15日发射升空,标志着我国自主研发的首个天文台级X射线望远镜的诞生.慧眼卫星凭借其大面积、宽波段、...“慧眼”硬X射线调制望远镜(简称慧眼卫星,英文名为Insight Hard X-ray Modulation Telescope,简称Insight-HXMT或HXMT),于2017年6月15日发射升空,标志着我国自主研发的首个天文台级X射线望远镜的诞生.慧眼卫星凭借其大面积、宽波段、高时间分辨率和高能量分辨率的综合优势,为黑洞与中子星系统的硬X射线快速变化和宽波段能谱研究领域开辟了新的研究窗口.超出设计寿命的慧眼卫星已稳定运行超过8yr,目前状态良好,且有望进一步延长其在轨服务时间.截至2024年10月,慧眼卫星已7次向全球科学界公开征集观测提案,共收到334份有效的观测提案,并据此制定了2368个观测计划.此外,慧眼卫星已向公众发布数据13批次,累计数据量达到40 TB,数据公开比率高达94%.慧眼卫星还向用户提供了不同版本的数据分析软件和标定数据库,在轨标定精度在2%左右,满足科学分析的要求.来自全球17家及国内36家研究机构的学者使用慧眼数据开展了科学研究,发表了约300多篇高质量学术论文,累计引用次数约7300次.展开更多
基金Project(2017G006-N)supported by the Project of Science and Technology Research and Development Program of China Railway Corporation。
文摘Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.
基金This project was supported by China Postdoctoral Science Foundation (2005037506) and the National Natural ScienceFoundation of China (70472029)
文摘In e-commerce the multidimensional data analysis based on the Web data needs integrating various data sources such as XML data and relational data on the conceptual level. A conceptual data description approach to multidimensional data model the UML galaxy diagram is presented in order to conduct multidimensional data analysis for multiple subjects. The approach is illuminated using a case of 2_roots UML galaxy diagram that takes marketing analysis of TV products involved one retailer and several suppliers into consideration.
文摘This study aims to examine the usability of environmentally harmless vegetable oil in the minimum quantity of lubrication(MQL)system in face milling of AISI O2 steel and to optimize the cutting parameters by different statistical methods.Vegetable oil was preferred as cutting fluid,and Taguchi method was used in the preparation of the test pattern.After testing with the prepared test pattern,cutting performance in all parameters has been improved according to dry conditions thanks to the MQL system.The highest tool life was obtained by using cutting parameters of 7.5 m cutting length,100 m/min cutting speed,100 mL/h MQL flow rate and 0.1 mm/tooth feed rate.Optimum cutting parameters were determined according to the Taguchi analysis,and the obtained parameters were confirmed with the verification tests.In addition,the optimum test parameter was determined by applying the gray relational analysis method.After using ANOVA analysis according to the measured surface roughness and cutting force values,the most effective cutting parameter was observed to be the feed rate.In addition,the models for surface roughness and cutting force values were obtained with precisions of 99.63%and 99.68%,respectively.Effective wear mechanisms were found to be abrasion and adhesion.
基金Project(50774095) supported by the National Natural Science Foundation of ChinaProject(200449) supported by the National Outstanding Doctoral Dissertations Special Funds of China
文摘In the case of unknown weights, theories of multi-attributed decision making based on interval numbers and grey related analysis were used to optimize mining methods. As the representative of independence for the indicator, the smaller the correlation of indicators is, the greater the weight is. Hence, the weights of interval numbers of indicators were determined by using correlation coefficient. Relative closeness based on positive and negative ideal methods was calculated by introducing distance between interval numbers, which made decision making more rational and comprehensive. A new method of ranking interval numbers based on normal distribution was proposed for the optimization of mining methods, whose basic properties were discussed. Finally, the feasibility and effectiveness of this method were verified by theories and practice.
基金supported by the National Natural Science Foundation of China (70961005)211 Project for Postgraduate Student Program of Inner Mongolia University+1 种基金National Natural Science Foundation of Inner Mongolia (2010Zd342011MS1002)
文摘The conventional data envelopment analysis (DEA) measures the relative efficiencies of a set of decision making units with exact values of inputs and outputs. In real-world prob- lems, however, inputs and outputs typically have some levels of fuzziness. To analyze a decision making unit (DMU) with fuzzy input/output data, previous studies provided the fuzzy DEA model and proposed an associated evaluating approach. Nonetheless, numerous deficiencies must still be improved, including the α- cut approaches, types of fuzzy numbers, and ranking techniques. Moreover, a fuzzy sample DMU still cannot be evaluated for the Fuzzy DEA model. Therefore, this paper proposes a fuzzy DEA model based on sample decision making unit (FSDEA). Five eval- uation approaches and the related algorithm and ranking methods are provided to test the fuzzy sample DMU of the FSDEA model. A numerical experiment is used to demonstrate and compare the results with those obtained using alternative approaches.
文摘The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.
基金supported by the National Natural Science Foundation of China(71471087)
文摘The classic data envelopment analysis(DEA) model is used to evaluate decision-making units'(DMUs) efficiency under the assumption that all DMUs are evaluated with the same criteria setting. Recently, new researches begin to focus on the efficiency analysis of non-homogeneous DMU arose by real practices such as the evaluation of departments in a university, where departments argue for the adoption of different criteria based on their disciplinary characteristics. A DEA procedure is proposed in this paper to address the efficiency analysis of two non-homogeneous DMU groups. Firstly, an analytical framework is established to compromise diversified input and output(IO) criteria from two nonhomogenous groups. Then, a criteria fusion operation is designed to obtain different DEA analysis strategies. Meanwhile, Friedman test is introduced to analyze the consistency of all efficiency results produced by different strategies. Next, ordered weighted averaging(OWA) operators are applied to integrate different information to reach final conclusions. Finally, a numerical example is used to illustrate the proposed method. The result indicates that the proposed method relaxes the restriction of the classical DEA model,and can provide more analytical flexibility to address different decision analysis scenarios arose from practical applications.
基金supported by the National Natural Science Foundation of China (7082100170801056)
文摘Traditional data envelopment analysis(DEA) theory assumes that decision variables are regarded as inputs or outputs,and no variable can play the roles of both an input and an output at the same time.In fact,there exist some variables that work as inputs and outputs simultaneously and are called dual-role variables.Traditional DEA models cannot be used to appraise the performance of decision making units containing dual-role variables.The paper analyzes the structure and properties of the production systems comprising dual-role variables,and proposes a DEA model integrating dual-role variables.Finally the proposed model is illustrated to evaluate the efficiency of university departments.
基金Project(50374079) supported by the National Natural Science Foundation of China
文摘It is difficult to detect the anomalies whose matching relationship among some data attributes is very different from others’ in a dataset. Aiming at this problem, an approach based on wavelet analysis for detecting and amending anomalous samples was proposed. Taking full advantage of wavelet analysis’ properties of multi-resolution and local analysis, this approach is able to detect and amend anomalous samples effectively. To realize the rapid numeric computation of wavelet translation for a discrete sequence, a modified algorithm based on Newton-Cores formula was also proposed. The experimental result shows that the approach is feasible with good result and good practicality.
文摘In this paper a new method of passive underwater TMA (target motion analysis) using data fusion is presented. The findings of this research are based on an understanding that there is a powerful sonar system that consists of many types of sonar but with one own-ship, and that different target parameter measurements can be obtained simultaneously. For the analysis 3 data measurements, passive bearing, elevation and multipath time-delay, are used, which are divided into two groups: a group with estimates of two preliminary target parameter obtained by dealing with each group measurement independently, and a group where correlated estimates are sent to a fusion center where the correlation between two data groups are considered so that the passive underwater TMA is realized. Simulation results show that curves of parameter estimation errors obtained by using the data fusion have fast convergence and the estimation accuracy is noticeably improved. The TMA algorithm presented is verified and is of practical significance because it is easy to be realized in one ship.
文摘The magnetic resonance spectroscopy(MRS)results are greatly influenced by reconstruction of the spectrum and quantitative analysis.Because of this requirement a number of programs dedicated to MRS data analysis were developed.The selection and use of appropriate software is crucial not only in clinical procedures,but also while carrying out scientific research.The choice of the software to suit the user's needs should be based on the analysis of the functionality of the program.It is particularly important from the user's viewpoint to identify what data can be loaded and processed in the program.The specific programs allow the user different degree of control over analysis parameters.Moreover,the programs for MRS data analysis differ in terms of the applied signal processing algorithms.The aim of this work,therefore,is to review available packages designed for MRS data analysis,taking into account their capabilities and limitations.
文摘The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
基金This work was supported by the National Natural Science Foundation of China(12171451,72091212).
文摘Mendelian randomization(MR)is widely used in causal mediation analysis to control unmeasured confounding effects,which is valid under some strong assumptions.It is thus of great interest to assess the impact of violations of these MR assumptions through sensitivity analysis.Sensitivity analyses have been conducted for simple MR-based causal average effect analyses,but they are not available for MR-based mediation analysis studies,and we aim to fill this gap in this paper.We propose to use two sensitivity parameters to quantify the effect due to the deviation of the IV assumptions.With these two sensitivity parameters,we derive consistent indirect causal effect estimators and establish their asymptotic propersties.Our theoretical results can be used in MR-based mediation analysis to study the impact of violations of MR as-sumptions.The finite sample performance of the proposed method is illustrated through simulation studies,sensitivity ana-lysis,and application to a real genome-wide association study.
基金the financial support provided by the National Natural Science Foundation of China(Grant No.11872013).
文摘Copper-based azide(Cu(N_(3))2 or CuN_(3),CA)chips synthesized by in-situ azide reaction and utilized in miniaturized explosive systems has become a hot research topic in recent years.However,the advantages of in-situ synthesis method,including small size and low dosage,bring about difficulties in quantitative analysis and differences in ignition capabilities of CA chips.The aim of present work is to develop a simplified quantitative analysis method for accurate and safe analysis of components in CA chips to evaluate and investigate the corresponding ignition ability.In this work,Cu(N_(3))2 and CuN_(3)components in CA chips were separated through dissolution and distillation by utilizing the difference in solubility and corresponding content was obtained by measuring N_(3)-concentration through spectrophotometry.The spectrophotometry method was optimized by studying influencing factors and the recovery rate of different separation methods was studied,ensuring the accuracy and reproducibility of test results.The optimized method is linear in range from 1.0-25.0 mg/L,with a correlation coefficient R^(2)=0.9998,which meets the requirements of CA chips with a milligram-level content test.Compared with the existing ICP method,component analysis results of CA chips obtained by spectrophotometry are closer to real component content in samples and have satisfactory accuracy.Moreover,as its application in miniaturized explosive systems,the ignition ability of CA chips with different component contents for direct ink writing CL-20 and the corresponding mechanism was studied.This study provided a basis and idea for the design and performance evaluation of CA chips in miniaturized explosive systems.
文摘For many products,distributions of their life mostly comply with increasing failure rates in average(IFRA).Aiming to these distributions,using properties of IFRA classification,this paper gives a non-parametric method for processing zero-failure data.Estimations of reliabilities in any time are first obtained,and based on a regression model of failure rates,estimations of reliability indexes are given.Finally,a practical example is processed with this method.
基金Project(51378510)supported by the National Natural Science Foundation of China。
文摘Due to rainfall infiltration,groundwater activity,geological processes,and natural erosion,soil often exhibits heterogeneity and unsaturation.Additionally,seismic events can compromise slope stability.Existing analytical solutions typically consider a single failure mode,leading to inaccurate slope stability assessments.This study analyzes the impact of matric suction through three nonlinear shear strength models and adopts a heterogeneous soil model where cohesion linearly increases with depth.An improved pseudo-dynamic method is used to account for seismic effects.Based on a three-dimensional(3D)trumpet-shaped rotational failure mechanism,a new framework is established to analyze the stability of 3D two-bench slopes in heterogeneous unsaturated soil under seismic effects.The internal energy dissipation rate and external power at failure are calculated,and the gravity increase method is introduced to derive an explicit expression for the safety factor(F_(s)).The results are compared with previously published results,demonstrating the effectiveness of the proposed method.Sensitivity analyses on different parameters are conducted,discussing the influence of various factors on F s.This study proposes a new formula for calculating the F_(s) of 3D two-bench slopes in heterogeneous unsaturated soil under seismic effects,providing a practical application for slope engineering.