This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate m...This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate mathematical models are challenging to establish or where system equations remain unknown.The Long Short-Term Memory(LSTM)neural network is trained using time series acquired from the desynchronization system states,subsequently predicting the lag synchronization transition.In the experiments,we focus on the Lorenz system with time-varying delayed coupling,studying the effects of coupling coefficients and time delays on lag synchronization,respectively.The results indicate that with appropriate training,the machine learning model can adeptly predict the lag synchronization occurrence and transition.This study not only enhances our comprehension of complex network synchronization behaviors but also underscores the potential and practical applications of machine learning in exploring nonlinear dynamic systems.展开更多
基金supported by the National Natural Science Foundation of China(No.52174184)。
文摘This paper propose a comprehensive data-driven prediction framework based on machine learning methods to investigate the lag synchronization phenomenon in coupled chaotic systems,particularly in cases where accurate mathematical models are challenging to establish or where system equations remain unknown.The Long Short-Term Memory(LSTM)neural network is trained using time series acquired from the desynchronization system states,subsequently predicting the lag synchronization transition.In the experiments,we focus on the Lorenz system with time-varying delayed coupling,studying the effects of coupling coefficients and time delays on lag synchronization,respectively.The results indicate that with appropriate training,the machine learning model can adeptly predict the lag synchronization occurrence and transition.This study not only enhances our comprehension of complex network synchronization behaviors but also underscores the potential and practical applications of machine learning in exploring nonlinear dynamic systems.