The accuracy of spot centroid positioning has a significant impact on the tracking accuracy of the system and the stability of the laser link construction.In satellite laser communication systems,the use of short-wave...The accuracy of spot centroid positioning has a significant impact on the tracking accuracy of the system and the stability of the laser link construction.In satellite laser communication systems,the use of short-wave infrared wavelengths as beacon light can reduce atmospheric absorption and signal attenuation.However,there are strong non-uniformity and blind pixels in the short-wave infrared image,which makes the image distorted and leads to the decrease of spot centroid positioning accuracy.Therefore,the high-precision localization of the spot centroid of the short-wave infrared images is of great research significance.A high-precision spot centroid positioning model for short-wave infrared is proposed to correct for non-uniformity and blind pixels in short-wave infrared images and quantify the localization errors caused by the two,further model-based localization error simulations are performed,and a novel spot centroid positioning payload for satellite laser communications has been designed using the latest 640×512 planar array InGaAs shortwave infrared detector.The experimental results show that the non-uniformity of the corrected image is reduced from 7%to 0.6%,the blind pixels rejection rate reaches 100%,the frame rate can be up to 2000 Hz,and the spot centroid localization accuracy is as high as 0.1 pixel point,which realizes high-precision spot centroid localization of high-frame-frequency short-wave infrared images.展开更多
Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient int...Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient inter-satellite cooperative computation offloading(ICCO)algorithm for LEO satellite networks.Specifically,an ICCO system model is constructed,which considers using neighboring satellites in the LEO satellite networks to collaboratively process tasks generated by ground user terminals,effectively improving resource utilization efficiency.Additionally,the optimization objective of minimizing the system task computation offloading delay and energy consumption is established,which is decoupled into two sub-problems.In terms of computational resource allocation,the convexity of the problem is proved through theoretical derivation,and the Lagrange multiplier method is used to obtain the optimal solution of computational resources.To deal with the task offloading decision,a dynamic sticky binary particle swarm optimization algorithm is designed to obtain the offloading decision by iteration.Simulation results show that the ICCO algorithm can effectively reduce the delay and energy consumption.展开更多
Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant ...Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.展开更多
Low-earth-orbit(LEO)satellite network has become a critical component of the satelliteterrestrial integrated network(STIN)due to its superior signal quality and minimal communication latency.However,the highly dynamic...Low-earth-orbit(LEO)satellite network has become a critical component of the satelliteterrestrial integrated network(STIN)due to its superior signal quality and minimal communication latency.However,the highly dynamic nature of LEO satellites leads to limited and rapidly varying contact time between them and Earth stations(ESs),making it difficult to timely download massive communication and remote sensing data within the limited time window.To address this challenge in heterogeneous satellite networks with coexisting geostationary-earth-orbit(GEO)and LEO satellites,this paper proposes a dynamic collaborative inter-satellite data download strategy to optimize the long-term weighted energy consumption and data downloads within the constraints of on-board power,backlog stability and time-varying contact.Specifically,the Lyapunov optimization theory is applied to transform the long-term stochastic optimization problem,subject to time-varying contact time and on-board power constraints,into multiple deterministic single time slot problems,based on which online distributed algorithms are developed to enable each satellite to independently obtain the transmit power allocation and data processing decisions in closed-form.Finally,the simulation results demonstrate the superiority of the proposed scheme over benchmarks,e.g.,achieving asymptotic optimality of the weighted energy consumption and data downloads,while maintaining stability of the on-board backlog.展开更多
The low Earth orbit(LEO)satellite networks have outstanding advantages such as wide coverage area and not being limited by geographic environment,which can provide a broader range of communication services and has bec...The low Earth orbit(LEO)satellite networks have outstanding advantages such as wide coverage area and not being limited by geographic environment,which can provide a broader range of communication services and has become an essential supplement to the terrestrial network.However,the dynamic changes and uneven distribution of satellite network traffic inevitably bring challenges to multipath routing.Even worse,the harsh space environment often leads to incomplete collection of network state data for routing decision-making,which further complicates this challenge.To address this problem,this paper proposes a state-incomplete intelligent dynamic multipath routing algorithm(SIDMRA)to maximize network efficiency even with incomplete state data as input.Specifically,we model the multipath routing problem as a markov decision process(MDP)and then combine the deep deterministic policy gradient(DDPG)and the K shortest paths(KSP)algorithm to solve the optimal multipath routing policy.We use the temporal correlation of the satellite network state to fit the incomplete state data and then use the message passing neuron network(MPNN)for data enhancement.Simulation results show that the proposed algorithm outperforms baseline algorithms regarding average end-to-end delay and packet loss rate and performs stably under certain missing rates of state data.展开更多
Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for t...Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for the global ground users.In this paper,the computation offloading problem and resource allocation problem are formulated as a mixed integer nonlinear program(MINLP)problem.This paper proposes a computation offloading algorithm based on deep deterministic policy gradient(DDPG)to obtain the user offloading decisions and user uplink transmission power.This paper uses the convex optimization algorithm based on Lagrange multiplier method to obtain the optimal MEC server resource allocation scheme.In addition,the expression of suboptimal user local CPU cycles is derived by relaxation method.Simulation results show that the proposed algorithm can achieve excellent convergence effect,and the proposed algorithm significantly reduces the system utility values at considerable time cost compared with other algorithms.展开更多
Lower Earth Orbit(LEO) satellite becomes an important part of complementing terrestrial communication due to its lower orbital altitude and smaller propagation delay than Geostationary satellite. However, the LEO sate...Lower Earth Orbit(LEO) satellite becomes an important part of complementing terrestrial communication due to its lower orbital altitude and smaller propagation delay than Geostationary satellite. However, the LEO satellite communication system cannot meet the requirements of users when the satellite-terrestrial link is blocked by obstacles. To solve this problem, we introduce Intelligent reflect surface(IRS) for improving the achievable rate of terrestrial users in LEO satellite communication. We investigated joint IRS scheduling, user scheduling, power and bandwidth allocation(JIRPB) optimization algorithm for improving LEO satellite system throughput.The optimization problem of joint user scheduling and resource allocation is formulated as a non-convex optimization problem. To cope with this problem, the nonconvex optimization problem is divided into resource allocation optimization sub-problem and scheduling optimization sub-problem firstly. Second, we optimize the resource allocation sub-problem via alternating direction multiplier method(ADMM) and scheduling sub-problem via Lagrangian dual method repeatedly.Third, we prove that the proposed resource allocation algorithm based ADMM approaches sublinear convergence theoretically. Finally, we demonstrate that the proposed JIRPB optimization algorithm improves the LEO satellite communication system throughput.展开更多
Satellite communication systems are facing serious electromagnetic interference,and interference signal recognition is a crucial foundation for targeted anti-interference.In this paper,we propose a novel interference ...Satellite communication systems are facing serious electromagnetic interference,and interference signal recognition is a crucial foundation for targeted anti-interference.In this paper,we propose a novel interference recognition algorithm called HDCGD-CBAM,which adopts the time-frequency images(TFIs)of signals to effectively extract the temporal and spectral characteristics.In the proposed method,we improve the Convolutional Long Short-Term Memory Deep Neural Network(CLDNN)in two ways.First,the simpler Gate Recurrent Unit(GRU)is used instead of the Long Short-Term Memory(LSTM),reducing model parameters while maintaining the recognition accuracy.Second,we replace convolutional layers with hybrid dilated convolution(HDC)to expand the receptive field of feature maps,which captures the correlation of time-frequency data on a larger spatial scale.Additionally,Convolutional Block Attention Module(CBAM)is introduced before and after the HDC layers to strengthen the extraction of critical features and improve the recognition performance.The experiment results show that the HDCGD-CBAM model significantly outper-forms existing methods in terms of recognition accuracy and complexity.When Jamming-to-Signal Ratio(JSR)varies from-30dB to 10dB,it achieves an average accuracy of 78.7%and outperforms the CLDNN by 7.29%while reducing the Floating Point Operations(FLOPs)by 79.8%to 114.75M.Moreover,the proposed model has fewer parameters with 301k compared to several state-of-the-art methods.展开更多
Low Earth Orbit(LEO)multibeam satellites will be widely used in the next generation of satellite communication systems,whose inter-beam interference will inevitably limit the performance of the whole system.Nonlinear ...Low Earth Orbit(LEO)multibeam satellites will be widely used in the next generation of satellite communication systems,whose inter-beam interference will inevitably limit the performance of the whole system.Nonlinear precoding such as Tomlinson-Harashima precoding(THP)algorithm has been proved to be a promising technology to solve this problem,which has smaller noise amplification effect compared with linear precoding.However,the similarity of different user channels(defined as channel correlation)will degrade the performance of THP algorithm.In this paper,we qualitatively analyze the inter-beam interference in the whole process of LEO satellite over a specific coverage area,and the impact of channel correlation on Signal-to-Noise Ratio(SNR)of receivers when THP is applied.One user grouping algorithm is proposed based on the analysis of channel correlation,which could decrease the number of users with high channel correlation in each precoding group,thus improve the performance of THP.Furthermore,our algorithm is designed under the premise of co-frequency deployment and orthogonal frequency division multiplexing(OFDM),which leads to more users under severe inter-beam interference compared to the existing research on geostationary orbit satellites broadcasting systems.Simulation results show that the proposed user grouping algorithm possesses higher channel capacity and better bit error rate(BER)performance in high SNR conditions relative to existing works.展开更多
In LEO satellite communication networks,the number of satellites has increased sharply, the relative velocity of satellites is very fast, then electronic signal aliasing occurs from time to time. Those aliasing signal...In LEO satellite communication networks,the number of satellites has increased sharply, the relative velocity of satellites is very fast, then electronic signal aliasing occurs from time to time. Those aliasing signals make the receiving ability of the signal receiver worse, the signal processing ability weaker,and the anti-interference ability of the communication system lower. Aiming at the above problems, to save communication resources and improve communication efficiency, and considering the irregularity of interference signals, the underdetermined blind separation technology can effectively deal with the problem of interference sensing and signal reconstruction in this scenario. In order to improve the stability of source signal separation and the security of information transmission, a greedy optimization algorithm can be executed. At the same time, to improve network information transmission efficiency and prevent algorithms from getting trapped in local optima, delete low-energy points during each iteration process. Ultimately, simulation experiments validate that the algorithm presented in this paper enhances both the transmission efficiency of the network transmission system and the security of the communication system, achieving the process of interference sensing and signal reconstruction in the LEO satellite communication system.展开更多
As the demands of massive connections and vast coverage rapidly grow in the next wireless communication networks, rate splitting multiple access(RSMA) is considered to be the new promising access scheme since it can p...As the demands of massive connections and vast coverage rapidly grow in the next wireless communication networks, rate splitting multiple access(RSMA) is considered to be the new promising access scheme since it can provide higher efficiency with limited spectrum resources. In this paper, combining spectrum splitting with rate splitting, we propose to allocate resources with traffic offloading in hybrid satellite terrestrial networks. A novel deep reinforcement learning method is adopted to solve this challenging non-convex problem. However, the neverending learning process could prohibit its practical implementation. Therefore, we introduce the switch mechanism to avoid unnecessary learning. Additionally, the QoS constraint in the scheme can rule out unsuccessful transmission. The simulation results validates the energy efficiency performance and the convergence speed of the proposed algorithm.展开更多
With the explosive growth of highdefinition video streaming data,a substantial increase in network traffic has ensued.The emergency of mobile edge caching(MEC)can not only alleviate the burden on core network,but also...With the explosive growth of highdefinition video streaming data,a substantial increase in network traffic has ensued.The emergency of mobile edge caching(MEC)can not only alleviate the burden on core network,but also significantly improve user experience.Integrating with the MEC and satellite networks,the network is empowered popular content ubiquitously and seamlessly.Addressing the research gap between multilayer satellite networks and MEC,we study the caching placement problem in this paper.Initially,we introduce a three-layer distributed network caching management architecture designed for efficient and flexible handling of large-scale networks.Considering the constraint on satellite capacity and content propagation delay,the cache placement problem is then formulated and transformed into a markov decision process(MDP),where the content coded caching mechanism is utilized to promote the efficiency of content delivery.Furthermore,a new generic metric,content delivery cost,is proposed to elaborate the performance of caching decision in large-scale networks.Then,we introduce a graph convolutional network(GCN)-based multi-agent advantage actor-critic(A2C)algorithm to optimize the caching decision.Finally,extensive simulations are conducted to evaluate the proposed algorithm in terms of content delivery cost and transferability.展开更多
Realization of high performance satellite onboard clock is vital for various positioning, navigation, and timing applications. For further improvement of the synchronization-based satellite time and frequency referenc...Realization of high performance satellite onboard clock is vital for various positioning, navigation, and timing applications. For further improvement of the synchronization-based satellite time and frequency references, we propose a geosynchronous(GEO) satellite virtual clock concept based on ground–satellite synchronization and present a beacon transponder structure for its implementation(scheduled for launch in 2025), which does not require atomic clocks to be mounted on the satellite. Its high performance relies only on minor modifications to the existing transponder structure of GEO satellites. We carefully model the carrier phase link and analyze the factors causing link asymmetry within the special relativity. Considering that performance of such synchronization-based satellite clocks is primarily limited by the link's random phase noise, which cannot be adequately modeled, we design a closed-loop experiment based on commercial GEO satellites for pre-evaluation. This experiment aims at extracting the zero-means random part of the ground-satellite Ku-band carrier phase via a feedback loop. Ultimately, we obtain a 1σ value of 0.633 ps(two-way link), following the Gaussian distribution. From this result, we conclude that the proposed real-time Einstein-synchronization-defined satellite virtual clock can achieve picosecond-level replication of onboard time and frequency.展开更多
The high-speed movement of satellites makes it not feasible to directly apply the mature routing scheme on the ground to the satellite network.DT-DVTR in the snapshot-based connectionoriented routing strategy is one o...The high-speed movement of satellites makes it not feasible to directly apply the mature routing scheme on the ground to the satellite network.DT-DVTR in the snapshot-based connectionoriented routing strategy is one of the representative solutions,but it still has room for improvement in terms of routing stability.In this paper,we propose an improved scheme for connection-oriented routing strategy named the Minimal Topology Change Routing based on Collaborative Rules(MTCR-CR).The MTCR-CR uses continuous time static topology snapshots based on satellite status to search for intersatellite link(ISL)construction solutions that meet the minimum number of topology changes to avoid route oscillations.The simulation results in Beidou-3 show that compared with DT-DVTR,MTCR-CR reduces the number of routing changes by about 92%,the number of path changes caused by routing changes is about38%,and the rerouting time is reduced by approximately 47%.At the same time,in order to show our algorithm more comprehensively,the same experimental index test was also carried out on the Globalstar satellite constellation.展开更多
Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on be...Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.展开更多
Currently,China has 32 Earth observation satellites in orbit.The satellites can provide various data such as optical,multispectral,infrared,and radar.The spatial resolution of China Earth observation satellites ranges...Currently,China has 32 Earth observation satellites in orbit.The satellites can provide various data such as optical,multispectral,infrared,and radar.The spatial resolution of China Earth observation satellites ranges from low to medium to high.The satellites possess the capability to observe across multiple spectral bands,under all weather conditions,and at all times.The data of China Earth observation satellites has been widely used in fields such as natural resource detection,environmental monitoring and protection,disaster prevention and reduction,urban planning and mapping,agricultural and forestry surveys,land survey and geological prospecting,and ocean forecasting,achieving huge social benefits.This article introduces the recent progress of Earth observation satellites in China since 2022,especially the satellite operation,data archiving,data distribution and data coverage.展开更多
This paper investigates the low earth orbit(LEO)satellite-enabled coded compressed sensing(CCS)unsourced random access(URA)in orthogonal frequency division multiple access(OFDMA)framework,where a massive uniform plana...This paper investigates the low earth orbit(LEO)satellite-enabled coded compressed sensing(CCS)unsourced random access(URA)in orthogonal frequency division multiple access(OFDMA)framework,where a massive uniform planar array(UPA)is equipped on the satellite.In LEO satellite communications,unavoidable timing and frequency offsets cause phase shifts in the transmitted signals,substantially diminishing the decoding performance of current terrestrial CCS URA receiver.To cope with this issue,we expand the inner codebook with predefined timing and frequency offsets and formulate the inner decoding as a tractable compressed sensing(CS)problem.Additionally,we leverage the inherent sparsity of the UPA-equipped LEO satellite angular domain channels,thereby enabling the outer decoder to support more active devices.Furthermore,the outputs of the outer decoder are used to reduce the search space of the inner decoder,which cuts down the computational complexity and accelerates the convergence of the inner decoding.Simulation results verify the effectiveness of the proposed scheme.展开更多
In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit...In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit satellites, and the directional nature of their signals, traditional monitoring methods are no longer suitable, especially in the case of multiple power level. Mobile crowdsensing(MCS), as a new technology, can make full use of idle resources to complete a variety of perceptual tasks. However, traditional MCS heavily relies on a centralized server and is vulnerable to single point of failure attacks. Therefore, we replace the original centralized server with a blockchain-based distributed service provider to enable its security. Therefore, in this work, we propose a blockchain-based MCS framework, in which we explain in detail how this framework can achieve abnormal frequency behavior monitoring in an inter-satellite spectrum sharing system. Then, under certain false alarm probability, we propose an abnormal spectrum detection algorithm based on mixed hypothesis test to maximize detection probability in single power level and multiple power level scenarios, respectively. Finally, a Bad out of Good(BooG) detector is proposed to ease the computational pressure on the blockchain nodes. Simulation results show the effectiveness of the proposed framework.展开更多
In this paper,we study the covert performance of the downlink low earth orbit(LEO)satellite communication,where the unmanned aerial vehicle(UAV)is employed as a cooperative jammer.To maximize the covert rate of the LE...In this paper,we study the covert performance of the downlink low earth orbit(LEO)satellite communication,where the unmanned aerial vehicle(UAV)is employed as a cooperative jammer.To maximize the covert rate of the LEO satellite transmission,a multi-objective problem is formulated to jointly optimize the UAV’s jamming power and trajectory.For practical consideration,we assume that the UAV can only have partial environmental information,and can’t know the detection threshold and exact location of the eavesdropper on the ground.To solve the multiobjective problem,we propose the data-driven generative adversarial network(DD-GAN)based method to optimize the power and trajectory of the UAV,in which the sample data is collected by using genetic algorithm(GA).Simulation results show that the jamming solution of UAV generated by DD-GAN can achieve an effective trade-off between covert rate and probability of detection errors when only limited prior information is obtained.展开更多
Dynamic analysis of the tethered satellite system(TSS)can provide a fundamental guideline to the evaluation of performance and robust design of the system examined.Uncertainties inherited with the parameters would ind...Dynamic analysis of the tethered satellite system(TSS)can provide a fundamental guideline to the evaluation of performance and robust design of the system examined.Uncertainties inherited with the parameters would induce unexpected variation of the response and deteriorate the reliability of the system.In this work,the effect of uncertain mass of the satellites on the deployment and retrieval dynamics of the TSS is investigated.First the interval mode is employed to take the variation of mass of satellite into account in the processes of deployment and retrieval.Then,the Chebyshev interval method is used to obtain the lower and upper response bounds of the TSS.To achieve a smooth and reliable implementation of deployment and retrieval,the nonlinear programming based on the Gauss pseudospectral method is adopted to obtain optimal trajectory of tether velocity.Numerical results show that the uncertainties of mass of the satellites have a distinct influence on the response of tether tension in the processes of deployment and retrieval.展开更多
基金Supported by the Short-wave Infrared Camera Systems(B025F40622024)。
文摘The accuracy of spot centroid positioning has a significant impact on the tracking accuracy of the system and the stability of the laser link construction.In satellite laser communication systems,the use of short-wave infrared wavelengths as beacon light can reduce atmospheric absorption and signal attenuation.However,there are strong non-uniformity and blind pixels in the short-wave infrared image,which makes the image distorted and leads to the decrease of spot centroid positioning accuracy.Therefore,the high-precision localization of the spot centroid of the short-wave infrared images is of great research significance.A high-precision spot centroid positioning model for short-wave infrared is proposed to correct for non-uniformity and blind pixels in short-wave infrared images and quantify the localization errors caused by the two,further model-based localization error simulations are performed,and a novel spot centroid positioning payload for satellite laser communications has been designed using the latest 640×512 planar array InGaAs shortwave infrared detector.The experimental results show that the non-uniformity of the corrected image is reduced from 7%to 0.6%,the blind pixels rejection rate reaches 100%,the frame rate can be up to 2000 Hz,and the spot centroid localization accuracy is as high as 0.1 pixel point,which realizes high-precision spot centroid localization of high-frame-frequency short-wave infrared images.
基金supported in part by Sub Project of National Key Research and Development plan in 2020 NO.2020YFC1511704Beijing Information Science and Technology University NO.2020KYNH212,NO.2021CGZH302+1 种基金Beijing Science and Technology Project(Grant No.Z211100004421009)in part by the National Natural Science Foundation of China(Grant No.62301058).
文摘Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient inter-satellite cooperative computation offloading(ICCO)algorithm for LEO satellite networks.Specifically,an ICCO system model is constructed,which considers using neighboring satellites in the LEO satellite networks to collaboratively process tasks generated by ground user terminals,effectively improving resource utilization efficiency.Additionally,the optimization objective of minimizing the system task computation offloading delay and energy consumption is established,which is decoupled into two sub-problems.In terms of computational resource allocation,the convexity of the problem is proved through theoretical derivation,and the Lagrange multiplier method is used to obtain the optimal solution of computational resources.To deal with the task offloading decision,a dynamic sticky binary particle swarm optimization algorithm is designed to obtain the offloading decision by iteration.Simulation results show that the ICCO algorithm can effectively reduce the delay and energy consumption.
基金supported by the National Key R&D Program of China(Grant No.2022YFA1005000)the National Natural Science Foundation of China(Grant No.62025110 and 62101308).
文摘Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.
基金supported by the National Natural Science Foundation of China under Grant 62371098the National Key Laboratory ofWireless Communications Foundation under Grant IFN20230203the National Key Research and Development Program of China under Grant 2021YFB2900404.
文摘Low-earth-orbit(LEO)satellite network has become a critical component of the satelliteterrestrial integrated network(STIN)due to its superior signal quality and minimal communication latency.However,the highly dynamic nature of LEO satellites leads to limited and rapidly varying contact time between them and Earth stations(ESs),making it difficult to timely download massive communication and remote sensing data within the limited time window.To address this challenge in heterogeneous satellite networks with coexisting geostationary-earth-orbit(GEO)and LEO satellites,this paper proposes a dynamic collaborative inter-satellite data download strategy to optimize the long-term weighted energy consumption and data downloads within the constraints of on-board power,backlog stability and time-varying contact.Specifically,the Lyapunov optimization theory is applied to transform the long-term stochastic optimization problem,subject to time-varying contact time and on-board power constraints,into multiple deterministic single time slot problems,based on which online distributed algorithms are developed to enable each satellite to independently obtain the transmit power allocation and data processing decisions in closed-form.Finally,the simulation results demonstrate the superiority of the proposed scheme over benchmarks,e.g.,achieving asymptotic optimality of the weighted energy consumption and data downloads,while maintaining stability of the on-board backlog.
文摘The low Earth orbit(LEO)satellite networks have outstanding advantages such as wide coverage area and not being limited by geographic environment,which can provide a broader range of communication services and has become an essential supplement to the terrestrial network.However,the dynamic changes and uneven distribution of satellite network traffic inevitably bring challenges to multipath routing.Even worse,the harsh space environment often leads to incomplete collection of network state data for routing decision-making,which further complicates this challenge.To address this problem,this paper proposes a state-incomplete intelligent dynamic multipath routing algorithm(SIDMRA)to maximize network efficiency even with incomplete state data as input.Specifically,we model the multipath routing problem as a markov decision process(MDP)and then combine the deep deterministic policy gradient(DDPG)and the K shortest paths(KSP)algorithm to solve the optimal multipath routing policy.We use the temporal correlation of the satellite network state to fit the incomplete state data and then use the message passing neuron network(MPNN)for data enhancement.Simulation results show that the proposed algorithm outperforms baseline algorithms regarding average end-to-end delay and packet loss rate and performs stably under certain missing rates of state data.
基金supported by National Natural Science Foundation of China No.62231012Natural Science Foundation for Outstanding Young Scholars of Heilongjiang Province under Grant YQ2020F001Heilongjiang Province Postdoctoral General Foundation under Grant AUGA4110004923.
文摘Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for the global ground users.In this paper,the computation offloading problem and resource allocation problem are formulated as a mixed integer nonlinear program(MINLP)problem.This paper proposes a computation offloading algorithm based on deep deterministic policy gradient(DDPG)to obtain the user offloading decisions and user uplink transmission power.This paper uses the convex optimization algorithm based on Lagrange multiplier method to obtain the optimal MEC server resource allocation scheme.In addition,the expression of suboptimal user local CPU cycles is derived by relaxation method.Simulation results show that the proposed algorithm can achieve excellent convergence effect,and the proposed algorithm significantly reduces the system utility values at considerable time cost compared with other algorithms.
基金supported by the National Key R&D Program of China under Grant 2020YFB1807900the National Natural Science Foundation of China (NSFC) under Grant 61931005Beijing University of Posts and Telecommunications-China Mobile Research Institute Joint Innovation Center。
文摘Lower Earth Orbit(LEO) satellite becomes an important part of complementing terrestrial communication due to its lower orbital altitude and smaller propagation delay than Geostationary satellite. However, the LEO satellite communication system cannot meet the requirements of users when the satellite-terrestrial link is blocked by obstacles. To solve this problem, we introduce Intelligent reflect surface(IRS) for improving the achievable rate of terrestrial users in LEO satellite communication. We investigated joint IRS scheduling, user scheduling, power and bandwidth allocation(JIRPB) optimization algorithm for improving LEO satellite system throughput.The optimization problem of joint user scheduling and resource allocation is formulated as a non-convex optimization problem. To cope with this problem, the nonconvex optimization problem is divided into resource allocation optimization sub-problem and scheduling optimization sub-problem firstly. Second, we optimize the resource allocation sub-problem via alternating direction multiplier method(ADMM) and scheduling sub-problem via Lagrangian dual method repeatedly.Third, we prove that the proposed resource allocation algorithm based ADMM approaches sublinear convergence theoretically. Finally, we demonstrate that the proposed JIRPB optimization algorithm improves the LEO satellite communication system throughput.
基金This work was supported by the Beijing Natural Science Foundation(L202003).
文摘Satellite communication systems are facing serious electromagnetic interference,and interference signal recognition is a crucial foundation for targeted anti-interference.In this paper,we propose a novel interference recognition algorithm called HDCGD-CBAM,which adopts the time-frequency images(TFIs)of signals to effectively extract the temporal and spectral characteristics.In the proposed method,we improve the Convolutional Long Short-Term Memory Deep Neural Network(CLDNN)in two ways.First,the simpler Gate Recurrent Unit(GRU)is used instead of the Long Short-Term Memory(LSTM),reducing model parameters while maintaining the recognition accuracy.Second,we replace convolutional layers with hybrid dilated convolution(HDC)to expand the receptive field of feature maps,which captures the correlation of time-frequency data on a larger spatial scale.Additionally,Convolutional Block Attention Module(CBAM)is introduced before and after the HDC layers to strengthen the extraction of critical features and improve the recognition performance.The experiment results show that the HDCGD-CBAM model significantly outper-forms existing methods in terms of recognition accuracy and complexity.When Jamming-to-Signal Ratio(JSR)varies from-30dB to 10dB,it achieves an average accuracy of 78.7%and outperforms the CLDNN by 7.29%while reducing the Floating Point Operations(FLOPs)by 79.8%to 114.75M.Moreover,the proposed model has fewer parameters with 301k compared to several state-of-the-art methods.
基金supported by the Key R&D Project of the Ministry of Science and Technology of China(2020YFB1808005)。
文摘Low Earth Orbit(LEO)multibeam satellites will be widely used in the next generation of satellite communication systems,whose inter-beam interference will inevitably limit the performance of the whole system.Nonlinear precoding such as Tomlinson-Harashima precoding(THP)algorithm has been proved to be a promising technology to solve this problem,which has smaller noise amplification effect compared with linear precoding.However,the similarity of different user channels(defined as channel correlation)will degrade the performance of THP algorithm.In this paper,we qualitatively analyze the inter-beam interference in the whole process of LEO satellite over a specific coverage area,and the impact of channel correlation on Signal-to-Noise Ratio(SNR)of receivers when THP is applied.One user grouping algorithm is proposed based on the analysis of channel correlation,which could decrease the number of users with high channel correlation in each precoding group,thus improve the performance of THP.Furthermore,our algorithm is designed under the premise of co-frequency deployment and orthogonal frequency division multiplexing(OFDM),which leads to more users under severe inter-beam interference compared to the existing research on geostationary orbit satellites broadcasting systems.Simulation results show that the proposed user grouping algorithm possesses higher channel capacity and better bit error rate(BER)performance in high SNR conditions relative to existing works.
基金supported by National Natural Science Foundation of China (62171390)Central Universities of Southwest Minzu University (ZYN2022032,2023NYXXS034)the State Scholarship Fund of the China Scholarship Council (NO.202008510081)。
文摘In LEO satellite communication networks,the number of satellites has increased sharply, the relative velocity of satellites is very fast, then electronic signal aliasing occurs from time to time. Those aliasing signals make the receiving ability of the signal receiver worse, the signal processing ability weaker,and the anti-interference ability of the communication system lower. Aiming at the above problems, to save communication resources and improve communication efficiency, and considering the irregularity of interference signals, the underdetermined blind separation technology can effectively deal with the problem of interference sensing and signal reconstruction in this scenario. In order to improve the stability of source signal separation and the security of information transmission, a greedy optimization algorithm can be executed. At the same time, to improve network information transmission efficiency and prevent algorithms from getting trapped in local optima, delete low-energy points during each iteration process. Ultimately, simulation experiments validate that the algorithm presented in this paper enhances both the transmission efficiency of the network transmission system and the security of the communication system, achieving the process of interference sensing and signal reconstruction in the LEO satellite communication system.
文摘As the demands of massive connections and vast coverage rapidly grow in the next wireless communication networks, rate splitting multiple access(RSMA) is considered to be the new promising access scheme since it can provide higher efficiency with limited spectrum resources. In this paper, combining spectrum splitting with rate splitting, we propose to allocate resources with traffic offloading in hybrid satellite terrestrial networks. A novel deep reinforcement learning method is adopted to solve this challenging non-convex problem. However, the neverending learning process could prohibit its practical implementation. Therefore, we introduce the switch mechanism to avoid unnecessary learning. Additionally, the QoS constraint in the scheme can rule out unsuccessful transmission. The simulation results validates the energy efficiency performance and the convergence speed of the proposed algorithm.
基金supported by the National Key Research and Development Program of China under Grant 2020YFB1807700the National Natural Science Foundation of China(NSFC)under Grant(No.62201414,62201432)+2 种基金the Qinchuangyuan Project(OCYRCXM-2022-362)the Fundamental Research Funds for the Central Universities and the Innovation Fund of Xidian University under Grant YJSJ24017the Guangzhou Science and Technology Program under Grant 202201011732。
文摘With the explosive growth of highdefinition video streaming data,a substantial increase in network traffic has ensued.The emergency of mobile edge caching(MEC)can not only alleviate the burden on core network,but also significantly improve user experience.Integrating with the MEC and satellite networks,the network is empowered popular content ubiquitously and seamlessly.Addressing the research gap between multilayer satellite networks and MEC,we study the caching placement problem in this paper.Initially,we introduce a three-layer distributed network caching management architecture designed for efficient and flexible handling of large-scale networks.Considering the constraint on satellite capacity and content propagation delay,the cache placement problem is then formulated and transformed into a markov decision process(MDP),where the content coded caching mechanism is utilized to promote the efficiency of content delivery.Furthermore,a new generic metric,content delivery cost,is proposed to elaborate the performance of caching decision in large-scale networks.Then,we introduce a graph convolutional network(GCN)-based multi-agent advantage actor-critic(A2C)algorithm to optimize the caching decision.Finally,extensive simulations are conducted to evaluate the proposed algorithm in terms of content delivery cost and transferability.
基金supported by the National Key Research and Development Program of China(Grant No.2021YFA1402100)。
文摘Realization of high performance satellite onboard clock is vital for various positioning, navigation, and timing applications. For further improvement of the synchronization-based satellite time and frequency references, we propose a geosynchronous(GEO) satellite virtual clock concept based on ground–satellite synchronization and present a beacon transponder structure for its implementation(scheduled for launch in 2025), which does not require atomic clocks to be mounted on the satellite. Its high performance relies only on minor modifications to the existing transponder structure of GEO satellites. We carefully model the carrier phase link and analyze the factors causing link asymmetry within the special relativity. Considering that performance of such synchronization-based satellite clocks is primarily limited by the link's random phase noise, which cannot be adequately modeled, we design a closed-loop experiment based on commercial GEO satellites for pre-evaluation. This experiment aims at extracting the zero-means random part of the ground-satellite Ku-band carrier phase via a feedback loop. Ultimately, we obtain a 1σ value of 0.633 ps(two-way link), following the Gaussian distribution. From this result, we conclude that the proposed real-time Einstein-synchronization-defined satellite virtual clock can achieve picosecond-level replication of onboard time and frequency.
基金supported by the National Key Research and Development Program of China(No.2020YFB1806000)。
文摘The high-speed movement of satellites makes it not feasible to directly apply the mature routing scheme on the ground to the satellite network.DT-DVTR in the snapshot-based connectionoriented routing strategy is one of the representative solutions,but it still has room for improvement in terms of routing stability.In this paper,we propose an improved scheme for connection-oriented routing strategy named the Minimal Topology Change Routing based on Collaborative Rules(MTCR-CR).The MTCR-CR uses continuous time static topology snapshots based on satellite status to search for intersatellite link(ISL)construction solutions that meet the minimum number of topology changes to avoid route oscillations.The simulation results in Beidou-3 show that compared with DT-DVTR,MTCR-CR reduces the number of routing changes by about 92%,the number of path changes caused by routing changes is about38%,and the rerouting time is reduced by approximately 47%.At the same time,in order to show our algorithm more comprehensively,the same experimental index test was also carried out on the Globalstar satellite constellation.
基金supported by National Natural Science Foundation of China(62032003).
文摘Recent advancements in satellite technologies and the declining cost of access to space have led to the emergence of large satellite constellations in Low Earth Orbit(LEO).However,these constellations often rely on bent-pipe architecture,resulting in high communication costs.Existing onboard inference architectures suffer from limitations in terms of low accuracy and inflexibility in the deployment and management of in-orbit applications.To address these challenges,we propose a cloud-native-based satellite design specifically tailored for Earth Observation tasks,enabling diverse computing paradigms.In this work,we present a case study of a satellite-ground collaborative inference system deployed in the Tiansuan constellation,demonstrating a remarkable 50%accuracy improvement and a substantial 90%data reduction.Our work sheds light on in-orbit energy,where in-orbit computing accounts for 17%of the total onboard energy consumption.Our approach represents a significant advancement of cloud-native satellite,aiming to enhance the accuracy of in-orbit computing while simultaneously reducing communication cost.
文摘Currently,China has 32 Earth observation satellites in orbit.The satellites can provide various data such as optical,multispectral,infrared,and radar.The spatial resolution of China Earth observation satellites ranges from low to medium to high.The satellites possess the capability to observe across multiple spectral bands,under all weather conditions,and at all times.The data of China Earth observation satellites has been widely used in fields such as natural resource detection,environmental monitoring and protection,disaster prevention and reduction,urban planning and mapping,agricultural and forestry surveys,land survey and geological prospecting,and ocean forecasting,achieving huge social benefits.This article introduces the recent progress of Earth observation satellites in China since 2022,especially the satellite operation,data archiving,data distribution and data coverage.
基金supported by the National Key R&D Program of China under Grant 2023YFB2904703the National Natural Science Foundation of China under Grant 62341110,62371122 and 62322104+1 种基金the Jiangsu Province Basic Research Project under Grant BK20192002the Fundamental Research Funds for the Central Universities under Grant 2242022k30005 and 2242023K5003。
文摘This paper investigates the low earth orbit(LEO)satellite-enabled coded compressed sensing(CCS)unsourced random access(URA)in orthogonal frequency division multiple access(OFDMA)framework,where a massive uniform planar array(UPA)is equipped on the satellite.In LEO satellite communications,unavoidable timing and frequency offsets cause phase shifts in the transmitted signals,substantially diminishing the decoding performance of current terrestrial CCS URA receiver.To cope with this issue,we expand the inner codebook with predefined timing and frequency offsets and formulate the inner decoding as a tractable compressed sensing(CS)problem.Additionally,we leverage the inherent sparsity of the UPA-equipped LEO satellite angular domain channels,thereby enabling the outer decoder to support more active devices.Furthermore,the outputs of the outer decoder are used to reduce the search space of the inner decoder,which cuts down the computational complexity and accelerates the convergence of the inner decoding.Simulation results verify the effectiveness of the proposed scheme.
文摘In this paper, the problem of abnormal spectrum usage between satellite spectrum sharing systems is investigated to support multi-satellite spectrum coexistence. Given the cost of monitoring, the mobility of low-orbit satellites, and the directional nature of their signals, traditional monitoring methods are no longer suitable, especially in the case of multiple power level. Mobile crowdsensing(MCS), as a new technology, can make full use of idle resources to complete a variety of perceptual tasks. However, traditional MCS heavily relies on a centralized server and is vulnerable to single point of failure attacks. Therefore, we replace the original centralized server with a blockchain-based distributed service provider to enable its security. Therefore, in this work, we propose a blockchain-based MCS framework, in which we explain in detail how this framework can achieve abnormal frequency behavior monitoring in an inter-satellite spectrum sharing system. Then, under certain false alarm probability, we propose an abnormal spectrum detection algorithm based on mixed hypothesis test to maximize detection probability in single power level and multiple power level scenarios, respectively. Finally, a Bad out of Good(BooG) detector is proposed to ease the computational pressure on the blockchain nodes. Simulation results show the effectiveness of the proposed framework.
基金supported in part by the National Natural Science Foundation for Distinguished Young Scholar 61825104in part by the National Natural Science Foundation of China under Grant 62201582+4 种基金in part by the National Nature Science Foundation of China under Grants 62101450in part by the Key R&D Plan of Shaan Xi Province Grants 2023YBGY037in part by National Key R&D Program of China(2022YFC3301300)in part by the Natural Science Basic Research Program of Shaanxi under Grant 2022JQ-632in part by Innovative Cultivation Project of School of Information and Communication of National University of Defense Technology under Grant YJKT-ZD-2202。
文摘In this paper,we study the covert performance of the downlink low earth orbit(LEO)satellite communication,where the unmanned aerial vehicle(UAV)is employed as a cooperative jammer.To maximize the covert rate of the LEO satellite transmission,a multi-objective problem is formulated to jointly optimize the UAV’s jamming power and trajectory.For practical consideration,we assume that the UAV can only have partial environmental information,and can’t know the detection threshold and exact location of the eavesdropper on the ground.To solve the multiobjective problem,we propose the data-driven generative adversarial network(DD-GAN)based method to optimize the power and trajectory of the UAV,in which the sample data is collected by using genetic algorithm(GA).Simulation results show that the jamming solution of UAV generated by DD-GAN can achieve an effective trade-off between covert rate and probability of detection errors when only limited prior information is obtained.
基金supported by the National Natural Science Foundation of China(Grant No.U21B2075)。
文摘Dynamic analysis of the tethered satellite system(TSS)can provide a fundamental guideline to the evaluation of performance and robust design of the system examined.Uncertainties inherited with the parameters would induce unexpected variation of the response and deteriorate the reliability of the system.In this work,the effect of uncertain mass of the satellites on the deployment and retrieval dynamics of the TSS is investigated.First the interval mode is employed to take the variation of mass of satellite into account in the processes of deployment and retrieval.Then,the Chebyshev interval method is used to obtain the lower and upper response bounds of the TSS.To achieve a smooth and reliable implementation of deployment and retrieval,the nonlinear programming based on the Gauss pseudospectral method is adopted to obtain optimal trajectory of tether velocity.Numerical results show that the uncertainties of mass of the satellites have a distinct influence on the response of tether tension in the processes of deployment and retrieval.