Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c...Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.展开更多
Reversible data hiding techniques are capable of reconstructing the original cover image from stego-images. Recently, many researchers have focused on reversible data hiding to protect intellectual property rights. In...Reversible data hiding techniques are capable of reconstructing the original cover image from stego-images. Recently, many researchers have focused on reversible data hiding to protect intellectual property rights. In this paper, we combine reversible data hiding with the chaotic Henon map as an encryption technique to achieve an acceptable level of confidentiality in cloud computing environments. And, Haar digital wavelet transformation (HDWT) is also applied to convert an image from a spatial domain into a frequency domain. And then the decimal of coefficients and integer of high frequency band are modified for hiding secret bits. Finally, the modified coefficients are inversely transformed to stego-images.展开更多
The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in...The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.展开更多
Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial....Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial. Therefore, it is a critical challenge to guarantee the service reliability. Fault-tolerance strategies, such as checkpoint, are commonly employed. Because of the failure of the edge switches, the checkpoint image may become inaccessible. Therefore, current checkpoint-based fault tolerance method cannot achieve the best effect. In this paper, we propose an optimal checkpoint method with edge switch failure-aware. The edge switch failure-aware checkpoint method includes two algorithms. The first algorithm employs the data center topology and communication characteristic for checkpoint image storage server selection. The second algorithm employs the checkpoint image storage characteristic as well as the data center topology to select the recovery server. Simulation experiments are performed to demonstrate the effectiveness of the proposed method.展开更多
Cloud computing can significantly improve efficiency in Internet utilization and data management.Several cloud applications(file sharing,backup,data up/download etc.) imply transfers of large amount of data without re...Cloud computing can significantly improve efficiency in Internet utilization and data management.Several cloud applications(file sharing,backup,data up/download etc.) imply transfers of large amount of data without real-time requirements.In several use-cases cloud-computing solutions reduce operational costs and guarantee target QoS.These solutions become critical when satellite systems are utilized,since resources are limited,network latency is huge and bandwidth costs are high.Using satellite capacity for cloud-computing bulk traffic,keeping acceptable performance of interactive applications,is very important and can limit the connectivity costs.This goal can be achieved installing in the Set Top Box(STB) a proxy agent,to differentiate traffic and assign bandwidth according to priority,leaving spare capacity to bulk cloud computing traffic.This aim is typically reached using a specific QoS architecture,adding functional blocks at network or lower layers.We propose to manage such a process at transport layer only.The endpoint proxy implements a new transport protocol called TCP Noordwijk+,introducing a flow control differentiation capability.The proxy includes TPCN+ which efficiently transfers low-priority bulk data and handles interactive data,keeping a high degree of friendliness.The outcomes of Ns-2simulations confirm applicability and good performance of the proposed solution.展开更多
In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a ...In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a contradiction between the open nature of the cloud and the demand that data own-ers maintain their privacy.To use cloud resources for privacy-preserving data training,a viable method must be found.A privacy-preserving deep learning model(PPDLM)is suggested in this research to ad-dress this preserving issue.To preserve data privacy,we first encrypted the data using homomorphic en-cryption(HE)approach.Moreover,the deep learn-ing algorithm’s activation function—the sigmoid func-tion—uses the least-squares method to process non-addition and non-multiplication operations that are not allowed by homomorphic.Finally,experimental re-sults show that PPDLM has a significant effect on the protection of data privacy information.Compared with Non-Privacy Preserving Deep Learning Model(NPPDLM),PPDLM has higher computational effi-ciency.展开更多
In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of ...In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.展开更多
Due to the recent explosion of big data, our society has been rapidly going through digital transformation and entering a new world with numerous eye-opening developments. These new trends impact the society and futur...Due to the recent explosion of big data, our society has been rapidly going through digital transformation and entering a new world with numerous eye-opening developments. These new trends impact the society and future jobs, and thus student careers. At the heart of this digital transformation is data science, the discipline that makes sense of big data. With many rapidly emerging digital challenges ahead of us, this article discusses perspectives on iSchools' opportunities and suggestions in data science education. We argue that iSchools should empower their students with "information computing" disciplines, which we define as the ability to solve problems and create values, information, and knowledge using tools in application domains. As specific approaches to enforcing information computing disciplines in data science education, we suggest the three foci of user-based, tool-based, and application- based. These three loci will serve to differentiate the data science education of iSchools from that of computer science or business schools. We present a layered Data Science Education Framework (DSEF) with building blocks that include the three pillars of data science (people, technology, and data), computational thinking, data-driven paradigms, and data science lifecycles. Data science courses built on the top of this framework should thus be executed with user-based, tool-based, and application-based approaches. This framework will help our students think about data science problems from the big picture perspective and foster appropriate problem-solving skills in conjunction with broad perspectives of data science lifecycles. We hope the DSEF discussed in this article will help fellow iSchools in their design of new data science curricula.展开更多
With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big D...With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big Data platforms. In this paper, we discussed the design principles and research directions of modern Big Data platforms by presenting research in modern Big Data products. We provided a detailed review and comparison of several state-ofthe-art frameworks and concluded into a typical structure with five horizontal and one vertical. According to this structure, this paper presents the components and modern optimization technologies developed for Big Data, which helps to choose the most suitable components and architecture from various Big Data technologies based on requirements.展开更多
Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server a...Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.展开更多
In a growing number of information processing applications,data takes the form of continuous data streams rather than traditional stored databases.Monitoring systems that seek to provide monitoring services in cloud e...In a growing number of information processing applications,data takes the form of continuous data streams rather than traditional stored databases.Monitoring systems that seek to provide monitoring services in cloud environment must be prepared to deal gracefully with huge data collections without compromising system performance.In this paper,we show that by using a concept of urgent data,our system can shorten the response time for most 'urgent' queries while guarantee lower bandwidth consumption.We argue that monitoring data can be treated differently.Some data capture critical system events;the arrival of these data will significantly influence the monitoring reaction speed which is called urgent data.High speed urgent data collections can help system to react in real time when facing fatal errors.A cloud environment in production,MagicCube,is used as a test bed.Extensive experiments over both real world and synthetic traces show that when using urgent data,monitoring system can lower the response latency compared with existing monitoring approaches.展开更多
Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new ch...Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.展开更多
This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-ba...This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-based according to the processing technique. We highlight the strengths and weaknesses of various big-data cloud processing techniques in order to help the big-data community select the appropri- ate processing technique. We also provide big data research challenges and future directions in aspect to transportation management systems.展开更多
Data sharing is a main application of cloud computing. Some existing solutions are proposed to provide flexible access control for outsourced data in the cloud. However, few attentions have been paid to group-oriented...Data sharing is a main application of cloud computing. Some existing solutions are proposed to provide flexible access control for outsourced data in the cloud. However, few attentions have been paid to group-oriented data sharing when multiple data owners want to share their private data for cooperative purposes. In this paper, we put forward a new paradigm, referred to as secure, scalable and efficient multi-owner(SSEM) data sharing in clouds. The SSEM integrates identity-based encryption and asymmetric group key agreement to enable group-oriented access control for data owners in a many-to-many sharing pattern. Moreover, with SSEM, users can join in or leave from the group conveniently with the privacy of both group data and user data.We proposed the key-ciphertext homomorphism technique to construct an SSEM scheme with short ciphertexts. The security analysis shows that our SSEM scheme achieves data security against unauthorized accesses and collusion attacks. Both theoretical and experimental results confirm that our proposed scheme takes users little costs to share and access outsourced data in a group manner.展开更多
Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files...Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files which can be updated online, we propose an improved dynamic provable data possession model. We use some homomorphic tags to verify the integrity of the file and use some hash values generated by some secret values and tags to prevent replay attack and forgery attack. Compared with previous works, our proposal reduces the computational and communication complexity from O(logn) to O(1). We did some experiments to ensure this improvement and extended the model to file sharing situation.展开更多
In this paper, we survey data security and privacy problems created by cloud storage applications and propose a cloud storage security architecture. We discuss state-of-the-art techniques for ensuring the privacy and ...In this paper, we survey data security and privacy problems created by cloud storage applications and propose a cloud storage security architecture. We discuss state-of-the-art techniques for ensuring the privacy and security of data stored in the cloud. We discuss policies for access control and data integrity, availability, and privacy. We also discuss several key solutions proposed in current literature and point out future research directions.展开更多
Despite the multifaceted advantages of cloud computing,concerns about data leakage or abuse impedes its adoption for security-sensi tive tasks.Recent investigations have revealed that the risk of unauthorized data acc...Despite the multifaceted advantages of cloud computing,concerns about data leakage or abuse impedes its adoption for security-sensi tive tasks.Recent investigations have revealed that the risk of unauthorized data access is one of the biggest concerns of users of cloud-based services.Transparency and accountability for data managed in the cloud is necessary.Specifically,when using a cloudhost service,a user typically has to trust both the cloud service provider and cloud infrastructure provider to properly handling private data.This is a multi-party system.Three particular trust models can be used according to the credibility of these providers.This pa per describes techniques for preventing data leakage that can be used with these different models.展开更多
文摘Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.
文摘Reversible data hiding techniques are capable of reconstructing the original cover image from stego-images. Recently, many researchers have focused on reversible data hiding to protect intellectual property rights. In this paper, we combine reversible data hiding with the chaotic Henon map as an encryption technique to achieve an acceptable level of confidentiality in cloud computing environments. And, Haar digital wavelet transformation (HDWT) is also applied to convert an image from a spatial domain into a frequency domain. And then the decimal of coefficients and integer of high frequency band are modified for hiding secret bits. Finally, the modified coefficients are inversely transformed to stego-images.
基金supported in part by the National Nature Science Foundation of China under Grant No.61402413 and 61340058 the "Six Kinds Peak Talents Plan" project of Jiangsu Province under Grant No.ll-JY-009+2 种基金the Nature Science Foundation of Zhejiang Province under Grant No.LY14F020019, Z14F020006 and Y1101183the China Postdoctoral Science Foundation funded project under Grant No.2012M511732Jiangsu Province Postdoctoral Science Foundation funded project Grant No.1102014C
文摘The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.
基金supported by Beijing Natural Science Foundation (4174100)NSFC(61602054)the Fundamental Research Funds for the Central Universities
文摘Cloud computing is becoming an important solution for providing scalable computing resources via Internet. Because there are tens of thousands of nodes in data center, the probability of server failures is nontrivial. Therefore, it is a critical challenge to guarantee the service reliability. Fault-tolerance strategies, such as checkpoint, are commonly employed. Because of the failure of the edge switches, the checkpoint image may become inaccessible. Therefore, current checkpoint-based fault tolerance method cannot achieve the best effect. In this paper, we propose an optimal checkpoint method with edge switch failure-aware. The edge switch failure-aware checkpoint method includes two algorithms. The first algorithm employs the data center topology and communication characteristic for checkpoint image storage server selection. The second algorithm employs the checkpoint image storage characteristic as well as the data center topology to select the recovery server. Simulation experiments are performed to demonstrate the effectiveness of the proposed method.
文摘Cloud computing can significantly improve efficiency in Internet utilization and data management.Several cloud applications(file sharing,backup,data up/download etc.) imply transfers of large amount of data without real-time requirements.In several use-cases cloud-computing solutions reduce operational costs and guarantee target QoS.These solutions become critical when satellite systems are utilized,since resources are limited,network latency is huge and bandwidth costs are high.Using satellite capacity for cloud-computing bulk traffic,keeping acceptable performance of interactive applications,is very important and can limit the connectivity costs.This goal can be achieved installing in the Set Top Box(STB) a proxy agent,to differentiate traffic and assign bandwidth according to priority,leaving spare capacity to bulk cloud computing traffic.This aim is typically reached using a specific QoS architecture,adding functional blocks at network or lower layers.We propose to manage such a process at transport layer only.The endpoint proxy implements a new transport protocol called TCP Noordwijk+,introducing a flow control differentiation capability.The proxy includes TPCN+ which efficiently transfers low-priority bulk data and handles interactive data,keeping a high degree of friendliness.The outcomes of Ns-2simulations confirm applicability and good performance of the proposed solution.
基金This work was partially supported by the Natural Science Foundation of Beijing Municipality(No.4222038)by Open Research Project of the State Key Laboratory of Media Convergence and Communication(Communication University of China),the National Key R&D Program of China(No.2021YFF0307600)Fundamental Research Funds for the Central Universities.
文摘In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a contradiction between the open nature of the cloud and the demand that data own-ers maintain their privacy.To use cloud resources for privacy-preserving data training,a viable method must be found.A privacy-preserving deep learning model(PPDLM)is suggested in this research to ad-dress this preserving issue.To preserve data privacy,we first encrypted the data using homomorphic en-cryption(HE)approach.Moreover,the deep learn-ing algorithm’s activation function—the sigmoid func-tion—uses the least-squares method to process non-addition and non-multiplication operations that are not allowed by homomorphic.Finally,experimental re-sults show that PPDLM has a significant effect on the protection of data privacy information.Compared with Non-Privacy Preserving Deep Learning Model(NPPDLM),PPDLM has higher computational effi-ciency.
文摘In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.
文摘Due to the recent explosion of big data, our society has been rapidly going through digital transformation and entering a new world with numerous eye-opening developments. These new trends impact the society and future jobs, and thus student careers. At the heart of this digital transformation is data science, the discipline that makes sense of big data. With many rapidly emerging digital challenges ahead of us, this article discusses perspectives on iSchools' opportunities and suggestions in data science education. We argue that iSchools should empower their students with "information computing" disciplines, which we define as the ability to solve problems and create values, information, and knowledge using tools in application domains. As specific approaches to enforcing information computing disciplines in data science education, we suggest the three foci of user-based, tool-based, and application- based. These three loci will serve to differentiate the data science education of iSchools from that of computer science or business schools. We present a layered Data Science Education Framework (DSEF) with building blocks that include the three pillars of data science (people, technology, and data), computational thinking, data-driven paradigms, and data science lifecycles. Data science courses built on the top of this framework should thus be executed with user-based, tool-based, and application-based approaches. This framework will help our students think about data science problems from the big picture perspective and foster appropriate problem-solving skills in conjunction with broad perspectives of data science lifecycles. We hope the DSEF discussed in this article will help fellow iSchools in their design of new data science curricula.
基金supported by the Research Fund of Tencent Computer System Co.Ltd.under Grant No.170125
文摘With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big Data platforms. In this paper, we discussed the design principles and research directions of modern Big Data platforms by presenting research in modern Big Data products. We provided a detailed review and comparison of several state-ofthe-art frameworks and concluded into a typical structure with five horizontal and one vertical. According to this structure, this paper presents the components and modern optimization technologies developed for Big Data, which helps to choose the most suitable components and architecture from various Big Data technologies based on requirements.
基金The authors would like to thank the reviewers for their detailed reviews and constructive comments, which have helped improve the quality of this paper. This work is supported by National Natural Science Foundation of China (No: 61379144), Foundation of Science and Technology on Information Assurance Laboratory (No: KJ-13-002) and the Graduate Innovation Fund of the National University of Defense Technology.
文摘Cloud storage is one of the main application of the cloud computing.With the data services in the cloud,users is able to outsource their data to the cloud,access and share their outsourced data from the cloud server anywhere and anytime.However,this new paradigm of data outsourcing services also introduces new security challenges,among which is how to ensure the integrity of the outsourced data.Although the cloud storage providers commit a reliable and secure environment to users,the integrity of data can still be damaged owing to the carelessness of humans and failures of hardwares/softwares or the attacks from external adversaries.Therefore,it is of great importance for users to audit the integrity of their data outsourced to the cloud.In this paper,we first design an auditing framework for cloud storage and proposed an algebraic signature based remote data possession checking protocol,which allows a third-party to auditing the integrity of the outsourced data on behalf of the users and supports unlimited number of verifications.Then we extends our auditing protocol to support data dynamic operations,including data update,data insertion and data deletion.The analysis and experiment results demonstrate that our proposed schemes are secure and efficient.
基金supported by the National Key Technology R&D Program(Grant NO. 2012BAH17F01)NSFC-NSF International Cooperation Project(Grant NO. 61361126011)
文摘In a growing number of information processing applications,data takes the form of continuous data streams rather than traditional stored databases.Monitoring systems that seek to provide monitoring services in cloud environment must be prepared to deal gracefully with huge data collections without compromising system performance.In this paper,we show that by using a concept of urgent data,our system can shorten the response time for most 'urgent' queries while guarantee lower bandwidth consumption.We argue that monitoring data can be treated differently.Some data capture critical system events;the arrival of these data will significantly influence the monitoring reaction speed which is called urgent data.High speed urgent data collections can help system to react in real time when facing fatal errors.A cloud environment in production,MagicCube,is used as a test bed.Extensive experiments over both real world and synthetic traces show that when using urgent data,monitoring system can lower the response latency compared with existing monitoring approaches.
基金the National Natural Science Foundation of China,the National Basic Research Program of China ("973" Program) the National High Technology Research and Development Program of China ("863" Program)
文摘Cloud computing and storage services allow clients to move their data center and applications to centralized large data centers and thus avoid the burden of local data storage and maintenance.However,this poses new challenges related to creating secure and reliable data storage over unreliable service providers.In this study,we address the problem of ensuring the integrity of data storage in cloud computing.In particular,we consider methods for reducing the burden of generating a constant amount of metadata at the client side.By exploiting some good attributes of the bilinear group,we can devise a simple and efficient audit service for public verification of untrusted and outsourced storage,which can be important for achieving widespread deployment of cloud computing.Whereas many prior studies on ensuring remote data integrity did not consider the burden of generating verification metadata at the client side,the objective of this study is to resolve this issue.Moreover,our scheme also supports data dynamics and public verifiability.Extensive security and performance analysis shows that the proposed scheme is highly efficient and provably secure.
基金supported in part by the National Basic Research Program(973 Program,No.2015CB352400)NSFC under grant U1401258U.S NSF under grant CCF-1016966
文摘This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-based according to the processing technique. We highlight the strengths and weaknesses of various big-data cloud processing techniques in order to help the big-data community select the appropri- ate processing technique. We also provide big data research challenges and future directions in aspect to transportation management systems.
基金supported in part by National High-Tech Research and Development Program of China(“863”Program)under Grant No.2015AA016004National Natural Science Foundation of China under Grants No.61173154,61272451,61572380
文摘Data sharing is a main application of cloud computing. Some existing solutions are proposed to provide flexible access control for outsourced data in the cloud. However, few attentions have been paid to group-oriented data sharing when multiple data owners want to share their private data for cooperative purposes. In this paper, we put forward a new paradigm, referred to as secure, scalable and efficient multi-owner(SSEM) data sharing in clouds. The SSEM integrates identity-based encryption and asymmetric group key agreement to enable group-oriented access control for data owners in a many-to-many sharing pattern. Moreover, with SSEM, users can join in or leave from the group conveniently with the privacy of both group data and user data.We proposed the key-ciphertext homomorphism technique to construct an SSEM scheme with short ciphertexts. The security analysis shows that our SSEM scheme achieves data security against unauthorized accesses and collusion attacks. Both theoretical and experimental results confirm that our proposed scheme takes users little costs to share and access outsourced data in a group manner.
基金supported by Major Program of Shanghai Science and Technology Commission under Grant No.10DZ1500200Collaborative Applied Research and Development Project between Morgan Stanley and Shanghai Jiao Tong University, China
文摘Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files which can be updated online, we propose an improved dynamic provable data possession model. We use some homomorphic tags to verify the integrity of the file and use some hash values generated by some secret values and tags to prevent replay attack and forgery attack. Compared with previous works, our proposal reduces the computational and communication complexity from O(logn) to O(1). We did some experiments to ensure this improvement and extended the model to file sharing situation.
基金supported by National Natural Science Foundation of China under grants 61173170 and 60873225National High Technology Research and Development Program of China under grant 2007AA01Z403Innovation Fund of Huazhong University of Science and Technology under grants 2013QN120,2012TS052 and 2012TS053
文摘In this paper, we survey data security and privacy problems created by cloud storage applications and propose a cloud storage security architecture. We discuss state-of-the-art techniques for ensuring the privacy and security of data stored in the cloud. We discuss policies for access control and data integrity, availability, and privacy. We also discuss several key solutions proposed in current literature and point out future research directions.
基金supported by National Basic Research (973) Program of China (2011CB302505)Natural Science Foundation of China (61373145, 61170210)+1 种基金National High-Tech R&D (863) Program of China (2012AA012600,2011AA01A203)Chinese Special Project of Science and Technology (2012ZX01039001)
文摘Despite the multifaceted advantages of cloud computing,concerns about data leakage or abuse impedes its adoption for security-sensi tive tasks.Recent investigations have revealed that the risk of unauthorized data access is one of the biggest concerns of users of cloud-based services.Transparency and accountability for data managed in the cloud is necessary.Specifically,when using a cloudhost service,a user typically has to trust both the cloud service provider and cloud infrastructure provider to properly handling private data.This is a multi-party system.Three particular trust models can be used according to the credibility of these providers.This pa per describes techniques for preventing data leakage that can be used with these different models.