Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c...Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.展开更多
The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in...The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.展开更多
With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big D...With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big Data platforms. In this paper, we discussed the design principles and research directions of modern Big Data platforms by presenting research in modern Big Data products. We provided a detailed review and comparison of several state-ofthe-art frameworks and concluded into a typical structure with five horizontal and one vertical. According to this structure, this paper presents the components and modern optimization technologies developed for Big Data, which helps to choose the most suitable components and architecture from various Big Data technologies based on requirements.展开更多
This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-ba...This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-based according to the processing technique. We highlight the strengths and weaknesses of various big-data cloud processing techniques in order to help the big-data community select the appropri- ate processing technique. We also provide big data research challenges and future directions in aspect to transportation management systems.展开更多
In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a ...In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a contradiction between the open nature of the cloud and the demand that data own-ers maintain their privacy.To use cloud resources for privacy-preserving data training,a viable method must be found.A privacy-preserving deep learning model(PPDLM)is suggested in this research to ad-dress this preserving issue.To preserve data privacy,we first encrypted the data using homomorphic en-cryption(HE)approach.Moreover,the deep learn-ing algorithm’s activation function—the sigmoid func-tion—uses the least-squares method to process non-addition and non-multiplication operations that are not allowed by homomorphic.Finally,experimental re-sults show that PPDLM has a significant effect on the protection of data privacy information.Compared with Non-Privacy Preserving Deep Learning Model(NPPDLM),PPDLM has higher computational effi-ciency.展开更多
In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of ...In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.展开更多
Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logi...Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logistics, this paper designed an intelligent logistics platform containing the main applications such as e-commerce, self-service transceiver, big data analysis, path location and distribution optimization. The intelligent logistics service platform has been built based on cloud computing to collect, store and handling multi-source heterogeneous mass data from sensors, RFID electronic tag, vehicle terminals and APP, so that the open-access cloud services including distribution, positioning, navigation, scheduling and other data services can be provided for the logistics distribution applications. And then the architecture of intelligent logistics cloud platform containing software layer(SaaS), platform layer(PaaS) and infrastructure(IaaS) has been constructed accordance with the core technology relative high concurrent processing technique, heterogeneous terminal data access, encapsulation and data mining. Therefore, intelligent logistics cloud platform can be carried out by the service mode for implementation to accelerate the construction of the symbiotic win-winlogistics ecological system and the benign development of the ICT industry in the trend of intellectualization in China.展开更多
文摘Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.
基金supported in part by the National Nature Science Foundation of China under Grant No.61402413 and 61340058 the "Six Kinds Peak Talents Plan" project of Jiangsu Province under Grant No.ll-JY-009+2 种基金the Nature Science Foundation of Zhejiang Province under Grant No.LY14F020019, Z14F020006 and Y1101183the China Postdoctoral Science Foundation funded project under Grant No.2012M511732Jiangsu Province Postdoctoral Science Foundation funded project Grant No.1102014C
文摘The Cloud is increasingly being used to store and process big data for its tenants and classical security mechanisms using encryption are neither sufficiently efficient nor suited to the task of protecting big data in the Cloud.In this paper,we present an alternative approach which divides big data into sequenced parts and stores them among multiple Cloud storage service providers.Instead of protecting the big data itself,the proposed scheme protects the mapping of the various data elements to each provider using a trapdoor function.Analysis,comparison and simulation prove that the proposed scheme is efficient and secure for the big data of Cloud tenants.
基金supported by the Research Fund of Tencent Computer System Co.Ltd.under Grant No.170125
文摘With the growth of distributed computing systems, the modern Big Data analysis platform products often have diversified characteristics. It is hard for users to make decisions when they are in early contact with Big Data platforms. In this paper, we discussed the design principles and research directions of modern Big Data platforms by presenting research in modern Big Data products. We provided a detailed review and comparison of several state-ofthe-art frameworks and concluded into a typical structure with five horizontal and one vertical. According to this structure, this paper presents the components and modern optimization technologies developed for Big Data, which helps to choose the most suitable components and architecture from various Big Data technologies based on requirements.
基金supported in part by the National Basic Research Program(973 Program,No.2015CB352400)NSFC under grant U1401258U.S NSF under grant CCF-1016966
文摘This paper describes the fundamentals of cloud computing and current big-data key technologies. We categorize big-da- ta processing as batch-based, stream-based, graph-based, DAG-based, interactive-based, or visual-based according to the processing technique. We highlight the strengths and weaknesses of various big-data cloud processing techniques in order to help the big-data community select the appropri- ate processing technique. We also provide big data research challenges and future directions in aspect to transportation management systems.
基金This work was partially supported by the Natural Science Foundation of Beijing Municipality(No.4222038)by Open Research Project of the State Key Laboratory of Media Convergence and Communication(Communication University of China),the National Key R&D Program of China(No.2021YFF0307600)Fundamental Research Funds for the Central Universities.
文摘In the analysis of big data,deep learn-ing is a crucial technique.Big data analysis tasks are typically carried out on the cloud since it offers strong computer capabilities and storage areas.Nev-ertheless,there is a contradiction between the open nature of the cloud and the demand that data own-ers maintain their privacy.To use cloud resources for privacy-preserving data training,a viable method must be found.A privacy-preserving deep learning model(PPDLM)is suggested in this research to ad-dress this preserving issue.To preserve data privacy,we first encrypted the data using homomorphic en-cryption(HE)approach.Moreover,the deep learn-ing algorithm’s activation function—the sigmoid func-tion—uses the least-squares method to process non-addition and non-multiplication operations that are not allowed by homomorphic.Finally,experimental re-sults show that PPDLM has a significant effect on the protection of data privacy information.Compared with Non-Privacy Preserving Deep Learning Model(NPPDLM),PPDLM has higher computational effi-ciency.
文摘In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.
基金supported in part by National Key Research and Development Program under Grant No. 2016YFC0803206China Postdoctoral Science Foundation under Grant No.2016M600972
文摘Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logistics, this paper designed an intelligent logistics platform containing the main applications such as e-commerce, self-service transceiver, big data analysis, path location and distribution optimization. The intelligent logistics service platform has been built based on cloud computing to collect, store and handling multi-source heterogeneous mass data from sensors, RFID electronic tag, vehicle terminals and APP, so that the open-access cloud services including distribution, positioning, navigation, scheduling and other data services can be provided for the logistics distribution applications. And then the architecture of intelligent logistics cloud platform containing software layer(SaaS), platform layer(PaaS) and infrastructure(IaaS) has been constructed accordance with the core technology relative high concurrent processing technique, heterogeneous terminal data access, encapsulation and data mining. Therefore, intelligent logistics cloud platform can be carried out by the service mode for implementation to accelerate the construction of the symbiotic win-winlogistics ecological system and the benign development of the ICT industry in the trend of intellectualization in China.