Relaying On A Single Cloud As A Storage Service Is Not A Proper Solution For A Number Of Reasons; For Instance, The Data Could Be Captured While Uploaded To The Cloud, And The Data Could Be Stolen From The Cloud Using A Stolen Id. In This Paper, We Propose A Solution That Aims At Offering A Secure Data Storage For Mobile Cloud Computing Based On The Multi-Clouds Scheme. The Proposed Solution Will Take The Advantages Of Multi-Clouds, Data Cryptography, And Data Compression To Secure The Distributed Data; By Splitting The Data Into Segments, Encrypting The Segments, Compressing The Segments, Distributing The Segments Via Multi-Clouds While Keeping One Segment On The Mobile Device Memory; Which Will Prevent Extracting The Data If The Distributed Segments Have Been Intercepted.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
This document summarizes a research paper that proposes a framework called Cooperative Provable Data Possession (CPDP) to verify the integrity of data stored across multiple cloud storage providers. The framework uses two techniques: 1) a Hash Index Hierarchy that allows responses from different cloud providers to a client's challenge to be combined into a single response, and 2) Homomorphic Verifiable Responses that enable efficient verification of data stored on multiple cloud providers. The document outlines the security properties and performance benefits of the CPDP framework for verifying data integrity in a multi-cloud storage environment.
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
This document summarizes research on personality-based distributed provable data ownership in multi-cloud storage. It discusses how current provable data possession protocols have limitations such as authentication overhead and lack of flexibility. The proposed approach eliminates authentication management by using identity-based cryptography. It aims to provide a secure, efficient and adaptable protocol for integrity checking of outsourced data across multiple cloud servers.
This document discusses effective modular order preserving encryption on cloud using multivariate hypergeometric distribution (MHGD). It begins with an abstract that describes how order preserving encryption allows efficient range queries on encrypted data. It then provides background on cloud computing security concerns and discusses existing approaches to searchable encryption, including probabilistic encryption, deterministic encryption, homomorphic encryption, and order preserving encryption. The key proposed approach is to improve the security of existing modular order preserving encryption approaches by utilizing MHGD.
Improving Data Storage Security in Cloud using HadoopIJERA Editor
The rising abuse of information stored on large data centres in cloud emphasizes the need to safe guard the data. Despite adopting strict authentication policies for cloud users data while transferred over to secure channel when reaches data centres is vulnerable to numerous attacks .The most widely adoptable methodology is safeguarding the cloud data is through encryption algorithm. Encryption of large data deployed in cloud is actually a time consuming process. For the secure transmission of information AES encryption has been used which provides most secure way to transfer the sensitive information from sender to the intended receiver. The main purpose of using this technique is to make sensitive information unreadable to all other except the receiver. The data thus compressed enables utilization of storage space in cloud environment. It has been augmented with Hadoop‟s map-reduce paradigm which works in a parallel mode. The experimental results clearly reflect the effectiveness of the methodology to improve the security of data in cloud environment.
Guaranteed Availability of Cloud Data with Efficient CostIRJET Journal
This document discusses efficient and cost-effective methods for hosting data across multiple cloud storage providers (multi-cloud) to ensure high data availability and reduce costs. It proposes distributing data among different cloud providers using replication and erasure coding techniques. This approach guarantees data availability even if one cloud provider fails and minimizes monetary costs by taking advantage of varying cloud pricing models and data access patterns. The technique is shown to save around 20% of costs while providing high flexibility to handle data and pricing changes over time.
This document presents a Cooperative Provable Data Possession (CPDP) scheme to ensure data integrity in a multicloud storage system. The CPDP scheme uses a trusted third party to generate secret keys, verification tags for data blocks, and store public parameters. It allows a client to issue challenges to verify the integrity of its data stored across multiple cloud service providers. The verification process involves the cloud providers proving possession of the original data file without retrieving the whole file. This scheme aims to efficiently verify data integrity in a multicloud system with support for data migration and scalability.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
This document summarizes a research paper that proposes a framework called Cooperative Provable Data Possession (CPDP) to verify the integrity of data stored across multiple cloud storage providers. The framework uses two techniques: 1) a Hash Index Hierarchy that allows responses from different cloud providers to a client's challenge to be combined into a single response, and 2) Homomorphic Verifiable Responses that enable efficient verification of data stored on multiple cloud providers. The document outlines the security properties and performance benefits of the CPDP framework for verifying data integrity in a multi-cloud storage environment.
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
This document summarizes research on personality-based distributed provable data ownership in multi-cloud storage. It discusses how current provable data possession protocols have limitations such as authentication overhead and lack of flexibility. The proposed approach eliminates authentication management by using identity-based cryptography. It aims to provide a secure, efficient and adaptable protocol for integrity checking of outsourced data across multiple cloud servers.
This document discusses effective modular order preserving encryption on cloud using multivariate hypergeometric distribution (MHGD). It begins with an abstract that describes how order preserving encryption allows efficient range queries on encrypted data. It then provides background on cloud computing security concerns and discusses existing approaches to searchable encryption, including probabilistic encryption, deterministic encryption, homomorphic encryption, and order preserving encryption. The key proposed approach is to improve the security of existing modular order preserving encryption approaches by utilizing MHGD.
Improving Data Storage Security in Cloud using HadoopIJERA Editor
The rising abuse of information stored on large data centres in cloud emphasizes the need to safe guard the data. Despite adopting strict authentication policies for cloud users data while transferred over to secure channel when reaches data centres is vulnerable to numerous attacks .The most widely adoptable methodology is safeguarding the cloud data is through encryption algorithm. Encryption of large data deployed in cloud is actually a time consuming process. For the secure transmission of information AES encryption has been used which provides most secure way to transfer the sensitive information from sender to the intended receiver. The main purpose of using this technique is to make sensitive information unreadable to all other except the receiver. The data thus compressed enables utilization of storage space in cloud environment. It has been augmented with Hadoop‟s map-reduce paradigm which works in a parallel mode. The experimental results clearly reflect the effectiveness of the methodology to improve the security of data in cloud environment.
Guaranteed Availability of Cloud Data with Efficient CostIRJET Journal
This document discusses efficient and cost-effective methods for hosting data across multiple cloud storage providers (multi-cloud) to ensure high data availability and reduce costs. It proposes distributing data among different cloud providers using replication and erasure coding techniques. This approach guarantees data availability even if one cloud provider fails and minimizes monetary costs by taking advantage of varying cloud pricing models and data access patterns. The technique is shown to save around 20% of costs while providing high flexibility to handle data and pricing changes over time.
This document presents a Cooperative Provable Data Possession (CPDP) scheme to ensure data integrity in a multicloud storage system. The CPDP scheme uses a trusted third party to generate secret keys, verification tags for data blocks, and store public parameters. It allows a client to issue challenges to verify the integrity of its data stored across multiple cloud service providers. The verification process involves the cloud providers proving possession of the original data file without retrieving the whole file. This scheme aims to efficiently verify data integrity in a multicloud system with support for data migration and scalability.
Efficient and Empiric Keyword Search Using CloudIRJET Journal
This document discusses efficient and empirical keyword search using cloud computing. It proposes a secure and reliable keyword search scheme across multiple clouds that allows users to search for files privately and reliably. The proposed system uses an iterative encryption approach to ensure file privacy even if multiple cloud servers collude. It implements a bloom filter tree index structure and integrity verification algorithm to securely search encrypted files across clouds and detect malicious servers. The system aims to provide an efficient and secure solution for keyword search on outsourced encrypted data stored in multiple cloud servers.
The document discusses privacy and security issues related to cloud storage. It proposes a new privacy-preserving auditing scheme for cloud storage that uses an interactive challenge-response protocol and verification protocol. This allows a third party auditor to verify the integrity and identify corrupted data for a cloud storage user, while preserving data privacy. The scheme aims to be efficient, lightweight and privacy-preserving. Experimental results show the protocol is efficient and achieves its goals.
A scalabl e and cost effective framework for privacy preservation over big d...amna alhabib
This document proposes a scalable and cost-effective framework called SaC-FRAPP for preserving privacy over big data on the cloud. The key idea is to leverage cloud-based MapReduce to anonymize large datasets before releasing them to other parties. Anonymized datasets are then managed using HDFS to avoid re-computation costs. A prototype system is implemented to demonstrate that the framework can anonymize and manage anonymized big data sets in a highly scalable, efficient and cost-effective manner.
Secure distributed deduplication systems with improved reliability 2Rishikesh Pathak
1. The document proposes new distributed deduplication systems that improve reliability by distributing data chunks across multiple cloud servers. This addresses limitations of single-server deduplication systems where losing one server causes disproportionate data loss.
2. The systems introduce a deterministic secret sharing scheme to protect data confidentiality in distributed storage, instead of using convergent encryption. Secret shares of files are distributed across servers.
3. The distributed approach enhances reliability while supporting deduplication and ensuring data integrity and "tag consistency" to prevent replacement attacks. This represents the first work addressing reliability, confidentiality and consistency for distributed deduplication.
IRJET- Improving Data Availability by using VPC Strategy in Cloud Environ...IRJET Journal
This document discusses improving data availability in cloud environments using virtual private cloud (VPC) strategies and data replication strategies (DRS). It proposes using VPC to define private networks in public clouds and deploying cloud resources into those private networks for improved security and control. It also proposes using DRS to store multiple copies of data across different nodes to increase data availability, reduce bandwidth usage, and provide fault tolerance. The proposed approach identifies popular data files for replication, selects the best storage sites based on factors like request frequency, failure probability, and storage usage, and decides when to replace replicas to optimize resource usage. A simulation showed this hybrid VPC and DRS approach improved performance metrics like response time, network usage, and load balancing compared to
The document proposes an approach to identify which intermediate datasets generated during cloud-based data processing need to be encrypted to preserve privacy, while avoiding encrypting all datasets which is inefficient and costly. It models the generation relationships between datasets and uses an upper-bound constraint on privacy leakage to determine which datasets exceed the threshold. This is formulated as an optimization problem to minimize privacy-preserving costs. Evaluation on real-world datasets shows the approach significantly reduces costs compared to fully encrypting all intermediate datasets.
Privacy-Preserving Public Auditing for Regenerating-Code-Based Cloud Storage1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
A Survey Paper on Removal of Data Duplication in a Hybrid Cloud IRJET Journal
This document summarizes a research paper on removing data duplication in a hybrid cloud. It discusses how data deduplication techniques like single-instance storage and block-level deduplication can reduce storage needs by eliminating duplicate data. It also describes the types of cloud storage (public, private, hybrid) and cloud services (SaaS, PaaS, IaaS). The document proposes encrypting files with differential privilege keys to improve security when checking for duplicate content in a hybrid cloud and prevent unauthorized access during deduplication.
A hybrid cloud approach for secure authorizedNinad Samel
This document summarizes a research paper that proposes a hybrid cloud approach for secure authorized data deduplication. The paper presents a scheme that uses convergent encryption to encrypt files before uploading them to cloud storage. It also considers the differential privileges of users when performing duplicate checks, in addition to file content. A prototype is implemented to test the proposed authorized duplicate check scheme. Experimental results show the scheme incurs minimal overhead compared to normal cloud storage operations. The goal is to better protect data security while supporting deduplication in a hybrid cloud architecture.
Dynamic Resource Allocation and Data Security for CloudAM Publications
Cloud computing is the next generation of IT organization. Cloud computing moves the software and
databases to the large centres where the management of services and data may not be fully trusted. In this system, we
focus on cloud data storage security, which has been an important aspect of quality of services. To ensure the
correctness of user’s data in the cloud, we propose an effective scheme with Advanced Encryption Standard and MD5
algorithm. Extensive security and performance analysis shows that the proposed scheme is highly efficient. In
proposed work we have developed efficient parallel data processing in clouds and present our research project for
parallel security. Parallel security is the data processing framework to explicitly exploit the dynamic storage along with
data security. We have proposed a strong, formal model for data security on cloud and corruption detection.
Effective & Flexible Cryptography Based Scheme for Ensuring User`s Data Secur...ijsrd.com
Cloud computing has been envisioned as the next-generation architecture of IT enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in the cloud, we propose an effective and flexible cryptography based scheme. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against malicious data modification attack.
Cooperative Schedule Data Possession for Integrity Verification in Multi-Clou...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Improved deduplication with keys and chunks in HDFS storage providersIRJET Journal
The document proposes a new deduplication scheme for HDFS storage providers that improves reliability. It uses MD5 hashing to generate unique tags for files and blocks, and stores the tags in a metadata file. Ownership verification is done during upload/download by checking these tags. Encrypted data is distributed across block servers for reliability. Convergent keys derived from hashes encrypt the data blocks using 3DES. This ensures security while allowing deduplication. The scheme achieves both file-level and block-level deduplication and uses distributed key servers for reliable key management at large scale.
IRJET-Auditing and Resisting Key Exposure on Cloud StorageIRJET Journal
1. The document discusses auditing and resisting key exposure in cloud storage. It proposes a new framework called an auditing protocol with key-exposure resilience that allows integrity of stored data to still be verified even if the client's current secret key is exposed.
2. It formalizes the definition and security model for such a protocol and proposes an efficient practical construction. The security proof and asymptotic performance analysis show the proposed protocol is secure and efficient.
3. Key techniques used include periodic key updates, homomorphic linear authenticators, and a novel authenticator construction to boost forward security and provide proof of retrievability with the current design.
Efficient technique for privacy preserving publishing of set valued data on c...ElavarasaN GanesaN
The document proposes a technique for privacy-preserving publishing of set-valued data on cloud computing. It extends the existing Extended Quasi Identifier Partitioning (EQI-partitioning) technique by incorporating l-diversity and k-anonymity to reduce information loss. A multi-level accessibility model is also developed to provide security based on user access levels. Identity-based proxy re-encryption is used to encrypt the data according to sensitivity values and provide access to different user levels. The proposed method aims to reduce information loss while improving security when outsourcing sensitive set-valued data to the cloud.
This presentation describes the various study on using the multi-cloud storage architecture for providing data security and consistency to the cloud users.
A Secure Multi-Owner Data Sharing Scheme for Dynamic Group in Public Cloud. IJCERT JOURNAL
In cloud computing outsourcing group resource among cloud users is a major challenge, so cloud computing provides a low-cost and well-organized solution. Due to frequent change of membership, sharing data in a multi-owner manner to an untrusted cloud is still its challenging issue. In this paper we proposed a secure multi-owner data sharing scheme for dynamic group in public cloud. By providing AES encryption with convergent key while uploading the data, any cloud user can securely share data with others. Meanwhile, the storage overhead and encryption computation cost of the scheme are independent with the number of revoked users. In addition, I analyze the security of this scheme with rigorous proofs. One-Time Password is one of the easiest and most popular forms of authentication that can be used for securing access to accounts. One-Time Passwords are often referred to as secure and stronger forms of authentication in multi-owner manner. Extensive security and performance analysis shows that our proposed scheme is highly efficient and satisfies the security requirements for public cloud based secure group sharing.
This is my Capstone Project for my Masters in Computer Science 2023 at the Rochester Institute of Technology. I want to fully thank Dr. M. Mustafa Rafique and Dr. Hans-Peter Bischof for their guidance and support throughout this process.
This paper talks about how to improve and build upon existing data distribution algorithms for a fog computing environment. It implements libraries from AES, Reed Solomon to improve the existing architecture.
This paper is also based off the existing research: ian Wang et al. A Three-Layer Privacy Preserving Cloud Storage Scheme
Based on Computational Intelligence in Fog Computing in IEEE TETCI, vol.
2, no. 1, pp. 3-12, Feb. 2018 .
Bio-Cryptography Based Secured Data Replication Management in Cloud StorageIJERA Editor
Cloud computing is new way of economical and efficient storage. The single data mart storage system is a less
secure because data remain under a single data mart. This can lead to data loss due to different causes like
hacking, server failure etc. If an attacker chooses to attack a specific client, then he can aim at a fixed cloud
provider, try to have access to the client’s information. This makes an easy job of the attackers, both inside and
outside attackers get the benefit of using data mining to a great extent. Inside attackers refer to malicious
employees at a cloud provider. Thus single data mart storage architecture is the biggest security threat
concerning data mining on cloud, so in this paper present the secure replication approach that encrypt based on
biocrypt and replicate the data in distributed data mart storage system. This approach involves the encryption,
replication and storage of data
Secure Distributed Deduplication Systems with Improved Reliability1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
“CONTEXT, CONTENT, PROCESS” APPROACH TO ALIGN INFORMATION SECURITY INVESTMENT...ClaraZara1
Today business environment is highly dependent on complex technologies, and information is considered an important asset. Organizations are therefore required to protect their information infrastructure and follow an inclusive risk management approach. One way to achieve this is by aligning the information security investment decisions with respect to organizational strategy. A large number of information security investment models have are in the literature. These models are useful for optimal and cost-effective investments in information security. However, it is extremely challenging for a decision maker to select one or combination of several models to decide on investments in information security controls. We propose a framework to simplify the task of selecting information security investment model(s). The proposed framework follows the “Context, Content, Process” approach, and this approach is useful in evaluation and prioritization of investments in information security controls in alignment with the overall organizational strategy.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
More Related Content
Similar to A New Framework for Securing Personal Data Using the Multi-Cloud
Efficient and Empiric Keyword Search Using CloudIRJET Journal
This document discusses efficient and empirical keyword search using cloud computing. It proposes a secure and reliable keyword search scheme across multiple clouds that allows users to search for files privately and reliably. The proposed system uses an iterative encryption approach to ensure file privacy even if multiple cloud servers collude. It implements a bloom filter tree index structure and integrity verification algorithm to securely search encrypted files across clouds and detect malicious servers. The system aims to provide an efficient and secure solution for keyword search on outsourced encrypted data stored in multiple cloud servers.
The document discusses privacy and security issues related to cloud storage. It proposes a new privacy-preserving auditing scheme for cloud storage that uses an interactive challenge-response protocol and verification protocol. This allows a third party auditor to verify the integrity and identify corrupted data for a cloud storage user, while preserving data privacy. The scheme aims to be efficient, lightweight and privacy-preserving. Experimental results show the protocol is efficient and achieves its goals.
A scalabl e and cost effective framework for privacy preservation over big d...amna alhabib
This document proposes a scalable and cost-effective framework called SaC-FRAPP for preserving privacy over big data on the cloud. The key idea is to leverage cloud-based MapReduce to anonymize large datasets before releasing them to other parties. Anonymized datasets are then managed using HDFS to avoid re-computation costs. A prototype system is implemented to demonstrate that the framework can anonymize and manage anonymized big data sets in a highly scalable, efficient and cost-effective manner.
Secure distributed deduplication systems with improved reliability 2Rishikesh Pathak
1. The document proposes new distributed deduplication systems that improve reliability by distributing data chunks across multiple cloud servers. This addresses limitations of single-server deduplication systems where losing one server causes disproportionate data loss.
2. The systems introduce a deterministic secret sharing scheme to protect data confidentiality in distributed storage, instead of using convergent encryption. Secret shares of files are distributed across servers.
3. The distributed approach enhances reliability while supporting deduplication and ensuring data integrity and "tag consistency" to prevent replacement attacks. This represents the first work addressing reliability, confidentiality and consistency for distributed deduplication.
IRJET- Improving Data Availability by using VPC Strategy in Cloud Environ...IRJET Journal
This document discusses improving data availability in cloud environments using virtual private cloud (VPC) strategies and data replication strategies (DRS). It proposes using VPC to define private networks in public clouds and deploying cloud resources into those private networks for improved security and control. It also proposes using DRS to store multiple copies of data across different nodes to increase data availability, reduce bandwidth usage, and provide fault tolerance. The proposed approach identifies popular data files for replication, selects the best storage sites based on factors like request frequency, failure probability, and storage usage, and decides when to replace replicas to optimize resource usage. A simulation showed this hybrid VPC and DRS approach improved performance metrics like response time, network usage, and load balancing compared to
The document proposes an approach to identify which intermediate datasets generated during cloud-based data processing need to be encrypted to preserve privacy, while avoiding encrypting all datasets which is inefficient and costly. It models the generation relationships between datasets and uses an upper-bound constraint on privacy leakage to determine which datasets exceed the threshold. This is formulated as an optimization problem to minimize privacy-preserving costs. Evaluation on real-world datasets shows the approach significantly reduces costs compared to fully encrypting all intermediate datasets.
Privacy-Preserving Public Auditing for Regenerating-Code-Based Cloud Storage1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
A Survey Paper on Removal of Data Duplication in a Hybrid Cloud IRJET Journal
This document summarizes a research paper on removing data duplication in a hybrid cloud. It discusses how data deduplication techniques like single-instance storage and block-level deduplication can reduce storage needs by eliminating duplicate data. It also describes the types of cloud storage (public, private, hybrid) and cloud services (SaaS, PaaS, IaaS). The document proposes encrypting files with differential privilege keys to improve security when checking for duplicate content in a hybrid cloud and prevent unauthorized access during deduplication.
A hybrid cloud approach for secure authorizedNinad Samel
This document summarizes a research paper that proposes a hybrid cloud approach for secure authorized data deduplication. The paper presents a scheme that uses convergent encryption to encrypt files before uploading them to cloud storage. It also considers the differential privileges of users when performing duplicate checks, in addition to file content. A prototype is implemented to test the proposed authorized duplicate check scheme. Experimental results show the scheme incurs minimal overhead compared to normal cloud storage operations. The goal is to better protect data security while supporting deduplication in a hybrid cloud architecture.
Dynamic Resource Allocation and Data Security for CloudAM Publications
Cloud computing is the next generation of IT organization. Cloud computing moves the software and
databases to the large centres where the management of services and data may not be fully trusted. In this system, we
focus on cloud data storage security, which has been an important aspect of quality of services. To ensure the
correctness of user’s data in the cloud, we propose an effective scheme with Advanced Encryption Standard and MD5
algorithm. Extensive security and performance analysis shows that the proposed scheme is highly efficient. In
proposed work we have developed efficient parallel data processing in clouds and present our research project for
parallel security. Parallel security is the data processing framework to explicitly exploit the dynamic storage along with
data security. We have proposed a strong, formal model for data security on cloud and corruption detection.
Effective & Flexible Cryptography Based Scheme for Ensuring User`s Data Secur...ijsrd.com
Cloud computing has been envisioned as the next-generation architecture of IT enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in the cloud, we propose an effective and flexible cryptography based scheme. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against malicious data modification attack.
Cooperative Schedule Data Possession for Integrity Verification in Multi-Clou...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Improved deduplication with keys and chunks in HDFS storage providersIRJET Journal
The document proposes a new deduplication scheme for HDFS storage providers that improves reliability. It uses MD5 hashing to generate unique tags for files and blocks, and stores the tags in a metadata file. Ownership verification is done during upload/download by checking these tags. Encrypted data is distributed across block servers for reliability. Convergent keys derived from hashes encrypt the data blocks using 3DES. This ensures security while allowing deduplication. The scheme achieves both file-level and block-level deduplication and uses distributed key servers for reliable key management at large scale.
IRJET-Auditing and Resisting Key Exposure on Cloud StorageIRJET Journal
1. The document discusses auditing and resisting key exposure in cloud storage. It proposes a new framework called an auditing protocol with key-exposure resilience that allows integrity of stored data to still be verified even if the client's current secret key is exposed.
2. It formalizes the definition and security model for such a protocol and proposes an efficient practical construction. The security proof and asymptotic performance analysis show the proposed protocol is secure and efficient.
3. Key techniques used include periodic key updates, homomorphic linear authenticators, and a novel authenticator construction to boost forward security and provide proof of retrievability with the current design.
Efficient technique for privacy preserving publishing of set valued data on c...ElavarasaN GanesaN
The document proposes a technique for privacy-preserving publishing of set-valued data on cloud computing. It extends the existing Extended Quasi Identifier Partitioning (EQI-partitioning) technique by incorporating l-diversity and k-anonymity to reduce information loss. A multi-level accessibility model is also developed to provide security based on user access levels. Identity-based proxy re-encryption is used to encrypt the data according to sensitivity values and provide access to different user levels. The proposed method aims to reduce information loss while improving security when outsourcing sensitive set-valued data to the cloud.
This presentation describes the various study on using the multi-cloud storage architecture for providing data security and consistency to the cloud users.
A Secure Multi-Owner Data Sharing Scheme for Dynamic Group in Public Cloud. IJCERT JOURNAL
In cloud computing outsourcing group resource among cloud users is a major challenge, so cloud computing provides a low-cost and well-organized solution. Due to frequent change of membership, sharing data in a multi-owner manner to an untrusted cloud is still its challenging issue. In this paper we proposed a secure multi-owner data sharing scheme for dynamic group in public cloud. By providing AES encryption with convergent key while uploading the data, any cloud user can securely share data with others. Meanwhile, the storage overhead and encryption computation cost of the scheme are independent with the number of revoked users. In addition, I analyze the security of this scheme with rigorous proofs. One-Time Password is one of the easiest and most popular forms of authentication that can be used for securing access to accounts. One-Time Passwords are often referred to as secure and stronger forms of authentication in multi-owner manner. Extensive security and performance analysis shows that our proposed scheme is highly efficient and satisfies the security requirements for public cloud based secure group sharing.
This is my Capstone Project for my Masters in Computer Science 2023 at the Rochester Institute of Technology. I want to fully thank Dr. M. Mustafa Rafique and Dr. Hans-Peter Bischof for their guidance and support throughout this process.
This paper talks about how to improve and build upon existing data distribution algorithms for a fog computing environment. It implements libraries from AES, Reed Solomon to improve the existing architecture.
This paper is also based off the existing research: ian Wang et al. A Three-Layer Privacy Preserving Cloud Storage Scheme
Based on Computational Intelligence in Fog Computing in IEEE TETCI, vol.
2, no. 1, pp. 3-12, Feb. 2018 .
Bio-Cryptography Based Secured Data Replication Management in Cloud StorageIJERA Editor
Cloud computing is new way of economical and efficient storage. The single data mart storage system is a less
secure because data remain under a single data mart. This can lead to data loss due to different causes like
hacking, server failure etc. If an attacker chooses to attack a specific client, then he can aim at a fixed cloud
provider, try to have access to the client’s information. This makes an easy job of the attackers, both inside and
outside attackers get the benefit of using data mining to a great extent. Inside attackers refer to malicious
employees at a cloud provider. Thus single data mart storage architecture is the biggest security threat
concerning data mining on cloud, so in this paper present the secure replication approach that encrypt based on
biocrypt and replicate the data in distributed data mart storage system. This approach involves the encryption,
replication and storage of data
Secure Distributed Deduplication Systems with Improved Reliability1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
Similar to A New Framework for Securing Personal Data Using the Multi-Cloud (20)
“CONTEXT, CONTENT, PROCESS” APPROACH TO ALIGN INFORMATION SECURITY INVESTMENT...ClaraZara1
Today business environment is highly dependent on complex technologies, and information is considered an important asset. Organizations are therefore required to protect their information infrastructure and follow an inclusive risk management approach. One way to achieve this is by aligning the information security investment decisions with respect to organizational strategy. A large number of information security investment models have are in the literature. These models are useful for optimal and cost-effective investments in information security. However, it is extremely challenging for a decision maker to select one or combination of several models to decide on investments in information security controls. We propose a framework to simplify the task of selecting information security investment model(s). The proposed framework follows the “Context, Content, Process” approach, and this approach is useful in evaluation and prioritization of investments in information security controls in alignment with the overall organizational strategy.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
Analysis of Media Discourse on Intellectual Property Rights Related to Metave...ClaraZara1
This study examines the evolving discourse on intellectual property rights (IPR) within the metaverse's Korean context, facilitated by the BIGKinds analytics program. As the metaverse transitions from a nascent concept to a complex reality, it reshapes digital interactions and poses new challenges for IPR, necessitating a comprehensive investigation into societal interest and legal discourse. The findings reveal a significant surge in metaverse-related IPR engagement over the past three years, aligning with the broader digital shift amid the COVID-19 pandemic. This pattern suggests an increasing need for nuanced legal approaches to copyright, trademark, and design patent protection in virtual environments. Thus, the urgency for legal reform to accommodate the metaverse's unique characteristics, the necessity for international collaboration on IPR in a borderless digital domain, and the intersection of technological advances, like NFTs and blockchain, with legal frameworks impacting creators' rights. As a result, this study provides policymakers and the digital community with real-time guidance on protecting intellectual property amid the transformative growth of the metaverse.
International Conference on Information Technology Convergence Services & AI ...ClaraZara1
International Conference on Information Technology Convergence Services & AI (ITCAI 2024) will provide an excellent international forum for sharing knowledge and new research results in all areas of Information Technology Convergence Services & AI. The Conference focuses on all technical and practical aspects of ITC & AI. The goal of this conference is to bring together researchers and practitioners from academia and industry to focus on understanding Information Technology Convergence and services, AI, and establishing new collaborations in these areas.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
10th International Conference on Artificial Intelligence and Soft Computing (...ClaraZara1
10th International Conference on Artificial Intelligence and Soft Computing (AIS 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology, and applications of Artificial Intelligence, Soft Computing. The Conference looks for significant contributions to all major fields of the Artificial Intelligence, Soft Computing in theoretical and practical aspects. The aim of the Conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
3rd International Conference on Artificial Intelligence Advances (AIAD 2024)ClaraZara1
3rd International Conference on Artificial Intelligence Advances (AIAD 2024) will act as a major forum for the presentation of innovative ideas, approaches, developments, and research projects in the area advanced Artificial Intelligence. It will also serve to facilitate the exchange of information between researchers and industry professionals to discuss the latest issues and advancement in the research area. Core areas of AI and advanced multi-disciplinary and its applications will be covered during the conferences.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
10th International Conference on Software Engineering (SOFT 2024)ClaraZara1
10th International Conference on Software Engineering (SOFT 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Software Engineering and Applications. The goal of this conference is to bring together researchers and practitioners from academia and industry to focus on understanding Modern software engineering concepts and establishing new collaborations in these areas.
A New Framework for Securing Personal Data Using the Multi-CloudClaraZara1
Relaying On A Single Cloud As A Storage Service Is Not A Proper Solution For A Number Of Reasons; For Instance, The Data Could Be Captured While Uploaded To The Cloud, And The Data Could Be Stolen From The Cloud Using A Stolen Id. In This Paper, We Propose A Solution That Aims At Offering A Secure Data Storage For Mobile Cloud Computing Based On The Multi-Clouds Scheme. The Proposed Solution Will Take The Advantages Of Multi-Clouds, Data Cryptography, And Data Compression To Secure The Distributed Data; By Splitting The Data Into Segments, Encrypting The Segments, Compressing The Segments, Distributing The Segments Via Multi-Clouds While Keeping One Segment On The Mobile Device Memory; Which Will Prevent Extracting The Data If The Distributed Segments Have Been Intercepted.
8th International Conference on Networks and Security (NSEC 2024)ClaraZara1
8th International Conference on Networks and Security (NSEC 2024) ) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Networks and Communications security. The aim of the conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.
International Journal of Security, Privacy and Trust Management (IJSPTM)ClaraZara1
With the simplicity of transmission of data over the web increasing, there has more prominent need for adequate security mechanisms. Trust management is essential to the security framework of any network. In most traditional networks both wired and wireless centralized entities play pivotal roles in trust management. The International Journal of Security, Privacy and Trust Management ( IJSPTM ) is an open access peer reviewed journal that provides a platform for exchanging ideas in new emerging trends that needs more focus and exposure and will attempt to publish proposals that strengthen our goals.
10th International Conference on Artificial Intelligence and Soft Computing (...ClaraZara1
10th International Conference on Artificial Intelligence and Soft Computing (AIS 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology, and applications of Artificial Intelligence, Soft Computing. The Conference looks for significant contributions to all major fields of the Artificial Intelligence, Soft Computing in theoretical and practical aspects. The aim of the Conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.
DOCUMENT SELECTION USING MAPREDUCE Yenumula B Reddy and Desmond HillClaraZara1
Big data is used for structured, unstructured and semi-structured large volume of data which is difficult to manage and costly to store. Using explanatory analysis techniques to understand such raw data, carefully balance the benefits in terms of storage and retrieval techniques is an essential part of the Big Data. The research discusses the MapReduce issues, framework for MapReduce programming model and implementation. The paper includes the analysis of Big Data using MapReduce techniques and identifying a required document from a stream of documents. Identifying a required document is part of the security in a stream of documents in the cyber world. The document may be significant in business, medical, social, or terrorism.
10th International Conference on Data Mining (DaMi 2024)ClaraZara1
10th International Conference on Data Mining (DaMi 2024) Conference provides a forum for researchers who address this issue and to present their work in a peer-reviewed forum.
12th International Conference of Artificial Intelligence and Fuzzy Logic (AI ...ClaraZara1
12th International Conference of Artificial Intelligence and Fuzzy Logic (AI & FL 2024) provides a forum for researchers who address this issue and to present their work in a peer-reviewed forum. Authors are solicited to contribute to the conference by submitting articles that illustrate research results, projects, surveying works and industrial experiences that describe significant advances in the following areas, but are not limited to these topics only.
An Effective Semantic Encrypted Relational Data Using K-Nn ModelClaraZara1
Data exchange and data publishing are becoming an important part of business and academic practices. Data owners need to maintain the rights over the datasets they share. A right-protection mechanism can be provided for the ownership of shared data, without revealing its usage under a wide range of machine learning and mining. In the approach provide two algorithms: the Nearest-Neighbors (NN) and determiner preserves the Minimum Spanning Tree (MST). The K-NN protocol guarantees that relations between object remain unaltered. The algorithms preserve the both right protection and utility preservation. The right protection scheme is based on watermarking. Watermarking methodology preserves the distance relationships.
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
The CBC machine is a common diagnostic tool used by doctors to measure a patient's red blood cell count, white blood cell count and platelet count. The machine uses a small sample of the patient's blood, which is then placed into special tubes and analyzed. The results of the analysis are then displayed on a screen for the doctor to review. The CBC machine is an important tool for diagnosing various conditions, such as anemia, infection and leukemia. It can also help to monitor a patient's response to treatment.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
artificial intelligence and data science contents.pptxGauravCar
What is artificial intelligence? Artificial intelligence is the ability of a computer or computer-controlled robot to perform tasks that are commonly associated with the intellectual processes characteristic of humans, such as the ability to reason.
› ...
Artificial intelligence (AI) | Definitio
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
A New Framework for Securing Personal Data Using the Multi-Cloud
1. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
DOI : 10.5121/ijsptm.2015.4402 11
A NEW FRAMEWORK FOR SECURING PERSONAL
DATA USING THE MULTI-CLOUD
Hassan Saad Alqahtani1
, Paul Sant1
and Ghita Kouadri-Mostefaoui2
1
IRAC, UCMK, Milton Keynes, UK
2
Department of Computer Science, UCL, London, UK
ABSTRACT
Relaying On A Single Cloud As A Storage Service Is Not A Proper Solution For A Number Of Reasons; For
Instance, The Data Could Be Captured While Uploaded To The Cloud, And The Data Could Be Stolen
From The Cloud Using A Stolen Id. In This Paper, We Propose A Solution That Aims At Offering A Secure
Data Storage For Mobile Cloud Computing Based On The Multi-Clouds Scheme. The Proposed Solution
Will Take The Advantages Of Multi-Clouds, Data Cryptography, And Data Compression To Secure The
Distributed Data; By Splitting The Data Into Segments, Encrypting The Segments, Compressing The
Segments, Distributing The Segments Via Multi-Clouds While Keeping One Segment On The Mobile Device
Memory; Which Will Prevent Extracting The Data If The Distributed Segments Have Been Intercepted.
KEYWORDS
Multi-cloud security, mobile cloud computing, cloud security, secure storage, untrusted environment
1. INTRODUCTION
Similar to other ICT solutions, cloud computing services have some open challenges and issues to
meet in terms of security and privacy. One of the most recent security breaches related to the
cloud is an iCloud privacy breach that exposed some celebrities’ personal pictures to the public
[1]. This incident drew attention to 1) the authentication techniques that were applied and 2) the
efficiency of the applied encryption method [1]. Multiple-cloud models have emerged as a
potential solution that could be used to overcome single cloud limitations and obstacles. The
National Institute of Standards and Technology (NIST) define a multiple-cloud as a set of
geographically distributed clouds that could be used in a serial or simultaneous way [2]. The
multiple-cloud computing paradigms help to: i) promote independence; ii) enhance security; iii)
increase redundancy; iv) optimise operational costs and v) improve the quality of delivered
services. The multiple-cloud paradigms allow two or more distributed cloud services to
collaborate and work together serially or simultaneously, regardless of whether they are
dependant or not [2]. [3] identifies the purpose of developing multiple-cloud models for
overcoming the limitations of the single cloud by distributing dependency, trust, privacy,
accessibility, and security among multiple-clouds. This study concludes that using a single cloud
as a storage medium, is totally insecure [4-6]. In this paper, this statement is confirmed by a
systemic literature review, which identifies the existing security issues/threats and highlights the
most critical ones. This research proposes an approach that provides a secure data storage
mechanism using a multi-cloud computing model. The research will be validated by an android
app prototype. The structure of the paper is as follows: the motivation for this research is
presented in Section 2. The literature review is presented in Section 3. Details of the proposed
approach are shown in Section 4. Finally, the conclusion and the future directions are shown in
Section 5.
2. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
12
2. MOTIVATION
This study proposes a new approach to secure data in the cloud, using a multiple-cloud scheme. It
develops a new mechanism/algorithm for providing a secure personal data storage system using
smart devices. The proposed mechanism relies on a number of encryption methods in order to
guarantee the integrity of stored data. Data confidentiality and integrity are preserved by splitting
and distributing the original data file on the multi-cloud. Before distributing the data, the file’s
segments are encrypted, which will enhance the security level. In order to guarantee the privacy
and the confidentiality of the targeted file, one file segment will remain stored in the mobile
device to allow future data retrieval and reconstruction.
3. RELATED WORK
We studied nine core approaches that have been developed in order to enhance the security and
availability of the delivered service. Table 1 shows a comparison between the studied approaches
based on the identified criteria, while Table 2 defines the used criteria. It is very important to
highlight that some approaches/algorithms have not been included in this study, for instance,
NubiSave [7], Cleversafe [8], POTSHARDS [9], Cryptographic Cloud Storage System (CS2)
[10], for the following reasons:
- Some approaches are too similar, lack novelty/contribution, or have been developed
based on the same core algorithm. It is obvious that a lot of algorithms/ approaches have
been developed based on the Byzantine Fault Tolerance algorithm (BFT), for instance,
[11-13].
- There is a lack of efficiency in the approaches developed for cloud computing. It is clear
that this is the case with approaches that have been developed for a distributed computing
environment and have been proposed as potential solutions for the cloud.
- The lack of documentation, testing, or development such as in the following two
commercial products - Cumulus4j (http://cumulus4j.org) and TrustedSafe
(http://www.trustedsafe.de/).
Actually, the multiple-cloud concept has been inspired by the information dispersal concept. This
is not new, nevertheless, it is very efficient and very popular [14, 15]. This concept aims to
enhance the system’s confidentiality, availability and reliability by distributing the information
through more than one location [16, 17].
The Byzantine Fault Tolerance (BFT) approach has been proposed as a protocol to be used over
distributed systems. Technically, the BFT aims to manage the communication process between
the end user and the replicated data/system [14, 18, 19]. There are a number of approaches that
have been inspired by BFT, one of which is TCLOUD [6, 16], which has been supported by the
European Union (EU). This project focuses on the critical application in terms of availability and
security. TCloud is presented to the end user as a platform that is capable of improving the quality
of the delivered service in terms of security and availability. The researchers argue that relying on
the application layer in order to guarantee the availability and security of delivered services is not
an efficient approach, and the proposed solutions need to cover the platform and infrastructure as
well.
Real-time cloud services, [20] developed an approach that claims to guarantee the reliability of
the infrastructure that is used. This approach will maintain the reliability of each virtual machine
after each computational cycle, as a result of a continuous change of resources over time.
However, it consumes the provided resources because of the need to collect, analyse and compute
3. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
13
the resources’ status. This approach can improve reliability by carrying out the required tasks
over the most reliable resources. It uses a proactive mechanism in order to prevent unreliable
resources from performing any task. The reliability level is determined by analysing the targeted
resource’s performance. DepSKY is a system that has been developed across distributed cloud
storage [5]. This system has two versions: DepSKY-Availability and DepSKY-Confidentiality
and Availability. This system shows a degree of inefficiency for some applications because its
needs a large number of communications between all the involved data writers which could be
very expensive for some applications.
The Redundant Array of Cloud Storage (RACS) has been developed as a proxy for cloud storage
[21]. RACS aims to enhance the availability of stored data, and prevent vendor lock-in issues,
through dispersing the targeted file across a number of cloud storages. InterCloud Storage (IC
storage) [22] is similar to RACS. However, IC Storage has been developed due to the need to
guarantee the dependability of the clouds used, and to overcome asynchrony issues by utilising
the toleration of clients’ protocol failures that has been developed by [23] to be used in terms of
distributed storage.
IRIS is an approach that has been developed by [24], and aims to enhance the authentication
level. Technically, IRIS relies on Proofs of Retrievability (POR) which have been developed in
order to prove to the client that the stored data is recoverable. IRIS supports multi-writers, but has
deficiencies in terms of access management and granted privileges. Also, it is costly in terms of
consumed communications. Tahoe - The Least-Authority File system (Tahoe-LAFS) has been
developed by [25] as a solution for securing distributed storage systems in order to increase the
availability of stored data. In order to apply this solution across cloud storage, the cloud itself
must be capable of executing code, because Tahoe-LAFS needs to perform some computations in
the used clouds.
In [26], the authors propose an approach for cloud storage which is entitled CloudProof. This
approach aims to guarantee the security of stored data by focusing on the auditability aspects.
Also, this approach needs a code execution as part of the cloud storage. Offloading a lot of tasks
to the used clouds means increasing the cost in terms of required resources, and the consumed
communication. [27] developed Robust Data Sharing with Key-Value Stores (RDSKVS), which
will help to develop a trusted distributed cloud storage system that supports multi-writers and
multi-readers with regard to the stored data. RDSKVS is concerned with the availability and
freshness of the shared data, which means that if the users need additional security features, they
have to have additional tools to deliver these features. This approach will save communication
costs because it does not require any communication between the involved users. However, this
approach is not efficient in term of cost and functionality.
High Availability and Integrity Layer (HAIL) has been developed by [28] as an updated version
of Redundant Arrays of Inexpensive Disks (RAID). One of HAIL’s core disadvantages is the lack
of capability to provide a live data version, which means limitations in terms of static files only.
Besides this, it requires code execution across the provided clouds, in order to prevent any kind of
latency that might be caused by executing the code across other platforms. Also, the segments are
distributed across the clouds in the same order for each file, which could be considered to be a
vulnerable feature. This is because the lack of randomisation will facilitate tracing the data
segments. The client will not check the integrity for all the segments, but the integrity validation
process will be implemented over specific segments. This could allow a creeping-corruption
attack over time. Additionally, the HAIL does not support the infrastructure’s heterogeneity. In
addition, it is supposed to be run over federated, or intra-cloud paradigms. Technically, HAIL
will be consumed as an integrated layer through the provided infrastructure.
4. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
14
Table 1. Multiple Cloud computing security approaches properties comparison
Table 2. Comparison Criteria Explanation
Criteria Explanation
Environment
The targeted computing environment/architecture (Distributed or
Clouding).
Cloud Module
The targeted multiple cloud paradigms (Single, Hybrid, inter-cloud,
federated, or multi-clouds).
Customers The approach expected end-user (Enterprise, or individual customer).
Platform
The programming language that has been used for developing the
approach.
Computation
The approach needs for executing code through off-load mode (through
the cloud),
Storage
If the approach could be consumed as a cloud storage, computation
platform, or both.
5. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
15
Multi-users The approach capability to support more than one user.
Consistency
The ability of cloud service provider to deliver a fresh version of the
file to the end users to consume it in parallel way.
Confidentiality
The ability of the approach to protect and maintain the Confidentiality
of the involved files.
Integrity
The capability of implementing the required measurements in order to
verify the integrity of the file/block.
Retrievability
The ability of the service provider to prove possessing of the targeted
file.
Authenticity
Log records for verifying if person; who create, modify, or delete, is
authorised to implement such tasks.
Auditing
Monitoring due to prove any kind of occurred breach, or suspicious
behaviour.
Access control
The approach capability to Manage and control accessing to the files
(privileges management).
The majority of developed approaches are relatively expensive in terms of configuration,
development, and operational costs. In addition, all the studied approaches are based on intra-
cloud or federated-cloud models, which are costly even for small enterprises. Both of these
models (intra-cloud, and federated-cloud) are suitable for large and medium enterprises due to the
characteristics of these models (prior-agreement and the coupled level). In other words, no
solution has been developed that can be implemented through multiple-cloud models and can be
used by individuals. Additionally, all the developed approaches vary in terms of interconnection
protocols, internal layers or general algorithms. Besides that, all the existing approaches increase
the complexity of the system used, and require a very high level of expertise to be configured and
implemented. Also, all the studied approaches lack portability.
In addition, there is a lack of products that could be provided to customers and could be used
immediately in order to enhance the level of delivered security, integrity, authenticity, and
redundancy. Nor are there products which can be used by users with a minimum required level of
experience, regardless of whether the customer was an enterprise or a single user. Additionally,
the developed approaches cannot deliver an optimum solution that is able to solve/consider the
majority of security aspects. This means that the used approach needs to be combined with other
approaches to deliver the required security level. In other words, the total operational cost will be
too high due to the need to configure, manage, maintain, and consume additional approaches in
conjunction with the main one. Besides that, some approaches required a code execution across
the provided clouds, in order to prevent any kind of latency that might be caused by executing the
code across other platforms. In addition, executing code over the used clouds might introduce the
possibility of attacks if it does not follow good practice.
6. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
16
To sum up, the performed literature review clearly shows the need to develop an approach
capable of maintaining the security, integrity, privacy, and availability of the stored data for
individuals. Besides that, the required experience level must be reduced, as well as the operational
costs. Our approach aims to cover that gap and achieve the study aims.
4. PROPOSED APPROACH
Developing a cloud system that is stable and capable of delivering a very high level of security
and availability cannot be achieved by relying on a higher layer of the delivered system (the
software). Instead, the lower layer (the infrastructure) must be involved. We will develop an
approach that combines security and availability, and deliver it to the end user for them to use it.
We believe that the end user has to define the settings in terms of the required security and
privacy, instead of relying on the service provider. In addition, the data encryption process must
be implemented on the end user’s mobile device instead of the cloud. This will guarantee the
privacy of the stored data. Additionally, exchanging encrypted data is much safer, and the end
user will guarantee that the providers cannot access or alter the data, as long as they do not have
the key.
Keeping one segment in the end user’s machine will prevent any attempt to recover the
distributed data, because the attacker must have all the segments in order to recover that data,
together with the key. Also, by keeping the last segment in the end user’s device, the attacker will
not be able to identify the first and last segments of the targeted data, because all the exchange
segments have a constant size.
This approach relies on firstly, dividing, encrypting and storing (distributing) data on several
protected multi-cloud storage facilities, and secondly, storing one segment only on the smart
device. The proposed approach will achieve the study aim through the following:
- Using a chaotic map encryption technique, which is a strong and novel mathematical
algorithm used to encrypt the targeted data.
- Storing one segment locally on the mobile device in order to prevent reconstructing the
whole chunk of data.
- Avoiding un-authenticated access by distributing the data into a multi-cloud environment,
and imposing two authentication levels.
4.1. Multiple-Cloud Computing Model
There are various existing models and forms of multiple clouding; consequently, it is very
important to compare these models in order to select the model that is best fit and capable of
delivering the expected service level. The multi-cloud libraries-based has been selected as the
most suitable paradigm. In the multi-cloud computing model, there is more than one independent
cloud that will be used to execute the requested tasks through consuming the provided
resources/capabilities; here the customers will take responsibility for resource/capabilities
management, task scheduling, and load balancing [29]. In the multi-cloud libraries-based model,
the users develop their own service broker through a unified Application Programming Interface
(API) [30]. The developed broker will be in the form of libraries and be embedded in the users’
access point (host) [31]. This model offers an availability level higher than the previous model,
and that is caused by associating the multi-cloud library with the user’s machine. However, the
users must be qualified to develop their multi-cloud libraries. In addition, this model does not
require any prior agreement between the involved clouds.
7. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
17
4.2. Framework Design
This section discusses the overall architecture of the proposed framework. It illustrates the
modules of the developed application and their functions/tasks. Figure 2 illustrates the approach
modules and their internal units.
4.2.1. Preference Module
This module is responsible for cloud selection procedures, based on the end-users’ requirements
(data size, number of segments, available storage). Also, it is responsible for the involved cloud
providers. In addition, the user could customise the security requirements via this module.
Additionally, the option of backing-up the secured data could be performed in this module after
calling the data logs from the data distribution module.
4.2.2. Data Management Module
This module is responsible for data fragmentation, recovery tasks, and the erasing of temporary
data. Additionally, it takes responsibility for segment re-naming. The re-naming process will help
to hide the identity of the exchanger data against phishing attacks, and will facilitate the archiving
of the secured data.
4.2.3. Data Security Module
This module is responsible for two key procedures: data encryption/decryption and data
compression/de-compression. The preferred encryption techniques will be performed through this
module, while the executed technique relies on the user’s selection from the preference module.
4.2.4. Data Distribution Module
This module is responsible for the distribution procedure and for segment requests (for recovery).
This module relies on the cloud storage APIs to provide an abstraction layer in order to provide
stable and reliable connections. The distribution process will based on two main factors: the user
security requirements and the status of the involved cloud. The status of the involved cloud
contains:
- Having access (correct user name and password).
- The service status (offline, online).
- The available storage.
- The size of the segment being sent (some service providers have conditions
regarding the maximum file size).
8. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
18
Figure 2. Proposed approach architecture design
4.3. Chaotic Logistic Map
Traditional data encryption algorithms such as Data Encryption Standard (DES), Triple DES
(3DES), Advanced Encryption Standard (AES), International Data Encryption Algorithm
(IDEA), and linear feedback shift register (LFSR), consider plaintext as either block cipher or
data stream and are not suitable for fast encryption of large data (for example, colour images).
Their implementation, when they are realized by software, of traditional algorithms for image
encryption is even more complicated because of high correlation between image pixels; in other
words, these mechanisms are not appropriate for encrypting digital pictures, especially in the case
of smart devices [32]. These techniques are expensive in terms of consumed time, computational
resources, and power consumption. Additionally, applying these techniques to large pictures is
not as efficient as it is to small pictures. For that, chaos techniques are better for picture
encryption, and can provide security, speed, power and computational resource savings. The
structure of the core chaos-based image encryption algorithm is shown in Figure 3 [33]. This
algorithm represents the core encryption technique used in our approach. It is worth mentioning
that the applied encryption techniques will not be limited to the chaotic logistic map. Rather,
other techniques will be included as part of this approach.
Figure 3. General Chaos Algorithm Architecture [33]
9. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
19
4.4. Approach Workflow
Figure 4 shows the key operational tasks associated with the distribution and recovery procedure
respectively. For the distribution procedure, the distributed segments will be recorded with the
associated cloud, in order to facilitate the recovery procedure. On the other hand, the segments
will be wiped from the clouds after recovering, and the logs will be updated. The consumed time
for these two procedures will vary and this is caused by the variation into the cloud providers for
each segments and their status.
Figure 4. Distribution & Recovery procedures
Figure 5 illustrates the flow of operations. It also shows how the mobile application can interact
and update the multiple-cloud status, and at which stage the data on-site storing and data
archiving need to be completed. In order to enhance the delivered security level, the customer can
keep one of the segments on the device itself. This means that the data can never be
decrypted/read/modified outside the smart phone.
Splitting the targeted file into segments is very important for many reasons:
• Distributing over the multi-clouds.
• Avoiding cloud maximum-size file problems.
In addition, splitting the file will enhance the utilization of the provided bandwidth and facilitate
load balancing. Also, it will facilitate the encryption/decryption processes by reducing the size of
the computed files. Technically, the data splitting improves the processing throughout, by
10. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
20
allowing pipeline and parallel process, which will reduce the operational cost and the consumed
time [34]. Compressing the distributed data could help reduce the segment’s size before
uploading to the cloud, which will reduce the consumed time and bandwidth and save storage in
the cloud. Technically, the compression efficiently depends on the type of compressed data (text,
image and video). For instance, the text files can be compressed, although compressing pdf files
cannot efficiently reduce the size because these types of files are already compressed. For that,
compressing all the files will not reduce the size and save time and operational cost, it might
consume the system resources without benefits. With the limitation of mobile devices’ resources,
and in order to avoid the compression problem, the technique that has been developed by [35]
will be used for choosing the best algorithm that offers the highest compression, and ignores the
incompressible files.
Figure 5. The proposed framework’s workflow
5. CONCLUSION
To sum up, we have proposed an approach that offers secure data storage over multi-clouds for
mobile computing users to ensure the security of the stored data. The study shows the importance
of cloud computing services and the features that encourage users to consume the cloud service
instead of the traditional IT solutions. Then, the limitations of cloud computing services’ security
11. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
21
approaches, covered by previous studies, were discussed in order to identify the current security
weaknesses that we need to overcome. The proposed approach overcomes the single cloud
storage limitations by shifting to multi-cloud schemes and omits the phishing attack efficiently by
keeping one segment of the distributed data on the mobile device. As for future work, the
proposed approach can be enhanced by adding more techniques and features to help improve the
delivered security levels. It will offer a diversity of security levels for users to select from.
Finally, we aim to extend the developed mobile application to cover the case where the end users
use their own cloud storage (private).
REFERENCES
[1] Kelion, L., 2014. Apple toughens iCloud security after celebrity breach, Available at:
http://goo.gl/vyxS3S [Last accessed on October 20, 2015].
[2] Hogan, M., Liu, F., Sokol, A. & Tong, J. 2011, "Nist cloud computing standards roadmap", NIST
Special Publication, vol. 35.
[3] Vukolic, M. 2010, "The Byzantine Empire in the Intercloud", SIGACT News, vol. 41, no. 3, pp. 105-
111.
[4] AlZain, M.A., Soh, B. & Pardede, E. 2013, "A Byzantine Fault Tolerance Model for a Multi-cloud
Computing", Computational Science and Engineering (CSE), 2013 IEEE 16th International
Conference on 2013, IEEE, pp. 130-137.
[5] Bessani, A., Correia, M., Quaresma, B., André, F. & Sousa, P. 2013, "DepSky: dependable and secure
storage in a cloud-of-clouds", ACM Transactions on Storage (TOS), vol. 9, no. 4, pp. 12.
[6] Verissimo, P., Bessani, A. & Pasin, M. 2012, "The TClouds architecture: Open and resilient cloud-of-
clouds computing", Dependable Systems and Networks Workshops (DSN-W), 2012 IEEE/IFIP 42nd
International Conference on 2012, IEEE, pp. 1-6.
[7] Spillner, J., Bombach, G., Matthischke, S., Muller, J., Tzschichholz, R. & Schill, A. 2011,
"Information Dispersion over Redundant Arrays of Optimal Cloud Storage for Desktop Users",
Utility and Cloud Computing (UCC), 2011 Fourth IEEE International Conference on 2011, pp. 1-8.
[8] Resch, J.K. & Plank, J.S. 2011, "AONT-RS: Blending Security and Performance in Dispersed Storage
Systems", Proceedings of the 9th USENIX Conference on File and Stroage Technologies USENIX
Association, Berkeley, CA, USA, pp. 14.
[9] Storer, M.W., Greenan, K.M., Miller, E.L. & Voruganti, K. 2009, " POTSHARDS—a secure,
recoverable, long-term archival storage system ", ACM Transactions on Storage (TOS), 5(2), pp. 5.
[10] Kamara, S., Papamanthou, C. & Roeder, T. "Cs2: A searchable cryptographic cloud storage system".
[11] Tchana, A., Broto, L. & Hagimont, D. 2012, "Approaches to cloud computing fault tolerance",
Computer, Information and Telecommunication Systems (CITS), 2012 International Conference on
2012, pp. 1-6.
[12] Zhao, W., Melliar-Smith, P.M. & Moser, L.E. 2010, "Fault Tolerance Middleware for Cloud
Computing", Cloud Computing (CLOUD), 2010 IEEE 3rd International Conference on 2010, pp. 67-
74.
[13] Wu, L., Liu, B. & Lin, W. 2013, "A Dynamic Data Fault-Tolerance Mechanism for Cloud Storage",
Emerging Intelligent Data and Web Technologies (EIDWT), 2013 Fourth International Conference on
2013, pp. 95-99.
[14] Rabin, M.O. 1989, "Efficient Dispersal of Information for Security, Load Balancing, and Fault
Tolerance", J.ACM, vol. 36, no. 2, pp. 335-348.
[15] Spillner, J. & Schill, A. 2014, "Towards Dispersed Cloud Computing", Communications and
Networking (BlackSeaCom), 2014 IEEE International Black Sea Conference on 2014, pp. 170-174.
[16] Bessani, A., Cutillo, L.A., Ramunno, G., Schirmer, N. & Smiraglia, P. 2013, "The TClouds platform:
concept, architecture and instantiations", Proceedings of the 2nd International Workshop on
Dependability Issues in Cloud Computing 2013, ACM, pp. 1.
[17] Spillner, J. & Muller, J. 2014, "Tutorial on Distributed Data Storage: From Dispersed Files to Stealth
Databases", Utility and Cloud Computing (UCC), 2014 IEEE/ACM 7th International Conference on
2014, pp. 535-536.
[18] Correia, M., Costa, P., Pasin, M., Bessani, A.N., Ramos, F.M. & Verissimo, P. 2012, "On the
Feasibility of Byzantine Fault-Tolerant MapReduce in Clouds-of-Clouds.", SRDS 2012, pp. 448-453.
12. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
22
[19] Garraghan, P., Townend, P. & Xu, J. 2011, "Byzantine fault-tolerance in federated cloud computing",
Service Oriented System Engineering (SOSE), 2011 IEEE 6th International Symposium on 2011,
IEEE, pp. 280-285.
[20] Malik, S. & Huet, F. 2011, "Adaptive Fault Tolerance in Real Time Cloud Computing", Services
(SERVICES), 2011 IEEE World Congress on 2011, pp. 280-287.
[21] Abu-Libdeh, H., Princehouse, L. & Weatherspoon, H. 2010, "RACS: a case for cloud storage
diversity", Proceedings of the 1st ACM symposium on Cloud computing 2010, ACM, pp. 229-240.
[22] Cachin, C., Haas, R. & Vukolic, M. 2010, Dependable storage in the Intercloud .
[23] Chockler, G., Guerraoui, R., Keidar, I. & Vukolic, M. 2008, Reliable distributed storage.
[24] Stefanov, E., van Dijk, M., Juels, A. & Oprea, A. 2012, "Iris: A scalable cloud file system with
efficient integrity checks", Proceedings of the 28th Annual Computer Security Applications
Conference 2012, ACM, pp. 229-238.
[25] Wilcox-O'Hearn, Z. & Warner, B. 2008, "Tahoe: the least-authority filesystem", Proceedings of the
4th ACM international workshop on Storage security and survivability 2008, ACM, pp. 21-26.
[26] Popa, R.A., Lorch, J.R., Molnar, D., Wang, H.J. & Zhuang, L. 2011, "Enabling Security in Cloud
Storage SLAs with CloudProof.", USENIX Annual Technical Conference.
[27] Basescu, C., Cachin, C., Eyal, I., Haas, R., Sorniotti, A., Vukolic, M. & Zachevsky, I. 2012, "Robust
data sharing with key-value stores", Dependable Systems and Networks (DSN), 2012 42nd Annual
IEEE/IFIP International Conference on 2012, IEEE, pp. 1-12.
[28] Bowers, K.D., Juels, A. & Oprea, A. 2009, "HAIL: a high-availability and integrity layer for cloud
storage", Proceedings of the 16th ACM conference on Computer and communications security 2009,
ACM, pp. 187-198.
[29] Ferrer, A. J., Hernández, F., Tordsson, J., Elmroth, E., Ali-Eldin, A., Zsigri, C., et al. 2012,
“OPTIMIS: A holistic approach to cloud service provisioning”, Future Generation Computer
Systems, 28(1), 66-77.
[30] Kecskemeti, G., Kertesz, A., Marosi, A., & Kacsuk, P. 2012, “Interoperable resource management for
establishing federated clouds”, IGI Global Theory and Practice, Hershey, pp.18-35.
[31] Petcu, D., Crăciun, C., Neagul, M., Panica, S., Di Martino, B., Venticinque, S., et al. 2011,
“Architecturing a sky computing platform. Towards a Service-Based Internet”, ServiceWave 2010
Workshops, pp. 1-13.
[32] Ismail, I.A., Amin, M. & Diab, H. 2010, "A Digital Image Encryption Algorithm Based A
Composition of Two Chaotic Logistic Maps.", IJ Network Security, vol. 11, no. 1, pp. 1-10.
[33] Lian, S., Sun, J. & Wang, Z. 2005, "Security analysis of a chaos-based image encryption algorithm",
Physica A: Statistical Mechanics and its Applications, vol. 351, no. 2–4, pp. 645-661.
[34] Ashokkumar, S., Karuppasamy, K., Srinivasan, B. & Balasubramanian V. 2010, "Parallel Key
Encryption for CBC and Interleaved CBC," International Journal of Computer Applications, vol. 2,
pp. 21-25, 2010.
[35] Harnik, D, Kat, R, Margalit, O, Sotnikov, D, Traeger, A. 2013, “To zip or not to zip: effective
resource usage for real-time compression”. In: Proceedings of the 11th USENIX conference on file
and storage technologies.
Authors
Dr Paul Sant joined the department of Computer Science and Technology (UoB) in September 2005 as a
lecturer and he became a Senior Lecturer in September 2006. He was promoted to Principal Lecturer in
August 2011. Dr. Paul completed his PhD from King's College, London in 2003 with a thesis entitled
"Algorithmics of edge-colouring pairs of 3-regular trees" and prior to this, a BSc. in Computer Science
from the University of Liverpool (1999). He is an active member of the British Computer Society and a
Chartered Information Technology Professional (CITP) as well as being a fellow of the Higher Education
Academy.
Dr Ghita kouadri Mostefaoui is a member of the Advanced Teaching Group, Department of Computer
Science, University College London. Ghita has been awarded her PhD in adaptive security, jointly from
the University of Fribourg and University Paris VI. Her research interests include cloud computing,
automatic extraction of software models and computer science education. Ghita is a fellow of the Higher
Education Academy.
13. International Journal of Security, Privacy and Trust Management (IJSPTM) Vol 4, No 3/4, November 2015
23
Hassan Saad Alqahtani started his PhD March-2014 at university of Bedfordshire. His research interest
includes cloud computing, mobile cloud computing, cyber security, and encryption. He received his Master
degree from Teesside University in 2012, and his Postgraduate certificate from Essex University in the
Telecommunication and Information System.