This document summarizes a research paper that proposes a hybrid cloud approach for secure authorized data deduplication. The paper presents a scheme that uses convergent encryption to encrypt files before uploading them to cloud storage. It also considers the differential privileges of users when performing duplicate checks, in addition to file content. A prototype is implemented to test the proposed authorized duplicate check scheme. Experimental results show the scheme incurs minimal overhead compared to normal cloud storage operations. The goal is to better protect data security while supporting deduplication in a hybrid cloud architecture.
DISTRIBUTED SCHEME TO AUTHENTICATE DATA STORAGE SECURITY IN CLOUD COMPUTINGijcsit
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces
database and application software to the large data centres, where the management of services and data
may not be predictable, where as the conventional solutions, for IT services are under proper logical,
physical and personal controls. This aspect attribute, however comprises different security challenges
which have not been well understood. It concentrates on cloud data storage security which has always been
an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and
efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent
features. Homomorphic token is used for distributed verification of erasure – coded data. By using this
scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and
secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to
traditional solutions, where the IT services are under proper physical, logical and personnel controls,
cloud computing moves the application software and databases to the large data centres, where the data
management and services may not be absolutely truthful. This effective security and performance analysis
describes that the proposed scheme is extremely flexible against malicious data modification, convoluted
failures and server clouding attacks.
ANALYSIS OF ATTACK TECHNIQUES ON CLOUD BASED DATA DEDUPLICATION TECHNIQUESneirew J
ABSTRACT
Data in the cloud is increasing rapidly. This huge amount of data is stored in various data centers around the world. Data deduplication allows lossless compression by removing the duplicate data. So, these data centers are able to utilize the storage efficiently by removing the redundant data. Attacks in the cloud computing infrastructure are not new, but attacks based on the deduplication feature in the cloud computing is relatively new and has made its urge nowadays. Attacks on deduplication features in the cloud environment can happen in several ways and can give away sensitive information. Though, deduplication feature facilitates efficient storage usage and bandwidth utilization, there are some drawbacks of this feature. In this paper, data deduplication features are closely examined. The behavior of data deduplication depending on its various parameters are explained and analyzed in this paper.
DISTRIBUTED SCHEME TO AUTHENTICATE DATA STORAGE SECURITY IN CLOUD COMPUTINGijcsit
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces
database and application software to the large data centres, where the management of services and data
may not be predictable, where as the conventional solutions, for IT services are under proper logical,
physical and personal controls. This aspect attribute, however comprises different security challenges
which have not been well understood. It concentrates on cloud data storage security which has always been
an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and
efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent
features. Homomorphic token is used for distributed verification of erasure – coded data. By using this
scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and
secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to
traditional solutions, where the IT services are under proper physical, logical and personnel controls,
cloud computing moves the application software and databases to the large data centres, where the data
management and services may not be absolutely truthful. This effective security and performance analysis
describes that the proposed scheme is extremely flexible against malicious data modification, convoluted
failures and server clouding attacks.
ANALYSIS OF ATTACK TECHNIQUES ON CLOUD BASED DATA DEDUPLICATION TECHNIQUESneirew J
ABSTRACT
Data in the cloud is increasing rapidly. This huge amount of data is stored in various data centers around the world. Data deduplication allows lossless compression by removing the duplicate data. So, these data centers are able to utilize the storage efficiently by removing the redundant data. Attacks in the cloud computing infrastructure are not new, but attacks based on the deduplication feature in the cloud computing is relatively new and has made its urge nowadays. Attacks on deduplication features in the cloud environment can happen in several ways and can give away sensitive information. Though, deduplication feature facilitates efficient storage usage and bandwidth utilization, there are some drawbacks of this feature. In this paper, data deduplication features are closely examined. The behavior of data deduplication depending on its various parameters are explained and analyzed in this paper.
The capability of involving the selection sharing encrypted data with different users via public
cloud storage may greatly ease security concerns over not intended data leaks in the cloud. A key
challenge to designing such encryption schemes to be sustainable in the efficient management of
encryption keys. The desired flexibility of sharing any group of selected documents with any group of
users need for something different encryption keys to be used for different documents. However, this
also implies the urgent need of securely distributing to users a large number of keys for both encryption
and search, and those users will have to protected from danger store the received keys, and submit an
equally large number of keyword trapdoors to the cloud in order to perform search over the shared data
implied need for secure communication, storage, and complexity clearly to give to someone the
approach impractical. In this work a data owner only needs to distribute a single key to a user for
sharing a very large number of documents, and the user only needs to submit a single trapdoor to the
cloud for querying the shared documents. User Revocation is used for Key Updation. Forward Secrecy
and Backward Secrecy is used.
NEW SECURE CONCURRECY MANEGMENT APPROACH FOR DISTRIBUTED AND CONCURRENT ACCES...ijiert bestjournal
Handover the critical data to the cloud provider sh ould have the guarantee of security and availabilit y for data at rest,in motion,and in use. Many alternatives sys tems exist for storage services,but the data confi dentiality in the database as a service paradigm are still immature. We propose a novel architecture that integrates clo ud database services paradigm with data confidentiality and exe cuting concurrent operations on encrypted data. Thi s is the method supporting geographically distributed client s to connect directly and access to an encrypted cl oud database,and to execute concurrent and independent operation s by using modifying the database structure. The proposed architecture has also the more advanta ge of removing intermediate proxies that limit the flexibility,availability,and expandability properties that are inbuilt in cloud-based systems. The efficacy of th e proposed architecture is evaluated by theoretical analyses a nd extensive experimental results with the help of prototype implementation related to the TPC-C standard benchm ark for various categories of clients and network l atencies. We propose a multi-keyword ranked search method for the encrypted cloud data databases,which simultan eously fulfill the needs of privacy requirements. The prop osed scheme could return not only the exact matchin g files,but also the files including the terms latent semantica lly associated to the query keyword.
Nowadays,Data is rising over internet in terabytes and Exabyte. So,there is a need of storing these data which has been fulfilled by cloud computing. Though the s ervice of cloud appear to be efficient and cost eff ective. Yet there are some challenges which are faced in cloud computing such as Data security and Authentication. In cloud Storage the data of Owner is stored in cloud where the cloud servers are remotely located the owner of the data does not have any direct control over the data. If the data over cloud is modified by the cloud,Third Party Auditor (TPA) or any other person there is no preci sion such that the owner of the data gets the infor mation about the modification of the data. TPA is a Third Party Auditor who has Experience in checking the integrity of data. TPA v erifies the files stored over the cloud if they are modified or not. Our scheme provides the solution to this probl em such that if there is any modification in the da ta the owner will get information about the change in the data. Our scheme only provides the information about the change in data it does not keep data intact or secure from mo dification over cloud.
A Study of A Method To Provide Minimized Bandwidth Consumption Using Regenera...IJERA Editor
Cloud storage systems to protect data from corruptions, redundant data to tolerate failures of storage and lost data should be repaired when storage fails. Regenerating codes provide fault tolerance by striping data across multiple servers, while using less repair traffic than traditional erasure codes during failure recovery. In previous research implemented practical Data Integrity Protection (DIP) scheme for regenerating-coding based cloud storage. Functional Minimum-Storage Regenerating (FMSR) codes and it construct FMSR-DIP codes, which allow clients to remotely verify the integrity of random subsets of long-term archival data under a multi server setting. The problem is to optimize bandwidth consumption when repairing multiple failures. The cooperative repair of multiple failures can help to further save bandwidth consumption when multiple failures are being repaired.
A NEW FRAMEWORK FOR SECURING PERSONAL DATA USING THE MULTI-CLOUDijsptm
Relaying On A Single Cloud As A Storage Service Is Not A Proper Solution For A Number Of Reasons; For Instance, The Data Could Be Captured While Uploaded To The Cloud, And The Data Could Be Stolen From The Cloud Using A Stolen Id. In This Paper, We Propose A Solution That Aims At Offering A Secure Data Storage For Mobile Cloud Computing Based On The Multi-Clouds Scheme. The Proposed Solution
Will Take The Advantages Of Multi-Clouds, Data Cryptography, And Data Compression To Secure The
Distributed Data; By Splitting The Data Into Segments, Encrypting The Segments, Compressing The
Segments, Distributing The Segments Via Multi-Clouds While Keeping One Segment On The Mobile Device
Memory; Which Will Prevent Extracting The Data If The Distributed Segments Have Been Intercepted
Crypto multi tenant an environment of secure computing using cloud sqlijdpsjournal
Today’s most modern research area of computing is cloud comput
ing due to its ability to diminish the costs
associated with virtualization, high availability, dynamic resource pools and increases the efficien
cy of
computing. But still it contains some drawbacks such as privacy, security, etc. This paper is thorou
ghly
focused on the security of data of multi tenant model obtains from the virtualization feature of clo
ud
computing. We use AES
-
128 bit algorithm and cloud SQL to protect sensitive data before storing in the
cloud. When the authorized customer arises for usag
e of data, then data firstly decrypted after that
provides to the customer. Multi tenant infrastructure is supported by Google, which prefers pushing
of
contents in short iteration cycle. As the customer is distributed and their demands can arise anywhe
re,
anytime so data can’t store at particular site it must be available different sites also. For this f
aster
accessing by different users from different places Google is the best one. To get high reliability a
nd
availability data is stored in encrypted befor
e storing in database and updated every time after usage. It is
very easy to use without requiring any software. This authenticate user can recover their encrypted
and
decrypted data, afford efficient and data storage security in the cloud.
Survey on cloud backup services of personal storageeSAT Journals
Abstract In widespread cloud environment cloud services is tremendously growing due to large amount of personal computation data. Deduplication process is used for avoiding the redundant data. A cloud storage environment for data backup in personal computing devices facing various challenge, of source deduplication for the cloud backup services with low deduplication efficiency. Challenges facing in the process of deduplication for cloud backup service are-1)Low deduplication efficiency due to exclusive access to large amount of data and limited system resources of PC based client site.2)Low data transfer efficiency due to transferring deduplicate data from source to backup server are typically small but that can be often across the WAN. Keywords- Cloud computing, Deduplication, cloud backup, application awareness
Multi- Level Data Security Model for Big Data on Public Cloud: A New ModelEswar Publications
With the advent of cloud computing the big data has emerged as a very crucial technology. The certain type of cloud provides the consumers with the free services like storage, computational power etc. This paper is intended to make use of infrastructure as a service where the storage service from the public cloud providers is going to leveraged by an individual or organization. The paper will emphasize the model which can be used by anyone without any cost. They can store the confidential data without any type of security issue, as the data will be altered
in such a way that it cannot be understood by the intruder if any. Not only that but the user can retrieve back the original data within no time. The proposed security model is going to effectively and efficiently provide a robust security while data is on cloud infrastructure as well as when data is getting migrated towards cloud infrastructure or vice versa.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Innovative Solutions for AREA Surveillance & Intrusion DetectionTristan Wiggill
A presentation delivered by Bernard Lee, engineer, ST Electronics, Singapore during the African Ports Evolution Event in Durban, South Africa.
More like this on www.transportworldafrica.co.za
The capability of involving the selection sharing encrypted data with different users via public
cloud storage may greatly ease security concerns over not intended data leaks in the cloud. A key
challenge to designing such encryption schemes to be sustainable in the efficient management of
encryption keys. The desired flexibility of sharing any group of selected documents with any group of
users need for something different encryption keys to be used for different documents. However, this
also implies the urgent need of securely distributing to users a large number of keys for both encryption
and search, and those users will have to protected from danger store the received keys, and submit an
equally large number of keyword trapdoors to the cloud in order to perform search over the shared data
implied need for secure communication, storage, and complexity clearly to give to someone the
approach impractical. In this work a data owner only needs to distribute a single key to a user for
sharing a very large number of documents, and the user only needs to submit a single trapdoor to the
cloud for querying the shared documents. User Revocation is used for Key Updation. Forward Secrecy
and Backward Secrecy is used.
NEW SECURE CONCURRECY MANEGMENT APPROACH FOR DISTRIBUTED AND CONCURRENT ACCES...ijiert bestjournal
Handover the critical data to the cloud provider sh ould have the guarantee of security and availabilit y for data at rest,in motion,and in use. Many alternatives sys tems exist for storage services,but the data confi dentiality in the database as a service paradigm are still immature. We propose a novel architecture that integrates clo ud database services paradigm with data confidentiality and exe cuting concurrent operations on encrypted data. Thi s is the method supporting geographically distributed client s to connect directly and access to an encrypted cl oud database,and to execute concurrent and independent operation s by using modifying the database structure. The proposed architecture has also the more advanta ge of removing intermediate proxies that limit the flexibility,availability,and expandability properties that are inbuilt in cloud-based systems. The efficacy of th e proposed architecture is evaluated by theoretical analyses a nd extensive experimental results with the help of prototype implementation related to the TPC-C standard benchm ark for various categories of clients and network l atencies. We propose a multi-keyword ranked search method for the encrypted cloud data databases,which simultan eously fulfill the needs of privacy requirements. The prop osed scheme could return not only the exact matchin g files,but also the files including the terms latent semantica lly associated to the query keyword.
Nowadays,Data is rising over internet in terabytes and Exabyte. So,there is a need of storing these data which has been fulfilled by cloud computing. Though the s ervice of cloud appear to be efficient and cost eff ective. Yet there are some challenges which are faced in cloud computing such as Data security and Authentication. In cloud Storage the data of Owner is stored in cloud where the cloud servers are remotely located the owner of the data does not have any direct control over the data. If the data over cloud is modified by the cloud,Third Party Auditor (TPA) or any other person there is no preci sion such that the owner of the data gets the infor mation about the modification of the data. TPA is a Third Party Auditor who has Experience in checking the integrity of data. TPA v erifies the files stored over the cloud if they are modified or not. Our scheme provides the solution to this probl em such that if there is any modification in the da ta the owner will get information about the change in the data. Our scheme only provides the information about the change in data it does not keep data intact or secure from mo dification over cloud.
A Study of A Method To Provide Minimized Bandwidth Consumption Using Regenera...IJERA Editor
Cloud storage systems to protect data from corruptions, redundant data to tolerate failures of storage and lost data should be repaired when storage fails. Regenerating codes provide fault tolerance by striping data across multiple servers, while using less repair traffic than traditional erasure codes during failure recovery. In previous research implemented practical Data Integrity Protection (DIP) scheme for regenerating-coding based cloud storage. Functional Minimum-Storage Regenerating (FMSR) codes and it construct FMSR-DIP codes, which allow clients to remotely verify the integrity of random subsets of long-term archival data under a multi server setting. The problem is to optimize bandwidth consumption when repairing multiple failures. The cooperative repair of multiple failures can help to further save bandwidth consumption when multiple failures are being repaired.
A NEW FRAMEWORK FOR SECURING PERSONAL DATA USING THE MULTI-CLOUDijsptm
Relaying On A Single Cloud As A Storage Service Is Not A Proper Solution For A Number Of Reasons; For Instance, The Data Could Be Captured While Uploaded To The Cloud, And The Data Could Be Stolen From The Cloud Using A Stolen Id. In This Paper, We Propose A Solution That Aims At Offering A Secure Data Storage For Mobile Cloud Computing Based On The Multi-Clouds Scheme. The Proposed Solution
Will Take The Advantages Of Multi-Clouds, Data Cryptography, And Data Compression To Secure The
Distributed Data; By Splitting The Data Into Segments, Encrypting The Segments, Compressing The
Segments, Distributing The Segments Via Multi-Clouds While Keeping One Segment On The Mobile Device
Memory; Which Will Prevent Extracting The Data If The Distributed Segments Have Been Intercepted
Crypto multi tenant an environment of secure computing using cloud sqlijdpsjournal
Today’s most modern research area of computing is cloud comput
ing due to its ability to diminish the costs
associated with virtualization, high availability, dynamic resource pools and increases the efficien
cy of
computing. But still it contains some drawbacks such as privacy, security, etc. This paper is thorou
ghly
focused on the security of data of multi tenant model obtains from the virtualization feature of clo
ud
computing. We use AES
-
128 bit algorithm and cloud SQL to protect sensitive data before storing in the
cloud. When the authorized customer arises for usag
e of data, then data firstly decrypted after that
provides to the customer. Multi tenant infrastructure is supported by Google, which prefers pushing
of
contents in short iteration cycle. As the customer is distributed and their demands can arise anywhe
re,
anytime so data can’t store at particular site it must be available different sites also. For this f
aster
accessing by different users from different places Google is the best one. To get high reliability a
nd
availability data is stored in encrypted befor
e storing in database and updated every time after usage. It is
very easy to use without requiring any software. This authenticate user can recover their encrypted
and
decrypted data, afford efficient and data storage security in the cloud.
Survey on cloud backup services of personal storageeSAT Journals
Abstract In widespread cloud environment cloud services is tremendously growing due to large amount of personal computation data. Deduplication process is used for avoiding the redundant data. A cloud storage environment for data backup in personal computing devices facing various challenge, of source deduplication for the cloud backup services with low deduplication efficiency. Challenges facing in the process of deduplication for cloud backup service are-1)Low deduplication efficiency due to exclusive access to large amount of data and limited system resources of PC based client site.2)Low data transfer efficiency due to transferring deduplicate data from source to backup server are typically small but that can be often across the WAN. Keywords- Cloud computing, Deduplication, cloud backup, application awareness
Multi- Level Data Security Model for Big Data on Public Cloud: A New ModelEswar Publications
With the advent of cloud computing the big data has emerged as a very crucial technology. The certain type of cloud provides the consumers with the free services like storage, computational power etc. This paper is intended to make use of infrastructure as a service where the storage service from the public cloud providers is going to leveraged by an individual or organization. The paper will emphasize the model which can be used by anyone without any cost. They can store the confidential data without any type of security issue, as the data will be altered
in such a way that it cannot be understood by the intruder if any. Not only that but the user can retrieve back the original data within no time. The proposed security model is going to effectively and efficiently provide a robust security while data is on cloud infrastructure as well as when data is getting migrated towards cloud infrastructure or vice versa.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Innovative Solutions for AREA Surveillance & Intrusion DetectionTristan Wiggill
A presentation delivered by Bernard Lee, engineer, ST Electronics, Singapore during the African Ports Evolution Event in Durban, South Africa.
More like this on www.transportworldafrica.co.za
Problemløsning. Samarbeid. Innovasjon. Digital læring. Kritisk tenkning. Eleven som produsent. Det er noen av nøkkelordene i det fremadstormende internasjonale begrep 21. århundrenes kompetanse (21st century Skills). Begrepet handler overordnet om at elever utvikler kompetanse til å bruke i samfunnet de skal bevege seg ut i, etter skolen.
Hva forstår vi egentlig med begrepet i Norden? Hvor langt er de nordiske landene med å bruke digitale verktøy til å understøtte denne kompetansen? Det har to forskere fra Aarhus Universitet undersøkt for Styrelsen for It og Læring (STIL):
This presentation focuses on utilizing community involvement to drive employee engagement. It is part one in a three part series on Creating a Culture of Engagement.
Moscow Python Conf 2016. Почему 100% покрытие это плохо?Ivan Tsyganov
Я работаю над продуктом Max Patrol компании Positive Technologies. Кодовая база нашего проекта насчитывает более 50 тысяч строк кода. Без хороших тестов работа с таким объемом кода превратилась бы в кошмар. Многие программисты стремятся к 100% покрытию кода тестами и считают, что это избавит их от множества проблем. Я расскажу о том, с какими трудностями мы столкнулись и почему заветные 100% ничего не говорят о покрытии тестируемого кода. Я приведу примеры кода и тестов, которые показывают 100% покрытие и покажу почему это не так. Я рассмотрю как работает библиотека coverage.py и объясню почему не стоит слепо верить результатам ее работы. Так же я поделюсь идеей получения честной метрики покрытия кода тестами и представлю прототип библиотеки, в которую воплотилась эта идея.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
Preserving Privacy Policy- Preserving public auditing for data in the cloudinventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
A Hybrid Cloud Approach for Secure Authorized DeduplicationSWAMI06
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data,
and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. To protect the confidentiality
of sensitive data while supporting deduplication, the convergent encryption technique has been proposed to encrypt the data before
outsourcing. To better protect data security, this paper makes the first attempt to formally address the problem of authorized data
deduplication. Different from traditional deduplication systems, the differential privileges of users are further considered in duplicate
check besides the data itself.We also present several new deduplication constructions supporting authorized duplicate check in a hybrid
cloud architecture. Security analysis demonstrates that our scheme is secure in terms of the definitions specified in the proposed
security model. As a proof of concept, we implement a prototype of our proposed authorized duplicate check scheme and conduct
testbed experiments using our prototype. We show that our proposed authorized duplicate check scheme incurs minimal overhead
compared to normal operations.
Effective & Flexible Cryptography Based Scheme for Ensuring User`s Data Secur...ijsrd.com
Cloud computing has been envisioned as the next-generation architecture of IT enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in the cloud, we propose an effective and flexible cryptography based scheme. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against malicious data modification attack.
A Hybrid Cloud Approach for Secure Authorized De-DuplicationEditor IJMTER
The cloud backup is used for the personal storage of the people in terms of reducing the
mainlining process and managing the structure and storage space managing process. The challenging
process is the deduplication process in both the local and global backup de-duplications. In the prior
work they only provide the local storage de-duplication or vice versa global storage de-duplication in
terms of improving the storage capacity and the processing time. In this paper, the proposed system
is called as the ALG- Dedupe. It means the Application aware Local-Global Source De-duplication
proposed system to provide the efficient de-duplication process. It can provide the efficient deduplication process with the low system load, shortened backup window, and increased power
efficiency in the user’s personal storage. In the proposed system the large data is partitioned into
smaller part which is called as chunks of data. Here the data may contain the redundancy it will be
avoided before storing into the storage area.
A Hybrid Cloud Approach for Secure Authorized Deduplication1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
Public Key Encryption algorithms Enabling Efficiency Using SaaS in Cloud Comp...Editor IJMTER
The Most great challenging in Cloud computing is Security. Here Security plays key role
in this paper proposed concept mainly deals with security at the end user access. While coming to the
end user access that are connected through the public networks. Here the end user wants to access his
application or services protected by the unauthorized persons. In this area if we want to apply
encryption or decryption methods such as RSA, 3DES, MD5, Blow fish. Etc.,
Whereas we can utilize these services at the end user access in cloud computing. Here there is
problem of encryption and decryption of the messages, services and applications. They are is lot of
time to take encrypt as well as decrypt and more number of processing capabilities are needed to use
the mechanism. For that problem we are introducing to use of cloud computing in SaaS model. i.e.,
scalable is applicable in this area so whenever it requires we can utilize the SaaS model.
In Cloud computing use of computing resources (hardware and software) that are delivered as a
service over Internet network. In advance earlier there is problem of using key size in various
algorithm like 64 bit it take some long period to encrypt the data.
A Secure Multi-Owner Data Sharing Scheme for Dynamic Group in Public Cloud. IJCERT JOURNAL
In cloud computing outsourcing group resource among cloud users is a major challenge, so cloud computing provides a low-cost and well-organized solution. Due to frequent change of membership, sharing data in a multi-owner manner to an untrusted cloud is still its challenging issue. In this paper we proposed a secure multi-owner data sharing scheme for dynamic group in public cloud. By providing AES encryption with convergent key while uploading the data, any cloud user can securely share data with others. Meanwhile, the storage overhead and encryption computation cost of the scheme are independent with the number of revoked users. In addition, I analyze the security of this scheme with rigorous proofs. One-Time Password is one of the easiest and most popular forms of authentication that can be used for securing access to accounts. One-Time Passwords are often referred to as secure and stronger forms of authentication in multi-owner manner. Extensive security and performance analysis shows that our proposed scheme is highly efficient and satisfies the security requirements for public cloud based secure group sharing.
Similar to A hybrid cloud approach for secure authorized (20)
MISS TEEN GONDA 2024 - WINNER ABHA VISHWAKARMADK PAGEANT
Abha Vishwakarma, a rising star from Uttar Pradesh, has been selected as the victor from Gonda for Miss High Schooler India 2024. She is a glad representative of India, having won the title through her commitment and efforts in different talent competitions conducted by DK Exhibition, where she was crowned Miss Gonda 2024.
Exploring Career Paths in Cybersecurity for Technical CommunicatorsBen Woelk, CISSP, CPTC
Brief overview of career options in cybersecurity for technical communicators. Includes discussion of my career path, certification options, NICE and NIST resources.
NIDM (National Institute Of Digital Marketing) Bangalore Is One Of The Leading & best Digital Marketing Institute In Bangalore, India And We Have Brand Value For The Quality Of Education Which We Provide.
www.nidmindia.com
Want to move your career forward? Looking to build your leadership skills while helping others learn, grow, and improve their skills? Seeking someone who can guide you in achieving these goals?
You can accomplish this through a mentoring partnership. Learn more about the PMISSC Mentoring Program, where you’ll discover the incredible benefits of becoming a mentor or mentee. This program is designed to foster professional growth, enhance skills, and build a strong network within the project management community. Whether you're looking to share your expertise or seeking guidance to advance your career, the PMI Mentoring Program offers valuable opportunities for personal and professional development.
Watch this to learn:
* Overview of the PMISSC Mentoring Program: Mission, vision, and objectives.
* Benefits for Volunteer Mentors: Professional development, networking, personal satisfaction, and recognition.
* Advantages for Mentees: Career advancement, skill development, networking, and confidence building.
* Program Structure and Expectations: Mentor-mentee matching process, program phases, and time commitment.
* Success Stories and Testimonials: Inspiring examples from past participants.
* How to Get Involved: Steps to participate and resources available for support throughout the program.
Learn how you can make a difference in the project management community and take the next step in your professional journey.
About Hector Del Castillo
Hector is VP of Professional Development at the PMI Silver Spring Chapter, and CEO of Bold PM. He's a mid-market growth product executive and changemaker. He works with mid-market product-driven software executives to solve their biggest growth problems. He scales product growth, optimizes ops and builds loyal customers. He has reduced customer churn 33%, and boosted sales 47% for clients. He makes a significant impact by building and launching world-changing AI-powered products. If you're looking for an engaging and inspiring speaker to spark creativity and innovation within your organization, set up an appointment to discuss your specific needs and identify a suitable topic to inspire your audience at your next corporate conference, symposium, executive summit, or planning retreat.
About PMI Silver Spring Chapter
We are a branch of the Project Management Institute. We offer a platform for project management professionals in Silver Spring, MD, and the DC/Baltimore metro area. Monthly meetings facilitate networking, knowledge sharing, and professional development. For event details, visit pmissc.org.
Jill Pizzola's Tenure as Senior Talent Acquisition Partner at THOMSON REUTERS...dsnow9802
Jill Pizzola's tenure as Senior Talent Acquisition Partner at THOMSON REUTERS in Marlton, New Jersey, from 2018 to 2023, was marked by innovation and excellence.
Resumes, Cover Letters, and Applying OnlineBruce Bennett
This webinar showcases resume styles and the elements that go into building your resume. Every job application requires unique skills, and this session will show you how to improve your resume to match the jobs to which you are applying. Additionally, we will discuss cover letters and learn about ideas to include. Every job application requires unique skills so learn ways to give you the best chance of success when applying for a new position. Learn how to take advantage of all the features when uploading a job application to a company’s applicant tracking system.
1. International Journal of Science and Research (IJSR)
ISSN (Online): 2319-7064
Index Copernicus Value (2013): 6.14 | Impact Factor (2013): 4.438
Volume 4 Issue 8, August 2015
www.ijsr.net
Licensed Under Creative Commons Attribution CC BY
A Hybrid Cloud Approach for Secure Authorized
Deduplication
Jagadish 1
, Dr.Suvarna Nandyal2
1
M.Tech.(CSE), Department of Computer Science & Engineering,
Poojya Doddappa Appa College of Engineering, Gulbarga, Karnataka, India
2
Professor and HOD, Department of Computer Science & Engineering,
Poojya Doddappa Appa College of Engineering, Gulbarga, Karnataka, India
Abstract: Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data and
has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. To protect the confidentiality of
sensitive data while supporting deduplication the convergent encryption technique has been proposed to encrypt the data before
outsourcing. To better protect data security, this work makes the first attempt to formally address the problem of authorized data
deduplication. Different from traditional deduplication systems, the differential privileges of users are further considered in duplicate
check besides the data itself. The work also presents several new deduplication constructions supporting authorized duplicate check in
hybrid cloud architecture. Security analysis demonstrates that our scheme is secure in terms of the definitions specified in the proposed
security model. As a proof of concept, the work implement a prototype of proposed authorized duplicate check scheme and conduct
tested experiments using the prototype. The work shows that the proposed authorized duplicate check scheme incurs minimal overhead
compared to normal operations.
Keywords: Data deduplication, Convergent encryption, Confidentiality, Hybrid cloud, Authorized Duplicate check.
1. Introduction
Cloud computing enables new business models and cost
effective resource usage. Instead of maintaining their own
data center, companies can concentrate on their core business
and purchase resources when it will needed. Especially when
combining publicly accessible clouds with a privately
maintained virtual infrastructure in a hybrid cloud, the hybrid
cloud technology can open up new opportunities for
businesses. Today’s cloud service providers offer both highly
available storage and massively parallel computing resources
at relatively low costs. As cloud computing becomes
prevalent, an increasing amount of data is being stored in the
cloud and shared by users with specified privileges, which
define the access rights of the stored data. One critical
challenge of cloud storage services is the management of the
ever-increasing volume of data. Data deduplication is a
specialized data compression technique for eliminating
duplicate copies of repeating data in storage.
Deduplication can take place at either the file level or the
block level for file level deduplication, it eliminates duplicate
copies of the same file. Traditional encryption, while
providing data confidentiality is incompatible with data
deduplication. Specifically, traditional encryption requires
different users to encrypt their data with their own keys.
Thus, identical data copies of different users will lead to
different cipher texts making deduplication impossible.
Convergent encryption has been proposed to enforce data
confidentiality while making deduplication feasible. It
encrypts/decrypts a data copy with a Convergent key, which
is obtained by computing the cryptographic hash value of the
content of the data copy. After key generation and data
encryption, users retain the keys and send the cipher text to
the cloud. Since the encryption operation is deterministic and
is derived from the data content, identical data copies will
generate the same convergent key and hence the same cipher
text. A Hybrid Cloud is a combined form of private clouds
and public clouds in which some critical data resides in the
enterprise’s private cloud while other data is stored in and
accessible from a public cloud. Hybrid clouds seek to deliver
the advantages of scalability, reliability, rapid deployment
and potential cost savings of public clouds with the security
and increased control and management of private clouds..
Figure 1.1: Architecture of Cloud Computing
The critical challenge of cloud storage or cloud computing is
the management of the continuously increasing volume of
data. Data deduplication or Single Instancing essentially
refers to the elimination of redundant data. However,
indexing of all data is still retained should that data ever be
required. In general the data deduplication eliminates the
duplicate copies of repeating data.
This paper is organized as follows, section 1 discusses the
introduction, and section 2 describes related work. Section 3
details the system design and implementation. Section 4,
presents the performance evaluations of our system design.
Finally, section 5 presents some concluding remark.
2. Related Work
“A secure cloud backup system with assured deletion and
version control. A. Rahumed”, H. C. H. Chen, Y. Tang, P.
P. C. Lee, and J. C. S.Lui [1],has presented Cloud storage is
Paper ID: SUB157810 1806
2. International Journal of Science and Research (IJSR)
ISSN (Online): 2319-7064
Index Copernicus Value (2013): 6.14 | Impact Factor (2013): 4.438
Volume 4 Issue 8, August 2015
www.ijsr.net
Licensed Under Creative Commons Attribution CC BY
an emerging service model that enables individuals and
enterprises to outsource the storage of data backups to
remote cloud providers at a low cost. Hence results shows
that FadeVersion only adds minimal performance overhead
over a traditional cloud backup service that does not support
assured deletion. “A reverse deduplication storage system
optimized for reads to latest backups”, C. Ng and P. Lee.
Revdedup [2] had present RevDedup, a de-duplication
system designed for VM disk image backup in virtualization
environments. RevDedup has several design goals: high
storage efficiency, low memory usage, high backup
performance, and high restore performance for latest
backups. They extensively evaluate our RevDedup prototype
using different workloads and validate our design goals.
“Role-based access controls”, D. Ferraiolo and R. Kuhn
[3],has described the Mandatory Access Controls (MAC) are
appropriate for multilevel secure military applications,
Discretionary Access Controls (DAC) are often perceived as
meeting the security processing needs of industry and
civilian government. “Secure deduplication with efficient
and reliable convergent key management”, J. Li, X. Chen,
M. Li, J. Li, P. Lee, andW. Lou [4], had proposed Dekey,
an efficient and reliable convergent key management scheme
for secure de-duplication. They implement Dekey using the
Ramp secret sharing scheme and demonstrate that it incurs
small encoding/decoding overhead compared to the network
transmission overhead in the regular upload/download
operations.” Reclaiming space from duplicate files in a
server less distributed file system”, J. R. Douceur, A.
Adya, W. J. Bolosky, D. Simon, and M. Theimer. [5],has
presented the Farsite distributed file system provides
availability by replicating each file onto multiple desktop
computers. Measurement of over 500 desktop file systems
shows that nearly half of all consumed space is occupied by
duplicate files. The mechanism includes 1) convergent
encryption, which enables duplicate files to coalesced into
the space of a single file, even if the files are encrypted with
different users’ keys, and 2) SALAD, a Self Arranging,
Lossy, Associative Database for aggregating file content and
location information in a decentralized, scalable, fault-
tolerant manner..“A secure data deduplication scheme for
cloud storage”, J. Stanek, A. Sorniotti, E. Androulaki,
and L. Kencl [6],has provided the private users outsource
their data to cloud storage providers, recent data breach
incidents make end-toend encryption an increasingly
prominent requirement data deduplication can be effective
for popular data, whilst semantically secure encryption
protects unpopular content. “Weak leakage-resilient client-
side deduplication of encrypted data in cloud storage”, J.
Xu, E.-C. Chang, and J. Zhou [7],has described the secure
client-side deduplication scheme, with the following
advantages: our scheme protects data confidentiality (and
some partial information) against both outside adversaries
and honest-but-curious cloud storage server, while Halevi et
al. trusts cloud storage server in data confidentiality. “Secure
and constant cost public cloud storage auditing with
deduplication”, J. Yuan and S. Yu[8] has proposed, Data
integrity and storage efficiency are two important
requirements for cloud storage. The author proposed scheme
is also characterized by constant realtime communication and
computational cost on the user side. “Privacy aware data
intensive computing on hybrid clouds”, K. Zhang, X.
Zhou, Y. Chen, X.Wang, and Y. Ruan [9] has proposed,
the emergence of cost-effective cloud services offers
organizations great opportunity to reduce their cost and
increase productivity.The system, called Sedic, leverages the
special features of Map Reduce to automatically partition a
computing job according to the security levels of the data it
works. “Gq and schnorr identification schemes Proofs of
security against impersonation under active and
concurrent attacks”, M. Bellare and A. Palacio[10] has
provided, the proof for GQ based on the assumed security of
RSA under one more inversion, an extension of the usual
onewayness assumption that was introduced. Both results
extend to establish security against impersonation under
concurrent attack.
3. Methodology
The basic objective of this work is the problem of privacy
preserving deduplication in cloud computing and a proposed
System focus on these aspects:
1) Differential Authorization: Each authorized user is able
to get his/her individual token of his file to perform
duplicate check based on his privileges.
2) Authorized Duplicate Check: Authorized user is able to
use his/her individual private keys to generate query for
certain file and the privileges he/she owned with the help
of private cloud, while the public cloud performs
duplicate check directly and tells the user if there is any
duplicate.
3.1 Proposed System
In Proposed system, Convergent encryption has been used to
enforce data confidentiality. Data copy is encrypted under a
key derived by hashing the data itself. This convergent key is
used for encrypt and decrypt a data copy. Furthermore, such
unauthorized users cannot decrypt the cipher text even
collude with the S-CSP(storage cloud service provider).
Security analysis demonstrates that that system is secure in
terms of the definitions specified in the proposed security
model.
Figure 1: Architecture for Authorized Deduplication
This work describes a company by where the employee
details such as name, password, email id, contact number and
Paper ID: SUB157810 1807
3. International Journal of Science and Research (IJSR)
ISSN (Online): 2319-7064
Index Copernicus Value (2013): 6.14 | Impact Factor (2013): 4.438
Volume 4 Issue 8, August 2015
www.ijsr.net
Licensed Under Creative Commons Attribution CC BY
designation is registered by admin or owner of the company
based on his userid and password employees of the company
able to perform operations such as file upload download and
duplicate checks on the files based on his privileges.
There are three entities define in hybrid cloud architecture of
authorized deduplication.
Data Users: A user is an entity that wants to outsource
data storage to the S-CSP(storage cloud service provider)
and access the data later. In a storage system supporting
deduplication, the user only uploads unique data but does
not upload any duplicate data to save the upload
bandwidth, which may be owned by the same user or
different users. Each file is protected with the convergent
encryption key and privilege keys to realize the authorized
deduplication with differential privileges.
Private Cloud: This is new entity for facilitating users
secure use of cloud services. The private keys for
privileges are managed by private cloud, which provides
the file token to users. Specifically, since the computing
resources at data user/owner side are restricted and the
public cloud is not fully trusted in practice, private cloud is
able to provide data user/owner with an execution
environment and infrastructure working as an interface
between user and the public cloud.
S-CSP(storage cloud service provider):This is an entity
that provides a data storage service in public cloud. The S-
CSP provides the data outsourcing service and stores data
on behalf of the users. To reduce the storage cost, the S-
CSP eliminates the storage of redundant data via
deduplication and keeps only unique data. In this paper,
we assume that S-CSP is always online and has abundant
storage capacity and computation power.
3.2 SHA1 Algorithm Description
In the proposed system convergent key for each file is
generated by using secure hashing algorithm-1 the steps of
this algorithm is given below
Step1: Padding
Pad the message with a single one followed by zeroes until
the final block has 448 bits.
Append the size of the original message as an unsigned 64
bit integer.
Step2: Initialize the 5 hash blocks (h0,h1,h2,h3,h4) to the
specific constants defined in the SHA1 standard.
Step3:Hash (for each 512bit Block)
○ Allocate an 80 word array for the message schedule
■ Set the first 16 words to be the 512bit block split into 16
words.
■ The rest of the words are generated using the following
algorithm
step4: word[i3] XOR word[i8] XOR word[i14] XOR
word[i16] then rotated 1 bit to the left.
○ Loop 80 times doing the following.
■ Calculate SHAfunction() and the constant K (these are
based on the current round number.
■ e=d
■ d=c
■ c=b (rotated left 30)
■ b=a
■ a = a (rotated left 5) + SHAfunction() + e + k + word[i]
○ Add a,b,c,d and e to the hash output.
step5: Output the concatenation (h0,h1,h2,h3,h4) which is
the message digest.
3.3: Flow Chart
The private keys for the privileges are managed by the
private cloud, who answers the file token requests from the
users and this interface offered by the private cloud allows
user to submit files and queries to be securely stored and
computed respectively.
Figure 2: Flow Diagram of the proposed work.
In deduplication system, hybrid cloud architecture is
introduced to solve the problem of unauthorized
deduplication of file. The private keys for privileges will not
be issued to users directly, which will be kept and managed
by the private cloud server. The user needs to send a request
to the private cloud server to get a file token. The user needs
to get the file token from the private cloud server to perform
the duplicate check for some file. The user either uploads this
file or prove their ownership based on the results of duplicate
check. If it is passed, the private cloud server will find the
corresponding privileges of the user from its stored table list
and send to the user then user can upload his files. The same
way user can download his file from storage cloud.
4. Results and Discussion
We conduct test based evaluation on our prototype. Our
evaluation focuses on comparing the overhead induced by
authorization steps, including file token generation and share
token generation, against the convergent encryption and file
upload steps. We evaluate the over- head by varying
different factors.
Paper ID: SUB157810 1808
4. International Journal of Science and Research (IJSR)
ISSN (Online): 2319-7064
Index Copernicus Value (2013): 6.14 | Impact Factor (2013): 4.438
Volume 4 Issue 8, August 2015
www.ijsr.net
Licensed Under Creative Commons Attribution CC BY
Figure 3: Registration screen
In fig 3, shows the initial registration of a screen. The admin
can add different employee informations. Thus, the Admin
registering an employment as a director.
Figure 4: Successful registration
In fig4, shows a successful registration of an Employee of
valid information.
Figure 5: Selection of team leader
In fig 5, after the getting a valid information from an
employer. The admin selects a team leader.
Figure 6: Successful registration of a team leader.
Figure 7: Login as Director.
Figure8: Choosing a file.
In fig 8, shows every user can upload the files onto the cloud
and also they give the access permissions to upload and
download a file into cloud.
Figure 9: Access permission to team leader.
In fig 9, shows the access permission can be given to the
various priorities like team leader, engineers etc. Later the
file has upload into the Amazon cloud, and later the file
fetches the required information from the hybrid cloud which
contains i.e. employees name and all etc., later they upload
the file. Hence the file gets stored and encrypted form an
image is been generated.
Figure 10: Set of private and public cloud server.
In the back end registered employees can be displayed and
the token generated by private cloud for the files .If the same
file is given to other user same token is generated by the
private cloud and a tag is generated for the duplicate file.
Unique files having no tags and it is represented as none. In
this project work the time required to encrypt and to store the
files in the amazon cloud is calculated and it is shown in the
file encryption chart by taking the file name along x-axis and
encryption time in milliseconds along y-axis.
If three files of different sizes such as 427kb,672kb and
2.15mb are uploaded to the cloud the files are stored in
encrypted form in the amazon cloud and the time required to
encrypt these files is based on network speed and it is
484ms,203ms and 453ms respectively for these files and the
time is noted in the notepad and this can be shown in figure
11.
Figure 11: File Encryption Time Chart
Paper ID: SUB157810 1809
5. International Journal of Science and Research (IJSR)
ISSN (Online): 2319-7064
Index Copernicus Value (2013): 6.14 | Impact Factor (2013): 4.438
Volume 4 Issue 8, August 2015
www.ijsr.net
Licensed Under Creative Commons Attribution CC BY
5. Conclusion and Future Work
In this Project, the notion of authorized data deduplication
was proposed to protect the data security by including
differential privileges of users in the duplicate check. In this
project we perform several new deduplication constructions
supporting authorized duplicate check in hybrid cloud
architecture, in which the duplicate-check tokens of files are
generated by the private cloud server with private keys. As a
proof of concept in this project we implement a prototype of
our proposed authorized duplicate check scheme and conduct
testbed experiments on our prototype. From this project we
show that our authorized duplicate check scheme incurs
minimal overhead compared to convergent encryption and
network transfer.
Futures work: It excludes the security problems that may
arise in the practical deployment of the present model. Also,
it increases the national security. It saves the memory by
deduplicating the data and thus provides us with sufficient
memory. It provides authorization to the private firms and
protects the confidentiality of the important data
References
[1] C. Ng and P. Lee. Revdedup: A reverse deduplication
storage system optimized for reads to latest backups. In
Proc. of APSYS,Apr 2013.
[2] J. Li, X. Chen, M. Li, J. Li, P. Lee, andW. Lou. Secure
deduplication with efficient and reliable convergent key
management. In IEEE Transactions on Parallel and
Distributed Systems, 2013.
[3] J. Stanek, A. Sorniotti, E. Androulaki, and L. Kencl. A
secure datadeduplication scheme for cloud storage. In
Technical Report, 2013.
[4] J. Xu, E.-C. Chang, and J. Zhou. Weak leakage-resilient
client-side deduplication of encrypted data in cloud
storage. In ASIACCS,pages 195–206, 2013.
[5] J. Yuan and S. Yu. Secure and constant cost public
cloud storage auditing with deduplication. IACR
Cryptology ePrint Archive, 2013:149, 2013.
[6] K. Zhang, X. Zhou, Y. Chen, X.Wang, and Y. Ruan.
Sedic: privacy aware data intensive computing on hybrid
clouds. In Proceedings of the 18th ACM conference on
Computer and communications security,CCS’11, pages
515–526, New York, NY, USA, 2011. ACM.
[7] M. Bellare, S. Keelveedhi, and T. Ristenpart. Dupless:
Serveraided encryption for deduplicated storage. In
USENIX Security Symposium, 2013.
[8] M. Bellare, S. Keelveedhi, and T. Ristenpart. Message-
locked encryption and secure deduplication. In
EUROCRYPT, pages 296–312, 2013.
[9] M. W. Storer, K. Greenan, D. D. E. Long, and E. L.
Miller. Secure data deduplication. In Proc. of StorageSS,
2008.
[10]P. Anderson and L. Zhang. Fast and secure laptop
backups with encrypted de-duplication. In Proc. of
USENIX LISA, 2010.
[11]R. D. Pietro and A. Sorniotti. Boosting efficiency and
security in proof of ownership for deduplication. In H.
Y. Youm and Y. Won, editors, ACM Symposium on
Information, Computer and Communications Security,
pages 81–82. ACM, 2012.
[12]S. Bugiel, S. Nurnberger, A. Sadeghi, and T. Schneider.
Twin clouds: An architecture for secure cloud
computing. In Workshop on Cryptography and Security
in Clouds (WCSC2011), 2011.
[13]S. Halevi, D. Harnik, B. Pinkas, and A. Shulman-Peleg.
Proofs of ownership in remote storage systems. In Y.
Chen, G. Danezis, and V. Shmatikov, editors, ACM
Conference on Computer and Communications Security,
pages 491–500. ACM, 2011.
[14]W. K. Ng, Y. Wen, and H. Zhu. Private data
deduplication protocols in cloud storage. In S. Ossowski
and P. Lecca, editors, Proceedings of the 27th Annual
ACM Symposium on Applied Computing, pages 441–
446. ACM, 2012.
[15]Z. Wilcox-O’Hearn and B. Warner. Tahoe: the least-
authority filesystem. In Proc. of ACM StorageSS, 2008.
Paper ID: SUB157810 1810