Despite several cost-effective and flexible characteristics of cloud computing, some clients are reluctant to adopt this paradigm due to emerging security and privacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of information is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipulate and tamper client's data. Attacks on cloud storages are extremely challenging to detect and mitigate. In order to formulate privacy preserved cloud storage, in this research paper, we propose an improved technique that consists of five contributions such as Resilient role-based access control mechanism, Partial homomorphic cryptography, metadata generation and sound steganography, Efficient third-party auditing service, Data backup and recovery process. We implemented these components using Java Enterprise Edition with Glassfish Server. Finally we evaluated our proposed technique by penetration testing and the results showed that client’s data is intact and protected from malicious attackers.
Accessing secured data in cloud computing environmentIJNSA Journal
Number of businesses using cloud computing has increased dramatically over the last few years due to the attractive features such as scalability, flexibility, fast start-up and low costs. Services provided over the web are ranging from using provider’s software and hardware to managing security and other issues. Some of the biggest challenges at this point are providing privacy and data security to subscribers of public cloud servers. An efficient encryption technique presented in this paper can be used for secure access to and storage of data on public cloud server, moving and searching encrypted data through communication channels while protecting data confidentiality. This method ensures data protection against both external and internal intruders. Data can be decrypted only with the provided by the data owner key, while public cloud server is unable to read encrypted data or queries. Answering a query does not depend on it size and done in a constant time. Data access is managed by the data owner. The proposed schema allows unauthorized modifications detection
A Survey on Different Techniques Used in Decentralized Cloud ComputingEditor IJCATR
This paper proposes various methods for anonymous authentication for data stored in cloud. Cloud verifies the authenticity
of the series without knowing the user’s identity before storing data. This paper also has the added feature of access control in which
only valid users are able to decrypt the stored information. These schemes also prevents replay attacks and supports creation,
modification, and reading data stored in the cloud. Moreover, our authentication and access control scheme is decentralized and robust,
unlike other access control schemes designed for clouds which are centralized. The communication, computation, and storage
overheads are comparable to centralized approaches .The aim of this paper is to cover many security issues arises in cloud computing
and different schemes to prevent security risks in cloud. Storage-as-a-service (Saas) offered by cloud service providers (CSPs) is a paid
facility that enables organizations to outsource their sensitive data to be stored on remote servers. In this paper, we propose a cloudbased
storage schemes that allows the data owner to benefit from the facilities offered by the CSP and enables indirect mutual trust
between them. This Paper provides different authentication techniques and algorithms for cloud security.
As the technology is increasing more number of clients would like to store their data in the public cloud. As the cloud offer client to store large amount of data and can use the data from anywhere using the internet. New security problems need to be solved to give intact to the client data available in the cloud. Client has to feel that their outsourced data is in the protected way in the cloud. From the security problems we propose “A NOVEL APPROACH FOR DATA UPLOADING AND REMOTE DATA INTEGRITY CHECKING BASED ON PUBLIC KEY CRYPTOGRAPHY” (ANDURIC-PKC). We will give the formal definition, system model and security model. Then a concrete ANDURIC-PKC protocol is built by using Generic group model and certificate management is not required. This protocol is efficient and flexible, this may be provably secured by using Computational Diffie-Hellman problem. Based on the original client authorization, the proposed protocol can realize the data integrity checking.
Data Stream Controller for Enterprise Cloud ApplicationIJSRD
Cloud computing is an emerging computing paradigm where computing resources are provided as services over Internet while residing in a large data center. Even though it enables us to dynamically provide servers with the ability to address a wide range of needs, this paradigm brings forth many new challenges for the data security and access control as users outsource their sensitive data to clouds, which are beyond the same trusted domain as data owners. The occupier need not be concerned with how the Paas system achieves expansion under high load.MAC systems differ as security policy is defined for the entire system, typically by administrators. Information flow control (IFC) is a MAC approach, developed originally from military information management methodologies. IFC can be used to enforce more general policies, using appropriate labeling and checking schemes. The labels can be used to manage both confidentiality and integrity concerns, tracking “secrecy†and “quality†of data, respectively. Decentralized Information Flow Control (DIFC) is an approach to security that allows application writers to control how data flow between the pieces of application and the outside world. As applied to privacy DIFC allows un trusted software to compute with private data while trusted security code controls the release of that data. As applied to integrity DIFC allows trusted code to protect un trusted software from unexpected inputs.
Accessing secured data in cloud computing environmentIJNSA Journal
Number of businesses using cloud computing has increased dramatically over the last few years due to the attractive features such as scalability, flexibility, fast start-up and low costs. Services provided over the web are ranging from using provider’s software and hardware to managing security and other issues. Some of the biggest challenges at this point are providing privacy and data security to subscribers of public cloud servers. An efficient encryption technique presented in this paper can be used for secure access to and storage of data on public cloud server, moving and searching encrypted data through communication channels while protecting data confidentiality. This method ensures data protection against both external and internal intruders. Data can be decrypted only with the provided by the data owner key, while public cloud server is unable to read encrypted data or queries. Answering a query does not depend on it size and done in a constant time. Data access is managed by the data owner. The proposed schema allows unauthorized modifications detection
A Survey on Different Techniques Used in Decentralized Cloud ComputingEditor IJCATR
This paper proposes various methods for anonymous authentication for data stored in cloud. Cloud verifies the authenticity
of the series without knowing the user’s identity before storing data. This paper also has the added feature of access control in which
only valid users are able to decrypt the stored information. These schemes also prevents replay attacks and supports creation,
modification, and reading data stored in the cloud. Moreover, our authentication and access control scheme is decentralized and robust,
unlike other access control schemes designed for clouds which are centralized. The communication, computation, and storage
overheads are comparable to centralized approaches .The aim of this paper is to cover many security issues arises in cloud computing
and different schemes to prevent security risks in cloud. Storage-as-a-service (Saas) offered by cloud service providers (CSPs) is a paid
facility that enables organizations to outsource their sensitive data to be stored on remote servers. In this paper, we propose a cloudbased
storage schemes that allows the data owner to benefit from the facilities offered by the CSP and enables indirect mutual trust
between them. This Paper provides different authentication techniques and algorithms for cloud security.
As the technology is increasing more number of clients would like to store their data in the public cloud. As the cloud offer client to store large amount of data and can use the data from anywhere using the internet. New security problems need to be solved to give intact to the client data available in the cloud. Client has to feel that their outsourced data is in the protected way in the cloud. From the security problems we propose “A NOVEL APPROACH FOR DATA UPLOADING AND REMOTE DATA INTEGRITY CHECKING BASED ON PUBLIC KEY CRYPTOGRAPHY” (ANDURIC-PKC). We will give the formal definition, system model and security model. Then a concrete ANDURIC-PKC protocol is built by using Generic group model and certificate management is not required. This protocol is efficient and flexible, this may be provably secured by using Computational Diffie-Hellman problem. Based on the original client authorization, the proposed protocol can realize the data integrity checking.
Data Stream Controller for Enterprise Cloud ApplicationIJSRD
Cloud computing is an emerging computing paradigm where computing resources are provided as services over Internet while residing in a large data center. Even though it enables us to dynamically provide servers with the ability to address a wide range of needs, this paradigm brings forth many new challenges for the data security and access control as users outsource their sensitive data to clouds, which are beyond the same trusted domain as data owners. The occupier need not be concerned with how the Paas system achieves expansion under high load.MAC systems differ as security policy is defined for the entire system, typically by administrators. Information flow control (IFC) is a MAC approach, developed originally from military information management methodologies. IFC can be used to enforce more general policies, using appropriate labeling and checking schemes. The labels can be used to manage both confidentiality and integrity concerns, tracking “secrecy†and “quality†of data, respectively. Decentralized Information Flow Control (DIFC) is an approach to security that allows application writers to control how data flow between the pieces of application and the outside world. As applied to privacy DIFC allows un trusted software to compute with private data while trusted security code controls the release of that data. As applied to integrity DIFC allows trusted code to protect un trusted software from unexpected inputs.
Security Check in Cloud Computing through Third Party Auditorijsrd.com
In cloud computing, data owners crowd their data on cloud servers and users (data consumers) can access the data from cloud servers. Due to the data outsourcing, however, it requires an independent auditing service to check the data integrity in the cloud. Some existing remote integrity checking method scan only serve for static records data. Thus, cannot be used in the auditing service since the data in the cloud can be animatedly updated. Thus, an efficient and secure dynamic auditing protocol is required to convince data owners that the data are correctly stored in the cloud. In this paper, we first design an auditing framework for cloud storage systems for privacy-preserving auditing protocol. Then, we extend our auditing protocol to support the data dynamic operations, which is efficient to secure the random model.
Achieving Secure, sclable and finegrained Cloud computing reportKiran Girase
cloud computing is also facing many challenges that, if not well resolved, may impede its fast growth. Data security, as it exists in many other applications, is among these challenges that would raise great concerns from users when they store sensitive information on cloud servers. These concerns originate from the fact that cloud servers are usually operated by commercial providers which are very likely to be outside of the trusted domain of the users. Data confidential against cloud servers is hence frequently desired when users outsource data for storage in the cloud.
Secure Data Storage in Cloud Using Encryption and Steganographyiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Messages addressed to specific users can be decrypted by Key Generation Centre (KGC) by generating their private keys. Data owner wants the data to be delivered only to specified user and not to unauthorized person that is the data owner makes their private data accessible only to authorized person. We propose attribute based encryption and escrow problem which means written agreement delivered to a third party to overcome this problem. Attribute based Encryption (ABE) is a type of public-key encryption in which the private key of a user and the cipher text are dependent upon attributes. It is a promising cryptographic approach.
Enhanced Data Partitioning Technique for Improving Cloud Data Storage SecurityEditor IJMTER
Cloud computing is a model for enabling for on demand network access to shared
configurable computing resources (e.g. networks, servers, storage, applications, and services).It is
based on virtualization and distributed computing technologies. Cloud Data storage systems enable
user to store data efficiently on server without any trouble of data resources. User can easily store
and retrieve their data remotely. The two biggest concerns about cloud data storage are reliability and
security. Clients aren’t like to entrust their data to another third party or companies without a
guarantee that they will be able to access therein formations whenever they want. In the existing
system, the data are stored in the cloud using dynamic data operation with computation which makes
the user need to make a copy for further updating and verification of the data loss. Different
distributed storing auditing techniques are used for overcoming the problem of data loss. Recent
work of this paper has show that data partitioning technique used for data storage by providing
Digital signature to every partitioning data and user .this technique allow user to upload or retrieve
the data with matching the digital signatures provided to them. This method ensures high cloud
storage integrity, enhanced error localization and easy identification of misbehaving server and
unauthorized access to the cloud server. Hence this work aims to store the data securely in reduced
space with less time and computational cost.
A Novel Information Accountability Framework for Cloud ComputingIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Enhanced security framework to ensure data security in cloud using security b...eSAT Journals
Abstract Data security and Access control is a challenging research work in Cloud Computing. Cloud service users upload there private and confidential data over the cloud. As the data is transferred among the server and client, the data is to be protected from unauthorized entries into the server, by authenticating the user’s and provide high secure priority to the data. So the Experts always recommend using different passwords for different logins. Any normal person cannot possibly follow that advice and memorize all their usernames and passwords. That is where password managers come in. The purpose of this paper is to secure data from unauthorized person using Security blanket algorithm.
Implementation of user authentication as a service for cloud networkSalam Shah
There are so many security risks for the users of cloud computing, but still the organizations are switching towards the cloud. The cloud provides data protection and a huge amount of memory usage remotely or virtually. The organization has not adopted the cloud computing completely due to some security issues. The research in cloud computing has more focus on privacy and security in the new categorization attack surface. User authentication is the additional overhead for the companies besides the management of availability of cloud services. This paper is based on the proposed model to provide central authentication technique so that secured access of resources can be provided to users instead of adopting some unordered user authentication techniques. The model is also implemented as a prototype.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Security Check in Cloud Computing through Third Party Auditorijsrd.com
In cloud computing, data owners crowd their data on cloud servers and users (data consumers) can access the data from cloud servers. Due to the data outsourcing, however, it requires an independent auditing service to check the data integrity in the cloud. Some existing remote integrity checking method scan only serve for static records data. Thus, cannot be used in the auditing service since the data in the cloud can be animatedly updated. Thus, an efficient and secure dynamic auditing protocol is required to convince data owners that the data are correctly stored in the cloud. In this paper, we first design an auditing framework for cloud storage systems for privacy-preserving auditing protocol. Then, we extend our auditing protocol to support the data dynamic operations, which is efficient to secure the random model.
Achieving Secure, sclable and finegrained Cloud computing reportKiran Girase
cloud computing is also facing many challenges that, if not well resolved, may impede its fast growth. Data security, as it exists in many other applications, is among these challenges that would raise great concerns from users when they store sensitive information on cloud servers. These concerns originate from the fact that cloud servers are usually operated by commercial providers which are very likely to be outside of the trusted domain of the users. Data confidential against cloud servers is hence frequently desired when users outsource data for storage in the cloud.
Secure Data Storage in Cloud Using Encryption and Steganographyiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Messages addressed to specific users can be decrypted by Key Generation Centre (KGC) by generating their private keys. Data owner wants the data to be delivered only to specified user and not to unauthorized person that is the data owner makes their private data accessible only to authorized person. We propose attribute based encryption and escrow problem which means written agreement delivered to a third party to overcome this problem. Attribute based Encryption (ABE) is a type of public-key encryption in which the private key of a user and the cipher text are dependent upon attributes. It is a promising cryptographic approach.
Enhanced Data Partitioning Technique for Improving Cloud Data Storage SecurityEditor IJMTER
Cloud computing is a model for enabling for on demand network access to shared
configurable computing resources (e.g. networks, servers, storage, applications, and services).It is
based on virtualization and distributed computing technologies. Cloud Data storage systems enable
user to store data efficiently on server without any trouble of data resources. User can easily store
and retrieve their data remotely. The two biggest concerns about cloud data storage are reliability and
security. Clients aren’t like to entrust their data to another third party or companies without a
guarantee that they will be able to access therein formations whenever they want. In the existing
system, the data are stored in the cloud using dynamic data operation with computation which makes
the user need to make a copy for further updating and verification of the data loss. Different
distributed storing auditing techniques are used for overcoming the problem of data loss. Recent
work of this paper has show that data partitioning technique used for data storage by providing
Digital signature to every partitioning data and user .this technique allow user to upload or retrieve
the data with matching the digital signatures provided to them. This method ensures high cloud
storage integrity, enhanced error localization and easy identification of misbehaving server and
unauthorized access to the cloud server. Hence this work aims to store the data securely in reduced
space with less time and computational cost.
A Novel Information Accountability Framework for Cloud ComputingIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Enhanced security framework to ensure data security in cloud using security b...eSAT Journals
Abstract Data security and Access control is a challenging research work in Cloud Computing. Cloud service users upload there private and confidential data over the cloud. As the data is transferred among the server and client, the data is to be protected from unauthorized entries into the server, by authenticating the user’s and provide high secure priority to the data. So the Experts always recommend using different passwords for different logins. Any normal person cannot possibly follow that advice and memorize all their usernames and passwords. That is where password managers come in. The purpose of this paper is to secure data from unauthorized person using Security blanket algorithm.
Implementation of user authentication as a service for cloud networkSalam Shah
There are so many security risks for the users of cloud computing, but still the organizations are switching towards the cloud. The cloud provides data protection and a huge amount of memory usage remotely or virtually. The organization has not adopted the cloud computing completely due to some security issues. The research in cloud computing has more focus on privacy and security in the new categorization attack surface. User authentication is the additional overhead for the companies besides the management of availability of cloud services. This paper is based on the proposed model to provide central authentication technique so that secured access of resources can be provided to users instead of adopting some unordered user authentication techniques. The model is also implemented as a prototype.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Research Report on Preserving Data Confidentiality & Data Integrity in ...Manish Sahani
ABSTRACT : Currently, cloud-based application is so very famous, but preserving the confidentiality of the user’s data is a huge task to accomplish. Keeping this need in mind, here a solution is proposed which will preserve the data confidentiality & integrity in cloud environment. For providing data confidentiality we will use AES algorithms, by virtue of which the secret data will be converted to cipher text and it becomes very difficult for the user to get the meaningful plain text. Here the basic emphasis is also on the data integrity so that the user’s data can’t be duplicated or copied.Keywords:Data Confidentiality, Data Integrity, AES algorithm
Secure File SharingSecure File Sharing Using Access Contro.docxjeffreye3
Secure File Sharing
Secure File Sharing Using Access Control Raviprakash Ganji
Computer Security
Project: Secure File Sharing Using Access Control
Name: Raviprakash Ganji
Institution: New England College.
Abstract
Presently these day's sharing imperative documents are exceptionally unsafe. So, we have made a half and half answer for document stockpiling on the cloud. This is a propelled framework where User can choose a document from his telephone and enter a key for that record. For Uploading the File is broken in 2 separate chunks and these pieces are encoded by AES and DES calculation individually and after that, they have put away on the cloud server side. At that point client, 1 chooses another client with whom he will share the document to. Client 2 gets an SMS with Encrypted key. We will utilize Blowfish calculation for key Encryption. Client 2 will login into the application utilizing his accreditations. On the Home page, he can see every one of the documents shared to him. At the point when client endeavors to get to them, the application will check for the SMS automatically, and whenever scrambled SMS from the User 1 was discovered, Application will begin the unscrambling procedure. For decoding both the encoded records are unscrambled one by one and afterward combined.
Secure File Sharing: Presentation Cloud storage framework system have been the source of captivation for the online clients to have simple access any place and whenever clients need. Numerous online specialist organizations have succeeded to serve the individual clients, industrialists just as the businessmen to have their information on cloud with dependability and security. The quantities of versatile users who need to utilize the assets or administrations in a haste with the assistance of their cell or mobile phones from cloud-based frameworks are quickly expanding. This way toward using the cloud assets for capacity and progress of information by multipurpose users is a testing task. The cloud environment given by the online providers co-ops can be in the sort of open, private or crossover cloud. The cloud client chooses the kind of cloud condition dependent on the clients' choice to security or introduction arrangement. Numerous IT giants are utilizing the cloud administrations to decrease the on premises the cost which is more prominent than they accommodate the online specialist organizations. The cloud framework given by various sellers shows the multiplicity as for execution and estimating. The plan strategies are differed to accomplish focused outcomes as far as proficient administration, decreased cost, verified information stockpiling. The general advantages of the cloud framework are simple sharing, matching up, off webpage information st.
ACCESSING SECURED DATA IN CLOUD COMPUTING ENVIRONMENTIJNSA Journal
Number of businesses using cloud computing has increased dramatically over the last few years due to the attractive features such as scalability, flexibility, fast start-up and low costs. Services provided over the web are ranging from using provider’s software and hardware to managing security and other issues. Some of the biggest challenges at this point are providing privacy and data security to subscribers of public cloud servers. An efficient encryption technique presented in this paper can be used for secure access to and storage of data on public cloud server, moving and searching encrypted data through communication channels while protecting data confidentiality. This method ensures data protection against both external and internal intruders. Data can be decrypted only with the provided by the data owner key, while public cloud server is unable to read encrypted data or queries. Answering a query does not depend on it size and done in a constant time. Data access is managed by the data owner. The proposed schema allows unauthorized modifications detection.
Abstract: Cloud computing model are obtaining ubiquitous authorization due to the heterogeneous convenience they provide. Although, the
security & privacy problems are the main considerable encumbrance holding back the universal adoption of this new emerging technology.
Various researches are concentrated on enhancing the security on Software as well as Hardware levels on the cloud. But these interpretations do
not mainly furnish the complete security way and therefore the data security compute (measure) are still kept under the access control of service
provider. Trusted Computing is another research concept. In actuality, these furnish a set of tools controlled by the third party technologies to
secure the Virtual Machines from the cloud computing providers. These approaches provides the tools to its consumers to assess and monitor the
aspects of security their data, they don’t allocate the cloud consumers with high control capability. While as the new emerging DCS approach
aims to provide the security of data owners of their data. But the DCS approach concept is elucidate in many ways and there is not a
standardized framework of cloud computing environment model for applying this approach.
Cloud Auditing With Zero Knowledge PrivacyIJERA Editor
The Cloud computing is a latest technology which provides various services through internet. The Cloud server allows user to store their data on a cloud without worrying about correctness & integrity of data. Cloud data storage has many advantages over local data storage. User can upload their data on cloud and can access those data anytime anywhere without any additional burden. The User doesn’t have to worry about storage and maintenance of cloud data. But as data is stored at the remote place how users will get the confirmation about stored data. Hence Cloud data storage should have some mechanism which will specify storage correctness and integrity of data stored on a cloud. The major problem of cloud data storage is security .Many researchers have proposed their work or new algorithms to achieve security or to resolve this security problem. In this paper, we proposed a Shamir’s Secrete sharing algorithm for Privacy Preservation for data Storage security in cloud computing. We can achieve confidentiality, integrity and availability of the data. It supports data dynamics where the user can perform various operations on data like insert, update and delete as well as batch auditing where multiple user requests for storage correctness will be handled simultaneously which reduce communication and computing cost.
Preserving Privacy Policy- Preserving public auditing for data in the cloudinventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Secure Data Sharing In an Untrusted CloudIJERA Editor
Cloud computing is a huge area which basically provides many services on the basis of pay as you go. One of the fundamental services provided by cloud is data storage. Cloud provides cost efficiency and an efficient solution for sharing resource among cloud users. A secure and efficient data sharing scheme for groups in cloud is not an easy task. On one hand customers are not ready to share their identity but on other hand want to enjoy the cost efficiency provided by the cloud. It needs to provide identity privacy, multiple owner and dynamic data sharing without getting effected by the number of cloud users revoked. In this paper, any member of a group can completely enjoy the data storing and sharing services by the cloud. A secure data sharing scheme for dynamic cloud users is proposed in this paper. For which it uses group signature and dynamic broadcast encryption techniques such that any user in a group can share the information in a secured manner. Additionally the permission option is proposed for the security reasons. This means the file access permissions are generated by the admin and given to the user using Role Based Access Control (RBA) algorithm. The file access permissions are read, write and delete. In this, owner can provide files with options and accepts the users using that option. The revocation of cloud user is a function generated by the Admin for security purpose. The encryption computational cost and storage overhead is not dependent on the number of users revoked. We analyze the security by proofs and produce the cloud efficiency report using cloudsim.
Improving Efficiency of Security in Multi-CloudIJTET Journal
Abstract--Due to risk in service availability failure and the possibilities of malicious insiders in the single cloud, a movement towards “Multi-clouds” has emerged recently. In general a multi-cloud security system there is a possibility for third party to access the user files. Ensuring security in this stage has become tedious since, most of the activities are done in network. In this paper, an enhanced security methodology has been introduced in order to make the data stored in cloud more secure. Duple authentication process introduced in this concept defends malicious insiders and shields the private data. Various disadvantages in traditional systems like unauthorized access, hacking have been overcome in this proposed system and a comparison made with the traditional systems in terms of performance and computational time have shown better results.
Insuring Security for Outsourced Data Stored in Cloud EnvironmentEditor IJCATR
The cloud storage offers users with infrastructure flexibility, faster deployment of applications and data, cost
control, adaptation of cloud resources to real needs, improved productivity, etc. Inspite of these advantageous factors, there
are several deterrents to the widespread adoption of cloud computing remain. Among them, security towards the correctness
of the outsourced data and issues of privacy lead a major role. In order to avoid security risk for the outsourced data, we
propose the dynamic audit services that enables integrity verification of untrusted and outsourced storages. An interactive
proof system (IPS) with the zero knowledge property is introduced to provide public auditability without downloading raw
data and protect privacy of the data. In the proposed system data owner stores the large number of data in cloud after e
encrypting the data with private key and also send public key to third party auditor (TPA) for auditing purpose. TPA in
clouds and it’s maintained by CSP. An Authorized Application (AA), which holds a data owners secret key (sk) and
manipulate the outsourced data and update the associated IHT stored in TPA. Finally Cloud users access the services through
the AA. Our system also provides secure auditing while the data owner outsourcing the data in the cloud. And after
performing auditing operations, security solutions are enhanced for the purpose of detecting malicious users with the help of
Certificate Authority
In the last few years, cloud computing has grown from being a promising business concept to one of the fastest growing segments of the IT industry. Now, recession-hit companies are increasingly realizing that simply by tapping into the cloud they can gain fast access to best-of-breed business applications or drastically boost their infrastructure resources, all at negligible cost. But as more and more information on individuals and companies is placed in the cloud, concerns are beginning to grow about just how safe an environment it is. This paper discusses security issues, requirements and challenges that cloud service providers (CSP) face during cloud engineering. Recommended security standards and management models to address these are suggested for technical and business community.
Cloud Computing Using Encryption and Intrusion Detectionijsrd.com
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Chal...IJERA Editor
In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing
Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated
infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application
Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage,
Account, Service, Traffic Hijacking and Unknown Risk Profile
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
2. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
online banking systems, users are required to complete a set
of security requirement to perform an operation, they follow
these processes appropriately to protect their confidential
information from malicious attacks, so there has to be a
trade-off among security, privacy and performance.
For-example, in order to hire a trusted cloud admin,
CSPs conduct strict background checks and multiple
detailed interviews whereas some CSPs state that they
have strict security procedures in place for all
employees when they have access to machines that
store client’s files. Although these are essential security
steps that they must follow, but without acomplete
trustworthy solution for defending insider attacks, a
malicious insider can easily obtain passwords,
cryptographic keys, files and gain access to client’s
records (Francisco et al., 2011). When client’s data
confidentiality has been breached, client would never
have any knowledge of the unauthorized access mostly
due to lack of control and transparency in cloud
provider’s security practices and policies.
2. ORGANIZATIONS AND CLOUD
SECURITY
Organizations that are required to follow well-defined
data security standards such as HIPAA and PCIDSS do
not trust the existing security as well as privacy policies
offered by CSPs (Mervat et al., 2012; Shucheng et al.,
2010). They feel the lack of control while storing
confidential records at off-premises storage and they are
concerned that malicious users might gain illegal access to
their records. In order to defend the attacks, CSPs formulate
their cloud infrastructure with efficient security controls
such as network firewalls, secure cryptographic processors,
anti-malwares, honey pots, access control mechanisms.
However cloud storages using these techniques are also
subjected to vulnerabilities in terms of implementation
(Rocha and Correia, 2011).
Clients require full confidence that attackers are not
able to steal, view or tamper their data. We know from
the past studies that security and privacy on cloud is
breached by external or internal attackers (Ling et al.,
2011). External attacks are issued by hackers who steal
client’s confidential records with an objective to obtain
desired amount of cash. These attacks may take place by
an IT personal belonging to competitors of CSP or client.
The intention of these attacks is to damage the brand
reputation of CSP or to abuse as well as misuse client’s
files. CSPs secure their physical and virtual
infrastructure by using various tools and techniques to
protect data and their systems from outsider attacks.
However, we found out that existing solutions are not
adequate to preserve the client’s privacy. It is also
identified that internal employees of CSP may become
malicious as well (Catteddu and Hogben, 2009).
Inside attackers such as malicious employees of
CSPs, intentionally exceed their privileged accesses in a
negative manner to affect the confidentiality (Adrian et al.,
2012). In contrast to an external hacker, malicious
insider can attack the computing infrastructure with
relatively easy manner and less knowledge of hacking,
since they have the detailed description of underlying
infrastructure. CSPs admitted their full awareness
aboutpossible malicious insiders and they would
normally claim to have its solution.
Science Publications
2.1. Related Work
Ateniese et al. (2007) proposed Provable Data
Possession (PDP) model to ensure that external storage
retains the client’s file with required integrity policies.
Using the PDP model, data owner i.e. the client first preprocesses the file F and adds some additional data or
expands it such as the new file is F’, client then generates
Verification Metadata (VMd) denoted as M for F’ before
sending it to the server for storage. Generated M will be
stored locally and client may delete the personal copy of
F, but prior to the deletion, client will execute a data
possession challenge to make sure that server has
successfully retained the file. The yes or no response
from server will verify the existence of file. While
uploading the file, client may encrypt it or it may already
be encrypted. To verify the file integrity, client issues a
challenge and request R to server to compute hash for the
stored file. Server will compute hash and send the result
P to the client and client will verify the integrity results
by comparing it to locally stored metadata M.
Wang et al. (2010) introduced an improved technique
of verifying data integrity on cloud by utilizing concept
of Third Party Auditor (TPA). Since client doesnot have
knowledge and expertise of auditing process, TPA will
conduct auditing services on behalf of client. TPA is a
certified and trusted entity for conducting scheduled or
requested auditing. This technique of using TPA, enables
the client to send the file by dividing it into number of
blocks, so file F will be denoted as sequence of N blocks
i.e., m1,…,mi,…,mn. For each block Message
Authentication Code (MAC) will be generated by using
the following function:
211
JCS
3. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
K x {0,1}* → {0,1}l
using the corresponding key. Each file and its key will
have a unique tag. If user wants to access the file, the
data storing CSP sends the obfuscated data with its
associated tag to the plug-in and then plug-in sends
request to the key storing CSP by providing the tag. The
key storing CSP will check the tag, identify the
corresponding key and sends it to the plug-in and finally
plug-in de-obfuscates the data by using that secret key.
where, K denotes the key space.
Using the TPA concept, the cloud user first generates
the public and private key parameters and then signs
each block and sends file with its VMd to server. As
TPA receives the file block number and verifies its
signature, quits by indicating false if verification has
failed or process continues if the result is true.
Venkatesh et al. (2012) provided a solution similar to
(Ateniese et al., 2007; Wang et al., 2010) but enhancing
them by adding Rivest Shamir Adleman (RSA)
algorithm based storage security. Client generates
signature for each block using RSA secret key and
hashing algorithm Ti = (H(mi). gmi rsk and generates
signature Φ = {Ti} for collection of all blocks. Merklee
Hash Tree (MHT) is constructed by involving each
block. Client then signs the root of MHT with secret key
as sigrsk(H(R)) = H(R)rsk. In next step client sends the
file, its hash and signed root {F,Φ, sigrsk(H(R))} to
server and deletes the local copy of Φ and sigrsk(H(R)).
During the auditing process, client issues a challenge to
server by selecting a specific file block, while the server
computes the proof and sends the results back to client.
Client verifies the data integrity by using the MHT root
and authenticates by using the secret key.
Ranchal et al. (2010) argued that the concept of
involving TPA in cloud services may associate
additional risk to confidentiality. TPA may not act as
expected (this happen when TPA is attacked by external
hacker or malicious insider) and they counter a solution
for protecting information integrity in cloud without
using TPA. They proposed an Identity Management
(IDM) approach using active bundle scheme which
includes sensitive data, metadata and a virtual machine.
In order to maintain confidentiality, the Personally
Identified Information (PII) is packed and encrypted
inside the active bundle. Virtual machine monitors and
manages the program inside the bundle that actually
controls the access to bundle.
Prasadreddy et al. (2011) focused on key and data
management technique while preserving the privacy of
users on cloud. They proposed a web-browser plug-in
that enables the client to store the key and data with
isolated CSPs. In this architecture, it is not required to
have mutual communication between two CSPs, rather
the plug-in handles all the necessary tasks. For-example
if file is stored with Amazon, keys will be stored with
Google, so Amazon cannot have illegal access to file
since it is obfuscated and it can only be de-obfuscated
Science Publications
3. A PRIVACY PRESERVED OFFPREMISES CLOUD STORAGE
Considering the requirements, we proposed a privacy
preserving technique that enables an organization dealing
with sensitive information to store their confidential data
at off-premises cloud storage without security and
privacy concerns. We have designed as well
implemented an improved technique which focuses on
achieving following set of requirements:
•
•
•
•
•
Only privileged users can access the system under a
strict access control policy
Grant adequate control to client on his data, such as
handling encryption, decryption and VMd
generation tasks
Client can process the data and perform certain
transactions without decryption
Client can consistently monitor his files on cloud
without revealing any data to an illegal authority
Client can efficiently restore the unintentionally
written, modified or violated records
3.1. System Architecture
The proposed system architecture consists of three
end-users i.e., Client, Trusted-TPA (TTPA) and Cloud
admin. From client’s perspective, we have assumed that
cloud admin is most un-trusted, TTPA is semi-trusted
and client’s personal admin is a fully trusted user. This is
based on the assumption that TTPA may have been
hijacked by a malicious attacker. In order to ensure a
safe computing environment, each involved entity is only
permitted to perform his privileged tasks but with added
security implementation of Role-based Access Control
(RBAC) with random Security Code Generator (SCG).
Cloud server is the central component of overall system.
It analyzes the access control mechanism and performs
corresponding operations requested by the privileged
users as shown in Fig. 1.
212
JCS
4. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Fig. 1. System architecture
For maintaining data confidentiality and integrity, the
responsibilities of client admin are as follows:
•
•
•
•
Homomorphic encryption of data prior to its arrival
at the cloud storage
Decrypt, update and download data throughout the
entire computing life cycle
Safe delivery of significant parameters such as
hashes, private and public keys
Ensure with the assistance of TTPA admin that data
on cloud is intact
TTPA admin belongs to an auditing authority and is
responsible for following tasks:
•
•
•
•
•
Conducting the auditing services on behalf of client
Initiating scheduled or requested auditing process as
directed by client admin
Providing response to the client about the status of
data,whether it is tampered, deleted, manipulated or
intact
Requesting the cloud admin to start data recovery
process from backup cloud storage
Storing the client’s parameters such as keys and VMd
Fig. 2. Encryption
3.2. System Workflow
The process is initiated from client admin with the
generation of private and public keys by requesting the
cloud server. Let us examine a simple scenario. Forinstance client admin wants to store a file named as
EMP.txt
containing
organization’s
employees’
confidential records at the cloud storage. Cloud server
requires the file and the public key for encryption
process as represented in Fig. 2.
The auditing reports will be shared among all three
involved parties to determine the integrity status. Cloud
admin is responsible for successful recovery of certain
amount of records that has been violated by authorized
or unauthorized users.
Science Publications
213
JCS
5. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Fig. 3. Decryption
Fig. 5. Sound Steganography
Fig. 4. Metadata generation
Data inside the file will be homomorphically
encrypted from the stream during the uploading process
and when file arrives at the server it will be completely
encrypted and stored. File EMP.txt is renamed as
Encrypted.txt when it is saved at the cloud storage.
Whenever required, server needs client’s private key to
decrypt the selected file as shown in Fig. 3.
Each attribute is retrieved from storage, decrypted
and presented for client’s view. Since data is
homomorphically encrypted, client admin can also
perform certain number of transactions without actually
decrypting the contents. Client admin requests the server
to compute VMd for the file. Cloud server divides the
file in equal number of blocks. In this example,
Encrypted.txt is divided into four blocks according to its
size and server generates metadata for the entire file as
well as its each block as shown in Fig. 4.
Science Publications
Fig. 6. Decoded Sound-1
Client admin downloads and store the VMd at his
local storage and request the cloud server to encrypt the
public key, private key and VMd. Cloud server encrypts
these parameters using sound steganography as shown in
Fig. 5 and transfers them to TTPA admin as two isolated
sound files i.e., sound-1 containing the public key and
VMd, sound-2 containing the private key.
At this stage, client admin proceeds with the deletion
of cryptographic keys and VMd from the local storage
and request the TTPA admin to initiate the auditing
process. In order to audit the client’s files, TTPA admin
needs to extract public key and VMd from the sound-1
by requesting the cloud server as shown in Fig. 6.
214
JCS
6. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Fig. 9. Data Violation
Fig. 7. Auditing Process
Fig. 10. Auditing Report after Integrity Violation
However, if file integrity is violated by any user, forexample Fig. 9 represents that malicious users have
modified the first block of file by replacing an encrypted
record with garbage characters such as “xx”.
The auditing report clearly indicates the integrity
violation and its location i.e. block-1 as shown in Fig.
10. TTPA admin requests the cloud admin to overcome
the violation and recover the data back to its original
state.
Upon receiving the request cloud admin views the
auditing reports and identifies that block-1 is violated.
Cloud admin will issue a request to cloud server for
recovering the block-1 as shown in Fig. 11.
Fig. 8. Initial Auditing Report
TTPA admin downloads these parameters to his local
storage and proceed with auditing process by providing
them to the cloud server. Fresh metadata for the stored
file is computed by the cloud server and compared with
the old metadata as shown in Fig. 7.
Based on the auditing results, cloud server creates the
auditing reports and shares it among involved users.
Since the file is securely saved with required integrity,
auditing reports indicated that integrity of file and each
block is well maintained as shown in Fig. 8.
Science Publications
215
JCS
7. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
mechanism, Partial homomorphic cryptography,
Metadata generation and sound steganography, efficient
third-party auditing service, Data backup and recovery
process. The codes snippets used throughout this paper
are written in Java and compiled using NetBeans 6.9.1
runtime environment with Glassfish Server version-3.0.
3.3.1. Resilient
Role-based
Access
Control
Mechanism
According to security recommendations provided by
Cloud Security Alliance (CSA), data owner is
responsible for enforcing access control policies whereas
CSP is responsible for their implementation (CSA,
2011). We assumed that data owner together with its IT
professionals and legal law authorities will specify and
enforce the access control policies by signing a Service
Level Agreement (SLA) with CSP. We have implemented
the access policies of client by using RBAC.
Three major roles were created for accessing the
system i.e., Client, TTPA and Cloud Admin by using
Glassfish Server file realm security domain. The access
control to perform a particular operation by a privileged
user is further implemented using Enterprise Java Beans
and security annotations. We assume that client will set
the following access privileges for each involved user as
shown in Fig. 13. Operations such as decryption and
encryption of client’s data are only associated to client
admin, data recovery process is associated to cloud
admin and conducting the auditing service is associated
to TTPA admin, whereas viewing audit reports is
associated to all three involved roles, these accesses are
controlled using following code snippet:
Fig. 11. Data Recovery Process
Fig. 12. Auditing Report after Data Recovery
After the successful recovery, he will alert the TTPA
admin to restart the auditing process in order to ensure
that data is intact. After the successful recovery, cloud
admin will alert the TTPA admin to restart the auditing
process in order to ensure that data is intact. The auditing
reports after recovery indicate that integrity in well
maintained as shown in Fig. 12 and there is no data loss.
At this point, TTPA admin will inform the client admin
to proceed with other operations such as uploading more
files, update existing records or download files
permanently.
Since client admin has already deleted all the
parameters from local storage, client will retrieve both
sound files from TTPA admin and request the cloud
server to extract the public and private key from sound-1
and sound-2 respectively.
@RolesAllowed("Client Admin")
public String decryptDataFiles() throws Exception {}
public String encryptDataFiles() throws Exception {}
@RolesAllowed("Cloud Admin")
public String dataRcoveryProcess() throws Exception {}
@RolesAllowed("TTPA Admin")
public String initiateAuditingProcess() throws
Exception{}
@RolesAllowed({"Client Admin","TTPA
Admin","Cloud Admin"})
public String viewAuditingReports() throws Exception
{}
In order to further enhance the security of access control,
we have implemented a SCG process that generates a
random twelve digit code which consists of special
character, symbols, numbers, upper and lower case letters.
3.3. System Implementation
We have described the implementation mechanism of
our suggested contributions such as Resilient RBAC
Science Publications
216
JCS
8. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Fig. 13. Role-based Access Control Privileges
The SCG was implemented using following code
snippet:
String alphaTwo;
String alphaThree;
for(int x=nc2;x<0;x++){
int alphaNum=r.nextInt(26);
alpha=alphabet[alphaNum];
number=r.nextInt(10);
String numStr=Integer.toString(number);
int symbolNum=r.nextInt(11);
alphaTwo=symbol[symbolNum];
int alphaNumTwo=r.nextInt(26);
alphaThree=alphabetTwo[alphaNumTwo];
Random r=new Random();
String
alphabet[]={"a","b","c","d","e","f","g","h","i","j","k","l"
+
"","m","n","o","p","q","r","s","t","u","v","w","x","y","z"
};
String symbol
[]={"~","!","@","$","%","^","&","*","(",")","?"};
String
alphabetTwo[]={"A","B","C","D","E","F","G","H","I","
J","K","L"
+
"","M","N","O","P","Q","R","S","T","U","V","W","X","
Y","Z"};
String alpha;
int number;
int c=6;
int c2=c/2;
int nc2=0-c2;
String finalCode="";
Science Publications
finalCode=finalCode+alpha+numStr+alphaTwo+alphaT
hree;
}
All transaction such as decryption or auditing cannot
proceed unless accurate security code has been requested
and provided by the authorized user. Once the code is
requested, it will be generated and sent via email or SMS to
the appropriate user to proceed with the transaction. On the
successful entry of security code, system will perform the
operation or else user needs to request a fresh code.
217
JCS
9. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Query-2: Alter and update the salary of an employee to
1350.
inputFactor=1350;
multiplicativeFactor=1;
encrypt(inputFactor,publickKey).
encrypt(multiplicativeFactor,publickKey).
amount=inputFactor.mupltiply(multiplicativeFactor).
salary=amount.
Similarly, there can be various other operations that can
be performed on encrypted records.
3.3.2. Partial Homomorphic Cryptography
One of the limitations of cloud systems is the
enablement of the client to process the data on cloud
while it remains encrypted (Hibo et al., 2011). In order
to perform this task, client is required to decrypt the data
prior to processing. We have implemented partial
homomorphic cryptography which enables the client to
perform certain number of operations on encrypted data.
However, it is not possible to perform unlimited number
of operations, due to unavailability of practical
implementation for full homomorphic cryptography.
Considering the security features of asymmetric
algorithms, we implemented the homomorphic version
of RSA algorithm to encrypt, decrypt and process the
encrypted data on cloud. The random homomorphic
public and private keys are generated by using following
code snippet:
3.3.3. Metadata
Generation
Steganography
Sound
When data is stored at the cloud, client admin needs
to generate VMd by using file and block level hash
calculation methods. Client first generates the VMd for
entire file and then for its each block. The file will be
automatically divided by server in N number of blocks
according to its size. For-example if the file F is 8MB, it
will be divided in four equal chunks F1, F2, F3 and F4
each of size 2MB. VMd for entire file will be calculated
by using Digital Signature Algorithm (DSA) with SHA1, as represented in following code snippet:
BigInteger p=BigInteger.probablePrime(N/2,random);
BigInteger q=BigInteger.probablePrime(N/2,random);
n=p.multiply(q);
BigInteger
phi_n=(p.subtract(one)).multiply(q.subtract(one));
publicKey=new BigInteger("65537");
privateKey=publicKey.modInverse(phi_n);
// where N refers to bit-length of returned BigInteger.
Signature
dsa=Signature.getInstance("SHA1withDSA","SUN");
dsa.initSign(privateKey);
dsa.update(fileData);
verificationMetadata=dsa.sign();
The public and private keys are generated and stored
with user. Using the public key client admin can then
start the encryption by using the following code which is
multiplicative homomorphic.
Similarly VMd for each block will be generated, but
instead of file, block number will be provided as
parameter i.e. dsa.update(fileBlock_n), where n refers to
a specific block number. After metadata generation
process is accomplished, the client admin will send the
VMd and public key to TTPA admin for conducting the
auditing process. Client admin will also send his private
key for secure storage. However, to verify the data
integrity TTPA admin only requires the VMd and
client’s public key. In order to protect these parameters
from man-in-middle and other malicious attacks,
parameters such as keys and VMd are first encoded
using Base64 encoding scheme for the transmission over
the network and secondly by using sound steganography
techniques, as represented in following code snippet:
message.modPow(publicKey,n);
// where message refers to the data being encrypted.
Similarly, data will be decrypted using the following
code:
encrypted.modPow(privateKey,n);
// where encrypted refers to the cipher data.
Since the data is encrypted by using RSA
multiplicative homomorphism, client processes the
encrypted data by performing number of transaction that
are multiplicative in nature. For-example when file
EMP.txt is stored at the cloud, client can increase,
modify or delete the salary and commission rate of an
employee. Some demo queries are implemented as
follows:
BASE64Encoder encoder=new BASE64Encoder();
String message=encoder.encode(parameter);
// Sound steganography.
Sound s=new Sound(inputFile);
int nbrOfSamples=message.length()*3;
Query-1: Increase the commission rate of an employee.
inputFactor=2;
encrypt(inputFactor,publickKey).
inputFactor.muptiply(commission_rate).
Science Publications
and
218
JCS
10. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
// where message refers to the value being encoded.
smoothAmpValues(s,nbrOfSamples+3);
for(int g=0;g<nbrOfSamples;g++) {
int asciiValue=(int) message.charAt(g/3);
int digit=getDecimalDigit(asciiValue,g%3);
int ampValue;
if(s.getSampleValueAt(g)>=0) {
ampValue=s.getSampleValueAt(g)+digit;
}
else{
ampValue=s.getSampleValueAt(g)-digit;
}
if(ampValue>32767){
ampValue=ampValue-10;
} if (ampValue<-32768){
ampValue=ampValue+10;
}
s.setSampleValueAt(g,ampValue);
int asciiValue=digit[0]+digit[1]*10+digit[2]*100;
if
(digit[0]==digit[1]&&digit[1]==digit[2]&&digit[2]==0)
nullReached=true;
message+=(char) asciiValue;
sampleIndex+=3;
}
BASE64Decoder decoder=new BASE64Decoder();
byte[] parameters=decoder.decodeBuffer(message);
// where message contains the encoded parameters.
Due to access control TTPA is not privileged to
decode the sound file containing the private key as it
only belongs to the access privilege of client admin.
However it will be decoded in similar fashion as other
parameters. Once the decoding process is accomplished,
TTPA admin will extract the client admin’s public key
and VMd and provide it to the server and then server will
start the process of calculating the fresh metadata by
generating the digital signature for each block and the
entire file as follows:
With the completion of steganography process, two
encoded sound files will be created, sound-1 contains the
public key and VMd and sound-2 contains the private
key. TTPA admin is only privileged to decrypt the
sound-1, since private key belongs to the client admin
and it is just sent for secure storage.
sigNew=Signature.getInstance("SHA1withDSA","SUN"
);
sigNew.initVerify(publicKey);
FileInputStream fis=new FileInputStream(fileName);
BufferedInputStream bufin=new
BufferedInputStream(fis);
byte[] buffer=new byte[1024];
int len;
while (bufin.available()!=0){
len=bufin.read(buffer);
sigNew.update(buffer,0,len);
};
bufin.close();
3.3.4. Efficient Third-Party Auditing Service
In order to conduct the auditing process, TTPA admin
will request the cloud server to generate fresh VMd for
client’s file stored at cloud. However, server requires
client admin’s public key and VMd from TTPA admin to
generate the auditing reports. Since parameters sent by
client are encoded using sound steganography and
Base64, all parameters will be decrypted using the
following code snippet:
When new signatures are generated, TTPA admin will
request the cloud server to start a comparison process to
verify the integrity as follows:
Sound s=new Sound(filename);
SoundSample[] samples=s.getSamples();
int[] digit=new int[10];
String message="";
// where message contains the desired data decoded by
sound.
int sampleIndex=0;
boolean nullReached=false;
while (!nullReached) {
digit[0]=getDecimalDigit(samples[sampleIndex].getValu
e(),0);
digit[1]=getDecimalDigit(samples[sampleIndex+1].getV
alue(),0);
digit[2]=getDecimalDigit(samples[sampleIndex+2].getV
alue(),0);
Science Publications
boolean
verifies=signofFileorBlockNumber.verify(sigNew);
if(verifies==true){
report.write("Integrity is Maintained");
}else{
report.write("Integrity is Violated"+’n’);
}
The auditing process will be recorded and a report
will be shared among all associated entities to view the
auditing results. If TTPA report encounters integrity
219
JCS
11. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
positive and TTPA will send success message to client
for further tasks such as downloading, uploading new
files or processing. However the data stored on cloud is
homomorphically encrypted, for the decrypted
downloading process, client needs to extract his private
key from encoded sound file.
violation, a request will be sent to cloud admin for
successful recovery of violated records from the backup
cloud storage.
3.3.5. Data Backup and Recovery Process
We assumed that client’s primary storage on cloud is
Kuala Lumpur while secondary or backup location is
New York datacenter. Under some unwanted
circumstances such as natural disasters, malicious attacks
or un-wanted modifications, data will be recovered from
the New York datacenter without any loss or leakage.
During the auditing process if TTPA admin identifies
integrity violation, it will not generate success signal for
client to move on with further process, unless data is
successfully recovered from backup storage to its state of
correctness. By viewing the auditing reports cloud admin
will initiate the backup and recovery process. If entire
file or a block is violated, it will be recovered and there
is no need to recover the un-violated records. This
process will take place by using following code snippet:
3.4. Advantages of the Proposed Technique
Data owner has sufficient degree of control over his
data since the client admin is privileged to perform
encryption, decryption and metadata generation tasks.
Performing these tasks does not require in-depth and
technical knowledge of cryptography, security or cloud
computing. An intermediate IT admin can handle these
operations efficiently. Unlike other cloud systems, client
admin is not required to encrypt the data before sending
it for cloud storage, using our system, data will be
encrypted automatically while it is being uploaded but
before it arrives to the cloud storage. Data is encrypted in
real-time directly from the stream without being stored at
any location. When data arrives at the cloud storage, it
will be fully encrypted. Similarly for the decryption
process, data is not decrypted at the CSP’s end and then
transferred to the client because this could raise the
concerns of confidentiality violation. Data will depart
from cloud storage as encrypted and it will be decrypted
in real-time from the download stream. Contents will be
downloaded or viewed once arrive at the client’s end.
With the use of RBAC and random security access
generator, client can ensure that only authorized users
are accessing the system as per their privilege, this will
provide sense of satisfaction to the client.
For performing any transaction, client admin is not
required to decrypt the records due to the
implementation of partial homomorphism cryptography
so it saves time and bandwidth cost. In order to provide
client full advantage of acquiring a cloud storage
service, using our approach client admin will store the
VMd, private and public keys with TTPA, not at their
local machine, however these parameters are encoded
and can be decrypted only by the privileged user. For
the integrity checks, TTPA will initiate an efficient
auditing process, where it will be easy to determine the
location of violated data from a huge file so as to
enable efficient recovery. During the recovery process,
cloud admin will check the auditing report to determine
the actual data that needs to be recovered instead of
recovering the entire file.
byte[] buff=new byte[size];
int bytesRead;
originalFile=new File("File.txt");
originalFileStream=new FileInputStream(originalFile);
backupFile=newFile("BackupFile.txt");
backupFileStream=new FileInputStream(backupFile);
fw=new FileWriter("Recovered.txt");
bw=new BufferedWriter(fw);
String newData;
int i=1;
while
((bytesRead=originalFileStream.read(buff,0,size))>0){
if(i<5){
if(i==1){
backupFileStream.read(buff,0,size);
newData=new String(buff);
bw.write(newData);
}else{
newData=new String(buff);
bw.write(newData);
}
}
i++;
} // end while
bw.close();
Once the violated, lost or damaged records are
recovered back, TTPA admin will re-initiate the auditing
process to verify its correctness, since data is
successfully backed up, the auditing results will be
Science Publications
220
JCS
12. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Table 1. Evaluation results
Attack ID Type
A001
External
Issued by
External Hacker
Attacked assets and operation
Data while Uploading
A002
Internal external
Cloud Admin
External Hacker
Data at Storage
A003
Internal external
Private Key, VMd at Storage
A004
External
Cloud Admin
External Hacker
TTPA Admin
External Hacker
A005
External
External Hacker
Private Key, VMd while
transferring
Data while Downloading
In order to verify the objectives of this research i.e.,
preserving the privacy of client’s data at off-premises
cloud storage, we evaluated the security of implemented
system using penetration testing strategy in a lab based
environment by creating a network of three workstations
(one for each user). The system is checked against
various attacks that might take place by an external
hacker or malicious internal such as cloud or TTPA
admin. The evaluation results are represented in Table 1.
The evaluation process is based on considering the
following attack scenarios:
•
•
External hackers launch attacks during the data
transmission process to steal the client’s confidential
records and remotely access the cloud system in
order to exploit the access control privileges of the
client admin by stealing his authentication
credentials
Internal as well as external hackers breach the
physical security barriers deployed by CSP to gain
unauthorized and direct access to client’s data in
order to view, modify or delete the actual contents.
They also attack the secure storage of TTPA to steal
or manipulate the significant parameters of client
i.e., keys and VMd
A malicious TTPA admin attempts to extract the
client’s private key to decrypt the data
Intact
Intact
Intact
Intact
4.2. Discussion
Various researchers across the globe have formulated
valuable contributions in forms of models, techniques
and algorithms to overcome the security and privacy
concerns of adopting the cloud paradigm. In this
research, we analyzed the significant contributions of
(Ateniese et al., 2007; Wang et al., 2010; Venkatesh et al.,
2012; Ranchal et al., 2010; Prasadreddy et al., 2011). We
proposed an improved and enhanced technique by
4.1. Results
During the experiments, we identified that client’s
privacy always remains intact despite the attacks
launched by several malicious users. For-example if an
expert hacker is able to attack the data during the transfer
(downloading, uploading) or at the storage it doesn’t
Science Publications
Sound Steganography,
RBAC Partial
Homomorphic
Cryptography, RBAC
Final data status
Intact
affects the privacy because before data departs from the
client it gets and remains encrypted throughout the entire
process even when it is stored or processed at cloud
storage. When attackers get access, they are not able to
get any meaningful information just beside the cipher
text and if an attacker violates the integrity at physical
cloud storage, it is immediately identified during the
auditing process and data is recovered back to its original
state from the backup storage.
Similarly when, TTPA admin wants to extract the
private key of client, attacker will not be able to decrypt
it because it is encrypted as sound. Also if attacker gets
the private key, attacker cannot decipher the client’s
data, since for decryption, system must perform the
decrypt process and this task can only be initiated by the
client when successfully logs in with required
credentials. Un-authorized users cannot perform any
operation, even if they break-in security of login menu
they need to request for random security code and the
code can be only sent to privileged users under the
implemented RBAC.
We concluded that using the proposed technique,
besides the threatening attacks, client’s privacy i.e., data
confidentiality and integrity is preserved at off-premises
cloud computing storage.
4. EVALUATION
•
Solution to preserve privacy
Partial Homomorphic
Cryptography, Auditing
Process, RBAC Partial
Homomorphic Cryptography,
Auditing Process,
Data backup and Recovery, RBAC
Sound Steganography, RBAC
221
JCS
13. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
overcoming the limitations of existing work and
acquiring their strengths. For-instance we developed the
process of efficient third party auditing process as
suggested by (Ateniese et al., 2007; Wang et al., 2010)
but we further enhanced their efficiency by
implementing partial homomorphic cryptography where
users are not only able to store and audit their data files,
they can also perform transaction on their encrypted
data. Venkatesh et al. (2012) proposed a RSA based
cryptography technique which facilitates users to only
encrypt their data but this technique does not have
capability of processing the encrypted data. Similarly, as
(Ranchal et al., 2010), argued that involving TPA may
associate additional risk to confidentiality of client’s
data, we overcame this issue by implementing a process
of sound steganography which secures the significant
parameters of client by malicious activities of TTPA
hence user’s parameters can be stored without any
security concerns. Prasadreddy et al. (2011) provided the
solution for securing the data of client by storing the data
and their associated keys with isolated CSPs, however
this is a good suggestion, but we believe that involving
multiple CSPs may result in enhancing the
communication and security complexity. In order to
improve this approach we implemented a concept of
encoding the cryptographic keys while at storage and
during the transfer. Keys and data can be only decrypted
by the relevant privileged authorities due to the
implementation of RBAC and SCG. In order to
overcome data violation issues, we further enhanced the
existing work by implementing the process of data
backup and recovery, where data of user is efficiently
recovered from secondary or backup cloud storage with
any loss or damage.
technique,
implementing
a
secure
computing
environment for multiple clients, TTPA and cloud
admins working together to protect the data
confidentiality and integrity and analyzing and
evaluating the performance of system together with its
security and privacy capabilities.
6. REFERENCES
Adrian, D., S. Creese and M. Goldsmith, 2012. Insider
attacks in cloud computing. Proceedings of 11th
International Conference on Trust, Security and
Privacy in Computing and Communications, Jun.
25-27, IEEE Xplore Press, England, pp: 857-862.
DOI: 10.1109/TrustCom.2012.188
Ateniese, G., R. Burns, R. Curtmola, J. Herring and L.
Kissner et al., 2007. Provable data possession at
untrusted stores. Proceedings of the 14th ACM
Conference on Computer and Communications
Security, Oct. 29-Nov. 02, ACM Press, USA., pp:
598-609. DOI: 10.1145/1315245.1315318
Catteddu, D. and G. Hogben, 2009. Benefits, Risks and
Recommendations for Information Security.
CSA, 2011. Security Guidance for Critical Areas of
Focus in Cloud Computing v3.0. USA.
Francisco, R., S. Abreu and M. Correia, 2011. The final
frontier: Confidentiality and privacy in the cloud.
Computer, 44: 44-50. DOI: 10.1109/MC.2011.223
Hibo, H., J. Xu, C. Ren and B. Choi, 2011. Processing
private queries over untrusted data cloud through
privacy homomorphism. Proceedings of the 27th
International Conference on Data Engineering, Apr.
11-16, IEEE Xplore Press, Germany, pp: 601-61.
DOI: 10.1109/ICDE.2011.5767862
Khayyam, H., Q. Ilkin, B. Vusale and B. Mammad,
2012. Cloud Computing for business. Proceedings
of the 6th International Conference on Application
of Information and Communication Technologies,
Oct. 17-19, IEEE Xplore Press, Georgia, pp: 1-4.
DOI: 10.1109/ICAICT.2012.6398514
Ling, L., X. Lin, L. Jing and C. Zhang, 2011. Study on
the third-party audit in cloud storage service.
Proceedings of teg International Conference on
Cloud and Service Computing, Dec. 12-14, IEEE
Xplore Press, Hong Kong, pp: 220-227. DOI:
10.1109/CSC.2011.6138525
Mervat, B., S. Brohi, S. Chuprat and A. Jamalul-lail,
2012. A Study on significance of adopting cloud
computing paradigm in healthcare sector.
Proceedings of the 2012 International Conference on
Cloud Computing Technologies, Applications and
Management, Dec. 8-10, IEEE Xplore Press, UAE,
pp: 65-68. DOI: 10.1109/ICCCTAM.2012.6488073
5. CONCLUSION
This paper provides a data privacy preserving solution
for organizations dealing with sensitive information to
adopt off-premises cloud paradigm. We proposed and
presented an improved technique evaluated it by
penetration testing. The results showed that client’s data
is intact and protected from malicious attackers.
For future work, we would solely focus on privacy
concerns to maintain the confidentiality and integrity of
client’s data. We intend to implement the proposed
system on Secure Socket Layer (SSL) to enhance the
security capabilities during the data transfer and
receiving process to protect the system from man-in-themiddle and other session hijacking sort of attacks. Other
plans include, enhancing the functionality of proposed
Science Publications
222
JCS
14. Sarfraz Nawaz Brohi et al. / Journal of Computer Science 10 (2): 210-223, 2014
Venkatesh, M., M.R. Sumalatha and C. SelvaKumar,
2012. Improving public auditability, data possession
in data storage security for cloud computing.
Proceedings of the International Conference on
Recent Trends in Information Technology, Apr. 1921, IEEE Xplore Press, India, pp: 463-467. DOI:
10.1109/ICRTIT.2012.6206835
Wang, C., Q. Wang, K. Ren and W. Lou, 2010. Privacypreserving public auditing for data storage security
in cloud computing. Proceedings of the 30th
International
Conference
on
Computer
Communications, Mar. 14-19, IEEE Xplore Press,
USA.,
pp:
1-9.
DOI:
10.1109/INFCOM.2010.5462173
Yashpalsinh, J. and K. Modi, 2012. Cloud computing
concepts, architecture and challenges. Proceedings
of the International Conference on Computing,
Electronics and Electrical Technologies, Mar. 21-22,
IEEE Xplore Press, India, pp: 877-880. DOI:
10.1109/ICCEET.2012.6203873
Prasadreddy, P., T. Srinivasa and S. Phani, 2011. A
threat free architecture for privacy assurance in
cloud computing. Proceedings of the IEEE World
Congress on Services, Jul. 4-9, IEEE Xplore Press,
USA.,
pp:
564-568.
DOI:
10.1109/SERVICES.2011.11
Ranchal, R., B. Bhargava, L. Ben and L. Lilien, 2010.
Protection of identity information in cloud
computing without trusted third party. Proceedings
of the 29th IEEE Symposium on Reliable
Distributed Systems, Oct. 31 Nov. -03, IEEE Xplore
Press,
Indian,
pp:
368-372.
DOI:
10.1109/SRDS.2010.57
Rocha, F and M. Correia, 2011. Lucy in the sky without
diamonds: Stealing confidential data in the cloud.
Proceedings of the 41st International Conference on
Dependable Systems and Networks Workshops, Jun.
27-30, IEEE Xplore Press, Hong Kong, pp: 129-134.
DOI: 10.1109/DSNW.2011.5958798
Shucheng, Y., C. Wang, K. Ren and W. Lou, 2010.
Achieving secure, scalable and fine-grained data
access control in cloud computing. Proceedings of
the 30th International Conference on Computer
Communications, Mar. 14-19, IEEE Xplore Press,
USA.,
pp:
1-9.
DOI:
10.1109/INFCOM.2010.5462174
Science Publications
223
JCS