Cloud computing makes immense use of internet to store a huge amount of data. Cloud computing provides high quality service with low cost and scalability with less requirement of hardware and software management. Security plays a vital role in cloud as data is handled by third party hence security is the biggest concern to matter. This proposed mechanism focuses on the security issues on the cloud. As the file is stored at a particular location which might get affected due to attack and will lost the data. So, in this proposed work instead of storing a complete file at a particular location, the file is divided into fragments and each fragment is stored at various locations. Fragments are more secured by providing the hash key to each fragment. This mechanism will not reveal all the information regarding a particular file even after successful attack. Here, the replication of fragments is also generated with strong authentication process using key generation. The auto update of a fragment or any file is also done here. The concept of auto update of files is done where a file or a fragment can be updated online. Instead of downloading the whole file, a fragment can be downloaded to update. More time is saved using this methodology.
This document discusses a proposed system for secure multi-cloud data storage using cryptographic techniques. The key aspects are:
1) The system splits user files into multiple encrypted chunks and stores each chunk in different private or public clouds. This prevents any single entity from accessing the full file.
2) Encryption is performed on the file chunks before storage using a symmetric encryption algorithm (AES is proposed) with an encryption key only known to the user.
3) During access, the encrypted chunks are retrieved, decrypted and merged to reconstruct the original file.
This document discusses security issues related to cloud computing. It begins with an introduction to cloud computing models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It then discusses potential security attacks to clouds like denial of service attacks and man-in-the-middle attacks. Security concerns with moving data and applications to the cloud are outlined. Techniques for securely publishing data in the cloud are also presented. The document concludes that security in cloud computing is challenging due to the complexity of clouds but that assurance of secure and mission-critical operations is important.
Fragmentation of Data in Large-Scale System For Ideal Performance and SecurityEditor IJCATR
Cloud computing is becoming prominent trend which offers the number of significant advantages. One of the ground laying
advantage of the cloud computing is the pay-as-per-use, where according to the use of the services, the customer has to pay. At present,
user’s storage availability improves the data generation. There is requiring farming out such large amount of data. There is indefinite
large number of Cloud Service Providers (CSP). The Cloud Service Providers is increasing trend for many number of organizations and
as well as for the customers that decreases the burden of the maintenance and local data storage. In cloud computing transferring data to
the third party administrator control will give rise to security concerns. Within the cloud, compromisation of data may occur due to
attacks by the unauthorized users and nodes. So, in order to protect the data in cloud the higher security measures are required and also
to provide security for the optimization of the data retrieval time. The proposed system will approach the issues of security and
performance. Initially in the DROPS methodology, the division of the files into fragments is done and replication of those fragmented
data over the cloud node is performed. Single fragment of particular file can be stored on each of the nodes which ensure that no
meaningful information is shown to an attacker on a successful attack. The separation of the nodes is done by T-Coloring in order to
prohibit an attacker to guess the fragment’s location. The complete data security is ensured by DROPS methodology
This document discusses security issues related to cloud computing. It begins with an introduction to cloud computing models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It then discusses potential security threats in cloud computing like denial of service attacks, side channel attacks, and man-in-the-middle cryptographic attacks. The document proposes a layered framework for assured cloud computing and techniques for secure publication of data in the cloud, including encryption. It concludes that achieving end-to-end security in cloud computing will be challenging due to complexity, but that more secure operations can be ensured even if some parts of the cloud fail.
E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Chal...IJERA Editor
In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing
Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated
infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application
Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage,
Account, Service, Traffic Hijacking and Unknown Risk Profile
IRJET-Using Downtoken Secure Group Data Sharing on CloudIRJET Journal
The document proposes a secure group data sharing scheme on cloud using key aggregate search encryption (KASE). In the proposed scheme, a data owner can generate a single download token (DT) to share a group of encrypted files with multiple users. The users only need to upload the DT to the cloud to search and download the shared files. This reduces the complexity of managing multiple encryption keys compared to traditional schemes. The scheme provides security, dynamic changes, low computation and communication costs for file access and key updates.
Cloud Computing Using Encryption and Intrusion Detectionijsrd.com
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
Multi-part Dynamic Key Generation For Secure Data EncryptionCSCJournals
Storage of user or application-generated user-specific private, confidential data on a third party storage provider comes with its own set of challenges. Although such data is usually encrypted while in transit, securely storing such data at rest presents unique security challenges. The first challenge is the generation of encryption keys to implement the desired threat containment. The second challenge is secure storage and management of these keys. This can be accomplished in several ways. A naive approach can be to trust the boundaries of a secure network and store the keys within these bounds in plain text. A more sophisticated method can be devised to calculate or infer the encryption key without explicitly storing it. This paper focuses on the latter approach. Additionally, the paper also describes the implementation of a system that in addition to exposing a set of REST APIs for secure CRUD operations also provides a means for sharing the data among specific users.
This document discusses a proposed system for secure multi-cloud data storage using cryptographic techniques. The key aspects are:
1) The system splits user files into multiple encrypted chunks and stores each chunk in different private or public clouds. This prevents any single entity from accessing the full file.
2) Encryption is performed on the file chunks before storage using a symmetric encryption algorithm (AES is proposed) with an encryption key only known to the user.
3) During access, the encrypted chunks are retrieved, decrypted and merged to reconstruct the original file.
This document discusses security issues related to cloud computing. It begins with an introduction to cloud computing models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It then discusses potential security attacks to clouds like denial of service attacks and man-in-the-middle attacks. Security concerns with moving data and applications to the cloud are outlined. Techniques for securely publishing data in the cloud are also presented. The document concludes that security in cloud computing is challenging due to the complexity of clouds but that assurance of secure and mission-critical operations is important.
Fragmentation of Data in Large-Scale System For Ideal Performance and SecurityEditor IJCATR
Cloud computing is becoming prominent trend which offers the number of significant advantages. One of the ground laying
advantage of the cloud computing is the pay-as-per-use, where according to the use of the services, the customer has to pay. At present,
user’s storage availability improves the data generation. There is requiring farming out such large amount of data. There is indefinite
large number of Cloud Service Providers (CSP). The Cloud Service Providers is increasing trend for many number of organizations and
as well as for the customers that decreases the burden of the maintenance and local data storage. In cloud computing transferring data to
the third party administrator control will give rise to security concerns. Within the cloud, compromisation of data may occur due to
attacks by the unauthorized users and nodes. So, in order to protect the data in cloud the higher security measures are required and also
to provide security for the optimization of the data retrieval time. The proposed system will approach the issues of security and
performance. Initially in the DROPS methodology, the division of the files into fragments is done and replication of those fragmented
data over the cloud node is performed. Single fragment of particular file can be stored on each of the nodes which ensure that no
meaningful information is shown to an attacker on a successful attack. The separation of the nodes is done by T-Coloring in order to
prohibit an attacker to guess the fragment’s location. The complete data security is ensured by DROPS methodology
This document discusses security issues related to cloud computing. It begins with an introduction to cloud computing models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It then discusses potential security threats in cloud computing like denial of service attacks, side channel attacks, and man-in-the-middle cryptographic attacks. The document proposes a layered framework for assured cloud computing and techniques for secure publication of data in the cloud, including encryption. It concludes that achieving end-to-end security in cloud computing will be challenging due to complexity, but that more secure operations can be ensured even if some parts of the cloud fail.
E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Chal...IJERA Editor
In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing
Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated
infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application
Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage,
Account, Service, Traffic Hijacking and Unknown Risk Profile
IRJET-Using Downtoken Secure Group Data Sharing on CloudIRJET Journal
The document proposes a secure group data sharing scheme on cloud using key aggregate search encryption (KASE). In the proposed scheme, a data owner can generate a single download token (DT) to share a group of encrypted files with multiple users. The users only need to upload the DT to the cloud to search and download the shared files. This reduces the complexity of managing multiple encryption keys compared to traditional schemes. The scheme provides security, dynamic changes, low computation and communication costs for file access and key updates.
Cloud Computing Using Encryption and Intrusion Detectionijsrd.com
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
Multi-part Dynamic Key Generation For Secure Data EncryptionCSCJournals
Storage of user or application-generated user-specific private, confidential data on a third party storage provider comes with its own set of challenges. Although such data is usually encrypted while in transit, securely storing such data at rest presents unique security challenges. The first challenge is the generation of encryption keys to implement the desired threat containment. The second challenge is secure storage and management of these keys. This can be accomplished in several ways. A naive approach can be to trust the boundaries of a secure network and store the keys within these bounds in plain text. A more sophisticated method can be devised to calculate or infer the encryption key without explicitly storing it. This paper focuses on the latter approach. Additionally, the paper also describes the implementation of a system that in addition to exposing a set of REST APIs for secure CRUD operations also provides a means for sharing the data among specific users.
Iaetsd cloud computing and security challengesIaetsd Iaetsd
This document summarizes security challenges in cloud computing. It discusses how the distributed nature of cloud computing introduces security risks to confidential data and resources. It outlines several types of security threats like data breaches, malware injection, and network attacks. It also examines security requirements like confidentiality, integrity, and authentication. Finally, the document notes challenges like ensuring security, managing resources, and maintaining performance and interoperability remain open issues for cloud computing.
Semantic annotation, which is considered one of the semantic web applicative aspects, has been adopted by researchers from different communities as a paramount solution that improves searching and retrieval of information by promoting the richness of the content. However, researchers are facing challenges concerning both the quality and the relevance of the semantic annotations attached to the annotated document against its content as well as its semantics, without ignoring those regarding automation process which is supposed to ensure an optimal system for information indexing and retrieval. In this article, we will introduce the semantic annotation concept by presenting a state of the art including definitions, features and a classification of annotation systems. Systems and proposed approaches in the field will be cited, as well as a study of some existing annotation tools. This study will also pinpoint various problems and limitations related to the annotation in order to offer solutions for our future work.
fog computing provide security to the data in cloudpriyanka reddy
Fog computing extends cloud computing by providing security and data processing capabilities at the edge of the network, close to end users and devices. It aims to address issues like high latency and bandwidth usage that can occur when all data processing is done in the cloud. Fog computing deploys computing, storage, and applications closer to end devices and users in order to improve response times for latency-sensitive applications like smart grids and connected vehicles. It creates a distributed network that balances resources between the cloud and edge devices.
Fog computing extends cloud computing by providing security and data processing capabilities at the edge of the network, close to end users and devices. It aims to address issues like high latency and bandwidth usage that can occur when all data processing is done in the cloud. Fog computing deploys computing, storage, and applications between end devices and cloud data centers so that data can be processed locally when needed. This helps enable real-time applications like smart energy grids that require low latency responses by running applications on edge devices instead of sending all data to the cloud.
A Review on Key-Aggregate Cryptosystem for Climbable Knowledge Sharing in Clo...Editor IJCATR
The Data sharing is an important functionality in cloud storage. In this article, we show how to securely, efficiently, and
flexibly share data with others in cloud storage. We describe new public-key cryptosystems which produce constant-size ciphertexts
such that efficient delegation of decryption rights for any set of ciphertexts are possible. The novelty is that one can aggregate any set
of secret keys and make them as compact as a single key, but encompassing the power of all the keys being aggregated. In other
words, the secret key holder can release a constant-size aggregate key for flexible choices of ciphertext set in cloud storage, but the
other encrypted files outside the set remain confidential. This compact aggregate key can be conveniently sent to others or be stored in
a smart card with very limited secure storage. We provide formal security analysis of our schemes in the standard model. We also
describe other application of our schemes. In particular, our schemes give the first public-key patient controlled encryption for flexible
hierarchy, which was yet to be known.
An Approach towards Shuffling of Data to Avoid Tampering in CloudIRJET Journal
This document proposes an approach to secure data stored in the cloud by using data shuffling, access control policies, and deduplication. It discusses encrypting user data using AES before uploading it to the cloud. An administrator can control the shuffling of encrypted data between servers at regular intervals to avoid tampering. Access control policies require authorized users to authenticate with a secret key before performing file operations. Deduplication prevents duplicate files from being stored to reduce storage usage by hashing and comparing file contents. The proposed approach aims to enhance security, prevent data tampering and duplication, and efficiently use cloud storage.
IRJET - A Novel Approach Implementing Deduplication using Message Locked Encr...IRJET Journal
This document proposes a novel approach to implementing data deduplication on the cloud using message locked encryption. It aims to overcome limitations of existing deduplication techniques like convergent encryption by using erasure code technology, encryption algorithms like DES and MD5 hashing, and tokenization to securely store and protect client data on the cloud. The proposed system gives clients proof of ownership of their data by allowing them to choose who can access their files and see any changes made over time. The system architecture involves a client uploading encrypted data to the cloud, and recipients selected by the client being able to access and retrieve encrypted pieces of the data.
Providing Secure Cloud for College Campusvivatechijri
In colleges data stored on the server can be access by any college staff, student or professor. Data is
very important and should not be altered or accessed without permission of its owner. But in these type of medium
scale organizations server can be access by anyone. A better approach to maintain the data security and
sustainable storage is cloud. Cloud provides user management for authentication and authorized access of stored
data. Since data is upload in cloud through network therefore its security during this phase is very important.
For this, encryption algorithms can be used to protect it from hacker. It provides efficient way to carryout
operations such as uploading and downloading data. An efficient use of storage should be a primary concern for
which data deduplication technique can be applied. Using this technique uploading of duplicate files can be
avoided.
This document discusses dynamic security techniques for content management repository systems. It begins by introducing content management systems, digital rights management, and digital asset management. It then proposes using a variant security approach that changes the encryption and decryption formulas used for each file (referred to as S(i), S(i+1), etc.) to make the system more difficult to attack. The document outlines how content would be encrypted before distribution, accessed through user authentication with a key distribution center, and viewed using a program tool that handles decryption. This dynamic changing of the security formulas with each file is proposed to improve protection of content distribution, modification, fabrication and secrecy.
Cloud computing involves clusters of servers connected over a network that allow users to access computational resources and pay only for what they use. While cloud computing provides advantages like flexibility and cost savings, security is a main concern as user data is stored remotely. Fog computing is a new technique that extends cloud computing by providing additional security measures and isolating user data at the network edge to enhance privacy. It aims to place data closer to end users to improve security in cloud environments.
Secure Data Sharing in Cloud through Limiting Trust in Third Party/ServerIRJET Journal
This document proposes a method for secure data sharing in clouds that limits trust in third party servers. It discusses shortcomings of existing approaches that rely fully on third parties for security operations like encryption and access control. The proposed method uses a two-layer encryption scheme, where the data owner performs lower layer encryption and the third party performs upper layer encryption. This limits the third party's access to the raw data. The owner also maintains control over access rights by directly providing keys for lower layer encryption to authorized users. The method is implemented and evaluated experimentally using Java and cloud services. It aims to address security and privacy concerns when outsourcing data storage while still offloading some operations to a third party.
IRJET- Securing Cloud Data Under Key ExposureIRJET Journal
This document proposes a new auditing mechanism to improve the efficiency and security of attribute-based encryption for securing cloud data. The existing single attribute authority model results in long wait times for users to obtain secret keys. The proposed approach employs multiple attribute authorities that can share the work of key distribution to reduce wait times. A central authority generates keys for verified users, while each attribute is managed by its own authority. The mechanism can also detect incorrectly verified users to enhance security. Analysis shows the auditing mechanism improves cloud security performance compared to previous single authority schemes.
Bio-Cryptography Based Secured Data Replication Management in Cloud StorageIJERA Editor
Cloud computing is new way of economical and efficient storage. The single data mart storage system is a less
secure because data remain under a single data mart. This can lead to data loss due to different causes like
hacking, server failure etc. If an attacker chooses to attack a specific client, then he can aim at a fixed cloud
provider, try to have access to the client’s information. This makes an easy job of the attackers, both inside and
outside attackers get the benefit of using data mining to a great extent. Inside attackers refer to malicious
employees at a cloud provider. Thus single data mart storage architecture is the biggest security threat
concerning data mining on cloud, so in this paper present the secure replication approach that encrypt based on
biocrypt and replicate the data in distributed data mart storage system. This approach involves the encryption,
replication and storage of data
Improving Efficiency of Security in Multi-CloudIJTET Journal
Abstract--Due to risk in service availability failure and the possibilities of malicious insiders in the single cloud, a movement towards “Multi-clouds” has emerged recently. In general a multi-cloud security system there is a possibility for third party to access the user files. Ensuring security in this stage has become tedious since, most of the activities are done in network. In this paper, an enhanced security methodology has been introduced in order to make the data stored in cloud more secure. Duple authentication process introduced in this concept defends malicious insiders and shields the private data. Various disadvantages in traditional systems like unauthorized access, hacking have been overcome in this proposed system and a comparison made with the traditional systems in terms of performance and computational time have shown better results.
The document discusses data security and access control. It emphasizes that data security is important for both individuals and organizations to protect data stored in databases. As technology advances, data becomes more vulnerable to security breaches. Effective data security requires confidentiality, integrity, and availability. Access control systems are important to ensure data secrecy by checking user privileges and authorizations. Additional security measures like encryption can further enhance data protection. The paper focuses on access control and privacy requirements to examine how to guarantee data security.
IRJET- Analysis of Cloud Security and Performance for Leakage of Critical...IRJET Journal
1) The DROPS methodology divides files into fragments and replicates each fragment to different cloud nodes to improve security and performance of outsourced data. Each node stores only a single fragment, so a successful attack would not reveal meaningful information.
2) Nodes storing fragments are separated using graph T-coloring to restrict an attacker's ability to guess fragment locations.
3) The methodology aims to increase the effort required for an attacker to retrieve data after intrusion while minimizing data loss. It evaluates both security and performance when securing and managing outsourced data in the cloud.
A survey on cloud security issues and techniquesijcsa
This document summarizes security issues and techniques related to cloud computing. It discusses common cloud security threats such as multi-tenancy, elasticity, insider and outsider attacks, loss of control, data loss, network attacks, malware injection, and flooding attacks. The document also outlines techniques for securing data in the cloud, including authentication, encryption, privacy, availability, and information management. Finally, it briefly discusses cloud computing security standards like SAML, OAuth, OpenID and SSL/TLS.
This document discusses domain data security on cloud computing. It begins by defining domains as a way to partition data for security, notifications, and reporting purposes. Data is highly secure within a domain. The document then discusses how distributing data across different domains based on regions can improve security and access. It analyzes security issues in cloud environments and discusses authentication, encryption, and other techniques used for data security in cloud computing. Segregating data by domain allows for faster access, easier maintenance, and higher security according to the document.
IRJET- Privacy Preserving and Proficient Identity Search Techniques for C...IRJET Journal
This document presents a privacy preserving and efficient identity search technique for cloud data security. It proposes a scheme using visual-encryption techniques to overcome issues with untrusted cloud storage. The existing methodology uses data signing algorithms but has limitations as the private key depends on the security of one computer. The proposed system uses visual-cryptographic encryption, which scrambles data using an algorithm requiring a key to decrypt. It involves users uploading encrypted files, administrators approving requests to view files through live video verification, and decryption using the appropriate key. The scheme aims to securely store large volumes of data while allowing identity verification for file access on the cloud.
A Secure Multi-Owner Data Sharing Scheme for Dynamic Group in Public Cloud. IJCERT JOURNAL
In cloud computing outsourcing group resource among cloud users is a major challenge, so cloud computing provides a low-cost and well-organized solution. Due to frequent change of membership, sharing data in a multi-owner manner to an untrusted cloud is still its challenging issue. In this paper we proposed a secure multi-owner data sharing scheme for dynamic group in public cloud. By providing AES encryption with convergent key while uploading the data, any cloud user can securely share data with others. Meanwhile, the storage overhead and encryption computation cost of the scheme are independent with the number of revoked users. In addition, I analyze the security of this scheme with rigorous proofs. One-Time Password is one of the easiest and most popular forms of authentication that can be used for securing access to accounts. One-Time Passwords are often referred to as secure and stronger forms of authentication in multi-owner manner. Extensive security and performance analysis shows that our proposed scheme is highly efficient and satisfies the security requirements for public cloud based secure group sharing.
Hybrid model for detection of brain tumor using convolution neural networksCSITiaesprime
The development of aberrant brain cells, some of which may turn cancerous, is known as a brain tumor. Magnetic resonance imaging (MRI) scans are the most common technique for finding brain tumors. Information about the aberrant tissue growth in the brain is discernible from the MRI scans. In numerous research papers, machine learning, and deep learning algorithms are used to detect brain tumors. It takes extremely little time to forecast a brain tumor when these algorithms are applied to MRI pictures, and better accuracy makes it easier to treat patients. The radiologist can make speedy decisions because of this forecast. The proposed work creates a hybrid convolution neural networks (CNN) model using CNN for feature extraction and logistic regression (LR). The pre-trained model visual geometry group 16 (VGG16) is used for the extraction of features. To reduce the complexity and parameters to train we eliminated the last eight layers of VGG16. From this transformed model the features are extracted in the form of a vector array. These features fed into different machine learning classifiers like support vector machine (SVM), naïve bayes (NB), LR, extreme gradient boosting (XGBoost), AdaBoost, and random forest for training and testing. The performance of different classifiers is compared. The CNN-LR hybrid combination outperformed the remaining classifiers. The evaluation measures such as recall, precision, F1-score, and accuracy of the proposed CNN-LR model are 94%, 94%, 94%, and 91% respectively.
Implementing lee's model to apply fuzzy time series in forecasting bitcoin priceCSITiaesprime
Over time, cryptocurrencies like Bitcoin have attracted investor's and speculators' interest. Bitcoin's dramatic rise in value in recent years has caught the attention of many who see it as a promising investment asset. After all, Bitcoin investment is inseparable from Bitcoin price volatility that investors must mitigate. This research aims to use Lee's Fuzzy Time Series approach to forecast the price of Bitcoin. A time series analysis method called Lee's Fuzzy Time Series to get around ambiguity and uncertainty in time series data. Ching-Cheng Lee first introduced this approach in his research on time series prediction. This method is a development of several previous fuzzy time series (FTS) models, namely Song and Chissom and Cheng and Chen. According to most previous studies, Lee's model was stated to be able to convey more precise forecasting results than the classic model from the FTS. This study used first and second orders, where researchers obtained error values from the first order of 5.419% and the second order of 4.042%, which means that the forecasting results are excellent. But of both orders, only the first order can be used to predict the next period's Bitcoin price. In the second order, the resulting relations in the next period do not have groups in their fuzzy logical relationship group (FLRG), so they can not predict the price in the next period. This study contributes to considering investors and the general public as a factor in keeping, selling, or purchasing cryptocurrencies.
More Related Content
Similar to High security mechanism: Fragmentation and replication in the cloud with auto update in the system
Iaetsd cloud computing and security challengesIaetsd Iaetsd
This document summarizes security challenges in cloud computing. It discusses how the distributed nature of cloud computing introduces security risks to confidential data and resources. It outlines several types of security threats like data breaches, malware injection, and network attacks. It also examines security requirements like confidentiality, integrity, and authentication. Finally, the document notes challenges like ensuring security, managing resources, and maintaining performance and interoperability remain open issues for cloud computing.
Semantic annotation, which is considered one of the semantic web applicative aspects, has been adopted by researchers from different communities as a paramount solution that improves searching and retrieval of information by promoting the richness of the content. However, researchers are facing challenges concerning both the quality and the relevance of the semantic annotations attached to the annotated document against its content as well as its semantics, without ignoring those regarding automation process which is supposed to ensure an optimal system for information indexing and retrieval. In this article, we will introduce the semantic annotation concept by presenting a state of the art including definitions, features and a classification of annotation systems. Systems and proposed approaches in the field will be cited, as well as a study of some existing annotation tools. This study will also pinpoint various problems and limitations related to the annotation in order to offer solutions for our future work.
fog computing provide security to the data in cloudpriyanka reddy
Fog computing extends cloud computing by providing security and data processing capabilities at the edge of the network, close to end users and devices. It aims to address issues like high latency and bandwidth usage that can occur when all data processing is done in the cloud. Fog computing deploys computing, storage, and applications closer to end devices and users in order to improve response times for latency-sensitive applications like smart grids and connected vehicles. It creates a distributed network that balances resources between the cloud and edge devices.
Fog computing extends cloud computing by providing security and data processing capabilities at the edge of the network, close to end users and devices. It aims to address issues like high latency and bandwidth usage that can occur when all data processing is done in the cloud. Fog computing deploys computing, storage, and applications between end devices and cloud data centers so that data can be processed locally when needed. This helps enable real-time applications like smart energy grids that require low latency responses by running applications on edge devices instead of sending all data to the cloud.
A Review on Key-Aggregate Cryptosystem for Climbable Knowledge Sharing in Clo...Editor IJCATR
The Data sharing is an important functionality in cloud storage. In this article, we show how to securely, efficiently, and
flexibly share data with others in cloud storage. We describe new public-key cryptosystems which produce constant-size ciphertexts
such that efficient delegation of decryption rights for any set of ciphertexts are possible. The novelty is that one can aggregate any set
of secret keys and make them as compact as a single key, but encompassing the power of all the keys being aggregated. In other
words, the secret key holder can release a constant-size aggregate key for flexible choices of ciphertext set in cloud storage, but the
other encrypted files outside the set remain confidential. This compact aggregate key can be conveniently sent to others or be stored in
a smart card with very limited secure storage. We provide formal security analysis of our schemes in the standard model. We also
describe other application of our schemes. In particular, our schemes give the first public-key patient controlled encryption for flexible
hierarchy, which was yet to be known.
An Approach towards Shuffling of Data to Avoid Tampering in CloudIRJET Journal
This document proposes an approach to secure data stored in the cloud by using data shuffling, access control policies, and deduplication. It discusses encrypting user data using AES before uploading it to the cloud. An administrator can control the shuffling of encrypted data between servers at regular intervals to avoid tampering. Access control policies require authorized users to authenticate with a secret key before performing file operations. Deduplication prevents duplicate files from being stored to reduce storage usage by hashing and comparing file contents. The proposed approach aims to enhance security, prevent data tampering and duplication, and efficiently use cloud storage.
IRJET - A Novel Approach Implementing Deduplication using Message Locked Encr...IRJET Journal
This document proposes a novel approach to implementing data deduplication on the cloud using message locked encryption. It aims to overcome limitations of existing deduplication techniques like convergent encryption by using erasure code technology, encryption algorithms like DES and MD5 hashing, and tokenization to securely store and protect client data on the cloud. The proposed system gives clients proof of ownership of their data by allowing them to choose who can access their files and see any changes made over time. The system architecture involves a client uploading encrypted data to the cloud, and recipients selected by the client being able to access and retrieve encrypted pieces of the data.
Providing Secure Cloud for College Campusvivatechijri
In colleges data stored on the server can be access by any college staff, student or professor. Data is
very important and should not be altered or accessed without permission of its owner. But in these type of medium
scale organizations server can be access by anyone. A better approach to maintain the data security and
sustainable storage is cloud. Cloud provides user management for authentication and authorized access of stored
data. Since data is upload in cloud through network therefore its security during this phase is very important.
For this, encryption algorithms can be used to protect it from hacker. It provides efficient way to carryout
operations such as uploading and downloading data. An efficient use of storage should be a primary concern for
which data deduplication technique can be applied. Using this technique uploading of duplicate files can be
avoided.
This document discusses dynamic security techniques for content management repository systems. It begins by introducing content management systems, digital rights management, and digital asset management. It then proposes using a variant security approach that changes the encryption and decryption formulas used for each file (referred to as S(i), S(i+1), etc.) to make the system more difficult to attack. The document outlines how content would be encrypted before distribution, accessed through user authentication with a key distribution center, and viewed using a program tool that handles decryption. This dynamic changing of the security formulas with each file is proposed to improve protection of content distribution, modification, fabrication and secrecy.
Cloud computing involves clusters of servers connected over a network that allow users to access computational resources and pay only for what they use. While cloud computing provides advantages like flexibility and cost savings, security is a main concern as user data is stored remotely. Fog computing is a new technique that extends cloud computing by providing additional security measures and isolating user data at the network edge to enhance privacy. It aims to place data closer to end users to improve security in cloud environments.
Secure Data Sharing in Cloud through Limiting Trust in Third Party/ServerIRJET Journal
This document proposes a method for secure data sharing in clouds that limits trust in third party servers. It discusses shortcomings of existing approaches that rely fully on third parties for security operations like encryption and access control. The proposed method uses a two-layer encryption scheme, where the data owner performs lower layer encryption and the third party performs upper layer encryption. This limits the third party's access to the raw data. The owner also maintains control over access rights by directly providing keys for lower layer encryption to authorized users. The method is implemented and evaluated experimentally using Java and cloud services. It aims to address security and privacy concerns when outsourcing data storage while still offloading some operations to a third party.
IRJET- Securing Cloud Data Under Key ExposureIRJET Journal
This document proposes a new auditing mechanism to improve the efficiency and security of attribute-based encryption for securing cloud data. The existing single attribute authority model results in long wait times for users to obtain secret keys. The proposed approach employs multiple attribute authorities that can share the work of key distribution to reduce wait times. A central authority generates keys for verified users, while each attribute is managed by its own authority. The mechanism can also detect incorrectly verified users to enhance security. Analysis shows the auditing mechanism improves cloud security performance compared to previous single authority schemes.
Bio-Cryptography Based Secured Data Replication Management in Cloud StorageIJERA Editor
Cloud computing is new way of economical and efficient storage. The single data mart storage system is a less
secure because data remain under a single data mart. This can lead to data loss due to different causes like
hacking, server failure etc. If an attacker chooses to attack a specific client, then he can aim at a fixed cloud
provider, try to have access to the client’s information. This makes an easy job of the attackers, both inside and
outside attackers get the benefit of using data mining to a great extent. Inside attackers refer to malicious
employees at a cloud provider. Thus single data mart storage architecture is the biggest security threat
concerning data mining on cloud, so in this paper present the secure replication approach that encrypt based on
biocrypt and replicate the data in distributed data mart storage system. This approach involves the encryption,
replication and storage of data
Improving Efficiency of Security in Multi-CloudIJTET Journal
Abstract--Due to risk in service availability failure and the possibilities of malicious insiders in the single cloud, a movement towards “Multi-clouds” has emerged recently. In general a multi-cloud security system there is a possibility for third party to access the user files. Ensuring security in this stage has become tedious since, most of the activities are done in network. In this paper, an enhanced security methodology has been introduced in order to make the data stored in cloud more secure. Duple authentication process introduced in this concept defends malicious insiders and shields the private data. Various disadvantages in traditional systems like unauthorized access, hacking have been overcome in this proposed system and a comparison made with the traditional systems in terms of performance and computational time have shown better results.
The document discusses data security and access control. It emphasizes that data security is important for both individuals and organizations to protect data stored in databases. As technology advances, data becomes more vulnerable to security breaches. Effective data security requires confidentiality, integrity, and availability. Access control systems are important to ensure data secrecy by checking user privileges and authorizations. Additional security measures like encryption can further enhance data protection. The paper focuses on access control and privacy requirements to examine how to guarantee data security.
IRJET- Analysis of Cloud Security and Performance for Leakage of Critical...IRJET Journal
1) The DROPS methodology divides files into fragments and replicates each fragment to different cloud nodes to improve security and performance of outsourced data. Each node stores only a single fragment, so a successful attack would not reveal meaningful information.
2) Nodes storing fragments are separated using graph T-coloring to restrict an attacker's ability to guess fragment locations.
3) The methodology aims to increase the effort required for an attacker to retrieve data after intrusion while minimizing data loss. It evaluates both security and performance when securing and managing outsourced data in the cloud.
A survey on cloud security issues and techniquesijcsa
This document summarizes security issues and techniques related to cloud computing. It discusses common cloud security threats such as multi-tenancy, elasticity, insider and outsider attacks, loss of control, data loss, network attacks, malware injection, and flooding attacks. The document also outlines techniques for securing data in the cloud, including authentication, encryption, privacy, availability, and information management. Finally, it briefly discusses cloud computing security standards like SAML, OAuth, OpenID and SSL/TLS.
This document discusses domain data security on cloud computing. It begins by defining domains as a way to partition data for security, notifications, and reporting purposes. Data is highly secure within a domain. The document then discusses how distributing data across different domains based on regions can improve security and access. It analyzes security issues in cloud environments and discusses authentication, encryption, and other techniques used for data security in cloud computing. Segregating data by domain allows for faster access, easier maintenance, and higher security according to the document.
IRJET- Privacy Preserving and Proficient Identity Search Techniques for C...IRJET Journal
This document presents a privacy preserving and efficient identity search technique for cloud data security. It proposes a scheme using visual-encryption techniques to overcome issues with untrusted cloud storage. The existing methodology uses data signing algorithms but has limitations as the private key depends on the security of one computer. The proposed system uses visual-cryptographic encryption, which scrambles data using an algorithm requiring a key to decrypt. It involves users uploading encrypted files, administrators approving requests to view files through live video verification, and decryption using the appropriate key. The scheme aims to securely store large volumes of data while allowing identity verification for file access on the cloud.
A Secure Multi-Owner Data Sharing Scheme for Dynamic Group in Public Cloud. IJCERT JOURNAL
In cloud computing outsourcing group resource among cloud users is a major challenge, so cloud computing provides a low-cost and well-organized solution. Due to frequent change of membership, sharing data in a multi-owner manner to an untrusted cloud is still its challenging issue. In this paper we proposed a secure multi-owner data sharing scheme for dynamic group in public cloud. By providing AES encryption with convergent key while uploading the data, any cloud user can securely share data with others. Meanwhile, the storage overhead and encryption computation cost of the scheme are independent with the number of revoked users. In addition, I analyze the security of this scheme with rigorous proofs. One-Time Password is one of the easiest and most popular forms of authentication that can be used for securing access to accounts. One-Time Passwords are often referred to as secure and stronger forms of authentication in multi-owner manner. Extensive security and performance analysis shows that our proposed scheme is highly efficient and satisfies the security requirements for public cloud based secure group sharing.
Similar to High security mechanism: Fragmentation and replication in the cloud with auto update in the system (20)
Hybrid model for detection of brain tumor using convolution neural networksCSITiaesprime
The development of aberrant brain cells, some of which may turn cancerous, is known as a brain tumor. Magnetic resonance imaging (MRI) scans are the most common technique for finding brain tumors. Information about the aberrant tissue growth in the brain is discernible from the MRI scans. In numerous research papers, machine learning, and deep learning algorithms are used to detect brain tumors. It takes extremely little time to forecast a brain tumor when these algorithms are applied to MRI pictures, and better accuracy makes it easier to treat patients. The radiologist can make speedy decisions because of this forecast. The proposed work creates a hybrid convolution neural networks (CNN) model using CNN for feature extraction and logistic regression (LR). The pre-trained model visual geometry group 16 (VGG16) is used for the extraction of features. To reduce the complexity and parameters to train we eliminated the last eight layers of VGG16. From this transformed model the features are extracted in the form of a vector array. These features fed into different machine learning classifiers like support vector machine (SVM), naïve bayes (NB), LR, extreme gradient boosting (XGBoost), AdaBoost, and random forest for training and testing. The performance of different classifiers is compared. The CNN-LR hybrid combination outperformed the remaining classifiers. The evaluation measures such as recall, precision, F1-score, and accuracy of the proposed CNN-LR model are 94%, 94%, 94%, and 91% respectively.
Implementing lee's model to apply fuzzy time series in forecasting bitcoin priceCSITiaesprime
Over time, cryptocurrencies like Bitcoin have attracted investor's and speculators' interest. Bitcoin's dramatic rise in value in recent years has caught the attention of many who see it as a promising investment asset. After all, Bitcoin investment is inseparable from Bitcoin price volatility that investors must mitigate. This research aims to use Lee's Fuzzy Time Series approach to forecast the price of Bitcoin. A time series analysis method called Lee's Fuzzy Time Series to get around ambiguity and uncertainty in time series data. Ching-Cheng Lee first introduced this approach in his research on time series prediction. This method is a development of several previous fuzzy time series (FTS) models, namely Song and Chissom and Cheng and Chen. According to most previous studies, Lee's model was stated to be able to convey more precise forecasting results than the classic model from the FTS. This study used first and second orders, where researchers obtained error values from the first order of 5.419% and the second order of 4.042%, which means that the forecasting results are excellent. But of both orders, only the first order can be used to predict the next period's Bitcoin price. In the second order, the resulting relations in the next period do not have groups in their fuzzy logical relationship group (FLRG), so they can not predict the price in the next period. This study contributes to considering investors and the general public as a factor in keeping, selling, or purchasing cryptocurrencies.
Sentiment analysis of online licensing service quality in the energy and mine...CSITiaesprime
The Ministry of Energy and Mineral Resources of the Republic of Indonesia regularly assessed public satisfaction with its online licensing services. User rated their satisfaction at 3.42 on a scale of 4, below the organization's average of 3.53. Evaluating public service performance is crucial for quality improvement. Previous research relied solely on survey data to assess public satisfaction. This study goes further by analyzing user feedback in text form from an online licensing application to identify negative aspects of the service that need enhancement. The dataset spanned September 2019 to February 2023, with 24,112 entries. The choice of classification methods on the highest accuracy values among decision tree, random forest, naive bayes, stochastic gradient descent, logistic regression (LR), and k-nearest neighbor. The text data was converted into numerical form using CountVectorizer and term frequency-inverse document frequency (TF-IDF) techniques, along with unigrams and bigrams for dividing sentences into word segments. LR bigram CountVectorizer ranked highest with 89% for average precision, F1-score, and recall, compared to the other five classification methods. The sentiment analysis polarity level was 36.2% negative. Negative sentiment revealed expectations from the public to the ministry to improve the top three aspects: system, mechanism, and procedure; infrastructure and facilities; and service specification product types.
Trends in sentiment of Twitter users towards Indonesian tourism: analysis wit...CSITiaesprime
This research analyzes the sentiment of Twitter users regarding tourism in Indonesia using the keyword "wonderful Indonesia" as the tourism promotion identity. This study aims to gain a deeper understanding of the public sentiment towards "wonderful Indonesia" through social media data analysis. The novelty obtained provides new insights into valuable information about Indonesian tourism for the government and relevant stakeholders in promoting Indonesian tourism and enhancing tourist experiences. The method used is tweet analysis and classification using the K-nearest neighbor (KNN) algorithm to determine the positive, neutral, or negative sentiment of the tweets. The classification results show that the majority of tweets (65.1% out of a total of 14,189 tweets) have a neutral sentiment, indicating that most tweets with the "wonderful Indonesia" tagline are related to advertising or promoting Indonesian tourism. However, the percentage of tweets with positive sentiment (33.8%) is higher than those with negative sentiment (1.1%). This study also achieved training results with an accuracy rate of 98.5%, precision of 97.6%, recall of 98.5%, and F1-score of 98.1%. However, reassessment is needed in the future as Twitter users' sentiments can change along with the development of Indonesian tourism itself.
The impact of usability in information technology projectsCSITiaesprime
Achieving success in information system and technology (IS/IT) projects is a complex and multifaceted endeavour that has proven difficult. The literature is replete with project failures, but identifying the critical success factors contributing to favourable outcomes remains challenging. The triad of Time-Cost-Quality is widely accepted as key to achieving project success. While time and cost can be quantified and measured, quality is a more complex construct that requires different metrics and measurement approaches. Utilizing the PRISMA Methodology, this study initiated a comprehensive search across literature databases and identified 142 relevant articles pertaining to the specified keywords. A subset of ten articles was deemed suitable for further examination through rigorous screening and eligibility assessments. Notably, a primary finding indicates that despite recognizing usability as a critical element, there is a tendency to neglect usability enhancements due to time and resource constraints. Regarding the influence of usability on project success, the active involvement of end-users emerges as a pivotal factor. Moreover, fostering the enhancement of Human Computer Interaction (HCI) knowledge within the development team is essential. Failure to provide good usability can lead to project failure, undermining user satisfaction and adoption of the technology.
Collecting and analyzing network-based evidenceCSITiaesprime
Since nearly the beginning of the Internet, malware has been a significant deterrent to productivity for end users, both personal and business related. Due to the pervasiveness of digital technologies in all aspects of human lives, it is increasingly unlikely that a digital device is involved as goal, medium or simply ‘witness’ of a criminal event. Forensic investigations include collection, recovery, analysis, and presentation of information stored on network devices and related to network crimes. These activities often involve wide range of analysis tools and application of different methods. This work presents methods that helps digital investigators to correlate and present information acquired from forensic data, with the aim to get a more valuable reconstructions of events or action to reach case conclusions. Main aim of network forensic is to gather evidence. Additionally, the evidence obtained during the investigation must be produced through a rigorous investigation procedure in a legal context.
Agile adoption challenges in insurance: a systematic literature and expert re...CSITiaesprime
The drawback of agile is struggled to function in large businesses like banks, insurance companies, and government agencies, which are frequently associated with cumbersome processes. Traditional software development techniques were cumbersome and pay more attention to standardization and industry, this leads to high costs and prolonged costs. The insurance company does not embrace change and agility may find themselves distracted and lose customers to agile competitors who are more relevant and customer-centric. Thus, to investigate the challenges and to recognize the prospect of agile adoption in insurance industry, a systematic literature review (SLR) in this study was organized and validated by expert review from professional with expertise in agile. The project performance domain from project management body of knowledge (PMBOK) was applied to align the challenges and the solution. Academicians and practitioners can acquire the perception and knowledge in having exceeded understanding about the challenge and solution of agile adoption from the results.
Exploring network security threats through text mining techniques: a comprehe...CSITiaesprime
In response to the escalating cybersecurity threats, this research focuses on leveraging text mining techniques to analyze network security data effectively. The study utilizes user-generated reports detailing attacks on server networks. Employing clustering algorithms, these reports are grouped based on threat levels. Additionally, a classification algorithm discerns whether network activities pose security risks. The research achieves a noteworthy 93% accuracy in text classification, showcasing the efficacy of these techniques. The novelty lies in classifying security threat report logs according to their threat levels. Prioritizing high-risk threats, this approach aids network management in strategic focus. By enabling swift identification and categorization of network security threats, this research equips organizations to take prompt, targeted actions, enhancing overall network security.
An LSTM-based prediction model for gradient-descending optimization in virtua...CSITiaesprime
A virtual learning environment (VLE) is an online learning platform that allows many students, even millions, to study according to their interests without being limited by space and time. Online learning environments have many benefits, but they also have some drawbacks, such as high dropout rates, low engagement, and students' self-regulated behavior. Evaluating and analyzing the students' data generated from online learning platforms can help instructors to understand and monitor students learning progress. In this study, we suggest a predictive model for assessing student success in online learning. We investigate the effect of hyperparameters on the prediction of student learning outcomes in VLEs by the long short-term memory (LSTM) model. A hyperparameter is a parameter that has an impact on prediction results. Two optimization algorithms, adaptive moment estimation (Adam) and Nesterov-accelerated adaptive moment estimation (Nadam), were used to modify the LSTM model's hyperparameters. Based on the findings of research done on the optimization of the LSTM model using the Adam and Nadam algorithm. The average accuracy of the LSTM model using Nadam optimization is 89%, with a maximum accuracy of 93%. The LSTM model with Nadam optimisation performs better than the model with Adam optimisation when predicting students in online learning.
Generalization of linear and non-linear support vector machine in multiple fi...CSITiaesprime
Support vector machines (SVMs) are a set of related supervised learning methods used for classification and regression. They belong to a family of generalized linear classifiers. In other terms, SVM is a classification and regression prediction tool that uses machine learning theory to maximize predictive accuracy. In this article, the discussion about linear and non-linear SVM classifiers with their functions and parameters is investigated. Due to the equality type of constraints in the formulation, the solution follows from solving a set of linear equations. Besides this, if the under-consideration problem is in the form of a non-linear case, then the problem must convert into linear separable form with the help of kernel trick and solve it according to the methods. Some important algorithms related to sentimental work are also presented in this paper. Generalization of the formulation of linear and non-linear SVMs is also open in this article. In the final section of this paper, the different modified sections of SVM are discussed which are modified by different research for different purposes.
Designing a framework for blockchain-based e-voting system for LibyaCSITiaesprime
A transition to democratic rule is considered the first step down a long road towards Libya’s recovery and prosperity. Thus, it strives to improve the country’s elections by introducing new technologies. A blockchain is a distributed ledger that is characterised by independence and security. Therefore, it has been widely applied in various fields ranging from credit encryption and digital currency. With the development of internet technology, electronic voting (E-voting) systems have been greatly popularised. However, they suffer from various security threats, which create a sense of distrust among existing systems. Integrating blockchain with online elections is a promising trend, which could lead to make an election transparent, immutable, reliable, and more secure. In this paper, we present a literature review and a case analysis of blockchain technology. Moreover, a framework for an E-voting system based on blockchain is proposed. The methodology is adopted on the basis of three activities, they are identification of the relevant literature about E-voting, system modelling, and the determination of suitable technological tools. The framework is secure and reliable. Thus, it could help increase the number of voters and ensure a high level of participation, as well as facilitate free and fair electoral processes.
Development reference model to build management reporter using dynamics great...CSITiaesprime
The digital technology transformation impacts changes in business patterns that require companies to innovate to act appropriately in making strategic decisions quickly, precisely, and accurately to increase efficiency, be practical company performance, and impacts changes in business patterns that require companies to innovate to act appropriately in making strategic decisions quickly to improve the performance. An enterprise resource planning (ERP) system is one step toward achieving performance. ERP system is one step to achieving performance. ERP system is essential for companies to automate the efficiency of business processes. The decisions from management in implementing the ERP system are necessary for ERP implementation to be successful. However, in practice, companies still experience complexity. For that, it needs to be considered related a business process reference model is essential to enhance efficiency in implementing the ERP used. This research discusses the business process reference model based on the ERP dynamics great plain (GP) application aggregated using management reporter (MR) to help users better understand the practical overview. The methodology utilizes a reference model based on Microsoft Dynamics GP guidelines with a business process redesign approach. This contributes to developing business processes to help users understand using the ERP dynamics GP application.
Social media and optimization of the promotion of Lake Toba tourism destinati...CSITiaesprime
Tourism is one of the largest contributors to Indonesia's foreign exchange earnings, surpassing taxation, energy, and gas. This study seeks to investigate the use of social media to optimize the promotion of Lake Toba as a tourist destination, which has been impacted by the COVID-19 pandemic. Using interview techniques and live field observations, it was discovered that social media, particularly the Instagram platform, play a significant role in promoting Lake Toba tourism. The Department of Culture and Tourism of the North Sumatra Province uses landscape photography as its primary promotion method, which has proved to be more effective and interesting than conventional methods such as the distribution of brochures or the use of manuals. The capture procedure and techniques for landscape photography were carried out by professional photographers in collaboration with the Department of Culture and Tourism of the North Sumatra Province. In addition to providing information, tourism sumut's Instagram account functions as a platform to raise public awareness about Lake Toba tourism and as a promotional medium for North Sumatra's tourist attractions on an international scale. Department of Culture and Tourism of the North Sumatra Province collaborates with travel agencies and local communities to disseminate Lake Toba tourism information.
Caring jacket: health monitoring jacket integrated with the internet of thing...CSITiaesprime
One of the policies that have been made by the World Health Organization (WHO) and the Indonesian government during this COVID-19 pandemic, is to use an oximeter for self-isolation patients. The oximeter is used to monitor the patient if happy hypoxia which is a silent killer, happens to the patient. To maintain body endurance, exercise is needed by COVID-19 patients, but doing too much exercise can also cause decreased immunity. That’s why fatigue level and exercise intensity need to be monitored. When exercising, social distancing protocol should be also reminded because can lower COVID-19 spreading up to 13.6%. To solve this issue, the Caring Jacket is proposed which is a health monitoring jacket integrated with an IoT system. This jacket is equipped with some sensors and the global positioning system (GPS) for tracking. The data from the test showed the temperature reading accuracy is up to 99.38%, the oxygen rate up to 97.31%, the beats per minute (BPM) sensor up to 97.82%, and the precision of all sensors is 97.00% compared with a calibrated device.
Net impact implementation application development life-cycle management in ba...CSITiaesprime
Digital transformation in the banking sector creates a lot of demand for application development, either new development or application enhancement. Continuous demand for reimagining, revamping, and running applications reliably needs to be supported by collaboration tools. Several big banks in Indonesia use Atlassian products, including Jira, Confluence, Bamboo, Bitbucket, and Crowd, to support strategic company projects. We need to measure the net impact of application development life-cycle management (ADLM) as a collaboration tool. Using the deLone and McLean model, process questionnaire data from banks in Indonesia that use ADLM. Processing data using structural equation modeling (SEM), multiple variables are analyzed statistically to establish, estimate, and test the causation model. The conclusions highlight that system quality strongly affected only User Satisfaction (p=0.049 and β=0.39). Information quality strongly affected use (p=0.001 and β=0.84) and strongly affected user satisfaction (p=0.169 and β=0.28). Service quality strongly affected only use (p=0.127 and β=0.31). Conclusion research verifies the information system's achievement approach described by DeLone and McLean. Importantly, it was discovered that system usability and quality were key indicators of ADLM success. To fulfill their objective, ADLM must be developed in a way that is simple to use, adaptable, and functional.
Circularly polarized metamaterial Antenna in energy harvesting wearable commu...CSITiaesprime
When battery powered sensors are spread out in places that are sometimes hard to reach, sustaining them become difficult. Therefore, to develop this technology on a large scale such as in the internet of things (IoT) scenario, it is necessary to figure out how to power them. The proffered solution in this work, is to get energy from the environment using energy harvesting Antennas. This work presents a wearable circular polarized efficient receiving and transmitting sensors for medical, IoT, and communication systems at the frequency range of WLAN, and GSM from 900 MHz up to 6 GHz. Using a cascaded system block of a circularly polarized Antenna, a rectifier and t-matching network, the design was successfully simulated. A DC charging voltage of 2.8V was achieved to power-up batteries of the wearable and IoT sensors. The major contribution of this work is the tri-band Antenna system which is able to harvest reflected Wi-Fi frequencies and also GSM frequencies combined in a miniaturized manner. This innovative configuration is a step forward in building devices with over 80% duty cycle.
An ensemble approach for the identification and classification of crime tweet...CSITiaesprime
Twitter is a famous social media platform, which supports short posts limited to 280 characters. Users tweet about many topics like movie reviews, customer service, meals they just ate, and awareness posts. Tweets carrying information about some crime scenes are crime tweets. Crime tweets are crucial and informative and separate classification is required. Identification and classification of crime tweets is a challenging task and has been the researcher’s latest interest. The researchers used different approaches to identify and classify crime tweets. This research has used an ensemble approach for the identification and classification of crime tweets. Tweepy and Twint libraries were used to collect datasets from Twitter. Both libraries use contrasting methods for extracting tweets from Twitter. This research has applied many ensemble approaches for the identification and classification of crime tweets. Logistic regression (LR), support vector machine (SVM), k-nearest neighbor (KNN), decision tree (DT), and random forest (RF) Classifier assigned with the weights of 1,2,1,1 and 1 respectively ensemble together by a soft weighted Voting classifier along with term frequency – inverse document frequency (TF-IDF) vectorizer gives the best performance with an accuracy of 96.2% on the testing dataset.
National archival platform system design using the microservice-based service...CSITiaesprime
Archives play a vital function concerning the dynamics of people and nations as an instrument to treasure information in diverse domains of politics, society, economics, culture, science, and technology. The acceleration of digital transformation triggers the implementation of a smart government that supports better public services. The smart government encourages a national archival system to facilitate archive producers and users. The four electronic-based government system (SPBE) factors in the archival sector and open archival information system (OAIS) as a data preservation standard are the benchmarks in developing this study's national archival platform system. An improved service computing system engineering (SCSE) framework adapted to the microservice architecture is used to aid the design of the national archival platform system. The proposed design met the four-factor service design validation of coupling, cohesion, complexity, and reusability. Also, the prototype suggests what resources are needed to put the design into action by passing the performance test of availability measurement.
Antispoofing in face biometrics: A comprehensive study on software-based tech...CSITiaesprime
The vulnerability of the face recognition system to spoofing attacks has piqued the biometric community's interest, motivating them to develop anti-spoofing techniques to secure it. Photo, video, or mask attacks can compromise face biometric systems (types of presentation attacks). Spoofing attacks are detected using liveness detection techniques, which determine whether the facial image presented at a biometric system is a live face or a fake version of it. We discuss the classification of face anti-spoofing techniques in this paper. Anti-spoofing techniques are divided into two categories: hardware and software methods. Hardware-based techniques are summarized briefly. A comprehensive study on software-based countermeasures for presentation attacks is discussed, which are further divided into static and dynamic methods. We cited a few publicly available presentation attack datasets and calculated a few metrics to demonstrate the value of anti-spoofing techniques.
The antecedent e-government quality for public behaviour intention, and exten...CSITiaesprime
An The main objective of the study is to identify the antecedent of leadership quality, public satisfaction and public behaviour intention of e-government service. Also, this study integrated e-government quality to expectation-confirmation model. In order to achieve these goals, observational research was then carried out to collect primary information, using the method of data dissemination and obtaining the opinion of 360 from the public using the e-government service and some of the e-government and software quality experts. The results of the study show that the positive association among the e-government services quality and public perceived usefulness, public expectation confirmation, leadership quality and public satisfaction that also play a positive role on the public behavior intention.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
High security mechanism: Fragmentation and replication in the cloud with auto update in the system
1. Computer Science and Information Technologies
Vol. 1, No. 2, July 2020, pp. 78~83
ISSN: 2722-3221, DOI: 10.11591/csit.v1i2.p78-83 78
Journal homepage: http://iaesprime.com/index.php/csit
High security mechanism: fragmentation and replication in the
cloud with auto update in the system
Shrutika Khobragade, Rohini Bhosale, Rahul Jiwane
Department of Computer Engineering, Mumbai University, Pillai HOC College of Engineering and Technology, India
Article Info ABSTRACT
Article history:
Received Jan 22, 2020
Revised May 10, 2020
Accepted May 25, 2020
Cloud computing makes immense use of internet to store a huge amount of
data. Cloud computing provides high quality service with low cost and
scalability with less requirement of hardware and software management.
Security plays a vital role in cloud as data is handled by third party hence
security is the biggest concern to matter. This proposed mechanism focuses on
the security issues on the cloud. As the file is stored at a particular location
which might get affected due to attack and will lost the data. So, in this
proposed work instead of storing a complete file at a particular location, the
file is divided into fragments and each fragment is stored at various locations.
Fragments are more secured by providing the hash key to each fragment. This
mechanism will not reveal all the information regarding a particular file even
after successful attack. Here, the replication of fragments is also generated
with strong authentication process using key generation. The auto update of a
fragment or any file is also done here. The concept of auto update of files is
done where a file or a fragment can be updated online. Instead of downloading
the whole file, a fragment can be downloaded to update. More time is saved
using this methodology.
Keywords:
Auto update mechanism
Cloud security
File fragmentation
File replication
Version control
This is an open access article under the CC BY-SA license.
Corresponding Author:
Rahul Jiwane
Department of Computer Engineering
Mumbai University
Pillai HOC College of Engineering and Technology, Rasayani, Dist. Raigad, Maharashtra, India
Email: rjiwane@mes.ac.in
1. INTRODUCTION
Cloud computing encloses more use of networking sites and other forms of interpersonal computing.
However, there are a large amount of resources on cloud storage, data or software applications which have
been accessed online. It plays an important role in the privacy and security of the data. As cloud computing is
a flexible, cost- effective and authenticated delivery platform for providing business consumer IT services on
the internet. Cloud computing presents an added level of risk as essential services are often outsourced to a
third party, which makes it difficult to maintain data security and privacy, demonstrate consent and also support
data and service availability. The cloud computing paradigm has reformed the control and management of the
information technology infrastructure [1]. Cloud computing is characterized by on-demand self-services,
resource pooling, elasticity, ubiquitous network accesses and measured assurance of the services [2], [3].
However, the benefits of imperceptible management (from user’s perspective), low cost, easy access and
greater resilience come with increased security concerns which have to be taken care of.
Erstwhile, computer software was not written with security in mind but because of the increasing
frequency and sophistication of malicious attacks against information systems, modern software design
methodologies include security as a primary objective. With cloud computing systems seeking to meet multiple
2. Comput. Sci. Inf. Technol.
High security mechanism: fragmentation and replication in the cloud with … (Shrutika Khobragade)
79
objectives, such as cost, performance, reliability, maintainability, and security, trade-offs have to be made. Any
cloud server is vulnerable to an attacker with unlimited time and physical access to the server. Additionally,
physical problems could cause the server to have down time. This would be a loss of availability, which is one
of the key principles of the security triad confidentiality, integrity, and availability (CIA).
The data which is outsourced to a public cloud must be secured. Unauthorized data access by other
users and different processes (can be accidental or deliberate) should be prevented [4]. As stated above, any
weak entity may lead the whole cloud at risk. In such a framework, the security mechanism should significantly
increase a hacker’s effort to retrieve a probable amount of data which may get lost even after a successful
intrusion or attack in the cloud. Moreover, the reasonable amount of loss (due to data leakage) must also be
minimized. Mei et al. [5] says that the scheme of fragmentation and replication to allocate files over multiple
servers can lead to any attack as the files will be stored at the particular location. Juels and Opera [6] presented
a technique called as Iris file system which ensures the freshness, integrity and availability of data in a cloud.
Kappes et al. [7] where the deliberate attack of censorious information in case of improper sanitization cannot
be handled. It stores the file based on blocks which may lead to an improper sanitization.
The outsourced environment where the use of a trusted third party provides the security services in
the cloud is advocated in addressing cloud computing security issues [2]. They used the public key
infrastructure (PKI) to enhance the trustworthiness in the authentication, integrity and confidentiality of data.
At the user level, the use of tamper-proof devices, such as smart cards was used as the storage of the keys.
Tang et al. [8] have utilized the public key cryptography and trusted third party for providing data security in
cloud environments. However, the author has not used the PKI infrastructure to reduce the overheads. The
system needs to be more secure and should be accessed by only authorized person. The data stored in the cloud
need to secure as well as proper encryption keys have to be used for the verification of file which is stored in
it.
2. PROPOSED SYSTEM MODEL
A new proposed model ensures the security of the data which is stored on the cloud. This system
provides the better solution to increase the security as well as performance level. The cloud security increases
by the control of third-party administrative control. The data which needs to be secured is in the form of files.
Diverse amount of files are stored on cloud so here a particular file is uploaded and then fragmentation process
is done. Each fragment of that file is secured with hash key which is generated randomly. Fragments are
generated based on equal size and each fragment of that particular file is placed at a different location. Here
controlled replication is maintained where each fragment is replicated only once to improve the security. When
an attacker/ hacker hacks that specified file, he will not reveal all the information of that file. Various attacks
such as data recovery, cross VM attack, improper media sanitization, VM escape can be handled by this
methodology. In this proposed system, a file is uploaded on the cloud which needs to be secured as it is third
party outsourced data. A file is generally stored on a cloud at a particular location which is not secured as any
attacker can easily access or attack that particular file and get the information using various malicious attack
or intrusion. So instead of storing a single file at a particular location, a file is fragmented according to the size
and placed at the different location so that if any attacker access to that particular file it will not reveal all the
information of that whole file. Security is the major concern where a file must be purely secured, so to maintain
this, each key is used for different fragment of a single file. Replica of file is also maintained to get availability
of lost file or old files.
2.1. Design goals of cloud computing
Design goals of cloud computing are stated to provides good authentication system which allows only
authorized users to login and process. Improves security as well as improves the performance. Fragmentation
of file is created to and each fragment is stored at different location. Controlled replication is developed to
decrease the chance of data loss, increases the performance, availability and reliability. To provide a file to the
client whenever there will be run time error in network.
As illustrated in Figure 1, in the design, first registration process is done by giving the information
about the user. Authorized user will login to the system. Proposed system provides best way for the secured
files which is very susceptible for attacks. User needs to upload the file on the system by providing file id and
file name. The uploaded file gets fragmented in such a way that fragments do not include any meaningful
information as a particular if fragmented and each fragment needs its own encryption key. Fragmentation is
done by using the equal size of file and if not of equal size then remaining bytes of file is stored at the next
fragment. This helps to keep away the attacker from finding the location of each fragment as each fragment is
stored at different location. Here each key is generated for each fragment from the content of the file so that an
attacker will not get the data of each fragment. Fragments will be placed at different nodes as it will be difficult
3. ISSN: 2722-3221
Comput. Sci. Inf. Technol., Vol. 1, No. 2, July 2020: 78 – 83
80
for the attacker to access on single file. After the fragmentation process that fragments get replicated on
different nodes in such a way that the access time will be low which also increases the performance. Proposed
system provides the controlled replication which is required to manage the ideal performance and more
security.
Figure 1. Proposed architecture
3. PROPOSED SYSTEM TECHNIQUE
The steps of algorithm are as follows:
3.1. Fragment placement
The initial step is select the file foe fragment. Select the particular file which needs to be fragmented.
The file gets fragmented based on the size. Each fragment is stored at the different location i.e. on different
nodes. Repeat the process until all fragments assign to the node. File size should be more than 20 KB.
3.2. General flow algorithm
S = {I, P, R, O}
Where,
I is set of initial input to the system.
I = {i1, i2, i3}
i1 = File given by the user.
i2 = Download request from user.
i3 = Download request from client.
P is set of procedure or function or processes or methods.
P = {p1, p2, p3, p4, p5, p6, p7, p8}
p1 = Registration and authentication.
p2 = Uploading a file on cloud server.
p3 = Fragmentation of file with separate hash key for each fragment received from user.
p4 = Replication of that file.
p5 = Download request from user.
p6 = Download request from client.
p7 = Collection and reassemble of fragments.
p8 = Downloading the original file.
R is a set of rules or constraints.
R= {r1}
r1 = File accessed from Replication.
O is a set of outputs.
O = {o1}
o1 = Downloading the original file.
4. Comput. Sci. Inf. Technol.
High security mechanism: fragmentation and replication in the cloud with … (Shrutika Khobragade)
81
3.3. Use of rijndael algorithm
Rijndael encryption algorithm proposes a new encryption technique for encryption and decryption
purpose. It is advanced AES algorithm is used to encrypt sensitive information. It is symmetric key encryption
algorithm to be used to encrypt information. It is the best combination of security, performance, efficiency,
easy implementation and flexibility, high speed and versatility across a variety of platforms. Run efficiently on
large computers, desktops and small devices like smart cards. The block and key can in fact be chosen
independently from 128, 160, 192, 224, 256 bits. Rijndael is simple to implement and uses very little system
memory.
3.4. Use of SHA-512 algorithm
We are using SHA-512 algorithm for performing hashing task. Secure hash algorithm (SHA). The
United States national security agency is designed hashing algorithm.SHA-512 is faster than SHA-256 on
64-bit machines is that has less rounds per byte (80 rounds for 128-byte blocks) compared to SHA-256 (64
rounds for 64 byte blocks), However, storing a SHA-512 bit hash is expensive. The SHA-512 time to generate
the hash value and the number of cycles per bytes are efficient comparing to the others. In SHA-512, the
number of cycles per bytes somewhat more compared to other hashing functions, but at the same time the time
to generate the hashing value is much smaller than others. So the SHA-512 hash function is efficient and also
secure hashing algorithm.
3.5. Use of random key generation algorithm
Random Key generation generates random values. It is used for encryption and decryption purpose.
Here the keys are randomly generated using a random number of generator or pseudorandom number generator
that produces random data. In our system random number key generation is used at the time of downloading
data for the user. It makes much harder for a hacker or attacker to guess the key.
3.6. Properties of proposed system
Properties of proposed system are follows:
Uses the one time password verification scheme.
Privacy of data is maintained from third party organization.
Files are divided into fragments and stored at the various locations.
Key generation is done for each fragment.
Best encryption algorithm used.
Version control is used for retrieving old files.
Auto update and auto replicate of files are done.
Controlled replication for immense security.
Implementation of real cloud.
There are different modules of system as shown in Figure 2. They are as follows:
Upload: This module consist of uploading the file on the cloud. When the file is uploaded, a file id is
generated for each new file. Then the name to that file can be given by the user which is unique.
Edit: This module is for editing of the particular fragment of a file. The details of all the uploaded files
are shown in this table. When any file is uploaded it is divided into various fragments and encrypted
and later stored in various nodes. And each fragment has its hash value which is sent to the users
email id during registration. To edit we need to select any fragment and edit that file, but to retrieve
that content hash value has to be entered.
Edit full file: This module consist of the update of whole file. Here we can upload the updated file and
rename the file as new file. The old file is stored in as a backup.
Download: This module is used for downloading the file which is in the encrypted form. A single
fragment can be downloaded or a single fragment too. When we download the data using authorized
key it is first decrypted and then the fragments are combined from various nodes used and the file
is presented in its original form. This is the procedure which is implemented to upload the file then
fragmentation, replication and download of original file is done.
5. ISSN: 2722-3221
Comput. Sci. Inf. Technol., Vol. 1, No. 2, July 2020: 78 – 83
82
Figure 2. Proposed system flow
4. RESULTS AND DISCUSSION
The Figure 3 shows the gap analysis between existing and proposed system. The existing system gives
the file to the client whenever request is granted. In the existing system there was no traditional cryptographic
technique used for security of the data and no recovery of data is maintained.
Figure 3. Result analysis
So, details of Figure 3 are as follows:
Security of fragment: In the existing system there was no cryptographic technique used but in this
proposed system special cryptographic technique is used to increase the security of data for each
fragment.
Lost data recovery: In the existing system, if the replica of a fragment is deleted, there was no recovery
system but in this proposed system even if any fragment is deleted, we can retrieve using version
control.
As from the above details there are two parameters which differ from the existing system.
6. Comput. Sci. Inf. Technol.
High security mechanism: fragmentation and replication in the cloud with … (Shrutika Khobragade)
83
Table 1. Comparison of two systems
Sr. No. Existing System Proposed System
1 Only the data is secured. Security is given to data as well as traditional cryptographic
technique is used.
2 Even if the file or fragment is lost or
whole block is lost, it cannot be
recovered.
Recovery of file or fragment is done in version control.
3 Download of whole file and update
is done. As it is time consuming.
Instead of downloading whole file we can update a particular
fragment online itself or we can download required fragment
only to update it. This saves time.
4 No separate hash key is used for
fragments.
Separate key is maintained for each fragment to make it more
secure.
5. CONCLUSION
A system is proposed for security of users’ data when the data is stored in the cloud. The proposed
scheme works for the security of users’ data when data is stored into the cloud. Cloud computing growth raises
the security concern due to its core technology. So, this system provides a better solution to achieve the security
as well as performance by using techniques such as secure cryptographic scheme, fragmentation and
replication. Fragmentation is used to protect data from single point disaster. Replication can be useful for
maintaining availability, reliability and performance in failure situations. But the extra replication can also
result in high storage cost or drops in systems overall performance due to extreme use of bandwidth. So, here
controlled replication is used. This scheme utilizes rijndael algorithm to create an encryption key that the user
gets while requesting to data owner to file accessing. This scheme uses SHA-512 algorithm for generating hash
key for each fragment after division.
REFERENCES
[1] K. Hashizume, D. G. Rosado, E. Fernandez-Medina, and E. B. Fernandez, “An analysis of security issues for cloud
computing, ”J. Internet Serv. Appl., vol. 4, no. 1, pp. 1-13, 2013, doi: 10.1186/1869-0238-4-5
[2] D. Zissis and D. Lekkas, “Addressing cloud computing security issues,” Future Generation Computer Systems, vol.
28, no. 3, pp. 583-592, 2012, doi: 10.1016/j.future.2010.12.006.
[3] M. Hogan, F. Liu, A. Sokol, and J. Tong, “NIST cloud computing standards roadmap,” NIST Special Publication,
2011, [Online] Available: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=909024.
[4] N. Khan, M. L. M. Kiah, S. U. Khan, and S. A. Madani, “Towards Secure Mobile Cloud Computing: A Survey,”
Future Generation Computer Systems, vol. 29, no. 5, pp. 1278-1299, 2013, doi: 10.1016/j.future.2012.08.003.
[5] A. Mei, L. V. Mancini and S. Jajodia, “Secure dynamic fragment and replica allocation in large-scale distributed file
systems,” in IEEE Transactions on Parallel and Distributed Systems, vol. 14, no. 9, pp. 885-896, Sept. 2003, doi:
10.1109/TPDS.2003.1233711.
[6] A. Juels and A. Opera, “New Approaches to Security and Availability for Cloud Data,” Communications of ACM,
Vol. 56, No. 2, pp. 64-73, 2013, doi: 10.1145/2408776.2408793.
[7] G. Kappes, A. Hatzieleftheriou and S. V. Anastasiadis, “Virtualization-aware access control for multitenant
filesystems,” 2014 30th Symposium on Mass Storage Systems and Technologies (MSST), 2014, pp. 1-6, doi:
10.1109/MSST.2014.6855543.
[8] Y. Tang, P. P. C. Lee, J. C. S. Lui and R. Perlman, “Secure Overlay Cloud Storage with Access Control and Assured
Deletion,” in IEEE Transactions on Dependable and Secure Computing, vol. 9, no. 6, pp. 903-916, Nov.-Dec. 2012,
doi: 10.1109/TDSC.2012.49.