Engaging in cloud computing domain projects can provide numerous advantages, as cloud technology continues to transform the way businesses operate and deliver services. Here are some key advantages of working on cloud computing projects:
Scalability: Cloud computing allows for easy scalability, enabling organizations to scale their resources up or down based on demand. This ensures that they only pay for the resources they actually use, optimizing costs.
Cost Efficiency: Cloud computing eliminates the need for upfront infrastructure investment and the maintenance of physical servers. This can lead to significant cost savings, especially for smaller businesses or startups with limited capital.
Flexibility and Agility: Cloud services provide flexibility in terms of deployment options, programming languages, and frameworks. This agility allows developers to experiment with new ideas and bring products to market more quickly.
Global Reach: Cloud services are typically offered through a network of data centers located around the world. This enables organizations to provide services globally with low latency and high availability.
Resource Optimization: Cloud platforms offer tools for optimizing resource usage, such as automatic scaling, load balancing, and serverless computing. This ensures that resources are allocated efficiently, improving overall performance.
Security and Compliance: Leading cloud service providers invest heavily in security measures, including data encryption, identity and access management, and regular security audits. Leveraging cloud services can enhance the overall security posture of an organization.
Collaboration and Remote Work: Cloud computing facilitates collaboration among team members, allowing them to access and work on shared resources from anywhere with an internet connection. This is particularly valuable for remote or distributed teams.
Disaster Recovery and Business Continuity: Cloud services provide robust disaster recovery options, including backup and data replication. This ensures that organizations can quickly recover from data loss or system failures, contributing to business continuity.
Automatic Updates and Maintenance: Cloud service providers handle infrastructure maintenance, updates, and patching. This frees up IT teams from routine tasks, allowing them to focus on strategic initiatives and innovation.
Innovation Acceleration: Cloud computing provides access to a wide range of cutting-edge technologies, such as artificial intelligence, machine learning, and Internet of Things (IoT). Organizations can leverage these technologies to drive innovation in their products and services.
Elasticity: Cloud services offer elasticity, allowing organizations to dynamically adjust their computing resources to match changing workloads. This ensures optimal performance during peak demand periods.
Effective & Flexible Cryptography Based Scheme for Ensuring User`s Data Secur...ijsrd.com
Cloud computing has been envisioned as the next-generation architecture of IT enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in the cloud, we propose an effective and flexible cryptography based scheme. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against malicious data modification attack.
The document proposes TEES (Traffic and Energy saving Encrypted Search), an architecture for efficient encrypted search over mobile cloud storage. TEES aims to reduce computation time, energy consumption, and network traffic for file retrieval compared to traditional encrypted search schemes. It offloads computation of relevance scores from mobile devices to the cloud to reduce energy usage. It also simplifies the search and retrieval process to reduce network traffic and retrieval time through a single communication round trip. TEES enhances security by adding noise to term frequency distributions and hiding the association between keywords and files. Experiments show TEES reduces computation time by 23-46% and energy consumption by 35-55% compared to traditional schemes.
Survey on Privacy- Preserving Multi keyword Ranked Search over Encrypted Clou...Editor IJMTER
The advent of cloud computing, data owners are motivated to outsource their complex
data management systems from local sites to commercial public cloud for great flexibility and
economic savings. But for protecting data privacy, sensitive data has to be encrypted before
outsourcing.Considering the large number of data users and documents in cloud, it is crucial for
the search service to allow multi-keyword query and provide result similarity ranking to meet the
effective data retrieval need. Related works on searchable encryption focus on single keyword
search or Boolean keyword search, and rarely differentiate the search results. We first propose a
basic MRSE scheme using secure inner product computation, and then significantly improve it to
meet different privacy requirements in two levels of threat models. The Incremental High Utility
Pattern Transaction Frequency Tree (IHUPTF-Tree) is designed according to the transaction
frequency (descending order) of items to obtain a compact tree.
By using high utility pattern the items can be arranged in an efficient manner. Tree structure
is used to sort the items. Thus the items are sorted and frequent pattern is obtained. The frequent
pattern items are retrieved from the database by using hybrid tree (H-Tree) structure. So the
execution time becomes faster. Finally, the frequent pattern item that satisfies the threshold value
is displayed.
This document summarizes a research paper that proposes a scheme for ensuring security and reliability of data stored in the cloud. The scheme utilizes erasure coding to redundantly store encrypted data fragments across multiple cloud servers. It generates homomorphic tokens that allow auditing of the data storage and identification of any misbehaving servers. The scheme supports secure dynamic operations like modification, deletion and append of cloud data files. Analysis shows the scheme is efficient and resilient against various security threats like server compromises or failures. It ensures storage correctness and fast localization of data errors in the cloud.
Enhancement of the Cloud Data Storage Architectural Framework in Private CloudINFOGAIN PUBLICATION
The data storage in the cloud typically resides in a service providing environment collocated with data from different clients. The institutions or organizations moving the sensitive and regulated data into the cloud in order to maintain the account for the means by which the access data is controlled and the data is kept secure. Data can take many forms. The cloud based application development; it includes the application programs, scripts, and configuration settings, along with the development tools. For deployed applications, it includes records and other content created or used by the applications, as well as account information about the users of the applications. Access controls are one means to keep data away from unauthorized users; encryption is another. Access controls are typically identity-based, which makes authentication of the user’s identity an important issue in cloud computing. In this research paper focus the cloud data storage architectural frame work of encrypted data.
Secure distributed deduplication systems with improved reliability 2Rishikesh Pathak
1. The document proposes new distributed deduplication systems that improve reliability by distributing data chunks across multiple cloud servers. This addresses limitations of single-server deduplication systems where losing one server causes disproportionate data loss.
2. The systems introduce a deterministic secret sharing scheme to protect data confidentiality in distributed storage, instead of using convergent encryption. Secret shares of files are distributed across servers.
3. The distributed approach enhances reliability while supporting deduplication and ensuring data integrity and "tag consistency" to prevent replacement attacks. This represents the first work addressing reliability, confidentiality and consistency for distributed deduplication.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
DISTRIBUTED SCHEME TO AUTHENTICATE DATA STORAGE SECURITY IN CLOUD COMPUTINGijcsit
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces
database and application software to the large data centres, where the management of services and data
may not be predictable, where as the conventional solutions, for IT services are under proper logical,
physical and personal controls. This aspect attribute, however comprises different security challenges
which have not been well understood. It concentrates on cloud data storage security which has always been
an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and
efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent
features. Homomorphic token is used for distributed verification of erasure – coded data. By using this
scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and
secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to
traditional solutions, where the IT services are under proper physical, logical and personnel controls,
cloud computing moves the application software and databases to the large data centres, where the data
management and services may not be absolutely truthful. This effective security and performance analysis
describes that the proposed scheme is extremely flexible against malicious data modification, convoluted
failures and server clouding attacks.
Effective & Flexible Cryptography Based Scheme for Ensuring User`s Data Secur...ijsrd.com
Cloud computing has been envisioned as the next-generation architecture of IT enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users' data in the cloud, we propose an effective and flexible cryptography based scheme. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against malicious data modification attack.
The document proposes TEES (Traffic and Energy saving Encrypted Search), an architecture for efficient encrypted search over mobile cloud storage. TEES aims to reduce computation time, energy consumption, and network traffic for file retrieval compared to traditional encrypted search schemes. It offloads computation of relevance scores from mobile devices to the cloud to reduce energy usage. It also simplifies the search and retrieval process to reduce network traffic and retrieval time through a single communication round trip. TEES enhances security by adding noise to term frequency distributions and hiding the association between keywords and files. Experiments show TEES reduces computation time by 23-46% and energy consumption by 35-55% compared to traditional schemes.
Survey on Privacy- Preserving Multi keyword Ranked Search over Encrypted Clou...Editor IJMTER
The advent of cloud computing, data owners are motivated to outsource their complex
data management systems from local sites to commercial public cloud for great flexibility and
economic savings. But for protecting data privacy, sensitive data has to be encrypted before
outsourcing.Considering the large number of data users and documents in cloud, it is crucial for
the search service to allow multi-keyword query and provide result similarity ranking to meet the
effective data retrieval need. Related works on searchable encryption focus on single keyword
search or Boolean keyword search, and rarely differentiate the search results. We first propose a
basic MRSE scheme using secure inner product computation, and then significantly improve it to
meet different privacy requirements in two levels of threat models. The Incremental High Utility
Pattern Transaction Frequency Tree (IHUPTF-Tree) is designed according to the transaction
frequency (descending order) of items to obtain a compact tree.
By using high utility pattern the items can be arranged in an efficient manner. Tree structure
is used to sort the items. Thus the items are sorted and frequent pattern is obtained. The frequent
pattern items are retrieved from the database by using hybrid tree (H-Tree) structure. So the
execution time becomes faster. Finally, the frequent pattern item that satisfies the threshold value
is displayed.
This document summarizes a research paper that proposes a scheme for ensuring security and reliability of data stored in the cloud. The scheme utilizes erasure coding to redundantly store encrypted data fragments across multiple cloud servers. It generates homomorphic tokens that allow auditing of the data storage and identification of any misbehaving servers. The scheme supports secure dynamic operations like modification, deletion and append of cloud data files. Analysis shows the scheme is efficient and resilient against various security threats like server compromises or failures. It ensures storage correctness and fast localization of data errors in the cloud.
Enhancement of the Cloud Data Storage Architectural Framework in Private CloudINFOGAIN PUBLICATION
The data storage in the cloud typically resides in a service providing environment collocated with data from different clients. The institutions or organizations moving the sensitive and regulated data into the cloud in order to maintain the account for the means by which the access data is controlled and the data is kept secure. Data can take many forms. The cloud based application development; it includes the application programs, scripts, and configuration settings, along with the development tools. For deployed applications, it includes records and other content created or used by the applications, as well as account information about the users of the applications. Access controls are one means to keep data away from unauthorized users; encryption is another. Access controls are typically identity-based, which makes authentication of the user’s identity an important issue in cloud computing. In this research paper focus the cloud data storage architectural frame work of encrypted data.
Secure distributed deduplication systems with improved reliability 2Rishikesh Pathak
1. The document proposes new distributed deduplication systems that improve reliability by distributing data chunks across multiple cloud servers. This addresses limitations of single-server deduplication systems where losing one server causes disproportionate data loss.
2. The systems introduce a deterministic secret sharing scheme to protect data confidentiality in distributed storage, instead of using convergent encryption. Secret shares of files are distributed across servers.
3. The distributed approach enhances reliability while supporting deduplication and ensuring data integrity and "tag consistency" to prevent replacement attacks. This represents the first work addressing reliability, confidentiality and consistency for distributed deduplication.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
DISTRIBUTED SCHEME TO AUTHENTICATE DATA STORAGE SECURITY IN CLOUD COMPUTINGijcsit
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces
database and application software to the large data centres, where the management of services and data
may not be predictable, where as the conventional solutions, for IT services are under proper logical,
physical and personal controls. This aspect attribute, however comprises different security challenges
which have not been well understood. It concentrates on cloud data storage security which has always been
an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and
efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent
features. Homomorphic token is used for distributed verification of erasure – coded data. By using this
scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and
secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to
traditional solutions, where the IT services are under proper physical, logical and personnel controls,
cloud computing moves the application software and databases to the large data centres, where the data
management and services may not be absolutely truthful. This effective security and performance analysis
describes that the proposed scheme is extremely flexible against malicious data modification, convoluted
failures and server clouding attacks.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
S.A.kalaiselvan toward secure and dependable storage serviceskalaiselvanresearch
This document proposes a scheme for ensuring data integrity and dependability for cloud storage systems. The scheme utilizes erasure coding to distribute data across multiple servers and generate verification tokens to enable lightweight auditing. It allows users to efficiently audit that their data is intact and to identify any misbehaving servers. The scheme also supports secure dynamic operations like updates, deletes and appends while maintaining the same level of integrity assurance. Analysis shows the scheme is efficient and resilient to various security threats from malicious servers.
IRJET- Mutual Key Oversight Procedure for Cloud Security and Distribution of ...IRJET Journal
The document proposes a mutual key oversight procedure for cloud security and distribution of data based on a hierarchy method. It discusses using attribute-based encryption to encrypt data before outsourcing it to the cloud. The proposed scheme uses a hierarchical structure with a cloud authority, domain authorities, and users to provide security and scalability. It allows both private and public uploading and sharing of files within this hierarchy.
IRJET- A Survey on Remote Data Possession Verification Protocol in Cloud StorageIRJET Journal
This document summarizes a survey on remote data possession verification protocols for cloud storage. It begins with an abstract describing the problem of verifying integrity of outsourced data files on remote cloud servers. It then provides background on remote data possession verification (RDPV) protocols and discusses related work on ensuring data integrity and supporting dynamic operations. The document describes the system framework, RDPV protocol, use of homomorphic hash functions, and an optimized implementation using an operation record table to efficiently support dynamic operations like modifications. It concludes that the presented efficient and secure RDPV protocol is suitable for cloud storage applications.
Enabling Integrity for the Compressed Files in Cloud ServerIOSR Journals
This document proposes a scheme for enabling data integrity for compressed files stored in cloud servers. The scheme encrypts some bits of data from each data block using an RSA algorithm and polynomial hashing to generate hash values. These hash values are stored at the client and used to verify integrity by checking responses from the cloud server against the stored hashes. The scheme aims to minimize computational and storage overhead for clients by compressing files, encrypting only some data bits, and requiring clients to store just two secret functions rather than the full data. This allows integrity checks with low bandwidth consumption suitable for thin clients like mobile devices.
Dynamic Resource Allocation and Data Security for CloudAM Publications
Cloud computing is the next generation of IT organization. Cloud computing moves the software and
databases to the large centres where the management of services and data may not be fully trusted. In this system, we
focus on cloud data storage security, which has been an important aspect of quality of services. To ensure the
correctness of user’s data in the cloud, we propose an effective scheme with Advanced Encryption Standard and MD5
algorithm. Extensive security and performance analysis shows that the proposed scheme is highly efficient. In
proposed work we have developed efficient parallel data processing in clouds and present our research project for
parallel security. Parallel security is the data processing framework to explicitly exploit the dynamic storage along with
data security. We have proposed a strong, formal model for data security on cloud and corruption detection.
iaetsd Controlling data deuplication in cloud storageIaetsd Iaetsd
This document discusses controlling data deduplication in cloud storage. It proposes an architecture that provides duplicate check procedures with minimal overhead compared to normal cloud storage operations. The key aspects of the proposed system are:
1) It uses convergent encryption to encrypt data for privacy while still allowing for deduplication of duplicate files.
2) It introduces a private cloud that manages user privileges and generates tokens for authorized duplicate checking in a hybrid cloud architecture.
3) It evaluates the overhead of the proposed authorized duplicate checking scheme and finds it incurs negligible overhead compared to normal cloud storage operations.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
The document proposes a scheme for secure access to outsourced databases in cloud computing environments. It uses a modified row-based encryption technique where each database row is encrypted with a separate key. The scheme also uses client-side memory and selective encryption to increase performance. A modified Diffie-Hellman key exchange algorithm is used to establish secure communication between the cloud service provider and users, inhibiting malicious outsiders. The scheme aims to overcome security challenges like information leakage, key management, revocation handling and user authentication that exist in current solutions for securing outsourced databases.
Cooperative Schedule Data Possession for Integrity Verification in Multi-Clou...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
EXPLORING WOMEN SECURITY BY DEDUPLICATION OF DATAIRJET Journal
The document proposes a new deduplication system that supports endorsed duplicate check and separates the record structure and substance. It utilizes AES calculation to encrypt information for security. MD5 and sha1 calculations are utilized to recognize duplicate records. The private cloud is outfitted with mystery validation to give more prominent security. The proposed framework separates record structure and substance and underpins endorsed duplicate check to give proficient deduplication while ensuring information security in the cloud.
Guaranteed Availability of Cloud Data with Efficient CostIRJET Journal
This document discusses efficient and cost-effective methods for hosting data across multiple cloud storage providers (multi-cloud) to ensure high data availability and reduce costs. It proposes distributing data among different cloud providers using replication and erasure coding techniques. This approach guarantees data availability even if one cloud provider fails and minimizes monetary costs by taking advantage of varying cloud pricing models and data access patterns. The technique is shown to save around 20% of costs while providing high flexibility to handle data and pricing changes over time.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
A novel cloud storage system with support of sensitive data applicationijmnct
Most users are willing to store their data in the c
loud storage system and use many facilities of clou
d. But
their sensitive data applications faces with potent
ial serious security threats. In this paper, securi
ty
requirements of sensitive data application in the c
loud are analyzed and improved structure for the ty
pical
cloud storage system architecture is proposed. The
hardware USB-Key is used in the proposed architectu
re
for purpose of enhancing security of user identity
and interaction security between the users and the
cloud
storage system. Moreover, drawn from the idea of da
ta active protection, a data security container is
introduced in the system to enhance the security of
the data transmission process; by encapsulating th
e
encrypted data, increasing appropriate access contr
ol and data management functions. The static data
blocks are replaced with a dynamic executable data
security container. Then, an enhanced security
architecture for software of cloud storage terminal
is proposed for more adaptation with the user's sp
ecific
requirements, and its functions and components can
be customizable. Moreover, the proposed architectur
e
have capability of detecting whether the execution
environment is according with the pre-defined
environment requirements.
Revocation based De-duplication Systems for Improving Reliability in Cloud St...IRJET Journal
1) The document discusses improving the reliability of deduplication systems in cloud storage by implementing user revocation along with Shamir's secret sharing scheme and ramp secret sharing scheme.
2) Deduplication systems aim to eliminate redundant data and achieve single instance storage, but reliability and security are ongoing issues when users are revoked.
3) The paper proposes using Shamir's secret sharing algorithm and ramp secret sharing scheme for encryption to maintain reliability when users are removed by allowing the data to be rechecked for duplication.
International Journal of Engineering and Science Invention (IJESI)inventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online
The document proposes a Session Based Ciphertext Policy Attribute Based Encryption (SB-CP-ABE) method for access control in cloud storage. SB-CP-ABE aims to enable efficient key refreshing and revocation in ciphertext policy attribute based encryption (CP-ABE) schemes. It introduces the concept of associating private keys with sessions, so that key updates and revocations only need to be done at session boundaries, avoiding the need for frequent re-encryption of ciphertexts. The method can be generically applied to existing CP-ABE schemes to improve their practicality for cloud storage environments.
Improving Data Storage Security in Cloud using HadoopIJERA Editor
The rising abuse of information stored on large data centres in cloud emphasizes the need to safe guard the data. Despite adopting strict authentication policies for cloud users data while transferred over to secure channel when reaches data centres is vulnerable to numerous attacks .The most widely adoptable methodology is safeguarding the cloud data is through encryption algorithm. Encryption of large data deployed in cloud is actually a time consuming process. For the secure transmission of information AES encryption has been used which provides most secure way to transfer the sensitive information from sender to the intended receiver. The main purpose of using this technique is to make sensitive information unreadable to all other except the receiver. The data thus compressed enables utilization of storage space in cloud environment. It has been augmented with Hadoop‟s map-reduce paradigm which works in a parallel mode. The experimental results clearly reflect the effectiveness of the methodology to improve the security of data in cloud environment.
Cloud Computing is the revolution in current generation IT enterprise. Cloud computing displaces database and application software to the large data centres, where the management of services and data may not be predictable, where as the conventional solutions, for IT services are under proper logical, physical and personal controls. This aspect attribute, however comprises different security challenges which have not been well understood. It concentrates on cloud data storage security which has always been an important aspect of quality of service (QOS). In this paper, we designed and simulated an adaptable and efficient scheme to guarantee the correctness of user data stored in the cloud and also with some prominent features. Homomorphic token is used for distributed verification of erasure – coded data. By using this scheme, we can identify misbehaving servers. In spite of past works, our scheme supports effective and secure dynamic operations on data blocks such as data insertion, deletion and modification. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the data management and services may not be absolutely truthful. This effective security and performance analysis describes that the proposed scheme is extremely flexible against malicious data modification, convoluted failures and server clouding attacks.
S.A.kalaiselvan toward secure and dependable storage serviceskalaiselvanresearch
This document proposes a scheme for ensuring data integrity and dependability for cloud storage systems. The scheme utilizes erasure coding to distribute data across multiple servers and generate verification tokens to enable lightweight auditing. It allows users to efficiently audit that their data is intact and to identify any misbehaving servers. The scheme also supports secure dynamic operations like updates, deletes and appends while maintaining the same level of integrity assurance. Analysis shows the scheme is efficient and resilient to various security threats from malicious servers.
IRJET- Mutual Key Oversight Procedure for Cloud Security and Distribution of ...IRJET Journal
The document proposes a mutual key oversight procedure for cloud security and distribution of data based on a hierarchy method. It discusses using attribute-based encryption to encrypt data before outsourcing it to the cloud. The proposed scheme uses a hierarchical structure with a cloud authority, domain authorities, and users to provide security and scalability. It allows both private and public uploading and sharing of files within this hierarchy.
IRJET- A Survey on Remote Data Possession Verification Protocol in Cloud StorageIRJET Journal
This document summarizes a survey on remote data possession verification protocols for cloud storage. It begins with an abstract describing the problem of verifying integrity of outsourced data files on remote cloud servers. It then provides background on remote data possession verification (RDPV) protocols and discusses related work on ensuring data integrity and supporting dynamic operations. The document describes the system framework, RDPV protocol, use of homomorphic hash functions, and an optimized implementation using an operation record table to efficiently support dynamic operations like modifications. It concludes that the presented efficient and secure RDPV protocol is suitable for cloud storage applications.
Enabling Integrity for the Compressed Files in Cloud ServerIOSR Journals
This document proposes a scheme for enabling data integrity for compressed files stored in cloud servers. The scheme encrypts some bits of data from each data block using an RSA algorithm and polynomial hashing to generate hash values. These hash values are stored at the client and used to verify integrity by checking responses from the cloud server against the stored hashes. The scheme aims to minimize computational and storage overhead for clients by compressing files, encrypting only some data bits, and requiring clients to store just two secret functions rather than the full data. This allows integrity checks with low bandwidth consumption suitable for thin clients like mobile devices.
Dynamic Resource Allocation and Data Security for CloudAM Publications
Cloud computing is the next generation of IT organization. Cloud computing moves the software and
databases to the large centres where the management of services and data may not be fully trusted. In this system, we
focus on cloud data storage security, which has been an important aspect of quality of services. To ensure the
correctness of user’s data in the cloud, we propose an effective scheme with Advanced Encryption Standard and MD5
algorithm. Extensive security and performance analysis shows that the proposed scheme is highly efficient. In
proposed work we have developed efficient parallel data processing in clouds and present our research project for
parallel security. Parallel security is the data processing framework to explicitly exploit the dynamic storage along with
data security. We have proposed a strong, formal model for data security on cloud and corruption detection.
iaetsd Controlling data deuplication in cloud storageIaetsd Iaetsd
This document discusses controlling data deduplication in cloud storage. It proposes an architecture that provides duplicate check procedures with minimal overhead compared to normal cloud storage operations. The key aspects of the proposed system are:
1) It uses convergent encryption to encrypt data for privacy while still allowing for deduplication of duplicate files.
2) It introduces a private cloud that manages user privileges and generates tokens for authorized duplicate checking in a hybrid cloud architecture.
3) It evaluates the overhead of the proposed authorized duplicate checking scheme and finds it incurs negligible overhead compared to normal cloud storage operations.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
The document proposes a scheme for secure access to outsourced databases in cloud computing environments. It uses a modified row-based encryption technique where each database row is encrypted with a separate key. The scheme also uses client-side memory and selective encryption to increase performance. A modified Diffie-Hellman key exchange algorithm is used to establish secure communication between the cloud service provider and users, inhibiting malicious outsiders. The scheme aims to overcome security challenges like information leakage, key management, revocation handling and user authentication that exist in current solutions for securing outsourced databases.
Cooperative Schedule Data Possession for Integrity Verification in Multi-Clou...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
EXPLORING WOMEN SECURITY BY DEDUPLICATION OF DATAIRJET Journal
The document proposes a new deduplication system that supports endorsed duplicate check and separates the record structure and substance. It utilizes AES calculation to encrypt information for security. MD5 and sha1 calculations are utilized to recognize duplicate records. The private cloud is outfitted with mystery validation to give more prominent security. The proposed framework separates record structure and substance and underpins endorsed duplicate check to give proficient deduplication while ensuring information security in the cloud.
Guaranteed Availability of Cloud Data with Efficient CostIRJET Journal
This document discusses efficient and cost-effective methods for hosting data across multiple cloud storage providers (multi-cloud) to ensure high data availability and reduce costs. It proposes distributing data among different cloud providers using replication and erasure coding techniques. This approach guarantees data availability even if one cloud provider fails and minimizes monetary costs by taking advantage of varying cloud pricing models and data access patterns. The technique is shown to save around 20% of costs while providing high flexibility to handle data and pricing changes over time.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
A novel cloud storage system with support of sensitive data applicationijmnct
Most users are willing to store their data in the c
loud storage system and use many facilities of clou
d. But
their sensitive data applications faces with potent
ial serious security threats. In this paper, securi
ty
requirements of sensitive data application in the c
loud are analyzed and improved structure for the ty
pical
cloud storage system architecture is proposed. The
hardware USB-Key is used in the proposed architectu
re
for purpose of enhancing security of user identity
and interaction security between the users and the
cloud
storage system. Moreover, drawn from the idea of da
ta active protection, a data security container is
introduced in the system to enhance the security of
the data transmission process; by encapsulating th
e
encrypted data, increasing appropriate access contr
ol and data management functions. The static data
blocks are replaced with a dynamic executable data
security container. Then, an enhanced security
architecture for software of cloud storage terminal
is proposed for more adaptation with the user's sp
ecific
requirements, and its functions and components can
be customizable. Moreover, the proposed architectur
e
have capability of detecting whether the execution
environment is according with the pre-defined
environment requirements.
Revocation based De-duplication Systems for Improving Reliability in Cloud St...IRJET Journal
1) The document discusses improving the reliability of deduplication systems in cloud storage by implementing user revocation along with Shamir's secret sharing scheme and ramp secret sharing scheme.
2) Deduplication systems aim to eliminate redundant data and achieve single instance storage, but reliability and security are ongoing issues when users are revoked.
3) The paper proposes using Shamir's secret sharing algorithm and ramp secret sharing scheme for encryption to maintain reliability when users are removed by allowing the data to be rechecked for duplication.
International Journal of Engineering and Science Invention (IJESI)inventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online
The document proposes a Session Based Ciphertext Policy Attribute Based Encryption (SB-CP-ABE) method for access control in cloud storage. SB-CP-ABE aims to enable efficient key refreshing and revocation in ciphertext policy attribute based encryption (CP-ABE) schemes. It introduces the concept of associating private keys with sessions, so that key updates and revocations only need to be done at session boundaries, avoiding the need for frequent re-encryption of ciphertexts. The method can be generically applied to existing CP-ABE schemes to improve their practicality for cloud storage environments.
Improving Data Storage Security in Cloud using HadoopIJERA Editor
The rising abuse of information stored on large data centres in cloud emphasizes the need to safe guard the data. Despite adopting strict authentication policies for cloud users data while transferred over to secure channel when reaches data centres is vulnerable to numerous attacks .The most widely adoptable methodology is safeguarding the cloud data is through encryption algorithm. Encryption of large data deployed in cloud is actually a time consuming process. For the secure transmission of information AES encryption has been used which provides most secure way to transfer the sensitive information from sender to the intended receiver. The main purpose of using this technique is to make sensitive information unreadable to all other except the receiver. The data thus compressed enables utilization of storage space in cloud environment. It has been augmented with Hadoop‟s map-reduce paradigm which works in a parallel mode. The experimental results clearly reflect the effectiveness of the methodology to improve the security of data in cloud environment.
Similar to Titles with Abstracts_2023-2024_Cloud Computing.pdf (20)
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...EduSkills OECD
Andreas Schleicher, Director of Education and Skills at the OECD presents at the launch of PISA 2022 Volume III - Creative Minds, Creative Schools on 18 June 2024.
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.pptHenry Hollis
The History of NZ 1870-1900.
Making of a Nation.
From the NZ Wars to Liberals,
Richard Seddon, George Grey,
Social Laboratory, New Zealand,
Confiscations, Kotahitanga, Kingitanga, Parliament, Suffrage, Repudiation, Economic Change, Agriculture, Gold Mining, Timber, Flax, Sheep, Dairying,
How Barcodes Can Be Leveraged Within Odoo 17Celine George
In this presentation, we will explore how barcodes can be leveraged within Odoo 17 to streamline our manufacturing processes. We will cover the configuration steps, how to utilize barcodes in different manufacturing scenarios, and the overall benefits of implementing this technology.
Philippine Edukasyong Pantahanan at Pangkabuhayan (EPP) CurriculumMJDuyan
(𝐓𝐋𝐄 𝟏𝟎𝟎) (𝐋𝐞𝐬𝐬𝐨𝐧 𝟏)-𝐏𝐫𝐞𝐥𝐢𝐦𝐬
𝐃𝐢𝐬𝐜𝐮𝐬𝐬 𝐭𝐡𝐞 𝐄𝐏𝐏 𝐂𝐮𝐫𝐫𝐢𝐜𝐮𝐥𝐮𝐦 𝐢𝐧 𝐭𝐡𝐞 𝐏𝐡𝐢𝐥𝐢𝐩𝐩𝐢𝐧𝐞𝐬:
- Understand the goals and objectives of the Edukasyong Pantahanan at Pangkabuhayan (EPP) curriculum, recognizing its importance in fostering practical life skills and values among students. Students will also be able to identify the key components and subjects covered, such as agriculture, home economics, industrial arts, and information and communication technology.
𝐄𝐱𝐩𝐥𝐚𝐢𝐧 𝐭𝐡𝐞 𝐍𝐚𝐭𝐮𝐫𝐞 𝐚𝐧𝐝 𝐒𝐜𝐨𝐩𝐞 𝐨𝐟 𝐚𝐧 𝐄𝐧𝐭𝐫𝐞𝐩𝐫𝐞𝐧𝐞𝐮𝐫:
-Define entrepreneurship, distinguishing it from general business activities by emphasizing its focus on innovation, risk-taking, and value creation. Students will describe the characteristics and traits of successful entrepreneurs, including their roles and responsibilities, and discuss the broader economic and social impacts of entrepreneurial activities on both local and global scales.
3. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_001 Enabling Balanced Data Deduplication in Mobile Edge Computing
In the mobile edge computing (MEC) environment, edge servers with storage and computing resources are
deployed at base stations within users’ geographic proximity to extend the capabilities of cloud computing to
the network edge. Edge storage system (ESS), is comprised by connected edge servers in a specific area, which
ensures low-latency services for users. However, high data storage overheads incurred by edge servers’ limited
storage capacities is a key challenge in ensuring the performance of applications deployed on an ESS. Data
deduplication, as a classic data reduction technology, has been widely applied in cloud storage systems. It also
offers a promising solution to reducing data redundancy in ESSs. However, the unique characteristics of MEC,
such as edge servers’ geographic distribution and coverage, render cloud data deduplication mechanisms
obsolete.
EPRO_CLD_002 Key Reduction in Multi-Key and Threshold Multi-Key Homomorphic Encryptions
by Reusing Error
As cloud computing and AI as a Service are provided, it is increasingly necessary to deal with privacy sensitive
data. To deal with the sensitive data, there are two cases of outsourcing process: i) many clients participate
dynamically ii) many clients are pre-determined. The solutions for protecting sensitive data in both cases are
the multi-key homomorphic encryption (MKHE) scheme and the threshold multi-key homomorphic encryption
(TMKHE) scheme. However, these schemes may be difficult for clients with limited resources to perform
MKHE and TMKHE. In addition, due to the large size of the evaluation keys, in particular multiplication and
rotation keys, the communication between the clients and server that provide outsourcing service increases.
Also, the size of the evaluation keys that the server must hold is tremendous, in particular, for the multiplication
and rotation keys, which are essential for bootstrapping operation.
4. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_003 Data Lake Architecture for Storing and Transforming Web Server Access Log Files
Web server access log files are text files containing important data about server activities, client requests
addressed to a server, server responses, etc. Large-scale analysis of these data can contribute to various
improvements in different areas of interest. The main problem lies in storing these files in their raw form, over
long time, to allow analysis processes to be run at any time enabling information to be extracted as foundation
for high quality decisions. Our research focuses on offering an economical, secure, and high-performance
solution for the storage of large amount of raw log files. Proposed system implements a Data Lake (DL)
architecture in cloud using Azure Data Lake Storage Gen2 (ADLS Gen2) for extract–load–transform (ELT)
pipelines. This architecture allows large volumes of data to be stored in their raw form. Afterwards they can be
subjected to transformation and advanced analysis processes without the need of a structured writing scheme.
The main contribution of this paper is to provide a solution that is affordable and more accessible to perform
web server access log data ingestion, storage and transformation over the newest technology, Data Lake.
EPRO_CLD_004 A Robust Selective Encryption Scheme for H.265/HEVC Video
To protect the information of video stream, many selective video encryption schemes have been proposed based
on the H.265/HEVC video. However, most of the existing algorithms are not robust, thus failing to decrypt
under packet loss. To further improve the robustness capability of video protection, a robust selective encryption
scheme is proposed in this paper. In H.265/HEVC standard, video is encoded into multiple slices, and the slices
are decoded independently. Inspired by the feature, each slice is individually encrypted using RC4 stream
cipher. The pseudorandom binary sequence (PRBS) for one slice is related to encoding parameters and the
SHA-256 hash value of the corresponding slice header, thus ensuring the real-time update of the PRBS and
increasing the resistance to chosen-plaintext attack. Two-rounds shifting algorithm is designed to scramble non-
zero coefficients of the transform units (TUs) and then motion vector difference (MVD) parameters, quantized
transform coefficients (QTCs) and intra prediction modes (IPMs) are selected for encryption
5. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_005 VeriDedup: A Verifiable Cloud Data Deduplication Scheme With Integrity and
Duplication Proof
Data deduplication is a technique to eliminate duplicate data in order to save storage space and enlarge upload
bandwidth, which has been applied by cloud storage systems. However, a cloud storage provider (CSP) may
tamper user data or cheat users to pay unused storage for duplicate data that are only stored once. Although
previous solutions adopt message-locked encryption along with Proof of Retrievability (PoR) to check the
integrity of deduplicated encrypted data, they ignore proving the correctness of duplication check during data
upload and require the same file to be derived into same verification tags, which suffers from brute-force attacks
and restricts users from flexibly creating their own individual verification tags. In this paper, we propose a
verifiable deduplication scheme called VeriDedup to address the above problems. It can guarantee the
correctness of duplication check and support flexible tag generation for integrity check over encrypted data
deduplication in an integrative way.
EPRO_CLD_006 Task Scheduling Mechanisms for Fog Computing: A Systematic Survey
In the Internet of Things (IoT) ecosystem, some processing is done near data production sites at higher speeds
without the need for high bandwidth by combining Fog Computing (FC) and cloud computing. Fog computing
offers advantages for real-time systems that require high speed internet connectivity. Due to the limited
resources of fog nodes, one of the most important challenges of FC is to meet dynamic needs in real-time.
Therefore, one of the issues in the fog environment is the optimal assignment of tasks to fog nodes. An efficient
scheduling algorithm should reduce various qualitative parameters such as cost and energy consumption, taking
into account the heterogeneity of fog nodes and the commitment to perform tasks within their deadlines. This
study provides a detailed taxonomy to gain a better understanding of the research issues and distinguishes
important challenges in existing work.
6. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_007 Data Secure De-Duplication and Recovery Based on Public Key Encryption With
Keyword Search
In the current era of information explosion, users’ demand for data storage is increasing, and data on the cloud
has become the first choice of users and enterprises. Cloud storage facilitates users to backup and share data,
effectively reducing users’ storage expenses. As the duplicate data of different users are stored multiple times,
leading to a sudden decrease in storage utilization of cloud servers. Data stored in plaintext form can directly
remove duplicate data, while cloud servers are semi-trusted and usually need to store data after encryption to
protect user privacy. In this paper, we focus on how to achieve secure de-duplication and recover data in
ciphertext for different users, and determine whether the indexes of public key searchable encryption and the
matching relationship of trapdoor are equal in ciphertext to achieve secure de-duplication. For the duplicate file,
the data user’s re- encryption key about the file is appended to the ciphertext chain table of the stored copy.
EPRO_CLD_008 Enhancing Security and Privacy Preservation of Sensitive Information in e-Health
Datasets Using FCA Approach
Advances in data collection, storage, and processing in e-Health systems have recently increased the importance
and popularity of data mining in the health care field. However, the high sensitivity of the handled and shared
data, brings a high risk of information disclosure and exposure. It is therefore important to hide sensitive
relationships by modifying the shared data. This major information security threat has, therefore, mandated the
requirement of hiding/securing sensitive relationships of shared data. As a large number of data mining activities
that attempt to identify interesting patterns from databases depend on locating frequent item sets, further
investigation of frequent item sets requires privacy-preserving techniques. To solve many difficult
combinatorial problems, such as data distribution problem, exact and heuristic algorithms have been used. Exact
algorithms are studied and considered optimal for such problems, however they suffer scalability bottleneck, as
they are limited to medium-sized instances only. Heuristic algorithms, on the other hand, are scalable, however,
they perform poor on security and privacy preservation
7. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_009 Real-Time Detection Schemes for Memory DoS (M-DoS) Attacks on Cloud
Computing Applications
Memory Denial of Service (M-DoS) attacks refer to a class of cyber-attacks that aim to exhaust the memory
resources of a system, rendering it unavailable to legitimate users. This type of attack is particularly dangerous
in cloud computing environments, where multiple users share the same resources. Detection and mitigation of
M-DoS attacks in real-time is a challenging task, as they often involve a large number of low-rate requests,
making it difficult to distinguish them from legitimate traffic. Several real-time detection schemes have been
proposed to identify and mitigate M-DoS attacks in cloud computing environments. These schemes can be
broadly classified into two categories: signature-based and anomaly-based detection. Signature-based detection
methods rely on the identification of specific patterns or characteristics of known M-DoS attack techniques,
while anomaly-based detection methods identify abnormal behaviour that deviates from the normal pattern of
usage
EPRO_CLD_010 A Review on Protection and Cancelable Techniques in Biometric Systems
An essential part of cloud computing, IoT, and in general the broad field of digital systems, is constituted by
the mechanisms which provide access to a number of services or applications. Biometric techniques aim to
manage the access to such systems based on personal data; however, some biometric traits are openly exposed
in the daily life, and in consequence, they are not secret, e.g., voice or face in social networks. In many cases,
biometric data are non-cancelable and non-renewable when compromised. This document examines the
vulnerabilities and proposes hardware and software countermeasures for the protection and confidentiality of
biometric information using randomly created supplementary information. Consequently, a taxonomy is
proposed according to the operating principle and the type of supplementary information supported by
protection techniques, analyzing the security, privacy, revocability, renewability, computational complexity,
and distribution of biometric information.
8. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_011 A Novel Hybrid Multikey Cryptography Technique for Video Communication
Online video streaming is becoming more widespread in people’s everyday entertainment routines. Protecting
copyright and piracy has become a key concern in real-time video streaming systems. This research provides a
revolutionary multi-key and hybrid cryptography approach to offer security. This work describes the software
implementation of video encryption and decryption employing continuous systems based on the Elliptic Curve
Cryptography approach as pseudorandom encryption key generators. This approach creates several keys to
encrypt and decode small chunks of video files that are produced dynamically based on the video data. The
suggested approach was implemented on the Android platform, where applications for sender and recipients
had been created to enable streaming. The security and performance of the proposed system have been examined
by implementing it on devices and streaming videos. The outcomes demonstrate superiority in terms of
performance and security
EPRO_CLD_012 CloudpredNet: An Ultra-Short-Term Movement Prediction Model for Ground-
Based Cloud Image
Ground-based cloud images can provide information on weather and cloud conditions, which are important for
cloud monitoring and PV power generation forecasting. Prediction of short-time cloud movement from images
is a major means of intra-hourly irradiation forecasting for solar power generation and is also important for
precipitation forecasting. However, there is a lack of advanced and complete methods for cloud movement
prediction from ground-based cloud images, and traditional techniques based on image processing and motion
vector calculations have difficulty in predicting cloud morphological changes, which makes accurate prediction
of cloud motion (especially nonlinear motion) very challenging. Therefore, this paper proposes CloudpredNet,
a ground-based cloud ultra-short-term movement prediction model based on an “encoder-generator”
architecture.
9. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_013 A Rankable Boolean Searchable Encryption Scheme Supporting Dynamic Updates
in a Cloud Environment
At present, three problems exist in searchable encryption in cloud storage services: firstly, most traditional
searchable encryption schemes only support single-keyword search while fail to perform Boolean searches;
even if a few schemes support Boolean searching, the storage efficiency is also unsatisfactory. Secondly, most
existing schemes do not support dynamic keyword updates, so the update efficiency is low. Thirdly, most
existing schemes cannot meet all demands of users, to perform rankable searching over search files according
to the importance of keywords. To solve these problems, a rankable Boolean searchable encryption scheme
supporting dynamic updates in a cloud environment (RBDC) is proposed. By using Paillier and GM encryption
algorithms, secure indices supporting dynamic updating are established. Based on applicable knowledge
gleaned from cryptography and set theory, the indices of keyword intersections and the intersection search
trapdoors are constructed to achieve multi-keyword Boolean search
EPRO_CLD_014 Efficacy of Bluetooth-Based Data Collection for Road Traffic Analysis and
Visualization Using Big Data Analytics
Ground-based cloud images can provide information on weather and cloud conditions, which play an important
role in cloud cover monitoring and photovoltaic power generation forecasting. However, the cloud motion
prediction of ground-based cloud images still lacks advanced and complete methods, and traditional
technologies based on image processing and motion vector calculation are difficult to predict cloud
morphological changes. In this paper, we propose a cloud motion prediction method based on Cascade Causal
Long Short-Term Memory (CCLSTM) and SuperResolution Network (SR-Net). Firstly, CCLSTM is used to
estimate the shape and speed of cloud motion. Secondly, the Super-Resolution Network is built based on
perceptual losses to reconstruct the result of CCLSTM and, finally, make it clearer. We tested our method on
Atmospheric Radiation Measurement (ARM) Climate Research Facility TSI (total sky imager) images. The
experiments showed that the method is able to predict the sky cloud changes in the next few steps.
10. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_015 RFSE-GRU: Data Balanced Classification Model for Mobile Encrypted Traffic in Big
Data Environment
With the widespread use of mobile technologies and the Internet, traffic in mobile networks is increasing. This
situation has made the classification of traffic an important element for data security and network management.
However, encryption of traffic in modern networks makes it difficult to classify traffic with traditional methods.
In this study, a unique deep learning-based classification model is proposed for the classification of encrypted
mobile traffic data. The proposed model is a classification model called RFSE-GRU, which combines the Gated
Recurrent Units (GRU) algorithm, feature selection and data balancing. The features that are more meaningful
in the classification process are determined by selecting the features with the Random Forest algorithm. In
addition, Synthetic Minority Oversampling Technique (SMOTE) oversampling algorithm and Edited Nearest
Neighbor (ENN) undersampling algorithm were used together to reduce the negative impact of data imbalance
on classification performance
EPRO_CLD_016 Secure Scheme for Locating Disease-Causing Genes Based on Multi-Key
Homomorphic Encryption
Genes have great significance for the prevention and treatment of some diseases. A vital consideration is the
need to find a way to locate pathogenic genes by analyzing the genetic data obtained from different medical
institutions while protecting the privacy of patients' genetic data. In this paper, we present a secure scheme for
locating disease-causing genes based on Multi-Key Homomorphic Encryption (MKHE), which reduces the risk
of leaking genetic data. First, we combine MKHE with a frequency-based pathogenic gene location function.
The medical institutions use MKHE to encrypt their genetic data. The cloud then homomorphically evaluates
specific gene-locating circuits on the encrypted genetic data. Second, whereas most location circuits are
designed only for locating monogenic diseases, we propose two location circuits (TH-intersection and Top-q)
that can locate the disease-causing genes of polygenic diseases. Third, we construct a directed decryption
protocol in which the users involved in the homomorphic evaluation can appoint a target user who can obtain
the final decryption result.
11. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_017 A MapReduce Based Approach for Secure Batch Satellite Image Encryption
The overarching goal of this research was to examine the state of satellite imagery security in relation to its
deteriorating form due to rising demand. The most common approaches to safeguarding satellite images during
transmission across transmission networks, which are not protected by standard encryption, are the focus of this
investigation. Since satellite imagery can be encrypted both in transit and while stored on a computer’s hard
drive, we put the suggested Image Encryption System to the test by applying it to a collection of satellite photos.
Concurrently encrypting data and running MapReduce jobs is key to the study methodology employed. This
will be carried out in the Hadoop ecosystem, where an innovative method of analysing random numbers for use
in Image encryption will be put to the test. The encryption was processed using MapReduce in the Hadoop
ecosystem. Images were saved as BMP files with added security metadata. The evaluation of experiments was
based on four (4) indicators. It was found that the processing time for batch encryption calculations grew in
proportion to the amount of computations. All cluster, map, and reduction processes were put to the test using
encrypted images, exposing load balancing difficulties and inefficiencies.
EPRO_CLD_018 Multimedia Security Using Encryption: A Survey
Considering the current dependency on digital technology in modern society, the protection of multimedia is
highly important. Encryption is vital in modern digital communication, ensuring data confidentiality,
authentication, integrity, and non-repudiation. Multimedia encryption-based security techniques are becoming
increasingly important as they allow for the secure sharing of multimedia content on digital platforms. This
survey aims to review the state of secure and privacy-preserving encryption schemes applicable to digital
multimedia, such as digital images, digital video, and digital audio. An extensive analysis of the existing
cryptography schemes and multimedia encryption algorithms will be conducted to give an extensive overview
of the current state of security encryption schemes specifically designed for digital multimedia technology. The
survey results will be used to understand better the effectiveness and reliability of secure multimedia encryption
schemes and contribute to developing efficient and secure encryption schemes in the future.
12. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
EPRO_CLD_019 SDTP: Accelerating Wide-Area Data Analytics With Simultaneous Data Transfer
and Processing
For the efficient analysis of geo-distributed datasets, cloud providers implement data-parallel jobs across geo-
distributed sites (e.g., datacenters and edge clusters), which are generally interconnected by wide-area network
links. However, current state-of-the-art geo-distributed data analytic methods fail to make full use of the
available network and computing resources. The main reason is that such geo-distributed methods must wait
for bottleneck sites to complete the corresponding transmission and computation in each phase. Furthermore,
such geo-distributed methods may be impractical to the network bandwidth dynamicity and diverse job
parallelism. To this end, we propose a Simultaneous Data Transfer and Processing (SDTP) mechanism to
accelerate wide-area data analytics, with the joint consideration of network bandwidth dynamics and job
parallelism. In the SDTP, a site can execute the computation, provided that it obtains the required input data.
EPRO_CLD_020 Deadline-Constrained Cost Minimisation for Cloud Computing Environments
The interest in performing scientific computations using commercially available cloud computing resources has
grown rapidly in the last decade. However, scheduling multiple workflows in cloud computing is challenging
due to its non-functional constraints and multi-dimensional resource requirements. Scheduling algorithms
proposed in literature use search-based approaches which often result in very high computational overhead and
long execution time. In this paper, a Deadline-Constrained Cost Minimisation (DCCM) algorithm is proposed
for resource scheduling in cloud computing. In the proposed scheme, tasks were grouped based on their
scheduling deadline constraints and data dependencies. Compared to other approaches, DCCM focuses on
meeting the user-defined deadline by sub-dividing tasks into different levels based on their priorities. Simulation
results showed that DCCM achieved higher success rates when compared to the state-of-the-art approaches.
13. ELYSIUMPRO TITLES WITH ABSTRACTS 2023-2024
ElysiumPro | IEEE Final Year Projects | Best Internship Training | Inplant Training in
Madurai.
Call Us: +91 9944 79 3398
Facebook @ http://surl.li/ktzsz
Chat Now @ https://wa.link/rq387s
Visit Our Channel: @ http://surl.li/ktzsc
Mail Us: @ info@elysiumpro.in
Visit Us: @ http://surl.li/ktzuu