There are so many security risks for the users of cloud computing, but still the organizations are switching towards the cloud. The cloud provides data protection and a huge amount of memory usage remotely or virtually. The organization has not adopted the cloud computing completely due to some security issues. The research in cloud computing has more focus on privacy and security in the new categorization attack surface. User authentication is the additional overhead for the companies besides the management of availability of cloud services. This paper is based on the proposed model to provide central authentication technique so that secured access of resources can be provided to users instead of adopting some unordered user authentication techniques. The model is also implemented as a prototype.
Cloud Computing Security :A broad set of policies, technologies, and controls deployed to protect data, applications, and the associated infrastructure of cloud computing.
Preventing Distributed Denial of Service Attacks in Cloud Environments IJITCA Journal
Distributed-Denial of Service (DDoS) is a key intimidation to network security. Network is a group of
nodes that interrelate with each other for switch over the information. This information is necessary for
that node is reserved confidentially. Attacker in the system may capture this private information and
distorted. So security is the major issue. There are several security attacks in network. One of the major
intimidations to internet examine is DDoS attack. It is a malevolent effort to suspending or suspends
services to destination node. DDoS or DoS is an effort to create network resource or the machine is busy to
its intentional user. Numerous thoughts are developed for avoid the DDoS or DoS. DDoS occur in two
different behaviors they may happen obviously or it may due to some attackers .Various schemes are
developed defense against to this attack. The Main focus of paper is present basis of DDoS attack, DDoS
attack types, and DDoS attack components, intrusion prevention system for DDoS.
Review of Detection DDOS Attack Detection Using Naive Bayes Classifier for Ne...journalBEEI
Distributed Denial of Service (DDoS) is a type of attack using the volume, intensity, and more costs mitigation to increase in this era. Attackers used many zombie computers to exhaust the resources available to a network, application or service so that authorize users cannot gain access or the network service is down, and it is a great loss for Internet users in computer networks affected by DDoS attacks. In the Network Forensic, a crime that occurs in the system network services can be sued in the court and the attackers will be punished in accordance with law. This research has the goal to develop a new approach to detect DDoS attacks based on network traffic activity were statistically analyzed using Naive Bayes method. Data were taken from the training and testing of network traffic in a core router in Master of Information Technology Research Laboratory University of Ahmad Dahlan Yogyakarta. The new approach in detecting DDoS attacks is expected to be a relation with Intrusion Detection System (IDS) to predict the existence of DDoS attacks.
Design and implementation of a privacy preserved off premises cloud storagesarfraznawaz
Despite several cost-effective and flexible characteristics of cloud computing, some clients are reluctant to adopt this paradigm due to emerging security and privacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of information is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipulate and tamper client's data. Attacks on cloud storages are extremely challenging to detect and mitigate. In order to formulate privacy preserved cloud storage, in this research paper, we propose an improved technique that consists of five contributions such as Resilient role-based access control mechanism, Partial homomorphic cryptography, metadata generation and sound steganography, Efficient third-party auditing service, Data backup and recovery process. We implemented these components using Java Enterprise Edition with Glassfish Server. Finally we evaluated our proposed technique by penetration testing and the results showed that client’s data is intact and protected from malicious attackers.
Mitigating Various Attacks in Mobile Ad-hoc Networks Using Trust Based ApproachIJLT EMAS
A Mobile ad hoc network (MANET) is self-organizing,
decentralized and infrastructure-less wireless network. The
successful transmission of the data packet depends on the
complete cooperation of each node in the network. These types of
network don’t have permanent base station, so each node in the
network acts as a router. Due to openness, decentralized, selforganizing
nature of MANET, it is vulnerable to various attacks.
So security is the main concern in MANET.
In this project, we have considered 2 attacks; Vampire
attack and DDoS attacks. Vampire attack drains the energy of
the nodes. DDoS attack exhausts the resources available to a
network, such that the node cannot provide any services. Here,
we discuss methods 2 methods as a solution to our problem; one
is to prevent the attack from happening and other to detect and
recover from the attacks.
Cloud Computing Security :A broad set of policies, technologies, and controls deployed to protect data, applications, and the associated infrastructure of cloud computing.
Preventing Distributed Denial of Service Attacks in Cloud Environments IJITCA Journal
Distributed-Denial of Service (DDoS) is a key intimidation to network security. Network is a group of
nodes that interrelate with each other for switch over the information. This information is necessary for
that node is reserved confidentially. Attacker in the system may capture this private information and
distorted. So security is the major issue. There are several security attacks in network. One of the major
intimidations to internet examine is DDoS attack. It is a malevolent effort to suspending or suspends
services to destination node. DDoS or DoS is an effort to create network resource or the machine is busy to
its intentional user. Numerous thoughts are developed for avoid the DDoS or DoS. DDoS occur in two
different behaviors they may happen obviously or it may due to some attackers .Various schemes are
developed defense against to this attack. The Main focus of paper is present basis of DDoS attack, DDoS
attack types, and DDoS attack components, intrusion prevention system for DDoS.
Review of Detection DDOS Attack Detection Using Naive Bayes Classifier for Ne...journalBEEI
Distributed Denial of Service (DDoS) is a type of attack using the volume, intensity, and more costs mitigation to increase in this era. Attackers used many zombie computers to exhaust the resources available to a network, application or service so that authorize users cannot gain access or the network service is down, and it is a great loss for Internet users in computer networks affected by DDoS attacks. In the Network Forensic, a crime that occurs in the system network services can be sued in the court and the attackers will be punished in accordance with law. This research has the goal to develop a new approach to detect DDoS attacks based on network traffic activity were statistically analyzed using Naive Bayes method. Data were taken from the training and testing of network traffic in a core router in Master of Information Technology Research Laboratory University of Ahmad Dahlan Yogyakarta. The new approach in detecting DDoS attacks is expected to be a relation with Intrusion Detection System (IDS) to predict the existence of DDoS attacks.
Design and implementation of a privacy preserved off premises cloud storagesarfraznawaz
Despite several cost-effective and flexible characteristics of cloud computing, some clients are reluctant to adopt this paradigm due to emerging security and privacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of information is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipulate and tamper client's data. Attacks on cloud storages are extremely challenging to detect and mitigate. In order to formulate privacy preserved cloud storage, in this research paper, we propose an improved technique that consists of five contributions such as Resilient role-based access control mechanism, Partial homomorphic cryptography, metadata generation and sound steganography, Efficient third-party auditing service, Data backup and recovery process. We implemented these components using Java Enterprise Edition with Glassfish Server. Finally we evaluated our proposed technique by penetration testing and the results showed that client’s data is intact and protected from malicious attackers.
Mitigating Various Attacks in Mobile Ad-hoc Networks Using Trust Based ApproachIJLT EMAS
A Mobile ad hoc network (MANET) is self-organizing,
decentralized and infrastructure-less wireless network. The
successful transmission of the data packet depends on the
complete cooperation of each node in the network. These types of
network don’t have permanent base station, so each node in the
network acts as a router. Due to openness, decentralized, selforganizing
nature of MANET, it is vulnerable to various attacks.
So security is the main concern in MANET.
In this project, we have considered 2 attacks; Vampire
attack and DDoS attacks. Vampire attack drains the energy of
the nodes. DDoS attack exhausts the resources available to a
network, such that the node cannot provide any services. Here,
we discuss methods 2 methods as a solution to our problem; one
is to prevent the attack from happening and other to detect and
recover from the attacks.
Fog Computing:The Justifying Insider Data Stealing Attacks in the CloudIJSRD
Cloud computing allows us for share and access our personal and business data. With this technology the communication becomes faster. But when a user share his personal data, he will start worrying about the security. Existing data security paradigms such as encryption have failed in protect data theft attacks, especially those committed by an insider to the cloud service provider. To overcome this problem, We propose a different approach for providing the security for data in the cloud by using offensive decoy technology(ODT). In this Technic we observe data access in the cloud and detect anomalous data access patterns. When unofficial access is found and then verified using challenge questions, we launch a deception attack by returning large amounts of decoy information to the attacker. This protects against the illegal use of the user’s real data. Experiments conducted in a local file setting provide indication that this approach may provide extraordinary levels of user data security in a Cloud environment.
FILESHADER: ENTRUSTED DATA INTEGRATION USING HASH SERVER csandit
The importance of security is increasing in a current network system. We have found a big
security weakness at the file integration when the people download or upload a file and propose
a novel solution how to ensure the security of a file. In particular, hash value can be applied to
ensure a file due to a speed and architecture of file transfer. Hash server stores all the hash
values which are updated by file provider and client can use these values to entrust file when it
downloads. FileShader detects to file changes correctly, and we observed that it did not show
big performance degradation. We expect FileShader can be applied current network systems
practically, and it can increase a security level of all internet users.
Detection of ICMPv6-based DDoS attacks using anomaly based intrusion detectio...IJECEIAES
Security network systems have been an increasingly important discipline since the implementation of preliminary stages of Internet Protocol version 6 (IPv6) for exploiting by attackers. IPv6 has an improved protocol in terms of security as it brought new functionalities, procedures, i.e., Internet Control Message Protocol version 6 (ICMPv6). The ICMPv6 protocol is considered to be very important and represents the backbone of the IPv6, which is also responsible to send and receive messages in IPv6. However, IPv6 Inherited many attacks from the previous internet protocol version 4 (IPv4) such as distributed denial of service (DDoS) attacks. DDoS is a thorny problem on the internet, being one of the most prominent attacks affecting a network result in tremendous economic damage to individuals as well as organizations. In this paper, an exhaustive evaluation and analysis are conducted anomaly detection DDoS attacks against ICMPv6 messages, in addition, explained anomaly detection types to ICMPv6 DDoS flooding attacks in IPv6 networks. Proposed using feature selection technique based on bio-inspired algorithms for selecting an optimal solution which selects subset to have a positive impact of the detection accuracy ICMPv6 DDoS attack. The review outlines the features and protection constraints of IPv6 intrusion detection systems focusing mainly on DDoS attacks.
Pattern Analysis and Signature Extraction for Intrusion Attacks on Web ServicesIJNSA Journal
The increasing popularity of web service technology is attracting hackers and attackers to hack the web services and the servers on which they run. Organizations are therefore facing the challenge of implementing adequate security for Web Services. A major threat is that of intruders which may maliciously try to access the data or services. The automated methods of signature extraction extract the binary pattern blindly resulting in more false positives. In this paper a semi automated approach is proposed to analyze the attacks and generate signatures for web services. For data collection, apart from the conventional SOAP data loggers, honeypots are also used that collect small data which is of high value. To filter out the most suspicious part of the data, SVM based classifier is employed to aid the system administrator. By applying an attack signature algorithm on the filtered data, a more balanced attack signature is extracted that results in fewer false positives and negatives. It helps the Security Administrator to identify the web services that are vulnerable or are attacked more frequently.
Among different online attacks obstructing IT security,
Denial of Service (DoS) and Distributed Denial of Service (DDoS)
are the most devastating attack. It also put the security experts under
enormous pressure recently in finding efficient defiance methods.
DoS attack can be performed variously with diverse codes and tools
and can be launched form different OSI model layers. This paper
describes in details DoS and DDoS attack, and explains how different
types of attacks can be implemented and launched from different OSI
model layers. It provides a better understanding of these increasing
occurrences in order to improve
Distributed Digital Artifacts on the Semantic WebEditor IJCATR
Distributed digital artifacts incorporate cryptographic hash values to URI called trusty URIs in a distributed environment
building good in quality, verifiable and unchangeable web resources to prevent the rising man in the middle attack. The greatest
challenge of a centralized system is that it gives users no possibility to check whether data have been modified and the communication
is limited to a single server. As a solution for this, is the distributed digital artifact system, where resources are distributed among
different domains to enable inter-domain communication. Due to the emerging developments in web, attacks have increased rapidly,
among which man in the middle attack (MIMA) is a serious issue, where user security is at its threat. This work tries to prevent MIMA
to an extent, by providing self reference and trusty URIs even when presented in a distributed environment. Any manipulation to the
data is efficiently identified and any further access to that data is blocked by informing user that the uniform location has been
changed. System uses self-reference to contain trusty URI for each resource, lineage algorithm for generating seed and SHA-512 hash
generation algorithm to ensure security. It is implemented on the semantic web, which is an extension to the world wide web, using
RDF (Resource Description Framework) to identify the resource. Hence the framework was developed to overcome existing
challenges by making the digital artifacts on the semantic web distributed to enable communication between different domains across
the network securely and thereby preventing MIMA.
Cloud Computing Using Encryption and Intrusion Detectionijsrd.com
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Chal...IJERA Editor
In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing
Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated
infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application
Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage,
Account, Service, Traffic Hijacking and Unknown Risk Profile
Fog Computing:The Justifying Insider Data Stealing Attacks in the CloudIJSRD
Cloud computing allows us for share and access our personal and business data. With this technology the communication becomes faster. But when a user share his personal data, he will start worrying about the security. Existing data security paradigms such as encryption have failed in protect data theft attacks, especially those committed by an insider to the cloud service provider. To overcome this problem, We propose a different approach for providing the security for data in the cloud by using offensive decoy technology(ODT). In this Technic we observe data access in the cloud and detect anomalous data access patterns. When unofficial access is found and then verified using challenge questions, we launch a deception attack by returning large amounts of decoy information to the attacker. This protects against the illegal use of the user’s real data. Experiments conducted in a local file setting provide indication that this approach may provide extraordinary levels of user data security in a Cloud environment.
FILESHADER: ENTRUSTED DATA INTEGRATION USING HASH SERVER csandit
The importance of security is increasing in a current network system. We have found a big
security weakness at the file integration when the people download or upload a file and propose
a novel solution how to ensure the security of a file. In particular, hash value can be applied to
ensure a file due to a speed and architecture of file transfer. Hash server stores all the hash
values which are updated by file provider and client can use these values to entrust file when it
downloads. FileShader detects to file changes correctly, and we observed that it did not show
big performance degradation. We expect FileShader can be applied current network systems
practically, and it can increase a security level of all internet users.
Detection of ICMPv6-based DDoS attacks using anomaly based intrusion detectio...IJECEIAES
Security network systems have been an increasingly important discipline since the implementation of preliminary stages of Internet Protocol version 6 (IPv6) for exploiting by attackers. IPv6 has an improved protocol in terms of security as it brought new functionalities, procedures, i.e., Internet Control Message Protocol version 6 (ICMPv6). The ICMPv6 protocol is considered to be very important and represents the backbone of the IPv6, which is also responsible to send and receive messages in IPv6. However, IPv6 Inherited many attacks from the previous internet protocol version 4 (IPv4) such as distributed denial of service (DDoS) attacks. DDoS is a thorny problem on the internet, being one of the most prominent attacks affecting a network result in tremendous economic damage to individuals as well as organizations. In this paper, an exhaustive evaluation and analysis are conducted anomaly detection DDoS attacks against ICMPv6 messages, in addition, explained anomaly detection types to ICMPv6 DDoS flooding attacks in IPv6 networks. Proposed using feature selection technique based on bio-inspired algorithms for selecting an optimal solution which selects subset to have a positive impact of the detection accuracy ICMPv6 DDoS attack. The review outlines the features and protection constraints of IPv6 intrusion detection systems focusing mainly on DDoS attacks.
Pattern Analysis and Signature Extraction for Intrusion Attacks on Web ServicesIJNSA Journal
The increasing popularity of web service technology is attracting hackers and attackers to hack the web services and the servers on which they run. Organizations are therefore facing the challenge of implementing adequate security for Web Services. A major threat is that of intruders which may maliciously try to access the data or services. The automated methods of signature extraction extract the binary pattern blindly resulting in more false positives. In this paper a semi automated approach is proposed to analyze the attacks and generate signatures for web services. For data collection, apart from the conventional SOAP data loggers, honeypots are also used that collect small data which is of high value. To filter out the most suspicious part of the data, SVM based classifier is employed to aid the system administrator. By applying an attack signature algorithm on the filtered data, a more balanced attack signature is extracted that results in fewer false positives and negatives. It helps the Security Administrator to identify the web services that are vulnerable or are attacked more frequently.
Among different online attacks obstructing IT security,
Denial of Service (DoS) and Distributed Denial of Service (DDoS)
are the most devastating attack. It also put the security experts under
enormous pressure recently in finding efficient defiance methods.
DoS attack can be performed variously with diverse codes and tools
and can be launched form different OSI model layers. This paper
describes in details DoS and DDoS attack, and explains how different
types of attacks can be implemented and launched from different OSI
model layers. It provides a better understanding of these increasing
occurrences in order to improve
Distributed Digital Artifacts on the Semantic WebEditor IJCATR
Distributed digital artifacts incorporate cryptographic hash values to URI called trusty URIs in a distributed environment
building good in quality, verifiable and unchangeable web resources to prevent the rising man in the middle attack. The greatest
challenge of a centralized system is that it gives users no possibility to check whether data have been modified and the communication
is limited to a single server. As a solution for this, is the distributed digital artifact system, where resources are distributed among
different domains to enable inter-domain communication. Due to the emerging developments in web, attacks have increased rapidly,
among which man in the middle attack (MIMA) is a serious issue, where user security is at its threat. This work tries to prevent MIMA
to an extent, by providing self reference and trusty URIs even when presented in a distributed environment. Any manipulation to the
data is efficiently identified and any further access to that data is blocked by informing user that the uniform location has been
changed. System uses self-reference to contain trusty URI for each resource, lineage algorithm for generating seed and SHA-512 hash
generation algorithm to ensure security. It is implemented on the semantic web, which is an extension to the world wide web, using
RDF (Resource Description Framework) to identify the resource. Hence the framework was developed to overcome existing
challenges by making the digital artifacts on the semantic web distributed to enable communication between different domains across
the network securely and thereby preventing MIMA.
Cloud Computing Using Encryption and Intrusion Detectionijsrd.com
Cloud computing provides many benefits to the users such as accessibility and availability. As the data is available over the cloud, it can be accessed by different users. There may be sensitive data of organization. This is the one issue to provide access to authenticated users only. But the data can be accessed by the owner of the cloud. So to avoid getting data being accessed by the cloud owner, we will use the intrusion detection system to provide security to the data. The other issue is to save the data backup in other cloud in encrypted form so that load balancing can be done. This will help the user with data availability in case of failure of one cloud.
E-Mail Systems In Cloud Computing Environment Privacy,Trust And Security Chal...IJERA Editor
In this paper, SMCSaaS is proposed to secure email system based on Web Service and Cloud Computing
Model. The model offers end-to-end security, privacy, and non-repudiation of PKI without the associated
infrastructure complexity. The Proposed Model control risks in Cloud Computing like Insecure Application
Programming Interfaces, Malicious Insiders, Data Loss Shared Technology Vulnerabilities, or Leakage,
Account, Service, Traffic Hijacking and Unknown Risk Profile
Purnomo Budi Santoso, Mochamad Choiri & ARIF RAHMAN, (2013), Integrasi Supplier, Produsen, dan Pelanggan pada UKM Keramik Dinoyo dengan Cloud Computing, Jurnal Rekayasa Mesin Vol 4 No 1, Malang, pp. 59-66
Edureka offers the best Cloud Computing training which will boost your career. Find out more about cloud computing courses and other details at our website http://www.edureka.in/cloudcomputing
Appraisal of the Most Prominent Attacks due to Vulnerabilities in Cloud Compu...Salam Shah
Cloud computing has attracted users due to high speed and bandwidth of the internet. The e-commerce systems are best utilizing the cloud computing. The cloud can be accessed by a password and username and is completely dependent upon the internet. The threats to confidentiality, integrity, authentication and other vulnerabilities that are associated with the internet are also associated with cloud. The internet and cloud can be secured from threats by ensuring proper security and authorization. The channel between user and cloud server must be secured with a proper authorization mechanism. The research has been carried out and different models have been proposed by the authors to ensure the security of clouds. In this paper, we have critically analyzed the already published literature on the security and authorization of the internet and cloud.
Challenges and Mechanisms for Securing Data in Mobile Cloud Computingijcnes
Cloud computing enables users to utilize the services of computing resources. Now days computing resources in mobile applications are being delivered with cloud computing. As there is a growing need for new mobile applications, usage of cloud computing can not be overlooked. Cloud service providers offers the services for the data request in a remote server. Virtualization aspect of cloud computing in mobile applications felicitates better utilization of resources. The industry needs to address the foremost security risk in the underlying technology. The cloud computing environment in mobile applications aggravated with various security problems. This paper addresses challenges in securing data in cloud for mobile Cloud computing and few mechanisms to overcome.
MIST Effective Masquerade Attack Detection in the CloudKumar Goud
Abstract: Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. With these new computing and communications paradigms arise new data security challenges. Existing data protection mechanisms such as encryption have failed in preventing data theft attacks, especially those perpetrated by an insider to the cloud provider. We propose a different approach for securing data in the cloud using offensive decoy technology. We monitor data access in the cloud and detect abnormal data access patterns. When unauthorized access is suspected and then verified using challenge questions, we launch a disinformation attack by returning large amounts of decoy information to the attacker. This protects against the misuse of the user’s real data. Experiments conducted in a local file setting provide evidence that this approach may provide unprecedented levels of user data security in a Cloud environment.
Keywords: Mist, Insider data stealing, Bait information, Lure Files, Validating user
Single Sign-on Authentication Model for Cloud Computing using KerberosDeepak Bagga
ABSTRACT
In today’s organizations need for several new resources and storage requirements for terabytes of data is generated every day. Cloud computing provides solution for this in a cost effective and efficient manner. Cloud computing provides on demand resources as services to clients. Cloud is highly scalable and flexible. Although it is benefiting the clients in several ways but as data is stored remotely it has many security loopholes like attacks, data lose, other security and authentication issues. In this paper we are proposing an authentication model for cloud computing based on the Kerberos protocol to provide single sign-on and to prevent against DDOS attacks. This model can benefit by filtering against unauthorized access and to reduce the burden, computation and memory usage of cloud against authentication checks for each client. It acts as a third party between cloud servers and clients to allow secure access to cloud services. In this paper we will see some of the related work for cloud security issues and attacks. Then in next section we will discuss the proposed architecture, its working and sequential process of message transmission. Next we will see how it can prevent against DDOS attacks, some benefits and how it provides single sign-on.
A survey on cloud security issues and techniquesijcsa
Today, cloud computing is an emerging way of computing in computer science. Cloud computing is a set of
resources and services that are offered by the network or internet. Cloud computing extends various
computing techniques like grid computing, distributed computing. Today cloud computing is used in both
industrial field and academic field. Cloud facilitates its users by providing virtual resources via internet. As
the field of cloud computing is spreading the new techniques are developing. This increase in cloud
computing environment also increases security challenges for cloud developers. Users of cloud save their
data in the cloud hence the lack of security in cloud can lose the user’s trust.
In this paper we will discuss some of the cloud security issues in various aspects like multi-tenancy,
elasticity, availability etc. the paper also discuss existing security techniques and approaches for a secure
cloud. This paper will enable researchers and professionals to know about different security threats and
models and tools proposed.
Secure One Time Password OTP Generation for user Authentication in Cloud Envi...ijtsrd
Cloud computing is one of today's most exciting technologies due to its ability to reduce cost associated with computing. This technology worldwide used to improve the business infrastructure and performance. The major threat that the cloud is facing now is security. So, the user authentication is very important step in cloud environment. The traditional authentication user name and static password or PIN code can be easily broken by the skillful attacker. An Unauthorized user can easily enter into the system if he knows the user credentials. Encryption algorithms play a main role in information security systems. Efficient password generation for user authentication is an important problem in secure Cloud communications. In the paper, the One Time Password OTP approach is used to authenticate the cloud users. The generated OPT is encrypted by RSA public key encryption to be more secure for the cloud user authentication. So the third party is not required to generate OPT in the proposed paper. This paper can help to solve the user authentication problem in Cloud environment. Kyaw Swar Hlaing | Nay Aung Aung "Secure One Time Password (OTP) Generation for user Authentication in Cloud Environment" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-6 , October 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28037.pdf Paper URL: https://www.ijtsrd.com/computer-science/computer-security/28037/secure-one-time-password-otp-generation-for-user-authentication-in-cloud-environment/kyaw-swar-hlaing
Secure Data Sharing In an Untrusted CloudIJERA Editor
Cloud computing is a huge area which basically provides many services on the basis of pay as you go. One of the fundamental services provided by cloud is data storage. Cloud provides cost efficiency and an efficient solution for sharing resource among cloud users. A secure and efficient data sharing scheme for groups in cloud is not an easy task. On one hand customers are not ready to share their identity but on other hand want to enjoy the cost efficiency provided by the cloud. It needs to provide identity privacy, multiple owner and dynamic data sharing without getting effected by the number of cloud users revoked. In this paper, any member of a group can completely enjoy the data storing and sharing services by the cloud. A secure data sharing scheme for dynamic cloud users is proposed in this paper. For which it uses group signature and dynamic broadcast encryption techniques such that any user in a group can share the information in a secured manner. Additionally the permission option is proposed for the security reasons. This means the file access permissions are generated by the admin and given to the user using Role Based Access Control (RBA) algorithm. The file access permissions are read, write and delete. In this, owner can provide files with options and accepts the users using that option. The revocation of cloud user is a function generated by the Admin for security purpose. The encryption computational cost and storage overhead is not dependent on the number of users revoked. We analyze the security by proofs and produce the cloud efficiency report using cloudsim.
Similar to Implementation of user authentication as a service for cloud network (20)
Critical evaluation of frontal image based gender classification techniquesSalam Shah
The face describes the personality of humans and has adequate importance in the identification and verification process. The human face provides, information as age, gender, face expression and ethnicity. Research has been carried out in the area of face detection, identification, verification, and gender classification to correctly identify humans. The focus of this paper is on gender classification, for which various methods have been formulated based on the measurements of face features. An efficient technique of gender classification helps in accurate identification of a person as male or female and also enhances the performance of other applications like Computer-User Interface, Investigation, Monitoring, Business Profiling and Human Computer Interaction (HCI). In this paper, the most prominent gender classification techniques have been evaluated in terms of their strengths and limitations.
A robust technique of brain mri classification using color features and k nea...Salam Shah
The analysis of MRI images is a manual process carried by experts which need to be automated to accurately classify the normal and abnormal images. We have proposed a reduced, three staged model having pre-processing, feature extraction and classification steps. In preprocessing the noise has been removed from grayscale images using a median filter, and then grayscale images have been converted to color (RGB) images. In feature extraction, red, green and blue channels from each channel of the RGB has been extracted because they are so much informative and easier to process. The first three color moments mean, variance, and skewness are calculated for each red, green and blue channel of images. The features extracted in the feature extraction stage are classified into normal and abnormal with K-Nearest Neighbors (k-NN). This method is applied to 100 images (70 normal, 30 abnormal). The proposed method gives 98.00% training and 95.00% test accuracy with datasets of normal images and 100% training and 90.00% test accuracy with abnormal images. The average computation time for each image was .06s.
An offline signature verification using pixels intensity levelsSalam Shah
Offline signature recognition has great importance in our day to day activities. Researchers are trying to use them as biometric identification in various areas like banks, security systems and for other identification purposes. Fingerprints, iris, thumb impression and face detection based biometrics are successfully used for identification of individuals because of their static nature. However, people’s signatures show variability that makes it difficult to recognize the original signatures correctly and to use them as biometrics. The handwritten signatures have importance in banks for cheque, credit card processing, legal and financial transactions, and the signatures are the main target of fraudulence. To deal with complex signatures, there should be a robust signature verification method in places such as banks that can correctly classify the signatures into genuine or forgery to avoid financial frauds. This paper, presents a pixels intensity level based offline signature verification model for the correct classification of signatures. To achieve the target, three statistical classifiers; Decision Tree (J48), probability based Naïve Bayes (NB tree) and Euclidean distance based k-Nearest Neighbor (IBk), are used.
For comparison of the accuracy rates of offline signatures with online signatures, three classifiers were applied on online signature database and achieved a 99.90% accuracy rate with decision tree (J48), 99.82% with Naïve Bayes Tree and 98.11% with K-Nearest Neighbor (with 10 fold cross validation). The results of offline signatures were 64.97% accuracy rate with decision tree (J48), 76.16% with Naïve Bayes Tree and 91.91% with k-Nearest Neighbor (IBk) (without forgeries). The accuracy rate dropped with the inclusion of forgery signatures as, 55.63% accuracy rate with decision tree (J48), 67.02% with Naïve Bayes Tree and 88.12% (with forgeries).
Testing desktop application police station information management systemSalam Shah
The police stations have adequate importance in the society to control the law and order situations of the country. In Pakistan, police stations manage criminal records and information manually. We have previously developed and improved a desktop application for the record keeping of the different registers of the police stations. The data of police stations is sensitive and that need to be handled within secured and fully functional software to avoid any unauthorized access. For the proper utilization of the newly developed software, it is necessary to test and analyze the system before deployment into the real environment. In this paper, we have performed the testing of an application. For this purpose, we have used Ranorex, automated testing tool for the functional and performance testing, and reported the results of test cases as pass or fail.
Navigation through citation network based on content similarity using cosine ...Salam Shah
The rate of scientific literature has been increased in the past few decades; new topics and information is added in the form of articles, papers, text documents, web logs, and patents. The growth of information at rapid rate caused a tremendous amount of additions in the current and past knowledge, during this process, new topics emerged, some topics split into many other sub-topics, on the other hand, many topics merge to formed single topic. The selection and search of a topic manually in such a huge amount of information have been found as an expensive and workforce-intensive task. For the emerging need of an automatic process to locate, organize, connect, and make associations among these sources the researchers have proposed different techniques that automatically extract components of the information presented in various formats and organize or structure them. The targeted data which is going to be processed for component extraction might be in the form of text, video or audio. The addition of different algorithms has structured information and grouped similar information into clusters and on the basis of their importance, weighted them. The organized, structured and weighted data is then compared with other structures to find similarity with the use of various algorithms. The semantic patterns can be found by employing visualization techniques that show similarity or relation between topics over time or related to a specific event. In this paper, we have proposed a model based on Cosine Similarity Algorithm for citation network which will answer the questions like, how to connect documents with the help of citation and content similarity and how to visualize and navigate through the document.
Risk management of telecommunication and engineering laboratorySalam Shah
The Telecommunication laboratory plays an important role in carrying out research in the different fields like Telecommunication, Information Technology, Wireless Sensor Networks, Mobile Networks and many other fields. Every Engineering University has a setup of laboratories for students particularly for Ph.D. scholars to work on the performance analysis of different Telecommunication Networks including WLANs, 3G/4G, and Long Term Evolution (LTE). The laboratories help students to have hand on practice on the theoretical concepts they have learned during the teachings at the university. The technical subjects have a practical part also which boosts the knowledge of students and learning of new ideas. The Telecommunication and Engineering laboratories are equipped with different electronic equipment’s like digital trainers, simulators etc. and some additional supportive devices like computers, air conditioners, projectors, and large screens, with power backup facility that creates the perfect environment for experimentation. The setup of Telecommunication and Engineering laboratories cost huge amount, required to purchase equipment, and maintain the equipment. In any working environment risk factor is involved. To handle and avoid risks there must be risk management policy to tackle with accidents and other damages during working in the laboratory, may it be human or equipment at risk. In this paper, we have proposed a risk management policy for the Telecommunication and Engineering laboratories, which can be generalized for similar type of laboratories in engineering fields of studies.
An evaluation of automated tumor detection techniques of brain magnetic reson...Salam Shah
Image processing is a technique developed by computer and Information technology scientist and being used in all field of research including medical sciences. The focus of this paper is the use of image processing in tumor detection from the brain Magnetic Resonance Imaging (MRI). For the brain tumor detection, Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) are the prominent imaging techniques, but most of the experts prefer MRI over CT. The traditional method of tumor detection in MRI images is a manual inspection which provides variations in the results when analyzed by different experts, therefore, in view of the limitations of the manual analysis of MRI, there is a need for an automated system that can produce globally acceptable and accurate results. There is enough amount of published literature available to replace the manual inspection process of MRI images with the digital computer system using image processing techniques. In this paper, we have provided a review of digital image processing techniques in the context of brain MRI processing and critically analyzed them for the identification of the gaps and limitations of the techniques so that the gaps can be filled and limitations of various techniques can be improved for precise and better results.
An appraisal of offline signature verification techniquesSalam Shah
Biometrics is being commonly used nowadays for the identification and verification of humans everywhere in the world. In biometrics humans unique characteristics like palm, fingerprints, iris etc. are being used. Pattern Recognition and image processing are the major areas where research on signature verification is carried out. Hand written Signature of an individual is also unique and for identification of humans are being used and accepted specially in the banking and other financial transactions. The hand written signatures due to its importance are at target of fraudulence. In this paper we have surveyed different papers on techniques that are currently used for the identification and verification of Offline signatures.
An application development for police stations in pakistanSalam Shah
Program slice is the part of program that may take the program off the path of the
desired output at some point of its execution. Such point is known as the slicing criterion.
This point is generally identified at a location in a given program coupled with the subset
of variables of program. This process in which program slices are computed is called
program slicing. Weiser was the person who gave the original definition of program slice
in 1979. Since its first definition, many ideas related to the program slice have been
formulated along with the numerous numbers of techniques to compute program slice.
Meanwhile, distinction between the static slice and dynamic slice was also made.
Program slicing is now among the most useful techniques that can fetch the particular
elements of a program which are related to a particular computation. Quite a large
numbers of variants for the program slicing have been analyzed along with the
algorithms to compute the slice. Model based slicing split the large architectures of
software into smaller sub models during early stages of SDLC. Software testing is
regarded as an activity to evaluate the functionality and features of a system. It verifies
whether the system is meeting the requirement or not. A common practice now is to
extract the sub models out of the giant models based upon the slicing criteria. Process of
model based slicing is utilized to extract the desired lump out of slice diagram. This
specific survey focuses on slicing techniques in the fields of numerous programing
paradigms like web applications, object oriented, and components based. Owing to the
efforts of various researchers, this technique has been extended to numerous other
platforms that include debugging of program, program integration and analysis, testing
and maintenance of software, reengineering, and reverse engineering. This survey
portrays on the role of model based slicing and various techniques that are being taken
on to compute the slices.
A review of slicing techniques in software engineeringSalam Shah
Program slice is the part of program that may take the program off the path of the
desired output at some point of its execution. Such point is known as the slicing criterion.
This point is generally identified at a location in a given program coupled with the subset
of variables of program. This process in which program slices are computed is called
program slicing. Weiser was the person who gave the original definition of program slice
in 1979. Since its first definition, many ideas related to the program slice have been
formulated along with the numerous numbers of techniques to compute program slice.
Meanwhile, distinction between the static slice and dynamic slice was also made.
Program slicing is now among the most useful techniques that can fetch the particular
elements of a program which are related to a particular computation. Quite a large
numbers of variants for the program slicing have been analyzed along with the
algorithms to compute the slice. Model based slicing split the large architectures of
software into smaller sub models during early stages of SDLC. Software testing is
regarded as an activity to evaluate the functionality and features of a system. It verifies
whether the system is meeting the requirement or not. A common practice now is to
extract the sub models out of the giant models based upon the slicing criteria. Process of
model based slicing is utilized to extract the desired lump out of slice diagram. This
specific survey focuses on slicing techniques in the fields of numerous programing
paradigms like web applications, object oriented, and components based. Owing to the
efforts of various researchers, this technique has been extended to numerous other
platforms that include debugging of program, program integration and analysis, testing
and maintenance of software, reengineering, and reverse engineering. This survey
portrays on the role of model based slicing and various techniques that are being taken
on to compute the slices.
A model for handling overloading of literature review process for social scienceSalam Shah
Literature review is an excruciating part in the process of research. It requires an analysis of
published material on the topic on interest. Moreover, for a new researcher, it is challenging
extract a great number of required objectives, including the problem identification,
no more great deal in this era of Information and Communication Technology (ICT), instead
overloading of the literature is a major problem and the great change to be handle. Often
postgraduate research students raise three questions to their peers and supervisors. First, how
many articles are sufficed for a good literature review? Second, how many past years
literature will be enough to meet the required level for a good literature review? And third,
this research paper a novel hypothetical model is proposed to answer first two questions; the
number of articles required for a good and reasonable literature review and number of years
backward the analysis of articles required for the same. Our results indicate that analysis of
data partially support our hypothetical model and its assumptions.
Keywords: literature review; hypothetical model; load reduction; proposal writing;
information systems.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Implementation of user authentication as a service for cloud network
1. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016), pp.197-210
http://dx.doi.org/10.14257/ijgdc.2016.9.10.18
ISSN: 2005-4262 IJGDC
Copyright ⓒ 2016 SERSC
Implementation of User Authentication as a Service for Cloud
Network
Masood Shah1
, Abdul Salam Shah2
and Imran Ijaz3
1,2,3
SZABIST, Islamabad, Pakistan
1
engg.cisco@gmail.com, 2
shahsalamss@gmail.com, 3
imran-ijaz@live.com
Abstract
There are so many security risks for the users of cloud computing, but still the
organizations are switching towards the cloud. The cloud provides data protection and a
huge amount of memory usage remotely or virtually. The organization has not adopted
the cloud computing completely due to some security issues. The research in cloud
computing has more focus on privacy and security in the new categorization attack
surface. User authentication is the additional overhead for the companies besides the
management of availability of cloud services. This paper is based on the proposed model
to provide central authentication technique so that secured access of resources can be
provided to users instead of adopting some unordered user authentication techniques. The
model is also implemented as a prototype.
Keywords: Cloud Computing, Cloud Security, Software Platform Infrastructure (SPI),
Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service
(IaaS)
1. Introduction
The cloud computing provides centralized security for data and processes and
high availability, but it has so many security challenges that can cause a huge loss to
an organization in the form of degradation [1]. The cloud computing is mostly used
as application software as a service and platform as a service for the operation of the
other applications. The cloud provides services to users without buying and
managing the costly hardware and infrastructure in the form of virtual environment
[2]. The users can store the huge amount of data and get the benefit of the
networking services with load balancing, security, and fault tolerance. The cloud
customer’s store confidential data on the cloud so for the security of the servers,
firewall, and intrusion detection systems are important [3].
The cloud security association recognized some risk in cloud computing, as
neglect and dishonest usage of cloud computing, the malicious insider,
vulnerabilities, data leakage, traffic hijacking, account service and the insecure
application programming interface [4]. The data confidentiality and protection of
resources must be provided by the cloud computing. The cloud security system
prohibits resources and authenticity of the data from attackers [5].
Types of cloud computing include public, private, and hybrid cloud. In public cloud,
services are public and accessed through the public network. In the private cloud, private
services are accessed through the private network and the combination of public and
private services is known as the hybrid cloud. The services provided by the cloud are
categorized as the SPI. The layers of the SPI model are SaaS, PaaS, and IaaS. The general
architecture of the cloud is the authentication server, the user, and server [6-8].
Corresponding Author
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
2. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
198 Copyright ⓒ 2016 SERSC
The remaining paper is organized as; in the section 2 Literature Review is
provided, section 3 contains Problem Statement, section 4, contains Traditional
Approaches, Proposed Model is provided in section 5, and following that the
Implementation is provided in section 6, section 7 contains Testing, Evaluation and
Results, and finally in section 8, the Conclusion of the study is provided.
2. Literature Review
According to Gruschka et. al. [9] cloud computing has three different contributor
classes which are the user, the cloud, and service. The cloud has three basic models that
are IaaS, PaaS, and SaaS. The most well-known surface attacks in cloud computing are
server-to-user attack and most common attacks are buffer overflow attacks and SQL
injection. The SSL certificate spoofing or Phishing attacks on mail clients is also common
in the cloud. Another type is triggering the cloud provider or denial of services. The next
attack surface is user-to-the cloud provider common attacks on this surface is Phishing
attacks which cause a user to control the cloud provider services. One of the attack
surfaces is cloud to service. The cloud organization integrates every kind of attack beside
the running service on the cloud. One of the drawbacks found in the control system
service which uses a Signature Wrapping attack has the possibility to alter eavesdropped
message, although it has a digital signature. The client and server cannot communicate
unless the server authenticated client. During communication, the hash function is used
for the integrity of the data and stopping different attacks. The authentication is once
again required if the user wants to access the new services on the cloud. The denial of
services attack is possible in this method by sending the repeated request to the server by
the unauthorized user, the man in the middle and brute force attacks are also possible.
Stolfo et. al., in [10] used decoy technology to supervise and sense unusual data
access. Suppose when an illegal access is detected and confirmed so the system produces
a large amount of decoy data for an attacker to cut off his access to the real user data.
According to the security Alliance, the top threat for cloud computing is when the attacker
is the malicious insider and majority of the user know the threat. If a hateful insider
allowed someone outsider to access your password key or your personal data is another
threat. The fog computing uses decoy information for the prevention of attacks. The other
use is the online social network by a separate user. User behavior profiling is a method to
monitor the access of normal user that how many times and when the users access the
data in the cloud. The authors have used behavior profiling and decoys for securing
information. By using decoy information, we put honeypots and other fake data can be
produced to detect the illegal entry to the cloud. The decoy information will puzzle the
attacker and then he will not understand the original and fake information that is honey
pots. Invader is puzzled with fake information. The architecture specified in this paper is
easily implemented in Cloud (Suggest-File System). User behavior Profile and Decoys
together gives high-level security. By using the specified methods mention in the paper
there is less possibility of the attack. In the authentication process, a man in the middle
attack is also possible. The attacker can get useful information from the data transmitted
between the trusted authority and the hardware. The masquerade attack is possible in the
registration process.
Yang et. al., in [11] described DDoS attacks and the mechanism that how to recover or
trace back the attack. Basically, the DDoS attack on the availability of cloud services.
The defensive techniques for DDoS attacks are detection, identification, and filtration. For
the detection of the DDoS attack, wavelet spectral analysis, statistical methods and
machine learning are used. The IP Trace-back, probabilistic packet marking (PPM) and
Deterministic packet marking (DPM) techniques are used to identify the attack source in
identification phase. The PPM needs a reduced amount of traffic than ICMP to rebuild
the path to recognize the attack source, but it requires additional calculation and packets
for the trace-back procedure. The SOA-Based Trace-back Approach (SBTA) is used to
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
3. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 199
trace back the actual cause of DDoS attack in the cloud. The SBTA scalable, flexible and
compatible. The infected client can recover his cloud by the trace-back tag and can find
the place that from where the attack is launched. There SBTA used advanced packet
marking based on CEFS to find out path restoration. For the defense against DDoS attacks
the traffic control mechanisms known as ingress and egress filtering are used. The other
most advanced method, StackPi, is used to insert a digital signature in IP packets to
prevent the Spoofing of the source address.
Duncan et. al., in [12] discussed the insider attacks. The Insider is anyone who has the
authorized access to the organization’s network or information system. The computer in
the home is shared between family members, but for a cloud user it can be malicious. The
data and resources can be spoiled by a family member who might innocently or accidently
mutate the information or system. In the family, many children have more knowledge
than their parents and it can harm your data by the full access to the resources by your
cloud account by viewing all the web content, password hacking or cracking or
downloading the restricted thing without leaving history are the several misuses. A friend
or family member become more malicious when there is an optimist feeling in the
relationship.
Infrastructure as a Service basically runs the hypervisor and it manages many
operating systems in the virtual machine on the host operating system, the customer
has no concern that which virtual machine runs on which server. But there is an
impact on security because the customers have no idea that how many copies of his
virtual machine exists, where the located position is and who access it. It is possible
that the malicious insider takes the copy of the customer virtual machine and takes it
to the outside hypervisor, but it depends on the cloud provider. The malicious
insider can break the administrator password and can access all the data on the
customer virtual machine and obtain a complete history of the virtual machine.
Riquet et. al., [13] discussed Firewall, intrusion detection and coordinated attacks
on the cloud on a large scale. The authorized traffic defined by the security policy is
filtered through the firewall. The most common techniques used for detection are
the anomaly-based detection and pattern matching. The anomaly based detection
allows all normal traffic, an alarm is raised when deviation from normality is
detected, but there is a chance to raise a false alarm. The IDS cannot detect those
port scan attack which is executed with a very slow speed rate. The security
solutions are Snort and commercial firewall. The Snort is an open source IDS, to
analyze the network, and traffic in real time. The other solution is Commercial
firewall.
Khorshed et. al., in [14] proposed an approach based on some machine learning
attack types. In cloud computing threats are created for cyber-attacks. The data loss,
account service, and traffic hijacking are threats. For hybrid clouds, digital identity
management is not enough, in a present way. In unknown risk profile the lake of
transparency, for audit logs are provided unwillingly. The model has focused on the
detection of attacks during the start or when an attack is occurring. The model
informs the user that what kind of attack is happening. The machine learning
techniques has been used, for the known attacks. The machine learning technique
takes the proactive action to solve the problem, if the attack exists. The machine
learning technique also inform the owner of the data and the security administration
about the known attacks. The performance information is available to cloud clients
and evaluating the attack by this data will complete both. The Defiance of its
functionalities attacks; Dangerous internal attacks; Shared Memory targeted by
attacks; and Phishing attacks are possible in the cloud environment.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
4. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
200 Copyright ⓒ 2016 SERSC
3. Problem Statement
The cloud environment is unsecured because the cloud is hosted on the internet
and is always available that makes it an attractive target. In the cloud, users are
authenticated with different mechanisms for web services. Different companies use
different techniques and mechanisms for authentication propose. These
authentication techniques and mechanisms can be vulnerable, if not dealt by trained
IT professionals. The overhead of the manual authentication process can be handed
over to some other company providing excellent authentication services after
signing a Service Level Agreement (SLA) or dealt by IT professionals with the
proper authentication method. To solve the authentication problem we have
proposed a model which is basically a modification of the already-proposed models.
4. Traditional Model/Approach
In the initial setup of hosting web services, there are three basic components, i.e.,
Web Services, File Transfer Protocol (FTP) and Data Sharing. The main
disadvantage of the traditional model is the direct access to different resources [15].
The companies only focusing on service availability, and not on the secure and
strong user authentication. The issue is same in mail services and storage service.
Companies should focus on strong /secure authentication along with the service
availability. The traditional model can be seen in Figure 1.
Figure 1. Traditional Model
5. Proposed Model
The traditional models emphasize on the availability, but the proposed model
focus on the secure authentication along with the availability of cloud services. To
avoid direct accessibility of resources, there should be a cloud gateway and
authentication / source selection as web interface provided by the service provider.
The user will be authenticated, and after, that the user will be allowed to use desired
services with the access of specific services only. If the user failed to provide
authentication the access will be denied [16]. In the proposed model the secure
authentication mechanism is applied with the help of the cloud gateway and the
authentication server. The identification fraud remains significant well-known
difficulties on the internet. The significantly exploited approaches are increasing
associated with accounts access by means of obtaining reusable qualifications
regarding the sites that have not applied strong individual authentication [17].
Several assaults are usually demonstrated while Phishing messages that
masquerade with versions. They might be routed by simply genuine corporations
and also consist of URL that point for you to fake Internet sites which may have
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
5. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 201
exactly the same performances like legitimate versions [18]. The proposed model
concentrates on approaches which might be utilized to put into action secure
authentication for the user with regard to online user identities. The item looks at
useful remedy approaches, overall architectural mastery design, and promising
developments. Secure End user Authentication So, how must all of us increase Web-
based user authentication techniques without compromising usability and ubiquity,
once the Web is actually used mainly through a browser that's limited access to your
surroundings and devices [19-20]. The most common solution strategies that are
employed today involve, throughout additional generalized words, various sort of
boosted shared-secret or multifactor strong authentication. The proposed model is
provided in Figure 2.
Figure 2. User Authentication as a Service for Cloud Network
6. Implementation of the Model
For the implementation of the proposed model following procedure and tools
have been used.
6.1. System Requirements
The Intel core i3 laptop having 3.00 GHz processor and 8 GB RAM has been used
for the implementation of the proposed model. For creating the cloud environment
and the installation of VMware Workstation 9, Windows 7 (64 bit) operating system
has been installed.
6.2. VMware Workstation
The VMware is a computer software which provides services similar to a physical
machine with an operating system and applications. We have installed the VMware
workstation 9 on the 64bit host operating system, which is running on the physical
host machine. We have created three virtual machines so that window 7, window
server 2008 and ESXI operating systems can be installed on each machine. These
virtual machines use and share the resources (hard drive, Ram) of the actual
physical machine. The user can run these machines at the same time, so we have
now three virtual and one physical machine. The states of VMware can be saved for
the backup and recovery using snapshot function provided by VMware software.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
6. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
202 Copyright ⓒ 2016 SERSC
The snapshot helps in recovery in case of crashes or other problems. The user can
easily recover the system to that state for which a snapshot has been taken.
6.3. Creating new Virtual Machines
As per requirements, we have to create three virtual machines with the help of the
wizard window. The wizard provides options of the typical and custom installation.
The typical installation selects the components automatically and less overhead is
involved, but the extra components consume more memory. The installation process
provides an option to install virtual machine along with the operating system or
without an operating system. In the first step, three virtual machines have been
created and in the next step windows 7, Window Server 2008 R2 and ESXI Servers
have been installed.
6.4. Installation of VMware Tools
The VMware tools is a package of drivers and software for the better performance
of the guest machine. If the VMware tools are not installed properly the guest
machine will not work as per user requirements and the user will miss the following
important functions and features.
1. The user cannot perform the copy and paste operation from host machines to the
guest machine.
2. The time of the guest and host machines will be not be synchronized.
3. Data sharing will not be possible.
4. The unity of the guest and the host will not be achieved.
The process of the installation of tools is very simple, the user has to turn on the
desired VMware and select VM tools. The installation wizard will appear through
which the user can install desired tools and restart the VMware machine which is
now ready to use. The VMware has different states like, Suspend, Resume, Power
off, Power on and Restart.
6.5. Installation of Windows Server 2008 R2
The installation of three virtual machines has been successfully done. The created
machine is now ready for the installation of the window server 2008 R2. Window
server 2008 R2 is the product of Microsoft that provides advanced features for the
setup of a server. The installation of the Window Server 2008 R2 on a virtual
machine requires system resources. It requires a 64-bit processor having a minimum
speed of 1.4 GHz. The memory selection depends upon the user need, if we select
the minimum, it provides a 512 Mb of RAM. The recommended option provides 2
GB of RAM and the maximum option provides 8 GB of RAM. The graphics
requirements are a super VGA of (800; 600) or higher. The required disk space is 32
GB but if the RAM size is selected as the maximum then it required more disk
space. The virtual machine has been created as per requirement of the windows
server 2008 R2 so the installation has not created problems. The window server
2008, has been installed on a virtual machine having 2 GB of RAM and 40 GB of
the hard disk as shown in Figure 3, and the starting of installation step can be seen
in Figure 4. Still, the edit option is available if the installation of server creates the
problem the user can modify the specifications of the virtual machine.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
7. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 203
Figure 3. Specifications for Windows Server 2008
Figure 4. Installation of Windows Server 2008
The installation of the tools has been carried out so that the user can perform
some tasks with Window Server 2008. The Windows Server 2008 is now ready as
shown in Figure 5. If the user wants to access ESXI servicer from window server
2008 then the installation of vSphere on Window Server 2008 is also compulsory.
Figure 5. Virtual Machine Having Windows Server 2008
6.6. Installation of ESXI Server
The installation of ESXI server has been carried out on the already created virtual
machine. The machine has a hard drive of 100 GB, and 4 GB of RAM. The user
cannot perform tasks with ESXI directly, because ESXI is command based just like
DOS. If the user need to perform any task on ESXI Server, an additional component
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
8. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
204 Copyright ⓒ 2016 SERSC
needs to be installed on a separate client machine. The installation of a console
vSphere is needed on a separate machine from where the user can access and
manage the ESXI server. Every task performed via vSphere will be physically
performed on the ESXI.
Figure 6. Virtual Machine Having ESXI Server
Figure 7. Specifications of ESXI Server
6.7. Installation of Windows 7 (64 bit)
The installation of Windows 7 (64 bit) has been carried out on the already created
virtual machine. The virtual machine has 2 GHz processor, having 4 GB of RAM
and 40 GB of hard disk space. The installation of vSphere has been carried out on
windows 7 so that we can access the server. The work performed on the Windows 7
client machine will be stored on the ESXI server. The addition or deletion of data or
machines from the ESXI can also be carried out using the client machine.
Figure 8. Windows 7 (64 Bit)
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
9. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 205
6.8. Installation of vSphere Client
The installation of the vSphere client is necessary to interact with the ESXI server
and Windows Server 2008. The server cannot directly be accessed for the storage of
data or for any other access to data. To access the ESXI server from the Windows
Server 2008, the installation of the vSphere software has been carried out on the
Window Server 2008. During the installation of vSphere a warning has been
received that the installation of VJ-redist 64 is mandatory before proceeding for the
installation of vSphere. The VJ-redist 64 is a visual j redistributable 64-bit software
that works with only 64 bit operating systems and it supports all the server and
client packages. The VJ-redist 64 has been successfully installed and resumed the
installation of vSphere but another warning message has been received that the
installation of Dot net 3.5 x64 is required. The installation of Dot Net 3.5 x 64 bit
has been successfully carried out. The installation of vSphere has been started and
completed successfully. The vSphere client can be used by providing an IP address
of the ESXI server, username, and the password. The selection of the IP address of
both machines from the same series is important. In a similar way if the user wants
to access the server from the machine having windows 7, then the installation of
vSphere on that machine is also important. So we have carried out the installation of
vSphere software on the virtual machine having windows 7 (64 bit) installed.
7. Testing, Evaluation and Results
In the proposed model HTTP Server, Storage Server, Mail Server and Media
Server have been created so that the cloud services can be hosted. The servers
created on virtual machines work just like physical machines and use the clients can
use resources of the host machines. In the proposed cloud model we provide secure
authentication to the servers and services of storage provider, http provider and
video audio streaming provider. The user sends the request from the device that can
be a laptop, PC or tablet, to access the cloud [21-22]. The request reaches to the
cloud gateway, where the authentication server is placed, which is a web interface.
The legitimate users are authenticated and the desired services are provided if the
user proved their identity to that authentication server. In the case of wrong
username and password the services will not be provided to the user. After the
successful authentication the user is going to access the FTP service, by providing
the user name and password as in Figure 9. The important FTP services are kept
secure from any kind of loss Figure 10.
Figure 9. User Authentication
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
10. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
206 Copyright ⓒ 2016 SERSC
Figure 10. Secured FTP Services
For the evaluation of the proposed model, data requests have been placed and the
response time has been calculated. The results shows that the model works perfectly
in the suggested environment, response time was calculated and high data transfer
was tested to evaluate the performance of the model. It only requires high speed
bandwidth to transfer data from service providing cloud company to authentication
providing cloud company because the user will not be able to access cloud services
directly as in traditional approach. The results have been graphically represented in
Figure 11, and 12.
Figure 11. Accessing Cloud Services
Figure 12. Response Time of ESXI Server
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
11. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 207
8. Conclusion
In this paper, a model for providing authentication to the users of cloud computing has
been presented. Window Server 2008, ESXI Server and Windows 7 were installed to
create the cloud environment. Data protection to the resources has been provided through
a web-based authentication server. The user authentication is the additional overhead for
the companies besides the management of availability of cloud services. In this paper, we
have tried to reduce the overhead of the companies using cloud computing. The model
just implemented as a prototype, in the future we will increase more security using
signatures and add more services into the circle of authentication [23-24].
References
[1] M. Shah and A. S. Shah, “Appraisal of the Most Prominent Attacks due to vulnerabilities in cloud
computing,” International Journal of Grid and Distributed Computing (IJGDC), vol. 9, no. 7, (2016), pp.
13-22.
[2] M. Uddin, J. Memon, R. Alsaqour, A. Shah, M. Z. A. Rozan, “Mobile Agent Based Multi-Layer
Security Framework for Cloud Data Centers”, Indian Journal of Science and Technology, vol. 8, no. 12,
(2015), pp.1-10.
[3] A. Waqas, A. W. Mahessar, N. Mahmood, Z. Bhatti, M. Karbasi and A. Shah, “Transaction
Management Techniques and Practices In Current Cloud Computing Environments: A Survey”,
International Journal of Database Management Systems, vol. 7, no. 1, (2015), pp. 41-59.
[4] M. Uddin, A. A. Rahman, A. Shah and J. Memon, “Virtualization Implementation Approach for Data
Centers to Maximize Performance”, Asian Network for Scientific Information (ANSINET), vol. 5, no. 2,
(2012), pp. 45-57.
[5] M. F. Ali, A. Bashar and A. Shah, “SmartCrowd: Novel Approach to Big Crowd Management using
Mobile Cloud Computing”, 2015 International Conference on Cloud Computing (ICCC), (2015), pp.1-4.
[6] A.G. Memon, S. Khawaja and A. Shah, “Steganography: A new Horizon for Safe Communication
Through XML”, Journal of Theoretical and Applied Information Technology, vol. 4 no.3, (2008), pp.
187-202.
[7] A. Shahzad, S. Musa, M. Irfan, A. Shah, “Key Encryption Method for SCADA Security Enhancement”,
Journal of Applied Sciences, vol. 14, no. 20, (2014), pp. 2498-2506.
[8] A. Shahzad, S. Musa, M. Irfan, A. Shah, “Deployment of New Dynamic Cryptography Buffer For
SCADA Security Enhancement”, Journal of Applied Sciences, vol. 14, no. 20, (2014), pp.2487-2497.
[9] N. Gruschka and M. Jensen, “Attack Surfaces: A Taxonomy for Attacks on Cloud Services”, IEEE 3rd
International Conference on Cloud Computing, (2010), pp.276-279.
[10] S. J. Stolfo, M B. Salem and D. A. Keromytis, “Fog Computing: Mitigating Insider Data Theft Attacks
in the Cloud”, IEEE CS Security and Privacy Workshops, (2012), pp.125-128.
[11] L. Yang, T. Zhang, J. Song, J.S. Wang and P. Chen, “Defence of DDoS Attack for Cloud Computing”,
2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE), (2012),
pp.626-629.
[12] A. J. Duncan, S. Creese and M. GoldSmith, “Insider Attacks in Cloud Computing”, IEEE 11th
International Conference on Trust, Security and Privacy in Computing and Communication, (2012),
pp.857-862.
[13] D. Riquet, G. Grimaud and M. Hauspie, “Large-Scale Coordinated Attacks: Impact on the Cloud
Security”, Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous
Computing, (2012), pp.558-563.
[14] M. T. Khorshed, A. B. M. S. Ali and S. A. Wasimi, “Trust Issues That Create Threats for Cyber Attacks
in Cloud Computing”, IEEE 17th International Conference on Parallel and Distributed Systems, (2011),
pp. 900-905.
[15] T. Karnwal, T. Sivakumar and G. Aghila, “A Comber Approach to Protect Cloud Computing Against
XML DDoS and HTTP DDoS Attack”, IEEE Students’ Conference on Electrical, Electronics and
Computer Science, (2012), pp. 1-5.
[16] R. Liu and J. Li, “A Predictive Judgment Method for WLAN Attacking Based on Cloud Computing
Environment”, 2010 International Conference on Apperceiving Computing and Intelligence Analysis
(ICACIA), (2010), pp. 22-25.
[17] H. A. Kholidy and F. Baiardi, “CIDD: A Cloud Intrusion Detection Dataset for Cloud Computing and
Masquerade Attacks”, Ninth International Conference on Information Technology – New Generations,
(2012), pp. 397-402.
[18] M. H. Sqalli, F. Al- Haidari and K. Salah, “EDoS-Shield – A Two Steps Mitigation Technique against
EDoS Attacks in Cloud computing”, IEEE International Conference on Utility and Cloud Computing,
(2011), pp. 49-56.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
12. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
208 Copyright ⓒ 2016 SERSC
[19] A. Iqbal, H. U. Rahman, M. U. Khan and M. Fayaz, “Secure Data in Cloud on the Basis of Sensitivity”,
Journal of Applied Environmental and Biological Sciences, vol. 6, no 2, (2016), pp. 102-108.
[20] A. Raza, M. Y. Koondhar, Sindhu, M. Hyder, G. D. Menghwar, B. Baloch and A.Shah, “Application
Service Delivery in the Modern Virtualized Data Center-Improving Reliability and Scalability”, Sindh
University Research Journal (Science Series), vol. 48, no. 3, (2016), pp. 579-584.
[21] A. W. Mahesar, Z. Bhatti, A. Waqas, M. Y. Koondhar, M. M. Rind, and S. Nizamani, “Efficient Link
Prediction Method in Dark Network Analysis”, Sindh University Research Journal (Science Series), vol.
48, no. 1, (2016), pp. 81-84.
[22] H. Nawaz, S. Soomro, S. H. Abbas, M. S. Ehsan and M. Y. Koondhar, “Simulation Based Analysis of
Handover Issues Affecting UMTS Performance”, Sindh University Research Journal (Science Series),
vol. 45, no. 4, (2013), pp. 689-696.
[23] A. S. Shah, M. N. A. Khan, F. Subhan, M. Fayaz and A. Shah, “An Offline Signature Verification
Technique using Pixels Intensity Levels”, International Journal of Signal Processing, Image Processing
and Pattern Recognition, vol. 9, no.8, (2016), pp. 205-222.
[24] A. S. Shah, M. N. A. Khan and A. Shah, “An Appraisal of Off-Line Signature Verification Techniques”,
International Journal of Modern Education and Computer Sciences, vol. 7, no. 4, (2015), pp. 67-75.
Authors
Masood Shah, enthusiastic and high-achieving IT professional,
has completed MS degree in Computer Science from SZABIST,
Islamabad, Pakistan in 2016. He did his Bachelor of Information
Technology from Agricultural University, Peshawar Pakistan in 2012.
He has completed short courses and diploma certificates in CCNA
(Cisco Certified Network Associate), MCSE (Windows Server 2008),
Cybercrime, Cyber Security, Networking. He is a young professional
having exceptional technical and analytical skills, with over 3 years’
experience of Computer System/ Network Administration,
Information System Support & Security, Network and Server support.
He has worked with Techno-ed Pvt Ltd Islamabad, Money Link
Exchange Peshawar, and Waseela-e-Taleem - Benazir Income
Support Programme. He is currently working as Lecturer with
Frontier Comprehensive School & College (FCS), Shergarh, Pakistan.
His research area includes Cloud Computing, Cyber Security, and
Cryptography.
Abdul Salam Shah, is currently doing specialization in
Management Information System (MIS) from Virtual University of
Pakistan. He has completed MS degree in Computer Science from
SZABIST, Islamabad, Pakistan in 2016. He did his BS degree in
Computer Science from Isra University Hyderabad, Sindh Pakistan in
2012. In addition to his degree, he has completed short courses and
diploma certificates in Databases, Machine Learning, Artificial
Intelligence, Cybercrime, Cybersecurity, Networking, and Software
Engineering. He has published articles in various journals of high
repute. He is a young professional and he started his career in the
Ministry of Planning, Development and Reforms, Islamabad Pakistan.
His research area includes Machine Learning, Artificial Intelligence,
Digital Image Processing and Data Mining.
Mr. Shah has contributed in a book titled "Research
Methodologies; an Islamic perspectives," International Islamic
University Malaysia, November, 2015.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
13. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
Copyright ⓒ 2016 SERSC 209
Imran Ijaz, is a Ph.D. Scholar in SZABIST Islamabad, Pakistan.
His research areas are Cloud Security, PKI and Security services
through PKI under cloud infrastructure. Supervised / Implemented a
number of National level network projects. He is serving in Fatima
Jinnah Women University, Rawalpindi, Pakistan.
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.
14. International Journal of Grid and Distributed Computing
Vol. 9, No. 10 (2016)
210 Copyright ⓒ 2016 SERSC
O
nline
Version
O
nly.
Book
m
ade
by
this
file
is
ILLEG
AL.