This document discusses various technologies related to ubiquitous computing and their attributes. It compares technologies such as microelectronics, power supply, sensor technology, communication technology, localization technology, security technology, machine to machine communication, and human machine interface. It discusses their attributes in terms of mobility, embeddedness, ability to form ad-hoc networks, and context awareness. The comparison highlights how each technology enables or limits ubiquitous computing applications.
A SURVEY ON TECHNIQUES REQUIREMENTS FOR INTEGRATEING SAFETY AND SECURITY ENGI...IJCSES Journal
Nowadays, safety and security have become a requirement, integrated to each other, for information systems as a new generation of infrastructure systems distributed throughout networks. That opened the door for questions on whether these systems are safety-critical especially since they were tested in a closed, separated environment and are now deployed in an uncontrollable environment, namely the internet, where the number of threats is enormous. So it opened the door to talk about new development approach methods that take safety and security into consideration during the system development life cycle and most importantly, identifying hazard, risks and threats. We will conduct a survey exploring technical languages that were created by the scholars to combine safety and security requirement engineering and accident analysis technique languages.
Survey on Mobile Cloud Computing [MCC], its Security & Future Research Challe...IRJET Journal
This document summarizes a research paper on mobile cloud computing (MCC). It begins with definitions of cloud computing and MCC, describing MCC as incorporating cloud computing resources and services into mobile environments. It then discusses security issues in MCC, categorizing them as mobile threats, cloud threats, and issues related to infrastructure and communication channels. The document proceeds to summarize 17 other research papers on topics like user authentication, open research issues, applications of MCC, and security challenges and solutions. It concludes by outlining the focus of the authors' own research on algorithms and cryptography for addressing security in MCC.
DEVELOPMENT OF A CONCEPTUAL MODEL OF ADAPTIVE ACCESS RIGHTS MANAGEMENT WITH U...IAEME Publication
The paper describes the conceptual model of adaptive control of cyber protection
of the informatization object (IO). Petri's Networks were used as a mathematical
device to solve the problem of adaptive control of user access rights. The simulation
model is proposed and the simulation in PIPE v4.3.0 package is performed. The
possibility of automating the procedures for adjusting the user profile to minimize or
neutralize cyber threats in the objects of informatization is shown. The model of
distribution of user tasks in computer networks of IO is proposed. The model, unlike
the existing, is based on the mathematical apparatus of Petri's Networks and contains
variables that allow reducing the power of the state space. Access control method
(ACM) is added. The addenda touched upon aspects of reconciliation of access rights
that are requested by the task and requirements of the security policy and the degree
of consistency of tasks and access to the IO nodes. Adjustment of rules and security
metrics for new tasks or redistributable tasks is described in the notation of Petri nets
This document analyzes the effect of compressive sensing theory and watermarking on verification and authentication performance in a multibiometric system. It proposes embedding sparse measurements of a watermark biometric image, generated using compressive sensing theory, into transform coefficients of a host biometric image. This provides two levels of security and authentication using the watermarked host image and reconstructed watermark image. Experimental results on different watermarking techniques show the proposed techniques do not affect the multibiometric system's verification performance or authentication accuracy while providing security against various attacks.
ARTIFICIAL INTELLIGENCE TECHNIQUES FOR THE MODELING OF A 3G MOBILE PHONE BASE...ijaia
The principal objective of this work is to be able to use artificial intelligence techniques to be able to
design a predictive model of the performance of a third-generation mobile phone base radio, using the
analysis of KPIs obtained in a statistical data set of the daily behaviour of an RBS. For the realization of
these models, various techniques such as Decision Trees, Neural Networks and Random Forest were used.
which will allow faster progress in the deep analysis of large amounts of data statistics and get better
results. In this part of the work, data was obtained from the behaviour of a third-party mobile phone base
radio generation of the Claro operator in Ecuador, it should be noted that. To specify this practical case,
several models were generated based on in various artificial intelligence technique for the prediction of
performance results of a mobile phone base radio of third generation, the same ones that after several tests
were creation of a predictive model that determines the performance of a mobile phone base radio. As a
conclusion of this work, it was determined that the development of a predictive model based on artificial
intelligence techniques is very useful for the analysis of large amounts of data in order to find or predict
complex results, more quickly and trustworthy. The data are KPIs of the daily and hourly performance of a
radio base of third generation mobile telephony, these data were obtained through the operator's remote
monitoring and management tool Sure call PRS.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Novel framework for optimized digital forensic for mitigating complex image ...IJECEIAES
Digital Image Forensic is significantly becoming popular owing to the increasing usage of the images as a media of information propagation. However, owing to the presence of various image editing tools and software, there is also an increasing threat to image content security. Reviewing the existing approaches to identify the traces or artifacts states that there is a large scope of optimization to be implemented to enhance the processing further. Therefore, this paper presents a novel framework that performs cost-effective optimization of digital forensic technique with an idea of accurately localizing the area of tampering as well as offers a capability to mitigate the attacks of various forms. The study outcome shows that the proposed system offers better outcomes in contrast to the existing system to a significant scale to prove that minor novelty in design attributes could induce better improvement with respect to accuracy as well as resilience toward all potential image threats.
MITIGATION TECHNIQUES TO OVERCOME DATA HARM IN MODEL BUILDING FOR MLijaia
Given the impact of Machine Learning (ML) on individuals and the society, understanding how harm might
be occur throughout the ML life cycle becomes critical more than ever. By offering a framework to
determine distinct potential sources of downstream harm in ML pipeline, the paper demonstrates the
importance of choices throughout distinct phases of data collection, development, and deployment that
extend far beyond just model training. Relevant mitigation techniques are also suggested for being used
instead of merely relying on generic notions of what counts as fairness.
A SURVEY ON TECHNIQUES REQUIREMENTS FOR INTEGRATEING SAFETY AND SECURITY ENGI...IJCSES Journal
Nowadays, safety and security have become a requirement, integrated to each other, for information systems as a new generation of infrastructure systems distributed throughout networks. That opened the door for questions on whether these systems are safety-critical especially since they were tested in a closed, separated environment and are now deployed in an uncontrollable environment, namely the internet, where the number of threats is enormous. So it opened the door to talk about new development approach methods that take safety and security into consideration during the system development life cycle and most importantly, identifying hazard, risks and threats. We will conduct a survey exploring technical languages that were created by the scholars to combine safety and security requirement engineering and accident analysis technique languages.
Survey on Mobile Cloud Computing [MCC], its Security & Future Research Challe...IRJET Journal
This document summarizes a research paper on mobile cloud computing (MCC). It begins with definitions of cloud computing and MCC, describing MCC as incorporating cloud computing resources and services into mobile environments. It then discusses security issues in MCC, categorizing them as mobile threats, cloud threats, and issues related to infrastructure and communication channels. The document proceeds to summarize 17 other research papers on topics like user authentication, open research issues, applications of MCC, and security challenges and solutions. It concludes by outlining the focus of the authors' own research on algorithms and cryptography for addressing security in MCC.
DEVELOPMENT OF A CONCEPTUAL MODEL OF ADAPTIVE ACCESS RIGHTS MANAGEMENT WITH U...IAEME Publication
The paper describes the conceptual model of adaptive control of cyber protection
of the informatization object (IO). Petri's Networks were used as a mathematical
device to solve the problem of adaptive control of user access rights. The simulation
model is proposed and the simulation in PIPE v4.3.0 package is performed. The
possibility of automating the procedures for adjusting the user profile to minimize or
neutralize cyber threats in the objects of informatization is shown. The model of
distribution of user tasks in computer networks of IO is proposed. The model, unlike
the existing, is based on the mathematical apparatus of Petri's Networks and contains
variables that allow reducing the power of the state space. Access control method
(ACM) is added. The addenda touched upon aspects of reconciliation of access rights
that are requested by the task and requirements of the security policy and the degree
of consistency of tasks and access to the IO nodes. Adjustment of rules and security
metrics for new tasks or redistributable tasks is described in the notation of Petri nets
This document analyzes the effect of compressive sensing theory and watermarking on verification and authentication performance in a multibiometric system. It proposes embedding sparse measurements of a watermark biometric image, generated using compressive sensing theory, into transform coefficients of a host biometric image. This provides two levels of security and authentication using the watermarked host image and reconstructed watermark image. Experimental results on different watermarking techniques show the proposed techniques do not affect the multibiometric system's verification performance or authentication accuracy while providing security against various attacks.
ARTIFICIAL INTELLIGENCE TECHNIQUES FOR THE MODELING OF A 3G MOBILE PHONE BASE...ijaia
The principal objective of this work is to be able to use artificial intelligence techniques to be able to
design a predictive model of the performance of a third-generation mobile phone base radio, using the
analysis of KPIs obtained in a statistical data set of the daily behaviour of an RBS. For the realization of
these models, various techniques such as Decision Trees, Neural Networks and Random Forest were used.
which will allow faster progress in the deep analysis of large amounts of data statistics and get better
results. In this part of the work, data was obtained from the behaviour of a third-party mobile phone base
radio generation of the Claro operator in Ecuador, it should be noted that. To specify this practical case,
several models were generated based on in various artificial intelligence technique for the prediction of
performance results of a mobile phone base radio of third generation, the same ones that after several tests
were creation of a predictive model that determines the performance of a mobile phone base radio. As a
conclusion of this work, it was determined that the development of a predictive model based on artificial
intelligence techniques is very useful for the analysis of large amounts of data in order to find or predict
complex results, more quickly and trustworthy. The data are KPIs of the daily and hourly performance of a
radio base of third generation mobile telephony, these data were obtained through the operator's remote
monitoring and management tool Sure call PRS.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Novel framework for optimized digital forensic for mitigating complex image ...IJECEIAES
Digital Image Forensic is significantly becoming popular owing to the increasing usage of the images as a media of information propagation. However, owing to the presence of various image editing tools and software, there is also an increasing threat to image content security. Reviewing the existing approaches to identify the traces or artifacts states that there is a large scope of optimization to be implemented to enhance the processing further. Therefore, this paper presents a novel framework that performs cost-effective optimization of digital forensic technique with an idea of accurately localizing the area of tampering as well as offers a capability to mitigate the attacks of various forms. The study outcome shows that the proposed system offers better outcomes in contrast to the existing system to a significant scale to prove that minor novelty in design attributes could induce better improvement with respect to accuracy as well as resilience toward all potential image threats.
MITIGATION TECHNIQUES TO OVERCOME DATA HARM IN MODEL BUILDING FOR MLijaia
Given the impact of Machine Learning (ML) on individuals and the society, understanding how harm might
be occur throughout the ML life cycle becomes critical more than ever. By offering a framework to
determine distinct potential sources of downstream harm in ML pipeline, the paper demonstrates the
importance of choices throughout distinct phases of data collection, development, and deployment that
extend far beyond just model training. Relevant mitigation techniques are also suggested for being used
instead of merely relying on generic notions of what counts as fairness.
Automatic Insider Threat Detection in E-mail System using N-gram TechniqueIRJET Journal
This document summarizes research on automatic insider threat detection in email systems using n-gram techniques. It discusses how n-grams can be used to classify documents and detect potential data leakage through email. The system would verify emails sent outside the network using SHA, n-grams and thresholds. If a threat is detected, the user would be blocked. The document also provides a literature review on 10 other papers related to insider threat detection using methods like user profiling, activity logs, data mining and visualization techniques. It describes how n-grams work by breaking words into character sequences and creating profiles based on frequency to classify documents.
DEEP-LEARNING-BASED HUMAN INTENTION PREDICTION WITH DATA AUGMENTATIONijaia
Data augmentation has been broadly applied in training deep-learning models to increase the diversity of
data. This study ingestigates the effectiveness of different data augmentation methods for deep-learningbased human intention prediction when only limited training data is available. A human participant pitches
a ball to nine potential targets in our experiment. We expect to predict which target the participant pitches
the ball to. Firstly, the effectiveness of 10 data augmentation groups is evaluated on a single-participant
data set using RGB images. Secondly, the best data augmentation method (i.e., random cropping) on the
single-participant data set is further evaluated on a multi-participant data set to assess its generalization
ability. Finally, the effectiveness of random cropping on fusion data of RGB images and optical flow is
evaluated on both single- and multi-participant data sets. Experiment results show that: 1) Data
augmentation methods that crop or deform images can improve the prediction performance; 2) Random
cropping can be generalized to the multi-participant data set (prediction accuracy is improved from 50%
to 57.4%); and 3) Random cropping with fusion data of RGB images and optical flow can further improve
the prediction accuracy from 57.4% to 63.9% on the multi-participant data set.
DATA AUGMENTATION TECHNIQUES AND TRANSFER LEARNING APPROACHES APPLIED TO FACI...ijaia
The face expression is the first thing we pay attention to when we want to understand a person’s state of
mind. Thus, the ability to recognize facial expressions in an automatic way is a very interesting research
field. In this paper, because the small size of available training datasets, we propose a novel data
augmentation technique that improves the performances in the recognition task. We apply geometrical
transformations and build from scratch GAN models able to generate new synthetic images for each
emotion type. Thus, on the augmented datasets we fine tune pretrained convolutional neural networks with
different architectures. To measure the generalization ability of the models, we apply extra-database
protocol approach, namely we train models on the augmented versions of training dataset and test them on
two different databases. The combination of these techniques allows to reach average accuracy values of
the order of 85% for the InceptionResNetV2 model.
FEATURE EXTRACTION METHODS FOR IRIS RECOGNITION SYSTEM: A SURVEYijcsit
This document summarizes several feature extraction methods for iris recognition systems. It discusses supervised, unsupervised, and semi-supervised learning approaches for iris recognition. It also reviews related literature on iris recognition techniques, including using wavelet transforms, SVM classifiers, and other feature extraction methods. Tables in the document compare different biometric traits and traditional biometric systems, as well as summarize reviewed articles on iris recognition with their main contributions. The methodology section describes the typical four steps of an iris recognition system: image acquisition, preprocessing, feature extraction, and matching/recognition. It also discusses various iris recognition methods and their performance measures.
A USER PROFILE BASED ACCESS CONTROL MODEL AND ARCHITECTUREIJCNC
Personalization and adaptation to the user profile capability are the hottest issues to ensure ambient
assisted living and context awareness in nowadays environments. With the growing healthcare and
wellbeing context aware applications, modeling security policies becomes an important issue in the
design of future access control models. This requires rich semantics using ontology modeling for the
management of services provided to dependant people. However, current access control models remain
unsuitable due to lack of personalization, adaptability and smartness to the handicap situation.
Progress of Machine Learning in the Field of Intrusion Detection Systemsijcisjournal
This document summarizes a research paper that proposes a new support vector machine (SVM) model for intrusion detection systems. The paper begins with background on machine learning and SVMs for intrusion detection. It then discusses related work applying SVMs and feature selection to intrusion detection. The proposed solution uses the CICIDS2017 dataset to select important features and train an SVM classifier to detect attacks. Testing shows the model achieves 97-99% precision in detecting attacks from the dataset. The results demonstrate the effectiveness of the proposed SVM model for intrusion detection.
ADDRESSING IMBALANCED CLASSES PROBLEM OF INTRUSION DETECTION SYSTEM USING WEI...IJCNCJournal
The main issues of the Intrusion Detection Systems (IDS) are in the sensitivity of these systems toward the errors, the inconsistent and inequitable ways in which the evaluation processes of these systems were often performed. Most of the previous efforts concerned with improving the overall accuracy of these models via increasing the detection rate and decreasing the false alarm which is an important issue. Machine Learning (ML) algorithms can classify all or most of the records of the minor classes to one of the main classes with negligible impact on performance. The riskiness of the threats caused by the small classes and the shortcoming of the previous efforts were used to address this issue, in addition to the need for improving the performance of the IDSs were the motivations for this work. In this paper, stratified sampling method and different cost-function schemes were consolidated with Extreme Learning Machine (ELM) method with Kernels, Activation Functions to build competitive ID solutions that improved the performance of these systems and reduced the occurrence of the accuracy paradox problem. The main experiments were performed using the UNB ISCX2012 dataset. The experimental results of the UNB ISCX2012 dataset showed that ELM models with polynomial function outperform other models in overall accuracy, recall, and F-score. Also, it competed with traditional model in Normal, DoS and SSH classes.
REVIEWING PROCESS MINING APPLICATIONS AND TECHNIQUES IN EDUCATIONijaia
Process Mining (PM) emerged from business process management but has recently been applied to
educational data and has been found to facilitate the understanding of the educational process.
Educational Process Mining (EPM) bridges the gap between process analysis and data analysis, based on
the techniques of model discovery, conformance checking and extension of existing process models. We
present a systematic review of the recent and current status of research in the EPM domain, focusing on
application domains, techniques, tools and models, to highlight the use of EPM in comprehending and
improving educational processes.
ANALYZING AND IDENTIFYING FAKE NEWS USING ARTIFICIAL INTELLIGENCEIAEME Publication
The main reason behind the spread of fake news is because of many fake and hyperpartisan sites present on the Internet. These fake sites try to manipulate the truth which creates misunderstanding in society. Therefore, it is important to detect fake news and try to make people aware of the truth. This paper gives an insight into how to detect fake news using Machine Learning and Deep Learning Techniques. On observing our data, we have categorized our data into five attributes namely Title, Text, Subject, Date, and Labels. In order to develop an efficient fake news detection system, the feature along with its degree of impact on the system must be taken into consideration. This paper attempts at providing a detailed analysis of detecting fake news using various models such as LSTM, ANN, Naïve Bayes, SVM, Logistic Regression, XGBoost, and Bert.
Uncertainty Quantification with Unsupervised Deep learning and Multi Agent Sy...Bang Xiang Yong
Presented at MET4FOF Workshop, JULY 2020
I talk about our recent work of combining Bayesian Deep learning with Explainable Artificial Intelligence (XAI) methods. In particular, we look at Bayesian Autoencoders.
Top Cited Articles in 2019 - International Journal of Network Security & Its ...IJNSA Journal
Recently, the mode of living became more complicated without computer systems. The techniques of camouflage information have acquired a vital role with the requirement of intensifying trade of multimedia content. Steganography is the technique that utilizes disguise in a way that prohibits unauthorized access from suspicion of the existence of confidential information exchanged during communication channels between the connected parties. In this paper, an integrated image steganographic system is designed to conceal images, messages or together where the mainly deliberate the improvement of embedding capacity through embedding text with image simultaneously. For that purpose, used matrix partition to partition the secret image then embedded each partition separately after scrambling each pixel by replacing msb instead of lsb to provide the second level of security furthermore to steganography. The simulation results clarify the better performance of the proposed algorithms
Biometric Identification and Authentication Providence using Fingerprint for ...IJECEIAES
The raise in the recent security incidents of cloud computing and its challenges is to secure the data. To solve this problem, the integration of mobile with cloud computing, Mobile biometric authentication in cloud computing is presented in this paper. To enhance the security, the biometric authentication is being used, since the Mobile cloud computing is popular among the mobile user. This paper examines how the mobile cloud computing (MCC) is used in security issue with finger biometric authentication model. Through this fingerprint biometric, the secret code is generated by entropy value. This enables the person to request for accessing the data in the desk computer. When the person requests the access to the authorized user through Bluetooth in mobile, the Authorized user sends the permit access through fingerprint secret code. Finally this fingerprint is verified with the database in the Desk computer. If it is matched, then the computer can be accessed by the requested person.
PREDICTIVE MAINTENANCE AND ENGINEERED PROCESSES IN MECHATRONIC INDUSTRY: AN I...ijaia
This document summarizes a case study on implementing predictive maintenance processes in a mechatronic industry using machine learning algorithms. A company installed sensors on a cutting machine to monitor blade status in real-time. A software platform was developed to analyze sensor data using k-Means clustering and LSTM algorithms to predict blade break conditions. The platform classified risk maps and predicted alert levels based on recent variable values. This approach aimed to optimize maintenance and reduce machine downtime for customers.
Cal-IT2 is an international partnership researching physically-coupled distributed embedded systems for large-scale societal applications. The goal is to develop robust and adaptable solutions for infrastructure like transportation and healthcare. Challenges include integrating highly distributed complex systems and legacy systems. Researchers from technology, policy, and industry are collaborating on projects in areas like energy, transportation, and structural health monitoring using sensors and predictive modeling. The community aims to autonomously respond to unexpected events through distributed computing, situational awareness, and adaptive resource management to address issues like hazard response.
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCEijesajournal
Automation is a powerful word that lies everywhere. It shows that without automation, application will not get developed. In a semiconductor industry, artificial intelligence played a vital role for implementing the chip based design through automation .The main advantage of applying the machine learning & deep learning technique is to improve the implementation rate based upon the capability of the society. The main objective of the proposed system is to apply the deep learning using data driven approach for controlling the system. Thus leads to a improvement in design, delay ,speed of operation & costs. Through this system, huge volume of data’s that are generated by the system will also get control.
This document summarizes Stefan Taubenberger's PhD research on using business process security requirements for IT security risk assessment. The research aims to determine if IT security risks can be reliably evaluated solely based on assessing adherence to security requirements, without using probabilities and events. The approach involves modeling business processes, identifying critical assets and security requirements, and evaluating how well security controls and processes meet the requirements. Preliminary validation using a reinsurance company's processes supports the idea that risks can be determined this way. The research seeks to address limitations of traditional risk assessment approaches.
Top cited article in 2019 - International Journal of Network Security & Its A...IJNSA Journal
The International Journal of Network Security & Its Applications (IJNSA) is a bi monthly open access peer-reviewed journal that publishes articles which contribute new results in all areas of the computer Network Security & its applications. The journal focuses on all technical and practical aspects of security and its applications for wired and wireless networks. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on understanding Modern security threats and countermeasures, and establishing new collaborations in these areas.
Intrusion Detection System using Self Organizing Map: A SurveyIJERA Editor
Due to usage of computer every field, Network Security is the major concerned in today’s scenario. Every year the number of users and speed of network is increasing, along with it online fraud or security threats are also increasing. Every day a new attack is generated to harm the system or network. It is necessary to protect the system or networks from various threats by using Intrusion Detection System which can detect “known” as well as “unknown” attack and generate alerts if any unusual behavior in the traffic. There are various approaches for IDS, but in this paper, survey is focused on IDS using Self Organizing Map. SOM is unsupervised, fast conversion and automatic clustering algorithm which is able to handle novelty detection. The main objective of the survey is to find and address the current challenges of SOM. Our survey shows that the existing IDS based on SOM have poor detection rate for U2R and R2L attacks. To improve it, proper normalization technique should be used. During the survey we also found that HSOM and GHSOM are advance model of SOM which have their own unique feature for better performance of IDS. GHSOM is efficient due to its low computation time. This survey is beneficial to design and develop efficient SOM based IDS having less computation time and better detection rate.
MOVIE SUCCESS PREDICTION AND PERFORMANCE COMPARISON USING VARIOUS STATISTICAL...ijaia
Movies are among the most prominent contributors to the global entertainment industry today, and they
are among the biggest revenue-generating industries from a commercial standpoint. It's vital to divide
films into two categories: successful and unsuccessful. To categorize the movies in this research, a variety
of models were utilized, including regression models such as Simple Linear, Multiple Linear, and Logistic
Regression, clustering techniques such as SVM and K-Means, Time Series Analysis, and an Artificial
Neural Network. The models stated above were compared on a variety of factors, including their accuracy
on the training and validation datasets as well as the testing dataset, the availability of new movie
characteristics, and a variety of other statistical metrics. During the course of this study, it was discovered
that certain characteristics have a greater impact on the likelihood of a film's success than others. For
example, the existence of the genre action may have a significant impact on the forecasts, although another
genre, such as sport, may not. The testing dataset for the models and classifiers has been taken from the
IMDb website for the year 2020. The Artificial Neural Network, with an accuracy of 86 percent, is the best
performing model of all the models discussed.
Evidencia de aplicación de planificación educación físicaAniela Padilla
Este documento es el informe de Aniela María Padilla Camacho sobre su participación en actividades de Educación Física en la primaria José María Morelos junto con su compañera Katya Michelle Solis Martínez para su clase del Profesora Celia Leticia Juárez Camacho. El informe describe las actividades realizadas que promuevieron la colaboración y calidad humana entre los estudiantes, con el objetivo de que respeten las opiniones y particularidades de los demás para mejorar las relaciones en el grupo.
El documento proporciona una lista de indicadores para evaluar el desempeño de un docente en 5 áreas clave: 1) Motivación de los estudiantes, 2) Planificación didáctica, 3) Estructura del proceso de enseñanza-aprendizaje, 4) Seguimiento del proceso, y 5) Evaluación. Cada indicador se valora de 5 a 10 y se solicitan propuestas de mejora. El objetivo es brindar una retroalimentación constructiva que permita al docente optimizar su práctica pedagógica.
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise boosts blood flow, releases endorphins, and promotes changes in the brain which help regulate emotions and stress levels.
Automatic Insider Threat Detection in E-mail System using N-gram TechniqueIRJET Journal
This document summarizes research on automatic insider threat detection in email systems using n-gram techniques. It discusses how n-grams can be used to classify documents and detect potential data leakage through email. The system would verify emails sent outside the network using SHA, n-grams and thresholds. If a threat is detected, the user would be blocked. The document also provides a literature review on 10 other papers related to insider threat detection using methods like user profiling, activity logs, data mining and visualization techniques. It describes how n-grams work by breaking words into character sequences and creating profiles based on frequency to classify documents.
DEEP-LEARNING-BASED HUMAN INTENTION PREDICTION WITH DATA AUGMENTATIONijaia
Data augmentation has been broadly applied in training deep-learning models to increase the diversity of
data. This study ingestigates the effectiveness of different data augmentation methods for deep-learningbased human intention prediction when only limited training data is available. A human participant pitches
a ball to nine potential targets in our experiment. We expect to predict which target the participant pitches
the ball to. Firstly, the effectiveness of 10 data augmentation groups is evaluated on a single-participant
data set using RGB images. Secondly, the best data augmentation method (i.e., random cropping) on the
single-participant data set is further evaluated on a multi-participant data set to assess its generalization
ability. Finally, the effectiveness of random cropping on fusion data of RGB images and optical flow is
evaluated on both single- and multi-participant data sets. Experiment results show that: 1) Data
augmentation methods that crop or deform images can improve the prediction performance; 2) Random
cropping can be generalized to the multi-participant data set (prediction accuracy is improved from 50%
to 57.4%); and 3) Random cropping with fusion data of RGB images and optical flow can further improve
the prediction accuracy from 57.4% to 63.9% on the multi-participant data set.
DATA AUGMENTATION TECHNIQUES AND TRANSFER LEARNING APPROACHES APPLIED TO FACI...ijaia
The face expression is the first thing we pay attention to when we want to understand a person’s state of
mind. Thus, the ability to recognize facial expressions in an automatic way is a very interesting research
field. In this paper, because the small size of available training datasets, we propose a novel data
augmentation technique that improves the performances in the recognition task. We apply geometrical
transformations and build from scratch GAN models able to generate new synthetic images for each
emotion type. Thus, on the augmented datasets we fine tune pretrained convolutional neural networks with
different architectures. To measure the generalization ability of the models, we apply extra-database
protocol approach, namely we train models on the augmented versions of training dataset and test them on
two different databases. The combination of these techniques allows to reach average accuracy values of
the order of 85% for the InceptionResNetV2 model.
FEATURE EXTRACTION METHODS FOR IRIS RECOGNITION SYSTEM: A SURVEYijcsit
This document summarizes several feature extraction methods for iris recognition systems. It discusses supervised, unsupervised, and semi-supervised learning approaches for iris recognition. It also reviews related literature on iris recognition techniques, including using wavelet transforms, SVM classifiers, and other feature extraction methods. Tables in the document compare different biometric traits and traditional biometric systems, as well as summarize reviewed articles on iris recognition with their main contributions. The methodology section describes the typical four steps of an iris recognition system: image acquisition, preprocessing, feature extraction, and matching/recognition. It also discusses various iris recognition methods and their performance measures.
A USER PROFILE BASED ACCESS CONTROL MODEL AND ARCHITECTUREIJCNC
Personalization and adaptation to the user profile capability are the hottest issues to ensure ambient
assisted living and context awareness in nowadays environments. With the growing healthcare and
wellbeing context aware applications, modeling security policies becomes an important issue in the
design of future access control models. This requires rich semantics using ontology modeling for the
management of services provided to dependant people. However, current access control models remain
unsuitable due to lack of personalization, adaptability and smartness to the handicap situation.
Progress of Machine Learning in the Field of Intrusion Detection Systemsijcisjournal
This document summarizes a research paper that proposes a new support vector machine (SVM) model for intrusion detection systems. The paper begins with background on machine learning and SVMs for intrusion detection. It then discusses related work applying SVMs and feature selection to intrusion detection. The proposed solution uses the CICIDS2017 dataset to select important features and train an SVM classifier to detect attacks. Testing shows the model achieves 97-99% precision in detecting attacks from the dataset. The results demonstrate the effectiveness of the proposed SVM model for intrusion detection.
ADDRESSING IMBALANCED CLASSES PROBLEM OF INTRUSION DETECTION SYSTEM USING WEI...IJCNCJournal
The main issues of the Intrusion Detection Systems (IDS) are in the sensitivity of these systems toward the errors, the inconsistent and inequitable ways in which the evaluation processes of these systems were often performed. Most of the previous efforts concerned with improving the overall accuracy of these models via increasing the detection rate and decreasing the false alarm which is an important issue. Machine Learning (ML) algorithms can classify all or most of the records of the minor classes to one of the main classes with negligible impact on performance. The riskiness of the threats caused by the small classes and the shortcoming of the previous efforts were used to address this issue, in addition to the need for improving the performance of the IDSs were the motivations for this work. In this paper, stratified sampling method and different cost-function schemes were consolidated with Extreme Learning Machine (ELM) method with Kernels, Activation Functions to build competitive ID solutions that improved the performance of these systems and reduced the occurrence of the accuracy paradox problem. The main experiments were performed using the UNB ISCX2012 dataset. The experimental results of the UNB ISCX2012 dataset showed that ELM models with polynomial function outperform other models in overall accuracy, recall, and F-score. Also, it competed with traditional model in Normal, DoS and SSH classes.
REVIEWING PROCESS MINING APPLICATIONS AND TECHNIQUES IN EDUCATIONijaia
Process Mining (PM) emerged from business process management but has recently been applied to
educational data and has been found to facilitate the understanding of the educational process.
Educational Process Mining (EPM) bridges the gap between process analysis and data analysis, based on
the techniques of model discovery, conformance checking and extension of existing process models. We
present a systematic review of the recent and current status of research in the EPM domain, focusing on
application domains, techniques, tools and models, to highlight the use of EPM in comprehending and
improving educational processes.
ANALYZING AND IDENTIFYING FAKE NEWS USING ARTIFICIAL INTELLIGENCEIAEME Publication
The main reason behind the spread of fake news is because of many fake and hyperpartisan sites present on the Internet. These fake sites try to manipulate the truth which creates misunderstanding in society. Therefore, it is important to detect fake news and try to make people aware of the truth. This paper gives an insight into how to detect fake news using Machine Learning and Deep Learning Techniques. On observing our data, we have categorized our data into five attributes namely Title, Text, Subject, Date, and Labels. In order to develop an efficient fake news detection system, the feature along with its degree of impact on the system must be taken into consideration. This paper attempts at providing a detailed analysis of detecting fake news using various models such as LSTM, ANN, Naïve Bayes, SVM, Logistic Regression, XGBoost, and Bert.
Uncertainty Quantification with Unsupervised Deep learning and Multi Agent Sy...Bang Xiang Yong
Presented at MET4FOF Workshop, JULY 2020
I talk about our recent work of combining Bayesian Deep learning with Explainable Artificial Intelligence (XAI) methods. In particular, we look at Bayesian Autoencoders.
Top Cited Articles in 2019 - International Journal of Network Security & Its ...IJNSA Journal
Recently, the mode of living became more complicated without computer systems. The techniques of camouflage information have acquired a vital role with the requirement of intensifying trade of multimedia content. Steganography is the technique that utilizes disguise in a way that prohibits unauthorized access from suspicion of the existence of confidential information exchanged during communication channels between the connected parties. In this paper, an integrated image steganographic system is designed to conceal images, messages or together where the mainly deliberate the improvement of embedding capacity through embedding text with image simultaneously. For that purpose, used matrix partition to partition the secret image then embedded each partition separately after scrambling each pixel by replacing msb instead of lsb to provide the second level of security furthermore to steganography. The simulation results clarify the better performance of the proposed algorithms
Biometric Identification and Authentication Providence using Fingerprint for ...IJECEIAES
The raise in the recent security incidents of cloud computing and its challenges is to secure the data. To solve this problem, the integration of mobile with cloud computing, Mobile biometric authentication in cloud computing is presented in this paper. To enhance the security, the biometric authentication is being used, since the Mobile cloud computing is popular among the mobile user. This paper examines how the mobile cloud computing (MCC) is used in security issue with finger biometric authentication model. Through this fingerprint biometric, the secret code is generated by entropy value. This enables the person to request for accessing the data in the desk computer. When the person requests the access to the authorized user through Bluetooth in mobile, the Authorized user sends the permit access through fingerprint secret code. Finally this fingerprint is verified with the database in the Desk computer. If it is matched, then the computer can be accessed by the requested person.
PREDICTIVE MAINTENANCE AND ENGINEERED PROCESSES IN MECHATRONIC INDUSTRY: AN I...ijaia
This document summarizes a case study on implementing predictive maintenance processes in a mechatronic industry using machine learning algorithms. A company installed sensors on a cutting machine to monitor blade status in real-time. A software platform was developed to analyze sensor data using k-Means clustering and LSTM algorithms to predict blade break conditions. The platform classified risk maps and predicted alert levels based on recent variable values. This approach aimed to optimize maintenance and reduce machine downtime for customers.
Cal-IT2 is an international partnership researching physically-coupled distributed embedded systems for large-scale societal applications. The goal is to develop robust and adaptable solutions for infrastructure like transportation and healthcare. Challenges include integrating highly distributed complex systems and legacy systems. Researchers from technology, policy, and industry are collaborating on projects in areas like energy, transportation, and structural health monitoring using sensors and predictive modeling. The community aims to autonomously respond to unexpected events through distributed computing, situational awareness, and adaptive resource management to address issues like hazard response.
ANALYSIS OF SYSTEM ON CHIP DESIGN USING ARTIFICIAL INTELLIGENCEijesajournal
Automation is a powerful word that lies everywhere. It shows that without automation, application will not get developed. In a semiconductor industry, artificial intelligence played a vital role for implementing the chip based design through automation .The main advantage of applying the machine learning & deep learning technique is to improve the implementation rate based upon the capability of the society. The main objective of the proposed system is to apply the deep learning using data driven approach for controlling the system. Thus leads to a improvement in design, delay ,speed of operation & costs. Through this system, huge volume of data’s that are generated by the system will also get control.
This document summarizes Stefan Taubenberger's PhD research on using business process security requirements for IT security risk assessment. The research aims to determine if IT security risks can be reliably evaluated solely based on assessing adherence to security requirements, without using probabilities and events. The approach involves modeling business processes, identifying critical assets and security requirements, and evaluating how well security controls and processes meet the requirements. Preliminary validation using a reinsurance company's processes supports the idea that risks can be determined this way. The research seeks to address limitations of traditional risk assessment approaches.
Top cited article in 2019 - International Journal of Network Security & Its A...IJNSA Journal
The International Journal of Network Security & Its Applications (IJNSA) is a bi monthly open access peer-reviewed journal that publishes articles which contribute new results in all areas of the computer Network Security & its applications. The journal focuses on all technical and practical aspects of security and its applications for wired and wireless networks. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on understanding Modern security threats and countermeasures, and establishing new collaborations in these areas.
Intrusion Detection System using Self Organizing Map: A SurveyIJERA Editor
Due to usage of computer every field, Network Security is the major concerned in today’s scenario. Every year the number of users and speed of network is increasing, along with it online fraud or security threats are also increasing. Every day a new attack is generated to harm the system or network. It is necessary to protect the system or networks from various threats by using Intrusion Detection System which can detect “known” as well as “unknown” attack and generate alerts if any unusual behavior in the traffic. There are various approaches for IDS, but in this paper, survey is focused on IDS using Self Organizing Map. SOM is unsupervised, fast conversion and automatic clustering algorithm which is able to handle novelty detection. The main objective of the survey is to find and address the current challenges of SOM. Our survey shows that the existing IDS based on SOM have poor detection rate for U2R and R2L attacks. To improve it, proper normalization technique should be used. During the survey we also found that HSOM and GHSOM are advance model of SOM which have their own unique feature for better performance of IDS. GHSOM is efficient due to its low computation time. This survey is beneficial to design and develop efficient SOM based IDS having less computation time and better detection rate.
MOVIE SUCCESS PREDICTION AND PERFORMANCE COMPARISON USING VARIOUS STATISTICAL...ijaia
Movies are among the most prominent contributors to the global entertainment industry today, and they
are among the biggest revenue-generating industries from a commercial standpoint. It's vital to divide
films into two categories: successful and unsuccessful. To categorize the movies in this research, a variety
of models were utilized, including regression models such as Simple Linear, Multiple Linear, and Logistic
Regression, clustering techniques such as SVM and K-Means, Time Series Analysis, and an Artificial
Neural Network. The models stated above were compared on a variety of factors, including their accuracy
on the training and validation datasets as well as the testing dataset, the availability of new movie
characteristics, and a variety of other statistical metrics. During the course of this study, it was discovered
that certain characteristics have a greater impact on the likelihood of a film's success than others. For
example, the existence of the genre action may have a significant impact on the forecasts, although another
genre, such as sport, may not. The testing dataset for the models and classifiers has been taken from the
IMDb website for the year 2020. The Artificial Neural Network, with an accuracy of 86 percent, is the best
performing model of all the models discussed.
Evidencia de aplicación de planificación educación físicaAniela Padilla
Este documento es el informe de Aniela María Padilla Camacho sobre su participación en actividades de Educación Física en la primaria José María Morelos junto con su compañera Katya Michelle Solis Martínez para su clase del Profesora Celia Leticia Juárez Camacho. El informe describe las actividades realizadas que promuevieron la colaboración y calidad humana entre los estudiantes, con el objetivo de que respeten las opiniones y particularidades de los demás para mejorar las relaciones en el grupo.
El documento proporciona una lista de indicadores para evaluar el desempeño de un docente en 5 áreas clave: 1) Motivación de los estudiantes, 2) Planificación didáctica, 3) Estructura del proceso de enseñanza-aprendizaje, 4) Seguimiento del proceso, y 5) Evaluación. Cada indicador se valora de 5 a 10 y se solicitan propuestas de mejora. El objetivo es brindar una retroalimentación constructiva que permita al docente optimizar su práctica pedagógica.
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise boosts blood flow, releases endorphins, and promotes changes in the brain which help regulate emotions and stress levels.
The document is a reflection by Ben Huyton on a project to create a 12-page booklet on the vegan diet and its health impacts. Ben stuck to his original plan and aims, creating two infographics, a multi-page article, and a fact file. He is satisfied with the end result but notes some areas for improvement, like being more creative on the contents page and altering the color scheme of the fact file. Ben effectively managed his time, creating a page every two lessons and reviewing work hourly with his tutor.
The document is a curriculum vitae for Philip Clark, who has over 20 years of experience training athletes. It outlines his education and experience as a coach and owner of several successful fitness businesses in Philadelphia. It details his experience coaching teams to championships and developing elite athletes. It also lists his qualifications, awards, publications, and appearances.
áReas de conocimiento de acuerdo a las carreras universitariasfernanda lopez
El documento presenta una lista de las principales áreas de conocimiento de acuerdo a las carreras universitarias, incluyendo arquitectura, artes, ciencias biológicas, ciencias de la salud, ciencias físico matemáticas, ciencias sociales, económico administrativas, educación, humanidades e ingenierías. Para cada área, se enumeran las carreras universitarias asociadas.
El documento describe los orígenes y elementos del rap. Surge a mediados del siglo XX entre la comunidad negra de EE.UU. combinando influencias afroamericanas y puertorriqueñas en Nueva York. Los elementos principales son el MC que recita rimas y el DJ que mezcla discos. También incluye break dance, grafiti, y fue influenciado por la música disco y reacciones contra ella. En los 80 se expandió internacionalmente con escenas propias en otros países.
Este documento describe los conceptos básicos de las redes de ordenadores, incluyendo los tipos de redes según su alcance geográfico, los componentes de una red local como estaciones de trabajo, servidores y dispositivos de conexión, y conceptos como las direcciones IP, protocolos de comunicación y topologías físicas y lógicas. Explica cómo las redes permiten compartir recursos entre ordenadores conectados y menciona ejemplos como acceso a internet y almacenamiento compartido.
Una red de computadoras conecta equipos informáticos para compartir información y recursos mediante dispositivos que envían y reciben datos. Existen redes de área local (LAN) que cubren un edificio o área pequeña, redes de área metropolitana (MAN) que cubren un área geográfica más extensa, y redes de área amplia (WAN) que se extienden a lo largo de un gran área geográfica. Las topologías de red incluyen bus, anillo, estrella, malla y árbol.
La robótica es una tecnología multidisciplinaria que utiliza recursos de otras ciencias como la mecánica, electrónica e informática. Actualmente, los robots se usan para tareas peligrosas, repetitivas o difíciles para humanos como en líneas de producción industriales. La robótica también se aplica en campos como la exploración espacial, minería, búsqueda y rescate. En el futuro, la creatividad e imaginación serán importantes para desarrollar esta área.
Este documento resume la novela Siddharta de Herman Hesse. Explica que trata sobre un joven, Siddharta, que busca encontrar la felicidad interior a través de diferentes etapas de su vida, incluyendo seguir las enseñanzas de los samana y vivir como un hombre rico. También analiza los personajes de Govinda y Kamala y explica cómo la obra se relaciona con conceptos como el nirvana, om, brahma y atma.
El Papa Francisco recuerda que cuando damos a los necesitados, se hace presente la misericordia de Dios. Al encontrarnos con personas necesitadas en la calle, ya no hay distancia entre ellos y nosotros, y debemos ayudarlos. La fe sin obras está muerta, por lo que debemos compartir lo poco que tenemos a través de obras de caridad como dar de comer a los hambrientos.
Este documento presenta los conceptos básicos de la lógica de predicados de primer orden, incluyendo términos, fórmulas atómicas, conectores lógicos, cuantificadores y reglas para poner las fórmulas en forma normal. Explica cómo se pueden representar y derivar inferencias sobre enunciados en este sistema formal.
MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACHijitcs
In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating
healthcare applications and services is one of the vast data approaches that can be adapted to mobile
cloud computing. This work proposes a framework of a global healthcare computing based combining both
mobile computing and cloud computing. This approach leads to integrate all of the required services and overcoming the barriers through facilitating both privacy and security.
Insights of effectivity analysis of learning-based approaches towards softwar...IJECEIAES
Software defect prediction is one of the essential sets of operation towards mitigating issues of risk management in software development known to contribute towards enhancing the quality of software. There is evolution of various methodologies towards resolving this issue while learning-based methodology is witnessed to be the most dominant contributor. The problem identified is that there are yet many unsolved queries associated with practical viability of such learning-based approach adoption in software quality management. Proposed approaches discussed in this paper contributes towards mitigating this challenge by introducing a simplified, compact, and crisp analysis of effectiveness associated with learning-based schemes. The paper presents its major findings of effectivity analysis of machine learning, deep learning, hybrid, and other miscellaneous approaches deployed for fault prediction followed by highlighting research trend. The major findings infer that feature selection, data imbalance, interpretability, and in adequate involvement of context are prime gaps in existing methods. The paper also contributes towards research gap as well as essential learning outcomes of present review work.
UBIQUITOUS COMPUTING Its Paradigm, Systems & Middlewarevivatechijri
This paper offers a survey of ubiquitous computing research which is the developing a scope that
gears communication technologies into routine life accomplishments. This study paper affords a types of the
studies that extents at the ubiquitous computing exemplar. In this paper, we present collective structure principles
of ubiquitous systems and scrutinize important developments in context-conscious ubiquitous structures. In toting,
this studies work affords a novel structure of ubiquitous computing system and an evaluation of sensors needed
for applications in ubiquitous computing. The goal of this studies work are 3-fold: i) help as a parameter for
researchers who're first-hand to ubiquitous computing and want to subsidize to this research expanse, ii) provide
a unique machine architecture for ubiquitous computing system, and iii) offer auxiliary studies ways necessary
for exceptional-of-provider assertion of ubiquitous computing..
Review of Business Information Systems – Fourth Quarter 2013 V.docxmichael591
This document discusses security threats in cloud computing based on a case study interview with an IT manager. The interviewee's company uses both private and public clouds. The document identifies 41 security threats from literature and classifies them from technical and business perspectives. Based on the interview, the major drivers for using cloud computing were improving business continuity, reducing costs through virtualization and disaster recovery, and utilizing high bandwidth. The interview helped explore the dimensions of security threats in cloud computing beyond what is described in existing research.
Analysis of Homomorphic Technique and Secure Hash Technique for Multimedia Co...IJERA Editor
This document discusses security techniques for multimedia content systems stored in the cloud. It analyzes homomorphic encryption and secure hash techniques. The document provides an overview of homomorphic encryption, describing how it allows computation on encrypted data while preserving privacy. It also reviews related work applying homomorphic encryption and searchable encryption to securely store and process multimedia files in cloud systems. The goal is to push workloads in a fully homomorphic encrypted form to cloud storage while maintaining privacy and flexibility.
SECURE CLOUD COMPUTING MECHANISM FOR ENHANCING: MTBACijistjournal
The development of the cloud system,A large number of vendors can visit their users in the same platform directing their focus on the software rather than the underlying framework. This necessary require the distribution, storage analysis of the data on cloud accessing virtualized and scalable web services with broad application of cloud, the data security and access control become a major concern. The access to the cloud requires authorization as well as data accessibility permission. The verification and updation of data accessibility permissions and data must be done with proper knowledge which requires identification of correct updates and block listed users who are intruder to cloud Introducing the false data system. In this paper we approach to builds a mutual trust relationship between users and cloud for accessing control method in cloud computing environment focusing on the system integrity and its security. The proposed approach is executed as a procedure manner and includes many steps to identify the user’s credibility in the cloud network.
This document provides a survey of mobile cloud computing. It discusses the definitions and models of cloud computing including infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS). Mobile cloud computing is defined as using cloud infrastructure for data storage and processing instead of on the mobile device. The document reviews several research papers on mobile cloud computing and discusses challenges like network latency, limited bandwidth, security, and privacy. It proposes architectures and techniques to address these challenges and better utilize mobile cloud computing.
A PRIVACY PROTECTION SCHEME TO TRANSMIT MEDICAL DATA FROM WEARABLE DEVICES TO...AM Publications
In the current scenarios, there was a huge demand for the wearable devices due to the development of clouds and cloudlet technology. So there has been wide essential to offer a better medical care to the people. For processing the patient medical information from one system to another includes various phases such as data collection, data storage, data sharing, etc. In the case of traditional healthcare system, it needs medical data transformation to cloud which includes user’s sensitive data and further cause’s communication energy consumption. Basically, medical data sharing referred as a most challenging issue. So, this research provides an apt solution to the medical data sharing. Here, a novel healthcare system is being developed by making use of flexibility of cloudlet. The cloudlet mainly functions to provide privacy protection, data sharing and intrusion detection. NTRU (Number Theory Research Unit) method is being initially used here to encrypt the user’s information that is being gathered through wearable devices. This information will be further transmitted to the nearest cloudlet in an energy efficient way. Apart from this method, this research also proposes a new trust model to assist the users to choose trustable partners whoever wishes to share the information stored in cloudlet. This model mainly helps the patients who are suffering from some health problems by communicating with each other. The user’s medical information which is being stored in remote cloud of hospital is classified into three parts and further secures them. For securing the healthcare system from malicious attacks this research developed a novel collaborative IDS (Intrusion Detection System) method through cloudlet mesh. As this can secure the remote healthcare big data cloud from various attacks. Finally, NTRU and AES-Rijndael algorithm are being used in order to attain more robust functionality. For implementation, Java technologies have been used to prove that proposed scheme remains effective.
The advancements in cloud computing and leveraging the benefits from
cloud computing to the service providers have increased the deployment of
traditional applications to the cloud. The applications once deployed on the
cloud, due to various reasons, need migration from development
infrastructure to operational infrastructure, one operational instance to other
operational instances due to load balancing and the cycle continues due to the
use of DevOps as development strategies for cloud computing applications.
Advocates of hybrid and public clouds observe cloud computing makes it
possible for organizations to avert or minimize upfront IT infrastructure
expenses. Proponents also assert that cloud computing systems permit
businesses to receive their software up and running faster, using improved
manageability and less maintenance, so it empowers IT teams to rapidly
adapt tools to meet the varying and unpredictable requirements. DevOps is a
lot of practices that mechanizes the procedures between programming
improvement and IT groups, all together that they can fabricate, test, and
discharge programming quicker and even more dependably. The idea of
DevOps is established on building a culture of a joint effort between groups
that generally worked in relative siloes. The guaranteed advantages
incorporate expanded trust, quicker programming discharges, capacity to
explain basic issues rapidly and better oversee impromptu work. Thus, this
work identifies the need for providing multiple security protocols during the
complete life cycle of cloud application development and deployment. This
work proposes a novel framework for automatic selection and deployment of
the security protocols during cloud service deployments. The framework
identifies the need for security aspects and selects the appropriate security
algorithms for virtual machines. The proposed framework demonstrates
nearly 80% improvement over the security policy deployment time.
Building a Distributed Secure System on Multi-Agent Platform Depending on the...CSCJournals
Today, applications in mobile multi-agent systems require a high degree of confidence that running code inside the system will not be malicious. Also any malicious agents must be identified and contained. Since the inception of mobile agents, the intruder has been addressed using a multitude of techniques, but many of these implementations have only addressed concerns from the position of either the platform or the agents. Very few approaches have undertaken the problem of mobile agent security from both perspectives simultaneously. Furthermore, no middleware exists to facilitate provisioning of the required security qualities of mobile agent software while extensively focusing on easing the software development burden. The aim is to build a distributed secure system using multi-agents by applying the principles of software engineering. The objectives of this paper is to introduce multi agent systems that enhance security rules through the access right to building a distributed secure system integrating with principles of software engineering system life cycle, as well as satisfy the security access right for both platform and agents to improve the three characteristics of agents adaptively, mobility and flexibility. This project based on the platform of PHP and MYSQL (Database) which can be presented in a website. The implementation and test are applied in both Linux and Windows platforms, including Linux Red Hat 8, Linux Ubuntu 6.06 LTS and Microsoft Windows XP Professional. Since PHP and MySQL are available in almost all operating systems, the result could be tested the platform as long as PHP and MySQL configuration is available. PHP5 and the MySQL (database) software are used to build a secure website. Multiple techniques of security and authentications have been used by multi-agents system. Secure database is encrypted by using md5. Also satisfy the characteristics for security requirements: confidentiality (protection from disclosure to unauthorized persons), integrity (maintaining data consistency) and authentication (assurance of identity of person or originator of data).
This document discusses an ontology-based context-sensitive software security knowledge management modeling approach. It begins with an introduction describing the need for secure software development practices and security management systems. It then reviews related work incorporating ontologies and context modeling for software security. The proposed method involves an ontology-based context model with two parts: a software security domain model and an application context model. It describes the components of each model and establishes a hierarchical relationship between them. Finally, it discusses criteria for context-driven security modeling, including usability and quality. The overall aim is to develop a framework that assists practitioners in software security analysis and decision making based on application context.
In the software industry, two software engineering development best practices coexist: open-source and
closed-source software. The former has a shared code that anyone can contribute, whereas the latter has a
proprietary code that only the owner can access. Software reliability is crucial in the industry when a new
product or update is released. Applying meta-heuristic optimization algorithms for closed-source software
reliability prediction has produced significant and accurate results. Now, open-source software dominates
the landscape of cloud-based systems. Therefore, providing results on open- source software reliability - as
a quality indicator - would greatly help solve the open-source software reliability growth- modelling
problem.
BRAIN. Broad Research in Artificial Intelligence and NeuroscieVannaSchrader3
BRAIN. Broad Research in Artificial Intelligence and Neuroscience
ISSN: 2068-0473 | e-ISSN: 2067-3957
Covered in: Web of Science (WOS); PubMed.gov; IndexCopernicus; The Linguist List; Google Academic; Ulrichs; getCITED;
Genamics JournalSeek; J-Gate; SHERPA/RoMEO; Dayang Journal System; Public Knowledge Project; BIUM; NewJour;
ArticleReach Direct; Link+; CSB; CiteSeerX; Socolar; KVK; WorldCat; CrossRef; Ideas RePeC; Econpapers; Socionet.
2020, Volume 11, Issue 4, pages: 149-167 | https://doi.org/10.18662/brain/11.4/146
Adaptation and
Effects of Cloud
Computing on Small
Businesses
Damla KARAGOZLU¹,
Janet AJAMU
2
,
Anne Bakupa MBOMBO
3
1 Assistant Professor, Near East
University, Department of Computer
Information Systems, Mersin 10, Turkey,
[email protected]
2 Near East University, Department of
Computer Information Systems, Mersin
10, Turkey, [email protected]
3 Near East University, Department of
Computer Information Systems, Mersin
10, Turkey, [email protected]
Abstract: One of the popular technologies in the industry today is cloud
computing and the use of cloud computing in small businesses is now
fashionable, it has a remarkable impact on them by bringing a new way
of working, accessing their data and storing it. The aim of the study is to
examine the impacts and effects of migration to cloud computing on small
businesses. The methodology used for this study is based on a review of
past studies, which explains in broad terms each research question of this
study. The results of this study shows that there are several benefits that
cloud computing has to offer small businesses if cloud computing is
adopted and that includes flexibility, cost reduction and automatic
software/hardware upgrades etc. Cloud computing has a large storage
capacity that helps small businesses store and access large amounts of
data quickly and easily, and there are a number of cloud computing data
recovery techniques that help to recover lost or damaged data in the cloud.
In the same way that security is a major concern for all technologies, it is
also a concern for data stored in the cloud; security techniques such as
encryption, cryptography are mostly used to ensure that key areas of data
security (data confidentiality, data integrity and data availability) are
protected. The results of this study helps small businesses to understand
the impact of cloud computing on their businesses and the various security
measures that need to be taken while adapting cloud computing
technology.
Keywords: Cloud computing; security in cloud computing; small
businesses; cloud computing in business; adaptation.
How to cite: Karagozlu, D., Ajamu, J., & Mbombo, A.B.
(2020). Adaptation and Effects of Cloud Computing on Small
Businesses. BRAIN. Broad Research in Artificial Intelligence and
Neuroscience, 11(4), 149-167.
https://doi.org/10.18662/brain/11.4/146
https://doi.org/10.18662/brain/11.4/146
mailto:[ema ...
Maintaining Secure Cloud by Continuous Auditingijtsrd
Increases in cloud computing capacity, as well as decreases in the cost of processing, are moving at a fast pace. These patterns make it incumbent upon organizations to keep pace with changes in technology that significantly influence security. Cloud security auditing depends upon the environment, and the rapid growth of cloud computing is an important new context in world economics. The small price of entry, bandwidth, and processing power capability means that individuals and organizations of all sizes have more capacity and agility to exercise shifts in computation and to disrupt industry in cyberspace than more traditional domains of business economics worldwide. An analysis of prevalent cloud security issues and the utilization of cloud audit methods can mitigate security concerns. This verification methodology indicates how to use frameworks to review cloud service providers (CSPs). The key barrier to widespread uptake of cloud computing is the lack of trust in clouds by potential customers. While preventive controls for security and privacy are actively researched, there is still little focus on detective controls related to cloud accountability and auditability. The complexity resulting from large-scale virtualization and data distribution carried out in current clouds has revealed an urgent research agenda for cloud accountability, as has the shift in focus of customer concerns from servers to data. M. Kanimozhi | A. Aishwarya | S. Triumal"Maintaining Secure Cloud by Continuous Auditing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-2 | Issue-3 , April 2018, URL: http://www.ijtsrd.com/papers/ijtsrd10829.pdf http://www.ijtsrd.com/engineering/computer-engineering/10829/maintaining-secure-cloud-by-continuous-auditing/m-kanimozhi
This document proposes a multi-agent system architecture for reacting to security alerts in heterogeneous distributed networks. The architecture has three layers - a low level that interfaces with the target infrastructure, an intermediate level that correlates alerts from different domains and deploys reaction actions, and a high level global view. It uses an ontology and Bayesian network based decision support system to help agents make decisions according to preferences and influence diagrams. The approach is illustrated using a case study of a medical application distributed across buildings, campuses and metropolitan areas.
A security decision reaction architecture for heterogeneous distributed networkchristophefeltus
This document proposes a multi-agent system architecture for reacting to security alerts in heterogeneous distributed networks. The architecture has three layers - low, intermediate, and high - and consists of agents that perform alert correlation, reaction decision making, and policy deployment. The agents communicate by exchanging messages. The architecture is intended to allow for quick and efficient reaction to security attacks while ensuring coordinated configuration changes across network components. It was developed and illustrated using a case study of a medical application distributed across buildings, campuses, and metropolitan areas.
A Systematic Literature Review On Cloud Computing Security Threats And Mitig...Claire Webber
This systematic literature review examines research on cloud computing security threats and mitigation strategies published between 2010 and 2020. The review identified 7 major security threats to cloud services, including data tampering, data leakage, and issues with data storage and intrusion. Data tampering and leakage were highly discussed topics. The findings also indicated that outsourcing data remains a challenge and suggested blockchain as a technology that could help address security issues. The review revealed needs to improve data confidentiality, integrity, and availability in future work.
A survey of predicting software reliability using machine learning methodsIAESIJAI
In light of technical and technological progress, software has become an urgent need in every aspect of human life, including the medicine sector and industrial control. Therefore, it is imperative that the software always works flawlessly. The information technology sector has witnessed a rapid expansion in recent years, as software companies can no longer rely only on cost advantages to stay competitive in the market, but programmers must provide reliable and high-quality software, and in order to estimate and predict software reliability using machine learning and deep learning, it was introduced A brief overview of the important scientific contributions to the subject of software reliability, and the researchers' findings of highly efficient methods and techniques for predicting software reliability.
The document presents a proposed framework for user authentication in mobile cloud environments called Dynamic Key Based User Authentication (DKBUA). The framework uses a dynamic key generation algorithm with six phases: registration, communication, key generation, key sending, encryption/decryption, and authentication. The algorithm is designed to be lightweight to reduce computation load. It also uses encryption/decryption to securely transmit communications. An analysis of existing authentication mechanisms is provided and the proposed framework is claimed to be resilient against denial of service attacks, known plaintext attacks, masquerading attacks, and insider attacks.
A systems engineering methodology for wide area network selectionAlexander Decker
This document describes a study that applies the Analytic Hierarchy Process (AHP) to help a company select the best wide area network (WAN) solution based on their requirements. The document provides background on AHP and reviews related literature on using multi-criteria decision making for selection problems. It then outlines the steps of AHP, including constructing a hierarchy, performing pairwise comparisons, and calculating weights and consistency. Finally, it describes how AHP could be applied to help the hypothetical company evaluate WAN alternatives and select the optimal solution.
Similar to UBIQUITOUS COMPUTING AND SCRUM SOFTWARE ANALYSIS FOR COMMUNITY SOFTWARE (20)
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
artificial intelligence and data science contents.pptxGauravCar
What is artificial intelligence? Artificial intelligence is the ability of a computer or computer-controlled robot to perform tasks that are commonly associated with the intellectual processes characteristic of humans, such as the ability to reason.
› ...
Artificial intelligence (AI) | Definitio
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Batteries -Introduction – Types of Batteries – discharging and charging of battery - characteristics of battery –battery rating- various tests on battery- – Primary battery: silver button cell- Secondary battery :Ni-Cd battery-modern battery: lithium ion battery-maintenance of batteries-choices of batteries for electric vehicle applications.
Fuel Cells: Introduction- importance and classification of fuel cells - description, principle, components, applications of fuel cells: H2-O2 fuel cell, alkaline fuel cell, molten carbonate fuel cell and direct methanol fuel cells.
UBIQUITOUS COMPUTING AND SCRUM SOFTWARE ANALYSIS FOR COMMUNITY SOFTWARE
1. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
DOI : 10.5121/ijsea.2016.7608 101
UBIQUITOUS COMPUTING AND SCRUM SOFTWARE
ANALYSIS FOR COMMUNITY SOFTWARE
Atif Bilal1
, Zubair Nabi2
, Muhammad Awais3
Muhammad Yahya Saeed4
and
Maliha Chaudhary5
1,2,3,4
Department of Software Engineering, Government College University, Faisalabad,
5
University of Agriculture, Faisalabad.
ABSTRACT
Ubiquitous processing has the special property when contrasted with customary desktop figuring
framework. Really in ubiquitous, more computational force in the earth by utilizing any gadget everything
was accessible and open. Numerous machines give the interface to the single client by concealing every
one of the gadgets from the foundation. Programming Engineering has unbelievable meaning in the area of
pervasive processing. There was no basic level engineering which was utilized and satisfactory for a wide
range of ubiquitous figuring application. Because of less programming designing methodologies were
distinguished, this was fundamental issue in the root to propose general level engineering. To reach on this
objective, step was taken at exceptionally introductory level to seek out all the believable programming
designing difficulties in numerous ubiquitous registering applications that were at that point grew as such.
Attempt to highlight all methodologies that ever been utilized to built up the universal applications. This
exploration is extremely useful in future for analysts to construct the general level engineering for the
ubiquitous applications
KEYWORDS
Ubiquitous, methodologies, engineering
1. INTRODUCTION
1.1 Review of Literature
Whang et al. (2007) Proposes the utilization of a system for the acceptance of AmI based
UbiCom applications. This philosophy depends on the utilization and advancement of Multi-
Agent Based Simulations (MABS). The proposition is roused by AmI based UbiCom applications
which include huge quantities of clients or perilous situations. Consequently, these applications
are not suitable to be accepted with genuine tests. The investigation of the related works reasons
that the acceptance of UbiCom System through genuine tests and the past re-enactments
approaches can't manage applications where there are a huge number of clients. This paper means
to exhibit that MABS is a perfect innovation for the acceptance of UbiCom plans before their
sending. Also, the study has finished up there is an inborn inconvenience in the examination of
MABS which can be helped with innovations, for example, information mining or informal
organization examination. Criminological examination is proposed as a backing for these
innovations to help MABS investigation. The primary commitment exhibited is a strategy to
create MABS so as to accept AmI based UbiCom applications [1].
Mutahdi et al. (2010) introduced a context-aware access control mechanism that utilizes threshold
cryptography and multilayer encryption to provide a dynamic and truly distributed method for
access control. We simulate our access control scheme and show that access control decisions can
2. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
102
be made in a timely manner even as we increase key and file sizes. This mechanism is closely
coupled with the context-capturing services and security policy service resulting in a fully
context-aware and seamless access control mechanism for typical ubiquitous computing
scenarios. In addition, trust is distributed throughout the ubiquitous computing infrastructure. We
believe that contextual awareness can enrich traditional security mechanisms with greater
flexibility and expressiveness power and enable a variety of security services [2].
Serrano et al. (2010) introduced a new methodology based on the use of Multi-Agent Based
Simulations (MABS) for testing and validation of Ambient Intelligence based Ubiquitous
Computing (UbiCom) systems. An ambient intelligence based UbiCom is a pervasive system in
which services have some intelligence in order to smoothly interact with users immersed in the
environment. The motivation for this methodology is its application in UbiCom large-scale
systems where large numbers of users are involved and in applications which deal with dangerous
environments. This paper also proposes two techniques for the analysis of general complex
MABSs: forensic analysis and the use of simpler simulations [3].
Kim et al. (2010) derived the security requirements of a child-care and safety service and
establish a conceptual model satisfying the requirements. Based on the system model, we propose
a privacy-preserving location supporting protocol for a child-care and safety service using
wireless sensor networks. While addressing the above problems, our protocol can be operated
over various networks (e.g., Wi-Fi and UWB) providing an RSSI (received signal strength
indication) without any modification. Through performance and security analysis of our protocol,
we show that our protocol is efficient and secure [4].
Emilio et al (2010) presents another philosophy in view of the utilization of Multi-Agent Based
Simulations (MABS) for testing and approval of Ambient Intelligence based Ubiquitous
Computing (UbiCom) frameworks. A surrounding insight based UbiCom is a pervasive
framework in which benefits have some knowledge so as to easily collaborate with clients
submerged in the earth. The inspiration for this philosophy is its application in UbiCom extensive
scale frameworks where vast quantities of clients are included and in applications which manage
risky situations. This paper additionally proposes two procedures for the investigation of general
complex MABSs: legal examination and the utilization of easier recreations. Additionally, the
procedure proposes to infuse components of the genuine UbiCom framework in the recreated
world to expand the certainty of the acceptance process. The proposition is delineated with a nitty
gritty contextual analysis that considers an expanding on our grounds and an AmI administration
for clearing if there should be an occurrence of flam [5].
Ville (2011) portrays transaction and battle over the importance of security in the setting of the
proposed rise of "omnipresent registering society" which alludes to a dream of a general public
where PC innovation, as modest microchips and remote systems, has been flawlessly coordinated
into regular items and exercises. As a delineation of the re-arrangement of the idea of "security"
that develops with "pervasiveness", the news scope of the 2011 Apple area following
embarrassment was investigated from a talk scientific point of view. Utilizing the idea of an
interceded outrage, the explanation of security was examined in connection to the media as the
site for the social arrangement concerning protection. Two contending talks concerning
protection were distinguished. In the social talk, protection was comprehended as debatable in the
changing conditions that innovative advancement produces. In a principal talk, mechanical
advancement was explained in relationship to the major and general right to security. The study
recommends two contrasting understandings of how security would be re-arranged in this
procedure of progress as omnipresent figuring society develops [6].
3. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
103
Hallsteinsen et al.(2012) presents the MUSIC venture has given a cognizant arrangement of
arrangements which facilitate the undertaking of creating setting mindful, self-adaptive
frameworks for versatile and universal registering situations. While the vast majority of the
different viewpoints and bits of building self-adaptive frameworks have been tended to before in
other exploration ventures, MUSIC emerges on the grounds that it has taken a gander at the
complete picture and has conveyed an exhaustive and reasonable advancement structure,
philosophy, and execution stage. The aftereffects of MUSIC have been received as of now by
follow-on undertakings, and we expect that the improvement of self-versatile applications and
comparing systems will develop in various headings later on. Extending the MUSIC system for
smart interactive media content adjustment is one testing research back road, while different
endeavours will look to guarantee that the adjustments are as tried and true as could be allowed to
bring autonomic registering to operation basic applications. The present form of the MUSIC
middleware offers just essential backing for security in light of the fact that extend deliberately
cantered on the depicted adjustment angles [7].
Aysegul et al.(2013) addressed the achievement of the setup choices is very important. Keeping
in mind the end goal to meet the difficulties of the issue, we proposed a methodology in view of
machine realizing so that the conduct of the information mining calculation in differing
circumstances is demonstrated to be utilized for the setup of the calculation. In our approach, the
acknowledged information mining quality is a piece of the conduct demonstrate so that whether
the arrangement quality objectives are achieved or not is surveyed. In particular, adjusting to the
changing conditions by creating another conduct model of information mining is conceivable at
whatever point the current conduct model needs in accomplishing the design quality objectives
[8].
Ezequiel et al.(2015) Proposed Scrum in under graduate programming building courses is a
compelling approach to update and increase the value of professional preparing. In this paper, we
have certified that the understudies' information of Scrum was enhanced when understudies were
given suit-capable instructional techniques as indicated by the preparing measurement of the
understudies 'learning styles. The investigations were completed with regards to an undergrad
programming designing course and have given confirmation to bolster the lattice speculation.
Particularly, the tests have uncovered that understudies who were given learning opportunities in
accordance with their learning 'inclinations accomplished better instructive results [9].
Florin et al (2015) portrays the late extension of Cloud Systems has prompted adjusting asset
administration answers for expansive number of wide conveyed and heterogeneous data centres.
This unique issue presents progresses in virtual machine task and situation, multi-objective and
multi-requirements work booking, asset administration in unified Clouds and in heterogeneous
situations, dynamic topology for information circulation, work process execution change, vitality
effectiveness procedures and confirmation of Service Level Agreements [10].
1.2 Materials and Methods
Universal processing has the one of a kind property when contrasted with customary desktop
registering framework. Really in ubiquitous, more computational force in the earth by utilizing
any gadget everything was accessible and open. Numerous machines give the interface to the
single client by concealing every one of the gadgets from the foundation. Programming
Engineering has unbelievable meaning in the area of pervasive processing. There was no normal
level design which was utilized and satisfactory for a wide range of ubiquitous figuring
application. Because of less programming designing methodologies were distinguished, this was
fundamental issue in the root to propose general level engineering. To reach on this objective,
stride was taken at extremely starting level to look out all the possible programming building
4. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
104
challenges in numerous ubiquitous registering applications that were at that point grew in this
way. Attempt to highlight all methodologies that ever been utilized to built up the universal
applications. This exploration is extremely useful in future for analysts to assemble the general
level engineering for the common applications.
Here we compared these all technologies with different perspectives.
2. CONTRAST BETWEEN ATTRIBUTES OF TOOLS OR PRODUCTS
Here the contrast of attributes of different technologies is defined:
Mobility
Microelectronics Power Supply
Microelectronics manages the scaling
down, improvement, production and
utilization of in time circuits. They have
small size and can be moved easily
from one place to another.
Supplying energy to electronic
frameworks is fundamental condition for
utilizing universal registering
applications. In this way, advance in chip
innovations and hardware improvement
has reliably expanded portability.
Sensor Technology
Sensor is used as an electronic
component is ubiquitous computing. It
is tiny device. The devices in which
Communication Technology
The innovation and business sector
patterns specialists perceive
correspondence innovation as a key driver
5. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
105
sensor is used have also small size these
are easy to move from one place to
another.
for ubiquitous processing. This
technology has increased mobility
capability and easy to change and move.
Sensor is used as an electronic
component is ubiquitous computing. It
is tiny device. The devices in which
sensor is used have also small size these
are easy to move from one place to
another.
The innovation and business sector
patterns specialists perceive
correspondence innovation as a key driver
for ubiquitous processing. This
technology has increased mobility
capability and easy to change and move.
Localization Technology Security Technology
One attractive feature of ubiquitous
registering is that outfitting brilliant
articles with suitable transmitters and
recipients empowers exact confinement.
This technology has increased mobility
capability.
A important element of ubiquitous
registering is that almost all confidence
articles can trade data. Security is in this
way key in ubiquitous figuring. This is
generally performed utilizing remote
devises.
Machine to Machine Communication Human Machine Interface
Ubiquitous registering frameworks will
be exceptionally appropriated
frameworks with thousands or a huge
number of suddenly cooperating
segments. These parts are of grin size
and solid.
The sharp objects of pervasive figuring
require engineers who outline UIs that
move past the some time ago crushing
screen/console rule. This technology has
the great mobility feature.
Embeddedness
Micro-electronics Power Supply
All in all, microelectronics is a full
grown and broadly accessible
innovation and does not represent any
bottlenecks for ubiquitous registering.
The gadgets in this innovation are
inserted and can be utilized as a part of
numerous bearings.
Indeed, even along these lines, for most
applications the force supply is the
biggest and heaviest segment, and the
greatest limitation on use. But we can
embed the devices in it and can use them
in different ways and in different
directions.
Sensor Technology Communication Technology
The key elements of sensor
improvement today incorporate
lessened the size and weight of sensors
and sensor frameworks, coordination of
the sensors into complex semiconductor
frameworks, diminishing,
embeddedness and force.
In this technology different technologies
are embedded and used in different ways
like satellite technology, telephony, radio
engineering, communication engineering
and switching technology.
Localization Technology Security Technology
6. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
106
There are at present three sorts of
limitation frameworks: satellite-upheld,
cell bolstered and indoor confinement
frameworks. These are merged or
embedded with other devices for better
work and understanding. It leads to
powerful working of technology.
In security technology different devices
are used and are embedded to get the high
security, and to make the security more
powerful and safe. By using different
powerful devices we can save the data
from unauthorized access. And can use it
safely.
Machine to Machine Communication Human Machine Interface
Different standards are used in this
technology. Standardization is very
important. In different standards
different devices are connected to each
other and are embedded in different
ways to get great performance.
Embeddedness is very important in this
technology.
Innovation specialists trust that human-
machine interfaces will assume a fairly
normal part for the advancement of
pervasive registering.
Different devices are merged in this
technology and have the great
embeddedness power.
Ad Hoc Networks
Microelectronics Power Supply
Microelectronics can be used as ad-hoc
network we can connect many devices
for different type of communication and
we can connect the different number of
devices. This technology gives us the
benefit of small networking.
Power supply have the capability of ad-
hoc network we can connect many
devices for different type of
communication and we can connect the
different number of devices. This
technology gives us the benefit of small
networking.
Sensor Technology Communication Technology
Sensor Technology is the most powerful
technology today. It can be used in
every field. It can be used with ad-hoc
networks. And it can play the role of ad-
hoc networks by connecting different
devices.
Communication technology has the base
of networking. In this technology
different networking standards are used
and this technology can be used as ad-hoc
networking by connecting and merging
different devices.
Localization Technology Security Technology
Numerous innovation specialists feel
that limitation innovation is the slightest
applicable for universal registering
among every one of the fields. But it
can be used with ad-hoc networks very
efficiently and can also work as small
self- made networks. These networks
are made for small use.
When in disbelief, it is difficult to create
frameworks with high happiness levels in
ubiquitous figuring because of the high
multifaceted nature and systems
administration of huge number of
dissimilar units. But by using this
technology we can make our networks
secure.
Machine to Machine Communication Human Machine Interface
7. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
107
As the systems administration of keen
articles turns out to be progressively
mind boggling, the potential for threat
likewise rises pointedly. It is no more
possible to expressly program all
collaborations among the items, on the
grounds that there are just extremely
numerous believable blends.
Human machine interface is the most
powerful technology today. It can be used
in every field. It can be used with ad-hoc
networks. And it can play the role of ad-
hoc networks by connecting different
devices.
Context Awareness
Microelectronics Power Supply
Context-aware systems are a component
of pervasive computing environment or
ubiquitous computing. Microelectronics
also used the technology of context
awareness. It has the capability of
context aware systems.
Flexible and remote frameworks with
force independence have gotten to be
imperative as of late. The power supply
technology has also used the technology
of context awareness. It has the capability
of context aware systems.
Sensor Technology Communication Technology
Portable and isolated frameworks with
force freedom have gotten to be critical
just now For getting this purpose
Context awareness is very important.
This technology is used in sensor
devices and these devices make the
better use of this technology.
Communication novelty is for the most
part consolidated with data novelty and
alluded to all things considered as the
ICT to highlight the cover between the
two fields. These two fields are combined
and used with context awareness
technology for better performance.
Localization Technology Security Technology
One exciting certainty of universal
registering is that outfitting sharp items
with suitable transmitters and collectors
empowers exact confinement.
Localization technology also use the
Context awareness technology for better
communication and results.
The basic targets of a security system are
to guarantee classification, uprightness,
non-disavowal, accessibility,
namelessness and substance. For getting
this goal in an efficient and secure way
context awareness technology is used.
Machine to Machine communication Human Machine Interface
The institutionalization of proper
machine-machine interfaces and their
advancement are along these lines
uncommonly vital for pervasive
registering. Context awareness
technology is used in this
communication for better results.
The shining objects of universal figuring
require engineers who plan UIs that move
past the earlier overpowering
screen/console standard. Context
awareness technology is used for getting
it.
Autonomy
Microelectronics Power Supply
8. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
108
Context-aware systems are a component
of pervasive computing environment or
ubiquitous computing. Microelectronics
also used the technology of context
awareness. It has the capability of
context aware systems.
Portable and remote frameworks with
force independence have gotten to be
vital as of late. The power supply
technology has also used the technology
of context awareness. It has the capability
of context aware systems.
Sensor Technology Communication Technology
Catching and contravention down this
present reality is one of the focal
attributes of ubiquitous processing. For
getting this purpose Context awareness
is very important. This technology is
used in sensor devices and these devices
make the better use of this technology.
Communication innovation is for the
most part consolidated with data
innovation and alluded to by and large as
the ICT so as to stress the cover between
the two fields. These two fields are
combined and used with context
awareness technology for better
performance.
Localization Technology Security Technology
One charming certainty of ubiquitous
registering is that furnishing sharp items
with proper transmitters and
beneficiaries empowers exact freedom.
Localization technology also use the
Context awareness technology for better
communication and results.
The principal targets of a security
instrument are to guarantee classification,
honesty, non-renouncement, accessibility,
darkness and reality. For getting this goal
in an efficient and secure way context
awareness technology is used.
Machine to Machine Communication Human Machine Interface
The institutionalization of proper
machine-machine interfaces and their
advancement are in this manner
amazingly essential for universal
figuring. Setting mindfulness innovation
is utilized as a part of this
correspondence for better results.
The sharp objects of ubiquitous figuring
require engineers who plan UIs that move
past the some time ago prevailing
screen/console rule. Connection
mindfulness innovation is utilized for
getting it.
3. RESULTS AND DISCUSSIONS
Today, ubiquitous processing is still a fantasy of development. Wide change work will be
necessary to understand most of its traits, for instance, autarkic power supply, machine-machine
communication, human-machine interface and security developments. Beside RFID-based
logistics and security systems, there are not a lot of unavoidable figuring applications at present
in proximity. However the spread and use of the Internet and mobile phones over the earlier
decade proposes how quickly ICT can make impact and even change far reaching parts of
society.
In the short run, general handling is the continuation of the Internet. The development of
ubiquitous preparing is encapsulated by two qualities that may appear to be contradicting at first
look. On one hand, only two or three unavoidable preparing applications exist at present. On the
other hand, various authorities assume that different applications will be recognized inside the
accompanying one to five years. No doubt, these early sharp things will offer joining of different
limits, which will consolidate, particularly, certain unmistakable capacities and data exchange by
method for convenient broadband, enabling relationship with the Internet.
9. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
109
As a sensible outcome, the main pervasive processing applications will most likely draw strongly
on come again is now acknowledged in the Internet. Varying media and information
correspondence will combine, existing media bursts will be defeat, and the likelihood of
computerized correspondence will get to be ubiquitous. Pervasive processing offerings will be
rung by means of a large number of ordinary electronic gadgets, while the administrations
themselves will be given by a focal, Internet-based IT foundation. The nearby communication
between web administrations and the goals of universal figuring is likewise reflected in what are
required to be its initial employments. Since versatility is a focal trademark in the early phase of
omnipresent processing, one can expect that empowering Internet availability from any gadget
will be the principle center. Savvy items will in this manner speak to an emergence of accessible
online administrations. In the coming years, omnipresent registering will be encapsulated not by
the fridge or range consequently associating with formulas, yet by universal access to data and
administrations accessible over the Internet. Over the extended pull, ubiquitous registering will
audibly change forms in the individual, financial and open domains. In the extended pull, as
universal processing is coordinated into regular items and they get to be organized, data and
control frameworks that are without further ado brought together in industry, transport, the
administration business and the general people part can be decentralized and facilitated. In the
individual domain, ubiquitous ICT resolve convey novel capacities to regular articles and more
chances to impart whenever and anyplace. To pick up a feeling of the potential and the points of
confinement of immanent ubiquitous processing, its focal harbinger, the Web, can dish up as a
form. Through the spread of the Internet in the public arena and the financial system, various
social and monetary procedures have to be sure changed essentially or even on a very basic level.
In financial terms, however, one can't talk about a simply Internet-based culture and financial
system.
At a standstill, present are singular businesses that are experiencing significant alter and that end
up in emergency because of digitalization and the Internet. The music and film commercial
enterprises outfit the most striking case, subsequent to their items, which are as of now accessible
in advanced frame at any rate (CD or DVD) can without much of a stretch be replicated to other
computerized media.
In view of encounters with the Internet and its steady advancement and extension, and
additionally on the attributes of omnipresent registering, one can distinguish focal fields that will
shape pervasive processing later on. The frequently verifiable activities and the high level of
systems administration in universal figuring make it hard to unequivocally see and control
framework exercises. This makes it all the more imperative for clients to have the capacity to
thoroughly believe the administrations and substance of universal figuring.
This trust involves:
• The non-reputability and correct content of services
• The safety of information in ever-present computing, and
• The watchful treatment of personal data
Usage of these benchmarks should depend similarly on mechanical, authoritative and
administrative instruments.
On the innovative side, there is still an extraordinary requirement for innovative work in
numerous zones. A wide range of difficulties anticipates industry, science and examination
support, particularly with respect to autarkic force supply, the human-machine interface and
safety advancement.
10. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
110
The detailing and execution of specialized guidelines will apply a conclusive impact on the
presentation and effect of pervasive registering. Open measures—like PC engineering in the
1980’s and the Internet in the 1990’s appear to be especially appropriate to isolating the
advancement of the foundation from that of the items in light of it.
This will make it conceivable to make a wide scope of openly combinable omnipresent
processing items that contend exorbitantly with each other. Innovation suppliers for the pervasive
processing foundation, administration suppliers, client affiliations and guidelines associations are
in this way called upon to concede to suitable open gauges, keeping in mind the end goal to
counter inclinations toward imposing business model.
Similar to the Internet, invasive registering is not past the law. Client recognition of a structure
will depend vigorously on tenets that will promise that they can have the vital trust in its
administrations and substance. These incorporate guaranteeing that business and lawful activities
are legitimately authoritative, characterizing risk for administrations and broken substance that
cause harms, and securing the protection of information. Despite the fact that pervasive
processing does not, on a basic level, make any new prerequisites past those of other appropriated
data and correspondence frameworks, its imperceptibility and omnipresence will for the most part
require that current guidelines be adjusted. This is especially clear with respect to information
security. It stays to be seen whether the current instruments for information assurance will
demonstrate sufficient as time goes on in light of the multifaceted nature of data handling and
systems administration in pervasive figuring. Here, administration suppliers must create
straightforwardness components, for occurrence, so that the individual can anticipate the long
haul results of tolerating or dismissing a trade of data.
Because of its useful rationale, omnipresent processing can't be limited to individual nations.
Globally uniform guidelines are subsequently a key prerequisite for the across the board
presentation of innovation. At present, however, extremely different administrative
methodologies hinder this in such domains as purchaser law, information insurance, the right to
speak freely and the press, and exclusive rights law. On top of this come exceptionally shifted
ways to deal with how such guidelines ought to be executed for instance, as will ful
responsibilities, declarations or laws. Globally uniform standards are critically required here, with
powerful instruments for confirming and upholding consistence. Something else, certain
pervasive registering administrations with particular access boundaries might be accessible or
usable just in specific nations, while they are banned totally in different nations, as is presently
the case in fine china with confinements on Internet get to and look equipment.
A key inquiry in building up a pervasive registering base is the manner by which to guarantee
basic, financially savvy and unhampered access to administrations. This is particularly critical
when positive open and business administration’s are offered exclusively or if nothing else
specially—by means of universal processing frameworks. This could influence acquiring age-
confined items like liquor and tobacco, for occurrence, or getting to human services
administrations. Especially as to open administrations, mind must be taken that culture is not part
into members and non-members in omnipresent.
Ubiquitous figuring can possibly significantly change creation and business forms. This can
mean not just that procedures will turn out to be progressively decentralized and adaptable;
additionally that recently accessible data will expand market straightforwardness. This would
tend to support little and adaptable organizations. Then again, extensive organizations are better
situated to make the significant ventures at first required for omnipresent processing and to set
benchmarks for cross-organization universal figuring foundations that advantage themselves the
11. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.6, November 2016
111
greater part of all. Organizations, their affiliations and monetary policymakers are called upon to
guarantee that the chances of ubiquitous computing are similarly unlock to all organizations.
Allowing for that omnipresent registering is in its outset and the fundamental innovations are best
case scenario somewhat accessible or not in the least, it is difficult to predict the greater part of its
social ramifications. In any case, the mechanical vision of pervasive figuring is obviously
unmistakable in its forms and its specialized execution and application inside the following ten
years appears to be for the most part conceivable; this will involve principal financial difficulties,
as talked about above. Unmistakably, pervasive registering is a profoundly dynamic worldview
that harbors an enormous potential. The examination lies in taking after its progression mindfully,
and molding it efficiently to adventure its helpful outcomes while keeping away from its possible
unhelpful impacts still much as could sensibly be projected.
REFERENCES
[1] M. Weiser. 1993. Hot topics-ubiquitous computing, Computer 26 (10) 71–72.
[2] ISO, 1996. Security frameworks for open systems: Access control framework, ISO/IEC 10181-3.
[3] S. Jajodia, P. Samarati, V. Subrahmanian, E. Bertino. 1997. A unified framework for enforcing
multiple access control policies, in: Proceedings of the 1997 ACM SIGMOD International
Conference on Management of Data, ACM Press, pp. 474–485.
[4] R. Sandhu.1998. Role activation hierarchies, in: Third ACM Workshop on RoleBased Access
Control, ACM Press, pp. 33–40.
[5] D. Goldschlag, M. Reed, P. Syverson. 1999. Onion routing for anonymous and private Internet
connections, Communications of the ACM 24 (2) 39–41.
[6] E. Bertino, S. Castano, E. Ferrari, M. Mesiti. 1999. Controlled access and dissemination of XML
documents, in: Proceedings of the Second International Workshop on Web Information and Data
Management, ACM Press, pp. 22–27.
[7] E. Damiani, D. Sabrina, S. Paraboschi, P. Samarati. 2001. Fine grained access control for soap e-
services, in: Proceedings of the Tenth International Conference on World Wide Web, ACM Press, pp.
504–513.
[8] P. Viswanathan, B. Gill, R. Campbell. 2001. Security architecture in gaia, Tech. Rep., Champaign,
IL, USA.
[8] W. Edwards, M. Newman, J. Sedivy. 2001. Building the ubiquitous computing user experience,
in: CHI ’01: CHI ’01 Extended Abstracts on Human Factors in Computing Systems, ACM Press,
New York, NY, USA, pp. 501–502.
[9] J. Park, R. Sandhu. 2002. Towards usage control models: Beyond traditional access control, in:
Proceedings of the Seventh ACM Symposium on Access Control Models and Technologies, ACM
Press, pp. 57–64.
[10] J. Jai-muhtadi, R. Campbell, A. Kapadia, M. Mickunas, S. Yi. 2002. Routing through the mist:
Privacy preserving communication in ubiquitous computing environments, in: ICDCS ’02:
Proceedings of the 22nd International Conference on Distributed Computing Systems, ICDCS’02,
IEEE Computer Society, Washington, DC, USA, pp. 74.
[11] G. Sampemane, P. Naldurg, R. Campbell. 2002. Access control for active spaces, in: ACSAC ’02:
Proceedings of the 18th Annual Computer Security Applications Conference, IEEE Computer
Society, Washington, DC, USA,