The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Fog Computing: Issues, Challenges and Future Directions IJECEIAES
In Cloud Computing, all the processing of the data collected by the node is done in the central server. This involves a lot of time as data has to be transferred from the node to central server before the processing of data can be done in the server. Also it is not practical to stream terabytes of data from the node to the cloud and back. To overcome these disadvantages, an extension of cloud computing, known as fog computing, is introduced. In this, the processing of data is done completely in the node if the data does not require higher computing power and is done partially if the data requires high computing power, after which the data is transferred to the central server for the remaining computations. This greatly reduces the time involved in the process and is more efficient as the central server is not overloaded. Fog is quite useful in geographically dispersed areas where connectivity can be irregular. The ideal use case requires intelligence near the edge where ultralow latency is critical, and is promised by fog computing. The concepts of cloud computing and fog computing will be explored and their features will be contrasted to understand which is more efficient and better suited for realtime application.
In the Field of Internet of Things IOT the devices by themselves can recognize the environment and conduct a certain functions by itself. IOT Devices majorly consist with sensors. Cloud Computing which is based in sensor networks manages huge amount of data which includes transferring and processing which takes delayed in service response time. As the growth of sensor network is increased, the demand to control and process the data on IOT devices is also increasing. Vikas Vashisth | Harshit Gupta | Dr. Deepak Chahal "Fog Computing: An Empirical Study" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30675.pdf Paper Url :https://www.ijtsrd.com/computer-science/realtime-computing/30675/fog-computing-an-empirical-study/vikas-vashisth
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Improve HLA based Encryption Process using fixed Size Aggregate Key generationEditor IJMTER
Cloud computing is an innovative idea for IT industries which provides several services to
users. In cloud computing secure authentication and data integrity of data is a major challenge, due to
internal and external threats. For improvement in data security over cloud, various techniques are
used.MAC based authentication is one of them, which suffers from undesirable systematic demerits
which have bounded usage and not secure verification, which may pose additional online load to users,
in a public auditing setting. Reliable and secure auditing are also challenging in cloud. In Cloud auditing
existing audit systems are based on aggregate key HLA algorithm. This algorithm is based on variable
sizes, different aggregate key generation, which encounters with security issues at decryption level.
Current Scheme generates a high length of key decryption that encounters with problem of space
complexity. To overcome these issues, We can improve HLA algorithm by improve aggregate key
generation, based on fixed key size. This algorithm generates constant aggregate key which will
overcomes problem of sharing of keys, security issues and space complexity.
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Fog Computing: Issues, Challenges and Future Directions IJECEIAES
In Cloud Computing, all the processing of the data collected by the node is done in the central server. This involves a lot of time as data has to be transferred from the node to central server before the processing of data can be done in the server. Also it is not practical to stream terabytes of data from the node to the cloud and back. To overcome these disadvantages, an extension of cloud computing, known as fog computing, is introduced. In this, the processing of data is done completely in the node if the data does not require higher computing power and is done partially if the data requires high computing power, after which the data is transferred to the central server for the remaining computations. This greatly reduces the time involved in the process and is more efficient as the central server is not overloaded. Fog is quite useful in geographically dispersed areas where connectivity can be irregular. The ideal use case requires intelligence near the edge where ultralow latency is critical, and is promised by fog computing. The concepts of cloud computing and fog computing will be explored and their features will be contrasted to understand which is more efficient and better suited for realtime application.
In the Field of Internet of Things IOT the devices by themselves can recognize the environment and conduct a certain functions by itself. IOT Devices majorly consist with sensors. Cloud Computing which is based in sensor networks manages huge amount of data which includes transferring and processing which takes delayed in service response time. As the growth of sensor network is increased, the demand to control and process the data on IOT devices is also increasing. Vikas Vashisth | Harshit Gupta | Dr. Deepak Chahal "Fog Computing: An Empirical Study" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30675.pdf Paper Url :https://www.ijtsrd.com/computer-science/realtime-computing/30675/fog-computing-an-empirical-study/vikas-vashisth
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Improve HLA based Encryption Process using fixed Size Aggregate Key generationEditor IJMTER
Cloud computing is an innovative idea for IT industries which provides several services to
users. In cloud computing secure authentication and data integrity of data is a major challenge, due to
internal and external threats. For improvement in data security over cloud, various techniques are
used.MAC based authentication is one of them, which suffers from undesirable systematic demerits
which have bounded usage and not secure verification, which may pose additional online load to users,
in a public auditing setting. Reliable and secure auditing are also challenging in cloud. In Cloud auditing
existing audit systems are based on aggregate key HLA algorithm. This algorithm is based on variable
sizes, different aggregate key generation, which encounters with security issues at decryption level.
Current Scheme generates a high length of key decryption that encounters with problem of space
complexity. To overcome these issues, We can improve HLA algorithm by improve aggregate key
generation, based on fixed key size. This algorithm generates constant aggregate key which will
overcomes problem of sharing of keys, security issues and space complexity.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
in cloud computing data storage is a significant issue
because the entire data reside over a set of interconnected
resource pools that enables the data to be accessed through
virtual machines. It moves the application software’s and
databases to the large data centers where the management of
data is actually done. As the resource pools are situated over
various corners of the world, the management of data and
services may not be fully trustworthy. So, there are various
issues that need to be addressed with respect to the
management of data, service of data, privacy of data, security
of data etc. But the privacy and security of data is highly
challenging. To ensure privacy and security of data-at-rest in
cloud computing, we have proposed an effective and a novel
approach to ensure data security in cloud computing by means
of hiding data within images following is the concept of
steganography. The main objective of this paper is to prevent
data access from cloud data storage centers by unauthorized
users. This scheme perfectly stores data at cloud data storage
centers and retrieves data from it when it is needed.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Cloud computing IT-703 reveals the attractive features of cloud computing along with the driven technology i.e. virtualization as per the RGPV syllabus
A proposed Solution: Data Availability and Error Correction in Cloud ComputingCSCJournals
Cloud Computing is the hottest technology in the market these days, used to make storage of huge amounts of data and information easier for organizations. Maintaining servers to store all the information is quite expensive for individual and organizations. Cloud computing allows to store and maintain data on remote servers that are managed by Cloud Service Providers (CSP) like Yahoo and Google. This data can then be accessed through out the globe. But as more and more information of individuals and companies is placed in the cloud, concerns are beginning to grow about just how safe an environment it is. In this paper we discussed security issues and requirements in the Cloud and possible solutions of some the problems. We develop an architecture model for cloud computing to solve the data availability and error correction problem.
Security Issues’ in Cloud Computing and its Solutions. IJCERT JOURNAL
Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services without upfront investment. Despite the potential gains achieved from the cloud computing, the organizations are slow in accepting it due to security issues and challenges associated with it. Security is one of the major issues which hamper the growth of cloud. The idea of handing over important data to another company is worrisome; such that the consumers need to be vigilant in understanding the risks of data breaches in this new environment. This paper introduces a detailed analysis of the cloud computing security issues and challenges focusing on the cloud computing types and the service delivery types.
Study on Secure Cryptographic Techniques in Cloudijtsrd
Cloud Computing is turning into an increasing number of popular day by day. If the safety parameters are taken care properly many organizations and authorities corporations will flow into cloud technology.one usage of cloud computing is statistics storage. Cloud affords considerable capability of garage for cloud users. It is more reliable and flexible to users to shop and retrieve their facts at whenever and everywhere. It is an increasingly more growing technology. Mariam Fatima | M Ganeshan | Saif Ulla Shariff "Study on Secure Cryptographic Techniques in Cloud" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-4 , June 2021, URL: https://www.ijtsrd.compapers/ijtsrd42363.pdf Paper URL: https://www.ijtsrd.comcomputer-science/computer-network/42363/study-on-secure-cryptographic-techniques-in-cloud/mariam-fatima
Security & privacy issues of cloud & grid computing networksijcsa
Cloud computing is a new field in Internet computing that provides novel perspectives in internetworking
technologies. Cloud computing has become a significant technology in field of information technology.
Security of confidential data is a very important area of concern as it can make way for very big problems
if unauthorized users get access to it. Cloud computing should have proper techniques where data is
segregated properly for data security and confidentiality. This paper strives to compare and contrast cloud
computing with grid computing, along with the Tools and simulation environment & Tips to store data and
files safely in Cloud.
Conceptual Model of Real Time Infrastructure Within Cloud Computing EnvironmentCSCJournals
Cloud computing is a new and most demandable technology in communication environment. Where computing resources such as hardware or/and software are processed as service over networks. SCADA implementation within cloud environment is relatively new and demandable over real time infrastructure (industrial infrastructure).The shifting (moving) of SCADA system (applications and resources) within cloud based infrastructure,meanfully overcome the cost and improve the reliability and performance of whole system. Cloud computing provides on-demand network access and batch of computing services for SCADA system. The current research paper takes two conceptual ideas to implement SCADA system within cloud computing (Hybrid Cloud) environment. In the first phase, SCADA applications are processed entirely inside the hybrid cloud. In the second phase, SCADA applications are running in separate application server directly connected to devices in a SCADA network and rest of paper discusses the security related to SCADA and cloud computing communication.
EFFECTIVE METHOD FOR MANAGING AUTOMATION AND MONITORING IN MULTI-CLOUD COMPUT...IJNSA Journal
Multi-cloud is an advanced version of cloud computing that allows its users to utilize different cloud systems from several Cloud Service Providers (CSPs) remotely. Although it is a very efficient computing
facility, threat detection, data protection, and vendor lock-in are the major security drawbacks of this infrastructure. These factors act as a catalyst in promoting serious cyber-crimes of the virtual world. Privacy and safety issues of a multi-cloud environment have been overviewed in this research paper. The
objective of this research is to analyze some logical automation and monitoring provisions, such as monitoring Cyber-physical Systems (CPS), home automation, automation in Big Data Infrastructure (BDI), Disaster Recovery (DR), and secret protection. The Results of this research investigation indicate that it is possible to avoid security snags of a multi-cloud interface by adopting these scientific solutions methodically.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
in cloud computing data storage is a significant issue
because the entire data reside over a set of interconnected
resource pools that enables the data to be accessed through
virtual machines. It moves the application software’s and
databases to the large data centers where the management of
data is actually done. As the resource pools are situated over
various corners of the world, the management of data and
services may not be fully trustworthy. So, there are various
issues that need to be addressed with respect to the
management of data, service of data, privacy of data, security
of data etc. But the privacy and security of data is highly
challenging. To ensure privacy and security of data-at-rest in
cloud computing, we have proposed an effective and a novel
approach to ensure data security in cloud computing by means
of hiding data within images following is the concept of
steganography. The main objective of this paper is to prevent
data access from cloud data storage centers by unauthorized
users. This scheme perfectly stores data at cloud data storage
centers and retrieves data from it when it is needed.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Cloud computing IT-703 reveals the attractive features of cloud computing along with the driven technology i.e. virtualization as per the RGPV syllabus
A proposed Solution: Data Availability and Error Correction in Cloud ComputingCSCJournals
Cloud Computing is the hottest technology in the market these days, used to make storage of huge amounts of data and information easier for organizations. Maintaining servers to store all the information is quite expensive for individual and organizations. Cloud computing allows to store and maintain data on remote servers that are managed by Cloud Service Providers (CSP) like Yahoo and Google. This data can then be accessed through out the globe. But as more and more information of individuals and companies is placed in the cloud, concerns are beginning to grow about just how safe an environment it is. In this paper we discussed security issues and requirements in the Cloud and possible solutions of some the problems. We develop an architecture model for cloud computing to solve the data availability and error correction problem.
Security Issues’ in Cloud Computing and its Solutions. IJCERT JOURNAL
Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services without upfront investment. Despite the potential gains achieved from the cloud computing, the organizations are slow in accepting it due to security issues and challenges associated with it. Security is one of the major issues which hamper the growth of cloud. The idea of handing over important data to another company is worrisome; such that the consumers need to be vigilant in understanding the risks of data breaches in this new environment. This paper introduces a detailed analysis of the cloud computing security issues and challenges focusing on the cloud computing types and the service delivery types.
Study on Secure Cryptographic Techniques in Cloudijtsrd
Cloud Computing is turning into an increasing number of popular day by day. If the safety parameters are taken care properly many organizations and authorities corporations will flow into cloud technology.one usage of cloud computing is statistics storage. Cloud affords considerable capability of garage for cloud users. It is more reliable and flexible to users to shop and retrieve their facts at whenever and everywhere. It is an increasingly more growing technology. Mariam Fatima | M Ganeshan | Saif Ulla Shariff "Study on Secure Cryptographic Techniques in Cloud" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-4 , June 2021, URL: https://www.ijtsrd.compapers/ijtsrd42363.pdf Paper URL: https://www.ijtsrd.comcomputer-science/computer-network/42363/study-on-secure-cryptographic-techniques-in-cloud/mariam-fatima
Security & privacy issues of cloud & grid computing networksijcsa
Cloud computing is a new field in Internet computing that provides novel perspectives in internetworking
technologies. Cloud computing has become a significant technology in field of information technology.
Security of confidential data is a very important area of concern as it can make way for very big problems
if unauthorized users get access to it. Cloud computing should have proper techniques where data is
segregated properly for data security and confidentiality. This paper strives to compare and contrast cloud
computing with grid computing, along with the Tools and simulation environment & Tips to store data and
files safely in Cloud.
Conceptual Model of Real Time Infrastructure Within Cloud Computing EnvironmentCSCJournals
Cloud computing is a new and most demandable technology in communication environment. Where computing resources such as hardware or/and software are processed as service over networks. SCADA implementation within cloud environment is relatively new and demandable over real time infrastructure (industrial infrastructure).The shifting (moving) of SCADA system (applications and resources) within cloud based infrastructure,meanfully overcome the cost and improve the reliability and performance of whole system. Cloud computing provides on-demand network access and batch of computing services for SCADA system. The current research paper takes two conceptual ideas to implement SCADA system within cloud computing (Hybrid Cloud) environment. In the first phase, SCADA applications are processed entirely inside the hybrid cloud. In the second phase, SCADA applications are running in separate application server directly connected to devices in a SCADA network and rest of paper discusses the security related to SCADA and cloud computing communication.
EFFECTIVE METHOD FOR MANAGING AUTOMATION AND MONITORING IN MULTI-CLOUD COMPUT...IJNSA Journal
Multi-cloud is an advanced version of cloud computing that allows its users to utilize different cloud systems from several Cloud Service Providers (CSPs) remotely. Although it is a very efficient computing
facility, threat detection, data protection, and vendor lock-in are the major security drawbacks of this infrastructure. These factors act as a catalyst in promoting serious cyber-crimes of the virtual world. Privacy and safety issues of a multi-cloud environment have been overviewed in this research paper. The
objective of this research is to analyze some logical automation and monitoring provisions, such as monitoring Cyber-physical Systems (CPS), home automation, automation in Big Data Infrastructure (BDI), Disaster Recovery (DR), and secret protection. The Results of this research investigation indicate that it is possible to avoid security snags of a multi-cloud interface by adopting these scientific solutions methodically.
A Guide to Edge Computing Technology For Business OperationsCerebrum Infotech
Edge computing services enable us to generate more data at a faster rate and distribute it to a range of networks and devices located at or near the consumer. For further details, see our website.
Ant colony Optimization: A Solution of Load balancing in Cloud dannyijwest
As the cloud computing is a new style of computing over internet. It has many advantages along with some
crucial issues to be resolved in order to improve reliability of cloud environment. These issues are related
with the load management, fault tolerance and different security issues in cloud environment. In this paper
the main concern is load balancing in cloud computing. The load can be CPU load, memory capacity,
delay or network load. Load balancing is the process of distributing the load among various nodes of a
distributed system to improve both resource utilization and job response time while also avoiding a
situation where some of the nodes are heavily loaded while other nodes are idle or doing very little work.
Load balancing ensures that all the processor in the system or every node in the network does
approximately the equal amount of work at any instant of time. Many methods to resolve this problem has
been came into existence like Particle Swarm Optimization, hash method, genetic algorithms and several
scheduling based algorithms are there. In this paper we are proposing a method based on Ant Colony
optimization to resolve the problem of load balancing in cloud environment.
A Comparison of Cloud Execution Mechanisms Fog, Edge, and Clone Cloud Computing IJECEIAES
Cloud computing is a technology that was developed a decade ago to provide uninterrupted, scalable services to users and organizations. Cloud computing has also become an attractive feature for mobile users due to the limited features of mobile devices. The combination of cloud technologies with mobile technologies resulted in a new area of computing called mobile cloud computing. This combined technology is used to augment the resources existing in Smart devices. In recent times, Fog computing, Edge computing, and Clone Cloud computing techniques have become the latest trends after mobile cloud computing, which have all been developed to address the limitations in cloud computing. This paper reviews these recent technologies in detail and provides a comparative study of them. It also addresses the differences in these technologies and how each of them is effective for organizations and developers.
A SCRUTINY TO ATTACK ISSUES AND SECURITY CHALLENGES IN CLOUD COMPUTINGijasa
Cloud computing is an anthology in which one or more computers are connected in a network. Cloud computing is a cluster of lattice computing, autonomic computing and utility computing. Cloud provides an on demand services to the users. Many numbers of users access the cloud to utilize the cloud resources. The security is one the major problem in cloud computing. Hence security is a major issue in cloud computing. Providing security is a major requirement of cloud computing. The study enclose all the security issues and attack issues in cloud computing.
A SCRUTINY TO ATTACK ISSUES AND SECURITY CHALLENGES IN CLOUD COMPUTINGijasa
Cloud computing is an anthology in which one or more computers are connected in a network. Cloud
computing is a cluster of lattice computing, autonomic computing and utility computing. Cloud provides an
on demand services to the users. Many numbers of users access the cloud to utilize the cloud resources.
The security is one the major problem in cloud computing. Hence security is a major issue in cloud
computing. Providing security is a major requirement of cloud computing. The study enclose all the
security issues and attack issues in cloud computing.
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
Developing a smart system for infant incubators using the internet of things ...IJECEIAES
This research is developing an incubator system that integrates the internet of things and artificial intelligence to improve care for premature babies. The system workflow starts with sensors that collect data from the incubator. Then, the data is sent in real-time to the internet of things (IoT) broker eclipse mosquito using the message queue telemetry transport (MQTT) protocol version 5.0. After that, the data is stored in a database for analysis using the long short-term memory network (LSTM) method and displayed in a web application using an application programming interface (API) service. Furthermore, the experimental results produce as many as 2,880 rows of data stored in the database. The correlation coefficient between the target attribute and other attributes ranges from 0.23 to 0.48. Next, several experiments were conducted to evaluate the model-predicted value on the test data. The best results are obtained using a two-layer LSTM configuration model, each with 60 neurons and a lookback setting 6. This model produces an R 2 value of 0.934, with a root mean square error (RMSE) value of 0.015 and a mean absolute error (MAE) of 0.008. In addition, the R 2 value was also evaluated for each attribute used as input, with a result of values between 0.590 and 0.845.
A review on internet of things-based stingless bee's honey production with im...IJECEIAES
Honey is produced exclusively by honeybees and stingless bees which both are well adapted to tropical and subtropical regions such as Malaysia. Stingless bees are known for producing small amounts of honey and are known for having a unique flavor profile. Problem identified that many stingless bees collapsed due to weather, temperature and environment. It is critical to understand the relationship between the production of stingless bee honey and environmental conditions to improve honey production. Thus, this paper presents a review on stingless bee's honey production and prediction modeling. About 54 previous research has been analyzed and compared in identifying the research gaps. A framework on modeling the prediction of stingless bee honey is derived. The result presents the comparison and analysis on the internet of things (IoT) monitoring systems, honey production estimation, convolution neural networks (CNNs), and automatic identification methods on bee species. It is identified based on image detection method the top best three efficiency presents CNN is at 98.67%, densely connected convolutional networks with YOLO v3 is 97.7%, and DenseNet201 convolutional networks 99.81%. This study is significant to assist the researcher in developing a model for predicting stingless honey produced by bee's output, which is important for a stable economy and food security.
A trust based secure access control using authentication mechanism for intero...IJECEIAES
The internet of things (IoT) is a revolutionary innovation in many aspects of our society including interactions, financial activity, and global security such as the military and battlefield internet. Due to the limited energy and processing capacity of network devices, security, energy consumption, compatibility, and device heterogeneity are the long-term IoT problems. As a result, energy and security are critical for data transmission across edge and IoT networks. Existing IoT interoperability techniques need more computation time, have unreliable authentication mechanisms that break easily, lose data easily, and have low confidentiality. In this paper, a key agreement protocol-based authentication mechanism for IoT devices is offered as a solution to this issue. This system makes use of information exchange, which must be secured to prevent access by unauthorized users. Using a compact contiki/cooja simulator, the performance and design of the suggested framework are validated. The simulation findings are evaluated based on detection of malicious nodes after 60 minutes of simulation. The suggested trust method, which is based on privacy access control, reduced packet loss ratio to 0.32%, consumed 0.39% power, and had the greatest average residual energy of 0.99 mJoules at 10 nodes.
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersIJECEIAES
In real world applications, data are subject to ambiguity due to several factors; fuzzy sets and fuzzy numbers propose a great tool to model such ambiguity. In case of hesitation, the complement of a membership value in fuzzy numbers can be different from the non-membership value, in which case we can model using intuitionistic fuzzy numbers as they provide flexibility by defining both a membership and a non-membership functions. In this article, we consider the intuitionistic fuzzy linear programming problem with intuitionistic polygonal fuzzy numbers, which is a generalization of the previous polygonal fuzzy numbers found in the literature. We present a modification of the simplex method that can be used to solve any general intuitionistic fuzzy linear programming problem after approximating the problem by an intuitionistic polygonal fuzzy number with n edges. This method is given in a simple tableau formulation, and then applied on numerical examples for clarity.
The performance of artificial intelligence in prostate magnetic resonance im...IJECEIAES
Prostate cancer is the predominant form of cancer observed in men worldwide. The application of magnetic resonance imaging (MRI) as a guidance tool for conducting biopsies has been established as a reliable and well-established approach in the diagnosis of prostate cancer. The diagnostic performance of MRI-guided prostate cancer diagnosis exhibits significant heterogeneity due to the intricate and multi-step nature of the diagnostic pathway. The development of artificial intelligence (AI) models, specifically through the utilization of machine learning techniques such as deep learning, is assuming an increasingly significant role in the field of radiology. In the realm of prostate MRI, a considerable body of literature has been dedicated to the development of various AI algorithms. These algorithms have been specifically designed for tasks such as prostate segmentation, lesion identification, and classification. The overarching objective of these endeavors is to enhance diagnostic performance and foster greater agreement among different observers within MRI scans for the prostate. This review article aims to provide a concise overview of the application of AI in the field of radiology, with a specific focus on its utilization in prostate MRI.
Seizure stage detection of epileptic seizure using convolutional neural networksIJECEIAES
According to the World Health Organization (WHO), seventy million individuals worldwide suffer from epilepsy, a neurological disorder. While electroencephalography (EEG) is crucial for diagnosing epilepsy and monitoring the brain activity of epilepsy patients, it requires a specialist to examine all EEG recordings to find epileptic behavior. This procedure needs an experienced doctor, and a precise epilepsy diagnosis is crucial for appropriate treatment. To identify epileptic seizures, this study employed a convolutional neural network (CNN) based on raw scalp EEG signals to discriminate between preictal, ictal, postictal, and interictal segments. The possibility of these characteristics is explored by examining how well timedomain signals work in the detection of epileptic signals using intracranial Freiburg Hospital (FH), scalp Children's Hospital Boston-Massachusetts Institute of Technology (CHB-MIT) databases, and Temple University Hospital (TUH) EEG. To test the viability of this approach, two types of experiments were carried out. Firstly, binary class classification (preictal, ictal, postictal each versus interictal) and four-class classification (interictal versus preictal versus ictal versus postictal). The average accuracy for stage detection using CHB-MIT database was 84.4%, while the Freiburg database's time-domain signals had an accuracy of 79.7% and the highest accuracy of 94.02% for classification in the TUH EEG database when comparing interictal stage to preictal stage.
Analysis of driving style using self-organizing maps to analyze driver behaviorIJECEIAES
Modern life is strongly associated with the use of cars, but the increase in acceleration speeds and their maneuverability leads to a dangerous driving style for some drivers. In these conditions, the development of a method that allows you to track the behavior of the driver is relevant. The article provides an overview of existing methods and models for assessing the functioning of motor vehicles and driver behavior. Based on this, a combined algorithm for recognizing driving style is proposed. To do this, a set of input data was formed, including 20 descriptive features: About the environment, the driver's behavior and the characteristics of the functioning of the car, collected using OBD II. The generated data set is sent to the Kohonen network, where clustering is performed according to driving style and degree of danger. Getting the driving characteristics into a particular cluster allows you to switch to the private indicators of an individual driver and considering individual driving characteristics. The application of the method allows you to identify potentially dangerous driving styles that can prevent accidents.
Hyperspectral object classification using hybrid spectral-spatial fusion and ...IJECEIAES
Because of its spectral-spatial and temporal resolution of greater areas, hyperspectral imaging (HSI) has found widespread application in the field of object classification. The HSI is typically used to accurately determine an object's physical characteristics as well as to locate related objects with appropriate spectral fingerprints. As a result, the HSI has been extensively applied to object identification in several fields, including surveillance, agricultural monitoring, environmental research, and precision agriculture. However, because of their enormous size, objects require a lot of time to classify; for this reason, both spectral and spatial feature fusion have been completed. The existing classification strategy leads to increased misclassification, and the feature fusion method is unable to preserve semantic object inherent features; This study addresses the research difficulties by introducing a hybrid spectral-spatial fusion (HSSF) technique to minimize feature size while maintaining object intrinsic qualities; Lastly, a soft-margins kernel is proposed for multi-layer deep support vector machine (MLDSVM) to reduce misclassification. The standard Indian pines dataset is used for the experiment, and the outcome demonstrates that the HSSF-MLDSVM model performs substantially better in terms of accuracy and Kappa coefficient.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
2. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 2, April 2018 : 1247 – 1255
1248
cloud services as well as local server services to make very efficient, reliable, secure, high performing
application. The microservices enable high performance processing, optimized analytics and heterogeneous
services to be hosted as close to the control and monitoring system as possible.
This system can be designed to address complex event processing (CEP) by writing rules using a
powerful and expressive domain specific language (DSL) for the multitude of the incoming sensor streams of
data. These rules can then be used to prevent costly machine failures or downtime as well as improve the
efficiency and safety of industrial operations and processes in real time. The services are deployed in cloud
server as well as in fog server located in DMZ network. These services deployed in these two tier
communicate with each other as per requirement. User perform monitoring and controlling operation where
data are exchanged from both services deployed in different layers. This paper is focused on explaining the
details of application development and deployment with the existing technology using block diagram with
technologies details, architectural diagram with wired and wireless network of things with tiers details, flow
chart diagram and with the flow of system.
2. LITERATURE SURVEY
2.1. Edge Computing
Edge computing is an approach to push processing of certain data at edge network. An application
having requirement to handle and process real-time data then such processing cannot tolerate network latency
delay. Processing of such data in cloud will increase response time, which is not acceptable in practical case.
Processing of such data locally, will reduce the data that need to be send for processing which help to
decrease the network traffic as well as increase performance. The hardware device in edge network have very
less memory and processing capacity. Therefore, it is recommend handling only dedicated process in edge
computing network.
2.2. Fog Computing
Data generated from IoT devices has increased exponentially in volume and velocity. The old data
warehouse model cannot meet the low latency response times for users demand. Cloud was only as option for
sending data to store, analysis etc. that might lead to data bottlenecks. Business models need data analytics
response as soon as possible but it get delay when data have to travel to cloud, process it and send response
back. Fog Computing helps to overcome these challenges. Fog computing architecture is almost same as a
cloud computing architecture but fog computing is located in edge network. Characteristics of fog computing
include low latency, location awareness, real-time analytics and security. Fog computing is located in edge
network because of that monitoring, controlling and maintaining edge devices can be done in ease. This give
more flexibility in enhancing edge computing. Because of fog computing, edge computing can get rid of lot
of circumstance and challenges that it might face.
2.3. Cloud Computing
Cloud computing is a platform where it have exposed multiple services like Platform as a
service(PaaS), Infrastructure as a service(IaaS), Software as a service(SaaS) etc. Cloud Computing is an
approach to use services through internet using lightweight protocol. Cloud computing provide resources like
computer processing resources, data storage devices and other devices on demand[4]. These resources can be
used to build your own SaaS in cloud like bigdata analytics to perform your requirement explained in [14].
These resources can be control and monitor using lightweight protocol. One can use services available in
cloud using lightweight protocol and can get rid of developing your own services. Overall, Cloud computing
is a platform which exposed lot of services to develop application or business logic with ease.
3. TIARRAH COMPUTING SYSTEM COMPONENT
The Components of Tiarrah Computing is shown in Figure 1. These components are explained
below:
3.1. Hardware, Network
To deploy and communicate with application there is always need of the hardware and networks.
The most widely used hardware and network in data Centre are Blaze Server, Racks, Router, Switch, cable-
wiring etc.
3. Int J Elec & Comp Eng ISSN: 2088-8708
Tiarrah Computing: The Next Generation of Computing (Yanish Pradhananga)
1249
3.2. Virtualization & Cloud Computing
Besides hardware, virtualization technology is used for the management and control of resources
like Memory, Storage, Network and Processor. By using virtualization, one can create virtual instance with
virtual image using a desired hardware with configuration that can be vertically scaled up or down as
required. The most popular cloud computing tools like openstack[12], [13] and cloudstack are used to create
virtual instance and destroy them. By using openstack and cloudstack one can create their own private or
public cloud. If there is no requirement of these technologies or one doesn’t want to invest in hardware to
build a datacenter or private cloud. One can have a simple server in private network or local network and
deploy rest of application in public cloud like Amazon AWS, Microsoft Azure, Rackspace etc. to deploy the
application.
3.3. Cluster Management
Tiarrah Computing promote for cluster management if the application is huge and it is necessary to
make the cluster running without any downtime since it is quite complicated to monitor and update each and
every machine individually. To solve these challenges there are opensource frameworks and tools for cluster
management as well as a lot of research going on in cluster management [15]. Since, it is opensource one can
use the features available or create his own feature or modify the feature as per requirement. There are also a
lot of forums and groups where one can find lot of support, documentations, and tutorials and so on to deal
with the challenges that you are facing. The benefits of using cluster management tools are as follows:
a. Cluster size can be scaled up or down easily or an automated design to scale cluster size up or
down can be put to use.
b. Zero downtime by using automatic failover or self-healing.
c. Controlling, monitoring and managing a group of clusters through a graphical user interface or
by using command line.
d. Dynamic load balancing can be achieved.
e. Easy deployment of cluster and easy deployment of application using DevOps.
f. Efficient use of resources.
For Small and Medium Enterprise (SME) there might not be requirement of Cluster Management,
since all workload can be handled by a server. But it can be implemented in future if required as application
grow.
3.4. DevOps
DevOps focuses on collaboration and communication of software developers and other information
technologies. Automating software delivery and infrastructure changes such concept and process comes
under DevOps practice. DevOps aims is to create an environment for building, testing and releasing software
rapidly, frequently and more reliably [16], [17]. Some of widely used DevOps tools in industries for software
development and delivery process are GIT, Docker, Jenkins, Puppet, Vagrant etc. Some of most popular and
widely used opensoure framework and tools for cluster management are Docker swarm, Fleet, Kubernetes,
Apache Mesos, Apache ZooKeeper, Apache Marathon etc.
3.5. Microservices and Monolitics
Microservices [11], [18], [19] and Monolitics architecture are architectures that are used for
developing applications. Most applications are developed using monolitics architecture and it is the
traditional way of application development. In software development there is requirement of continuous
increment in feature which increase sizes of application. As time passes application become complicated and
huge, which lead to degradation in performance and testing. This mean that application development speed is
inversely proportional to the size and features. To overcome this limitation, mocroservices based application
development approach emerged.
Microservices based application development is an approach to develop an application composed of
small services. Each services in microservices run in its own process. Each services are independent of each
other, which give flexibility in development, testing and deployment of services separately. Each service runs
as a separate process and communication between services is accomplished through lightweight mechanisms.
Microservices give more flexibility in developing application with feature like non-blocking, event driven,
concurrency, scalability, polyglot etc.
3.6. Things
The Internet of things is the internetworking of physical devices, vehicles, buildings etc. These
things are embedded with electronics, software, sensors, actuators etc. The network connection enables to
collect and exchange data from these things. IoT allows things to be monitor and/or control remotely IoT
4. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 2, April 2018 : 1247 – 1255
1250
allows physical world things to be connected directly with the computer-based system. Things connected
with internet results improved efficiency, accuracy and economic benefit. Technologies such as smart grids,
smart homes, intelligent transportation and smart cities are the platform based on things. Experts estimate
that the IoT will consist of almost 50 billion things by 2020.
Figure 1. Block Diagram of Tiarrah Computing
3.7. End User
The most widely used end user devices to interact with the applications are Android devices, iOS
devices, desktop, laptop etc. The client side of application like android, iOS and User Interface or browser
communicate with server using websocket, Rest API, Soap etc.
4. APPLICATION FLOW IN TIARRAH COMPUTING
Figure 2 show the overall application flow. Each application flow layer are explain below clearly.
Figure 2. Application Flow of Tiarrah Computing
4.1. IoT Devices
IoT Devices or things can be connected to Message Broker via wired or wireless connection. An IoT
device can fetch data from things using GPIO, TCP/IP or via Modbus protocol. An IoT device can fetch data
5. Int J Elec & Comp Eng ISSN: 2088-8708
Tiarrah Computing: The Next Generation of Computing (Yanish Pradhananga)
1251
from a single thing which is directly connected or a group of things connected via Remote Terminal
Unit(RTU), PLC and Supervisory Control and Data Acquisition(SCADA). IoT Device send data fetched
from things to a message broker which tells the detailed status of things. IoT Devices may support wireless
communication protocols such as Zigbee[20-22], Wifi [23], Bluetooth [24] etc. and support wired connection
such as RS-232, RS-485, RJ-45, RJ-11, USB etc. Most IoT Devices are able to communicate using Modbus
protocol and TCP/IP protocol. The data from these connected things are further transferred to Cloud using a
Message Broker. Some example of sensors connected with things are temperature sensor, humidity sensor,
and light sensor etc.
4.2. Message Broker
Services on application can interact with one another using middleware called the message broker
which provide decoupling on the services of an applications. This decoupling of services gives more
flexibility in application development. There may be probability messages and workload may queue up when
multiple receiver is connected. These queues are handled by the message broker which make sure that
messages are delivered, transaction managed and reliable storage is always available. Message Broker
manage workload queue or message queue from multiple receivers, providing reliable storage, and guarantee
message delivery and transaction management. Some of the Popular and widely used opensource Message
Broker are Kafka, RabbitMQ, ActiveMQ, Mosquitto etc.
4.3. Messaging Protocol
In order to select the best possible messaging protocol solution in IoT and IIoT an in-depth
knowledge of the architecture as well as the messaging or data sharing are pre-requisites of every target
system is essential. IoT applications of the future will be supported by many important technologies that are
currently in an advancing phase. In order to connect devices messaging technologies like Data Distribution
Service DDS, Constrained Application Protocol CoAP, Message Queuing Telemetry Transport MQTT,
eXtensible Messaging and Presence Protocol XMPP, Advanced Message Queuing Protocol AMQP and
Representational State Transfer REST may be put to use. However, when one considers the fundamental
system requirements like performance interoperability, security, tolerance of fault, service quality etc. The
suitability of these messaging technologies may fall short of expectations, as they may be unable to support
communication within and between device to cloud communication and communication within a data centre.
In order to interchange data websocket protocol can also be used to make the full duplex connection with
Message Broker as well as with the services.
4.4. Data Processing
The process of gathering and managing data components to produce meaningful information is
called data processing. . It is a subset of information processing. Most applications gather information and
save it in the memory or database. These data are then processed as a batch job. Real-time data or streaming
data are becoming increasingly relevant. It is no longer sufficient to process big volumes of data but
operations on real-time data is also adding value in data processing. Real-time data processing is developing
fast and is currently one of the most popular research topics. To overcome these challenges of real-time data
processing there emerge two different architecture that are kappa architecture and lambda architecture. Data
processing is performed in two different mechanism:
4.4.1. Stream Processing
The analysis of large quantities of data and carrying out of actions on data even aas it is being
collected can be done using stream processing. For this purpose, a continuous series of queries (i.e. SQL-type
queries that operate over time and buffer windows) and applied on the streaming data. Streaming analytic, an
important component of stream processing is the ability to perform mathematical and statistical calculations
on streaming data, is made possible with a scalable, highly available and fault tolerant architecture.
A conventional database model first stores and classifes data before being processed by queries.
However, in stream processing data can be analysed even while it is en-route. Since stream processing can be
doen even on external data sources it enables selection of data into application flow and the update of the
external databases with data that has been processed.
4.4.2. Batch Processing
When transaction is processed in a blulk or a batch it is called batch processing. Unlike transaction
processing that can be performed only one at a time and may require user intervention, once batch processing
has begun, external interaction is not required. Its potential is best suitable for end of-cycle processing, such
6. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 2, April 2018 : 1247 – 1255
1252
as for processing a bank’s reports at the end of a day, or generating monthly or bi-weekly payrolls, although
it can be performed as desired.
4.5. Data Mining
When transaction is processed in a bulk or a batch it is called batch processing. Unlike transaction
processing that can be performed only one at a time and may require user intervention, once batch processing
has begun, external interaction is not required. Its potential is best suitable for end of-cycle processing, such
as for processing a bank’s reports at the end of a day, or generating monthly or bi-weekly payrolls, although
it can be performed as desired.
4.5.1. Information Fusion
Information fusion is the process of combining or fusing information or data of same object or scene
to have a clear vision of complexity, reliability and accuracy of the situation of the information [5].
4.5.2. Multi-Sensor Data/Information Fusion
Developing any intelligence and smart appliation or system required multi-sensor data/information
fusion. Data/Information fusion from single source or things can be easily bypass and have lot of limitation in
intelligence, logic design, accuracy etc. The fusion of data/information from multiple source in multiple way
and multiple level gives more flexibility to achieve a unified pricture [6].
One good example, to elaborate real-time multi-sensor data/information fusion can be fire alarm
system. Having only smoke detector will not give the exact situation information whether fire occurred or
not. Effective use of multiple sensors like temperature sensor, smoke detector, humidity sensor etc. and
fusing the information/data from these sensors in real-time will be the exact situation and make complicated
to overcome or bypass the security threat.
4.5.3. Cross Domain Data Fusion
As the bigdata evolve, it comes with many challenges to handle data and get insight from it. Bigdata
open a new approach to visualize data from high level with a concrete vision. The data from multiple domain
have multiple scenario and fusing the data from these domains give a new way to solve complex challenges.
It is obvious that the data from multiple domain are in multiple format. Each data is handled separately and
information from these domains are again fused to have a high-level concrete visualization of data. Three
categories of data fusion methodologies for Cross-Domain data fusion approach are summarized as well as
elaborate the approach to unlock the knowledge from disparate datasets of different domain in the bigdata
research [7].
4.5.4. Big Data Fusion
Interacting with billions of rows of data from single source is not sufficient, now according to the
rapid development of thechnologies and requirements it is necessary to integrate multiple sources and
perform fusion. The data from multiple sources, real-time stream and historical data may be there included
while dealing with bigdata Fusion [8]. The fusion of these data across multiple sources and the result of
fusion of each sources get the right insights.
4.6. Decision Making
Decision Making is not an easy task. There is always a requirement for appropriate decision to be
made everywhere like in Business, Enterprise, Stock Market, Gambling etc. Decisions are made by satisfying
the condition. Decision are taken by using Message/Event Driven Architecture, Process Driven Architecture
and Service Driven Architecture etc. When complex processing is not required and there is a need to perform
mission critical operations Message/Event Driven Architecture are better to implement or programmed in
microcontroller, ARM Controller or any device that support programmable automation Controllers.
Implementation of such logic near the device give real-time triggering for the action that needs to be
performed. The trigger of event/action could be an email, SMS, controlling devices, call other event, start
new process etc. To make a complex decision, the use of predictive analytics, machine learning, aggregation,
summarization etc. is not sufficient. The effective use of these techniques along with the appropriate data
fusion techniques as defined in Section 3.5 needs to be used.
7. Int J Elec & Comp Eng ISSN: 2088-8708
Tiarrah Computing: The Next Generation of Computing (Yanish Pradhananga)
1253
5. ARCHITECTURE OF TIARRAH COMPUTING
Figure 3 explain detail about the Tiarrah Computing. This architecture is explained clearly with the
flow of data and the approach how it is handled. The below description of the Figure 3 will give you a clear
vision of application.
5.1. First Tier
First tier lies under Edge Computing. The deployment of hardware with an appropriate network
connection are made using standard protocols as requirement. In Figure 3 two widely popular and emerging
network used in wired and wireless connection are shown. I am considering widely used automation
industrial design pattern with devices like PLC, Scada etc. So that, Industries can adopt Tiarrah Computing
with their existing hardware and network without any modification or with less modification.
First Tier or Edge Computing architecture and working mechanism is almost similar with the
architecture and working mechanism of automation industries (with less modification). Figure 3, illustrates
things can be directly connected with ARM Controller using RS-232 or RS-485 connections. Most of the
automation industries use RTU or PLC to connect things or devices. PLC is used where there is need to
handle mission critical event in real-time. The logic and algorithm are designed with message/event driven
architecture and is embedded in PLC. Widely used and popular automation industry communication protocol
known as modbus protocol can be used. ARM Controller is used for data acquistation of things and sending
these data to Message Broker. As shown in the figure, Monitoring and Controlling Section can be built with
the traditional monitoring and controlling system using Scada or can use this system inside for monitoring
and controlling. The advantage that this system gives in behave of Scada is that it can be used to monitor and
control from anywhere and anytime with ease. FPGA design pattern can be used in this tier using modbus
protocol [25]. ARM Controller can be connected with Router using Wi-Fi or wired connection. The figure
shows a wired connection using Ethernet (RJ-45) connection.
The Figure 3 also shows another very popular and emerging wireless connection protocol being
used in IoT to make an application known as Zigbee Protocol. Devices or Things can be directly connected to
Zigbee Devices using GPIO. Zigbee device can have multiple type of input connection like RJ-45, RS-232,
Wi-Fi etc and support multiple protocols like TCP/IP, Modbus etc. Because of that fetching data from PLC,
RTU can be performed using modbus protocol. Zigbee support popular topologies like Mesh topology, Star
topology and Tree topology because of that Zigbee is becoming more popular. Zigbee devices consume less
power and use wireless communication network for exchanging data between Zigbee Slave and Zigbee
Master. Zigbee Master is connected with the Router to have a WAN network connection for exchanging data.
5.2. Second Tier
This tier lies under Fog Computing. Fog Computing is evolving and emerging to distribute the cloud
responsibility or services as close to the data source for quick and immediate interaction. The services are
deployed inside DMZ network. DMZ network is a separate network other than LAN and WAN network.
LAN network lies in first tier and WAN network lies in third tier. As application is microservies based
application. Microservies are deployed in standard server or a private cloud [10] depending on the types of
services deployed in this tier. Microservices are deployed in public cloud as well as in private cloud.
Microservies are deployed in this tier to achieve low latency, real-time interactions, application distribution,
event handling, process handling etc. The performance of services deployed for real-time interaction is faster
in this layer as compare to the services deployed in cloud. Services are deployed in this tier for quick
interaction or immediate response. Real-Time interaction of data is recommended to publish/subscribe from
this tier. Real- Time monitoring and controlling are more efficient and faster in this layer. The services
deployed in this tier interact with the services deployed in cloud tier using REST API. Real-Time
communication between users is made possible, using WebSocket, SockJS etc.
5.3. Third Tier
This tier lies in cloud computing. This tier consist of Services deployed in cloud server, database,
cloud storage etc. The microservices in cloud communicate with microservices in DMZ network server using
APIs to remotely configure, control and manage the systems things. Cloud microservices include a
management UI for developing and deploying analytics expressions, maintains details of users and devices
details in database for monitoring and managing, and for managing the integration of services with the
customers identity access management and persistence solutions. This system address bigdata analytics in
cloud where there is requirement of huge resources for batch processing. The services which are deployed in
cloud interact or communicate with the services in private cloud using APIs. By deploying services which
8. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 2, April 2018 : 1247 – 1255
1254
required high computation resource in cloud give flexibility to perform bigdata analytics in desire time by
allocating resources dynamically [9].
Figure 3. System Architecture of Tiarrah Computing
5.4. Fourth Tier
The fourth tier is user interface where user can interact with application using iOS application or
android Application or web based application. The devices can be mobile, tablet, laptop, desktop etc. User
can interact with application from anywhere at any time. Remote monitoring and controlling system give
more flexibility and control for users.
6. CONCLUSION
Tiarrah Computing is a computing approaching for next generation application deployment. Most
automation industries applications rely on edge computing only and is bound with a lot of circumstances.
Most IoT based applications are deployed using first and third Tier of Tiarrah computing. The four tier
computing architecture gives a lot of opportunity and flexibility as well as the ability to deal with upcoming
challenges. This Computing approach is so flexible that it can deal with realtime/batch bigdata challenges,
IoT application development and deployment challenges, flexible application deployment, dynamic resource
allocation, self-healing approach etc. Overall it is a platform or a computing approach where one can work
with the existing cloud computing with its features, design own fog computing approach or use any software
or services which might be available in an application in the future. One can use evolving edge computing
design approach to connect things or deploy things using Edge computing service provider as available.
Design your deployment, Real-time event handling, and design concept is a concept and idea to build an
application using existing technologies. One can take the benefits of existing technologies with logic to
9. Int J Elec & Comp Eng ISSN: 2088-8708
Tiarrah Computing: The Next Generation of Computing (Yanish Pradhananga)
1255
overcome the challenges created by IoT, IIoT, bigdata, real-time bigdata, event handling, data fusion,
designing wired and wireless things network, dynamic scaling and zero-downtime cluster. Overall it is the
practice of dealing with technologies to build an application with features that are scalable, durable,
available, consistent etc. It is the way to have a high level view to tackle IoT, bigdata and cloud computing
challenges.
REFERENCES
[1] J. A. Stankovic, “Research directions for the Internet of Things”, IEEE Internet of Things J., vol. 1, no. 1, pp. 3-9,
Feb. 2014.
[2] Amir Vahid Dastjerdi, Rajkumar Buyya, “Fog Computing: Helping the Internet of Things Realize Its Potential”,
IEEE Computer Society, vol. 49, no. 8, pp. 112-116, Aug. 2016.
[3] Dusit Niyato, Lu Xiao, Ping Wang, “Machine-to-machine communications for home energy management system in
smart grid”, IEEE Communications Magazine, vol. 49, no. 4, pp.53-59, 2011.
[4] Alexandru losup, Simon Ostermann, M. Nezih Yigitbasi, “Performance Analysis of Cloud Computing Services for
Many-Tasks Scientific Computing”, IEEE Transactions on Parallel and Distributed System, vol. 22, no. 6,
pp. 931-945, Jun. 2011.
[5] Bingwei Liu, Yu Chen, Ari Hadiks, Erik Blasch, Alex Aved, Dan Shen, Genshe Chen, “Information fusion in a
cloud computing era: A systems level perspective”, IEEE Aerospace and Electronic Systems Magazine, vol. 29,
no, 10, pp. 16-24, Oct. 2014.
[6] David L. Hall, James Llinas, “An Introduction to Multisensor Data Fusion”, Proc. of IEEE, vol. 85, pp. 6-23, Jan.
1997.
[7] Yu Zheng, “Methodologies for Cross-Domain Data Fusion: An Overview”, IEEE Transactions on Big Data, vol. 1,
no. 1, pp. 16-33, Mar. 2015.
[8] George Suciu, Alexandru Vulpe, Razvan Craciunescu, Cristina Butca, Victor Suciu, “Big Data Fusion for eHealth
and Ambient Assisted Living Cloud Applications”, Proc. of IEEE International Black Sea Conference on
Communication and Networking (BlackSeaCom), pp. 102-106, 2015.
[9] Yanish Pradhananga, Shridevi Karande, Chandraprakash Karande, “High Performance Analytics of Bigdata with
Dynamic and Optimized Hadoop Cluster”, Proc. IEEE International Conference on Advanced Communication
Control and Computing Technologies(ICACCCT), pp. 715-720, Jan. 2017.
[10] Robert Birke, Andrej Podzimek, Lydia Y. Chen, Evgenia Smimi, “Virtualization in the Private Cloud: State of the
Practice”, IEEE Transactions on Network and Service Management, vol. 13, no. 3, pp. 608-621, Aug. 2016.
[11] David S. Linthicum, “Practical Use of Microservices in Moving Workloads to the Cloud”, IEEE Cloud Computing,
vol.3, no. 5, pp. 6-9, Nov. 2016.
[12] Thasviya Haroon, S Neena, K K Krishnaprasad, Rejoice Wilson, Sanjo Simon, John Paul Martin, “Convivial
private cloud implementation system using OpenStack”, Proc. IEEE International Conference on Electrical,
Electronics, and Optimization Techniques(ICEEOT), Nov. 2016.
[13] Openstack, http://www.openstack.org, 2016, (29.09.2016)
[14] Yanish Pradhananga, Shridevi Karande, Chandraprakash Karande, “CBA: Cloud-based Bigdata Analytics”, Proc.
IEEE International Conference on Computing Communication Control and Automation(ICCUBEA), pp. 47-51, Jul.
2015.
[15] Dmitry Duplyakin, Matthew Haney, Henry Tufo, “Highly Available Cloud-Based Cluster Management”, Proc.
IEEE 15th International Conference on Cluster, Cloud and Grid Computing(CCGrid), Jul. 2015.
[16] Daniel Sun, Min Fu, Liming Zhu, Guoqiang Li, Qinghua Lu, “Non-Intrusive Anomaly Detection with Streaming
Performance Metrics and Logs for DevOps in Public Clouds: A Case Study in AWS”, IEEE Transactions on
Emerging Topics in Computing, vol. 4, no. 2, pp. 278-289, Jun. 2016.
[17] Lianping Chen, “Continuous Delivery: Huge Benefits, but Challenges Too”, IEEE Software, vol. 32, pp. 50-54,
no. 2, Apr. 2015.
[18] Alan Sill, “The Design and Architecture of Microservices”, IEEE Cloud Computing, vol. 3, no. 5, pp. 76-80,
Nov. 2016.
[19] Christian Esposito, Aniello Castiglione, Kim-Kwang Raymond Choo, “Challenges in Delivering Software in the
Cloud as Microservices”, IEEE Cloud Computing, vol. 3, no. 5, pp. 10-14, Nov. 2016.
[20] Yuri Alvarez, Fernando Las Heras, “ZigBee-based Sensor Network for Indoor Location and Tracking
Applications”, IEEE Latin America Transactions, vol. 14, no. 7, pp. 3208-3214, Jul. 2016.
[21] Zuochen Shi, Yintang Yang, Di Li, Yang Liu, “A Fully-Integrated Low-Power Analog Front-End for ZigBee
Transmitter Applications”, IEEE Chinese Journal of Electronics, vol. 25, no. 3, pp. 424-431, Aug. 2016.
[22] Eugene David Ngangue Ndih, Soumaya Cherkaoui, “On Enhancing Technology Coexistence in the IOT Era:
ZigBee and 802.11 Case”, IEEE Access, vol. 4, pp. 1835-1844, Apr. 2016.
[23] Mikhail Afnasyev, Tsuwei Chen, Geoffrey M. Voelker, Alex C. Snoeren, “Usage Patterns in an Urban WiFi
Network”, IEEE/ACM Transactions on Networking, vol. 18, no. 5, pp. 1359-1372, Oct. 2010.
[24] Bluetooth, https://www.bluetooth.org/, 2016 (accessed 17.10.2016).
[25] Jaimeen N. Chhatrawala, Nandish Jasani, Vidita Tilva, “FPGA based data Acquistion with Modbus protocol”,
Proc. International Conference on Communication and Signal Processing(ICCSP), Nov. 2016