In Cloud Computing, all the processing of the data collected by the node is done in the central server. This involves a lot of time as data has to be transferred from the node to central server before the processing of data can be done in the server. Also it is not practical to stream terabytes of data from the node to the cloud and back. To overcome these disadvantages, an extension of cloud computing, known as fog computing, is introduced. In this, the processing of data is done completely in the node if the data does not require higher computing power and is done partially if the data requires high computing power, after which the data is transferred to the central server for the remaining computations. This greatly reduces the time involved in the process and is more efficient as the central server is not overloaded. Fog is quite useful in geographically dispersed areas where connectivity can be irregular. The ideal use case requires intelligence near the edge where ultralow latency is critical, and is promised by fog computing. The concepts of cloud computing and fog computing will be explored and their features will be contrasted to understand which is more efficient and better suited for realtime application.
Security and Privacy Issues of Fog Computing: A SurveyHarshitParkar6677
Abstract. Fog computing is a promising computing paradigm that ex-
tends cloud computing to the edge of networks. Similar to cloud comput-
ing but with distinct characteristics, fog computing faces new security
and privacy challenges besides those inherited from cloud computing. In
this paper, we have surveyed these challenges and corresponding solu-
tions in a brief manner.
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Tiarrah Computing: The Next Generation of ComputingIJECEIAES
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
A Comparison of Cloud Execution Mechanisms Fog, Edge, and Clone Cloud Computing IJECEIAES
Cloud computing is a technology that was developed a decade ago to provide uninterrupted, scalable services to users and organizations. Cloud computing has also become an attractive feature for mobile users due to the limited features of mobile devices. The combination of cloud technologies with mobile technologies resulted in a new area of computing called mobile cloud computing. This combined technology is used to augment the resources existing in Smart devices. In recent times, Fog computing, Edge computing, and Clone Cloud computing techniques have become the latest trends after mobile cloud computing, which have all been developed to address the limitations in cloud computing. This paper reviews these recent technologies in detail and provides a comparative study of them. It also addresses the differences in these technologies and how each of them is effective for organizations and developers.
Security and Privacy Issues of Fog Computing: A SurveyHarshitParkar6677
Abstract. Fog computing is a promising computing paradigm that ex-
tends cloud computing to the edge of networks. Similar to cloud comput-
ing but with distinct characteristics, fog computing faces new security
and privacy challenges besides those inherited from cloud computing. In
this paper, we have surveyed these challenges and corresponding solu-
tions in a brief manner.
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
Tiarrah Computing: The Next Generation of ComputingIJECEIAES
The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.
A Comparison of Cloud Execution Mechanisms Fog, Edge, and Clone Cloud Computing IJECEIAES
Cloud computing is a technology that was developed a decade ago to provide uninterrupted, scalable services to users and organizations. Cloud computing has also become an attractive feature for mobile users due to the limited features of mobile devices. The combination of cloud technologies with mobile technologies resulted in a new area of computing called mobile cloud computing. This combined technology is used to augment the resources existing in Smart devices. In recent times, Fog computing, Edge computing, and Clone Cloud computing techniques have become the latest trends after mobile cloud computing, which have all been developed to address the limitations in cloud computing. This paper reviews these recent technologies in detail and provides a comparative study of them. It also addresses the differences in these technologies and how each of them is effective for organizations and developers.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
A Study on Cloud and Fog Computing Security Issues and SolutionsAM Publications
Cloud computing is the significant part of the data world. The security level in cloud is undefined. Fog computing is the new buzz word added to the technical world. And the term Fog was coined by CISCO. The need for Fog computing is security and gets the data more closely to the end-user. Fog Computing is not going to replace the Cloud computing, it will be acting as the intermediate layer for securing the data which is stored inside the cloud. The principal idea of this paper is to provide data safety measures to the Cloud storage through Fog Computing. Fog Computing will be playing the vital role for the future technology. The Internet of Things (IoT) will be using the Fog computing to implement the smart World concept. So, in the future we have to handle huge amount of data and we need to provide the security for the Data. This study gives the security solutions available for the different issues.
www.iosrjournals.org 57 | Page Latest development of cloud computing technolo...Sushil kumar Choudhary
Cloud computing is a network-based environment that focuses on sharing computations, Cloud computing networks access to a shared pool of configurable networks, servers, storage, service, applications & other important Computing resources. In modern era of Information Technology, the accesses to all information about the important activities of the related fields. In this paper discuss the advantages, disadvantages, characteristics, challenge, deployment model, cloud service model, cloud service provider & various applications areas of cloud computing such as small & large scale (manufacturing, automation, television, broadcast, constructions industries), Geographical Information system (GIS), Military intelligence fusion (MIS), business management, banking, Education, healthcare, Agriculture sector, E-Governance, project planning, cloud computing in family etc.
In the Field of Internet of Things IOT the devices by themselves can recognize the environment and conduct a certain functions by itself. IOT Devices majorly consist with sensors. Cloud Computing which is based in sensor networks manages huge amount of data which includes transferring and processing which takes delayed in service response time. As the growth of sensor network is increased, the demand to control and process the data on IOT devices is also increasing. Vikas Vashisth | Harshit Gupta | Dr. Deepak Chahal "Fog Computing: An Empirical Study" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30675.pdf Paper Url :https://www.ijtsrd.com/computer-science/realtime-computing/30675/fog-computing-an-empirical-study/vikas-vashisth
Aes based secured framework for cloud databasesIJARIIT
A Cloud database management system is a distributed database that delivers computing as a service (Caas) instead of
a product. Improving confidentiality of information stored in a cloud database .It is a very important contribution to the cloud
database. Data encryption is the optimum solution for achieving confidentiality. In some normal methods, encrypt the whole
database through some standard encryption algorithm that does not allow in SQL database operations directly on the cloud. This
formal solution affected by workload and cost would make the cloud database service inconvenient. I propose a novel
architecture for adaptive encryption of public cloud database. Adaptive encryption allows any SQL operation over encrypted
data. The novel cloud database architecture that uses adaptive encryption technique with no intermediate servers. This scheme
provides cloud provider with the best level of confidentiality for any database workload. I can determine the encryption and
adaptive encryption cost of data confidentiality from the research point of view. Index Terms Adaptive encryption Technique, AES (Advanced encryption Standard), Metadata.
Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the internet backbone.
Cloud computing and security issues in theIJNSA Journal
Cloud computing has formed the conceptual and infrastructural basis for tomorrow’s computing. The
global computing infrastructure is rapidly moving towards cloud based architecture. While it is important
to take advantages of could based computing by means of deploying it in diversified sectors, the security
aspects in a cloud based computing environment remains at the core of interest. Cloud based services and
service providers are being evolved which has resulted in a new business trend based on cloud technology.
With the introduction of numerous cloud based services and geographically dispersed cloud service
providers, sensitive information of different entities are normally stored in remote servers and locations
with the possibilities of being exposed to unwanted parties in situations where the cloud servers storing
those information are compromised. If security is not robust and consistent, the flexibility and advantages
that cloud computing has to offer will have little credibility. This paper presents a review on the cloud
computing concepts as well as security issues inherent within the context of cloud computing and cloud
infrastructure.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
in cloud computing data storage is a significant issue
because the entire data reside over a set of interconnected
resource pools that enables the data to be accessed through
virtual machines. It moves the application software’s and
databases to the large data centers where the management of
data is actually done. As the resource pools are situated over
various corners of the world, the management of data and
services may not be fully trustworthy. So, there are various
issues that need to be addressed with respect to the
management of data, service of data, privacy of data, security
of data etc. But the privacy and security of data is highly
challenging. To ensure privacy and security of data-at-rest in
cloud computing, we have proposed an effective and a novel
approach to ensure data security in cloud computing by means
of hiding data within images following is the concept of
steganography. The main objective of this paper is to prevent
data access from cloud data storage centers by unauthorized
users. This scheme perfectly stores data at cloud data storage
centers and retrieves data from it when it is needed.
Fog computing is defined as a decentralized infrastructure that places storage and processing components at the edge of the cloud, where data sources such as application users and sensors exist.It is an architecture that uses edge devices to carry out a substantial amount of computation (edge computing), storage, and communication locally and routed over the Internet backbone.To achieve real-time automation, data capture and analysis has to be done in real-time without having to deal with the high latency and low bandwidth issues that occur during the processing of network data In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures.. In 2015, Cisco partnered with Microsoft, Dell, Intel, Arm and Princeton University to form the OpenFog Consortium.The consortium's primary goals were to both promote and standardize fog computing. These concepts brought computing resources closer to data sources.Fog computing also differentiates between relevant and irrelevant data. While relevant data is sent to the cloud for storage, irrelevant data is either deleted or transmitted to the appropriate local platform. As such, edge computing and fog computing work in unison to minimize latency and maximize the efficiency associated with cloud-enabled enterprise systemsFog computing consists of various componets such as fog nodes.Fog nodes are independent devices that pick up the generated information. Fog nodes fall under three categories: fog devices, fog servers, and gateways. These devices store necessary data while fog servers also compute this data to decide the course of action. Fog devices are usually linked to fog servers. Fog gateways redirect the information between the various fog devices and servers. With Fog computing, local data storage and scrutiny of time-sensitive data become easier. With this the amount and the distance of passing data to the cloud is reduced, therefore reducing the security challenges.Fog computing enables data processing based on application demands, available networking and computing resources. This reduces the amount of data required to be transferred to the cloud, ultimately saving network bandwidth.Fog computing can run independently and ensure uninterrupted services even with fluctuating network connectivity to the cloud. It performs all time-sensitive actions close to end users which meets latency constraints of IoT applications.
IoT applications where data is generated in terabytes or more, where a quick and large amount of data processing is required and sending data to the cloud back and forth is not feasible, are good candidates for fog computing. Fog computing provides real-time processing and event responses which are critical in healthcare. Besides, it also addresses issues regarding network connectivity and traffic required for remote storage, processing and medical record retrieval from the cloud.
The Future of Fog Computing and IoT: Revolutionizing Data ProcessingFredReynolds2
Sending a business e-mail, watching a YouTube video, making an online video call meeting, or playing a video game online requires considerable data flow. It necessitates such massive data flow in the direction of servers in data centers. Cloud computing prefers remote data processing and substantial storage systems to develop online apps we use daily. But we must know that other decentralized cloud computing systems exist. Fog computing technology is growing wildly in popularity. As per fog technology experts, the global fog technology market will reach nearly $2.3 billion at the end of 2032. The market for fog technology was $196.7 million at the end of 2022.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
A Study on Cloud and Fog Computing Security Issues and SolutionsAM Publications
Cloud computing is the significant part of the data world. The security level in cloud is undefined. Fog computing is the new buzz word added to the technical world. And the term Fog was coined by CISCO. The need for Fog computing is security and gets the data more closely to the end-user. Fog Computing is not going to replace the Cloud computing, it will be acting as the intermediate layer for securing the data which is stored inside the cloud. The principal idea of this paper is to provide data safety measures to the Cloud storage through Fog Computing. Fog Computing will be playing the vital role for the future technology. The Internet of Things (IoT) will be using the Fog computing to implement the smart World concept. So, in the future we have to handle huge amount of data and we need to provide the security for the Data. This study gives the security solutions available for the different issues.
www.iosrjournals.org 57 | Page Latest development of cloud computing technolo...Sushil kumar Choudhary
Cloud computing is a network-based environment that focuses on sharing computations, Cloud computing networks access to a shared pool of configurable networks, servers, storage, service, applications & other important Computing resources. In modern era of Information Technology, the accesses to all information about the important activities of the related fields. In this paper discuss the advantages, disadvantages, characteristics, challenge, deployment model, cloud service model, cloud service provider & various applications areas of cloud computing such as small & large scale (manufacturing, automation, television, broadcast, constructions industries), Geographical Information system (GIS), Military intelligence fusion (MIS), business management, banking, Education, healthcare, Agriculture sector, E-Governance, project planning, cloud computing in family etc.
In the Field of Internet of Things IOT the devices by themselves can recognize the environment and conduct a certain functions by itself. IOT Devices majorly consist with sensors. Cloud Computing which is based in sensor networks manages huge amount of data which includes transferring and processing which takes delayed in service response time. As the growth of sensor network is increased, the demand to control and process the data on IOT devices is also increasing. Vikas Vashisth | Harshit Gupta | Dr. Deepak Chahal "Fog Computing: An Empirical Study" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30675.pdf Paper Url :https://www.ijtsrd.com/computer-science/realtime-computing/30675/fog-computing-an-empirical-study/vikas-vashisth
Aes based secured framework for cloud databasesIJARIIT
A Cloud database management system is a distributed database that delivers computing as a service (Caas) instead of
a product. Improving confidentiality of information stored in a cloud database .It is a very important contribution to the cloud
database. Data encryption is the optimum solution for achieving confidentiality. In some normal methods, encrypt the whole
database through some standard encryption algorithm that does not allow in SQL database operations directly on the cloud. This
formal solution affected by workload and cost would make the cloud database service inconvenient. I propose a novel
architecture for adaptive encryption of public cloud database. Adaptive encryption allows any SQL operation over encrypted
data. The novel cloud database architecture that uses adaptive encryption technique with no intermediate servers. This scheme
provides cloud provider with the best level of confidentiality for any database workload. I can determine the encryption and
adaptive encryption cost of data confidentiality from the research point of view. Index Terms Adaptive encryption Technique, AES (Advanced encryption Standard), Metadata.
Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the internet backbone.
Cloud computing and security issues in theIJNSA Journal
Cloud computing has formed the conceptual and infrastructural basis for tomorrow’s computing. The
global computing infrastructure is rapidly moving towards cloud based architecture. While it is important
to take advantages of could based computing by means of deploying it in diversified sectors, the security
aspects in a cloud based computing environment remains at the core of interest. Cloud based services and
service providers are being evolved which has resulted in a new business trend based on cloud technology.
With the introduction of numerous cloud based services and geographically dispersed cloud service
providers, sensitive information of different entities are normally stored in remote servers and locations
with the possibilities of being exposed to unwanted parties in situations where the cloud servers storing
those information are compromised. If security is not robust and consistent, the flexibility and advantages
that cloud computing has to offer will have little credibility. This paper presents a review on the cloud
computing concepts as well as security issues inherent within the context of cloud computing and cloud
infrastructure.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
in cloud computing data storage is a significant issue
because the entire data reside over a set of interconnected
resource pools that enables the data to be accessed through
virtual machines. It moves the application software’s and
databases to the large data centers where the management of
data is actually done. As the resource pools are situated over
various corners of the world, the management of data and
services may not be fully trustworthy. So, there are various
issues that need to be addressed with respect to the
management of data, service of data, privacy of data, security
of data etc. But the privacy and security of data is highly
challenging. To ensure privacy and security of data-at-rest in
cloud computing, we have proposed an effective and a novel
approach to ensure data security in cloud computing by means
of hiding data within images following is the concept of
steganography. The main objective of this paper is to prevent
data access from cloud data storage centers by unauthorized
users. This scheme perfectly stores data at cloud data storage
centers and retrieves data from it when it is needed.
Fog computing is defined as a decentralized infrastructure that places storage and processing components at the edge of the cloud, where data sources such as application users and sensors exist.It is an architecture that uses edge devices to carry out a substantial amount of computation (edge computing), storage, and communication locally and routed over the Internet backbone.To achieve real-time automation, data capture and analysis has to be done in real-time without having to deal with the high latency and low bandwidth issues that occur during the processing of network data In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures.. In 2015, Cisco partnered with Microsoft, Dell, Intel, Arm and Princeton University to form the OpenFog Consortium.The consortium's primary goals were to both promote and standardize fog computing. These concepts brought computing resources closer to data sources.Fog computing also differentiates between relevant and irrelevant data. While relevant data is sent to the cloud for storage, irrelevant data is either deleted or transmitted to the appropriate local platform. As such, edge computing and fog computing work in unison to minimize latency and maximize the efficiency associated with cloud-enabled enterprise systemsFog computing consists of various componets such as fog nodes.Fog nodes are independent devices that pick up the generated information. Fog nodes fall under three categories: fog devices, fog servers, and gateways. These devices store necessary data while fog servers also compute this data to decide the course of action. Fog devices are usually linked to fog servers. Fog gateways redirect the information between the various fog devices and servers. With Fog computing, local data storage and scrutiny of time-sensitive data become easier. With this the amount and the distance of passing data to the cloud is reduced, therefore reducing the security challenges.Fog computing enables data processing based on application demands, available networking and computing resources. This reduces the amount of data required to be transferred to the cloud, ultimately saving network bandwidth.Fog computing can run independently and ensure uninterrupted services even with fluctuating network connectivity to the cloud. It performs all time-sensitive actions close to end users which meets latency constraints of IoT applications.
IoT applications where data is generated in terabytes or more, where a quick and large amount of data processing is required and sending data to the cloud back and forth is not feasible, are good candidates for fog computing. Fog computing provides real-time processing and event responses which are critical in healthcare. Besides, it also addresses issues regarding network connectivity and traffic required for remote storage, processing and medical record retrieval from the cloud.
The Future of Fog Computing and IoT: Revolutionizing Data ProcessingFredReynolds2
Sending a business e-mail, watching a YouTube video, making an online video call meeting, or playing a video game online requires considerable data flow. It necessitates such massive data flow in the direction of servers in data centers. Cloud computing prefers remote data processing and substantial storage systems to develop online apps we use daily. But we must know that other decentralized cloud computing systems exist. Fog computing technology is growing wildly in popularity. As per fog technology experts, the global fog technology market will reach nearly $2.3 billion at the end of 2032. The market for fog technology was $196.7 million at the end of 2022.
A review on orchestration distributed systems for IoT smart services in fog c...IJECEIAES
This paper provides a review of orchestration distributed systems for IoT smart services in fog computing. The cloud infrastructure alone cannot handle the flow of information with the abundance of data, devices and interactions. Thus, fog computing becomes a new paradigm to overcome the problem. One of the first challenges was to build the orchestration systems to activate the clouds and to execute tasks throughout the whole system that has to be considered to the situation in the large scale of geographical distance, heterogeneity and low latency to support the limitation of cloud computing. Some problems exist for orchestration distributed in fog computing are to fulfil with high reliability and low-delay requirements in the IoT applications system and to form a larger computer network like a fog network, at different geographic sites. This paper reviewed approximately 68 articles on orchestration distributed system for fog computing. The result shows the orchestration distribute system and some of the evaluation criteria for fog computing that have been compared in terms of Borg, Kubernetes, Swarm, Mesos, Aurora, heterogeneity, QoS management, scalability, mobility, federation, and interoperability. The significance of this study is to support the researcher in developing orchestration distributed systems for IoT smart services in fog computing focus on IR4.0 national agenda.
A Comprehensive Exploration of Fog Computing.pdfEnterprise Wired
This article delves into the intricacies of Fog computing, exploring its definition, key components, benefits, and its transformative impact on various industries.
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
Developing a smart system for infant incubators using the internet of things ...IJECEIAES
This research is developing an incubator system that integrates the internet of things and artificial intelligence to improve care for premature babies. The system workflow starts with sensors that collect data from the incubator. Then, the data is sent in real-time to the internet of things (IoT) broker eclipse mosquito using the message queue telemetry transport (MQTT) protocol version 5.0. After that, the data is stored in a database for analysis using the long short-term memory network (LSTM) method and displayed in a web application using an application programming interface (API) service. Furthermore, the experimental results produce as many as 2,880 rows of data stored in the database. The correlation coefficient between the target attribute and other attributes ranges from 0.23 to 0.48. Next, several experiments were conducted to evaluate the model-predicted value on the test data. The best results are obtained using a two-layer LSTM configuration model, each with 60 neurons and a lookback setting 6. This model produces an R 2 value of 0.934, with a root mean square error (RMSE) value of 0.015 and a mean absolute error (MAE) of 0.008. In addition, the R 2 value was also evaluated for each attribute used as input, with a result of values between 0.590 and 0.845.
A review on internet of things-based stingless bee's honey production with im...IJECEIAES
Honey is produced exclusively by honeybees and stingless bees which both are well adapted to tropical and subtropical regions such as Malaysia. Stingless bees are known for producing small amounts of honey and are known for having a unique flavor profile. Problem identified that many stingless bees collapsed due to weather, temperature and environment. It is critical to understand the relationship between the production of stingless bee honey and environmental conditions to improve honey production. Thus, this paper presents a review on stingless bee's honey production and prediction modeling. About 54 previous research has been analyzed and compared in identifying the research gaps. A framework on modeling the prediction of stingless bee honey is derived. The result presents the comparison and analysis on the internet of things (IoT) monitoring systems, honey production estimation, convolution neural networks (CNNs), and automatic identification methods on bee species. It is identified based on image detection method the top best three efficiency presents CNN is at 98.67%, densely connected convolutional networks with YOLO v3 is 97.7%, and DenseNet201 convolutional networks 99.81%. This study is significant to assist the researcher in developing a model for predicting stingless honey produced by bee's output, which is important for a stable economy and food security.
A trust based secure access control using authentication mechanism for intero...IJECEIAES
The internet of things (IoT) is a revolutionary innovation in many aspects of our society including interactions, financial activity, and global security such as the military and battlefield internet. Due to the limited energy and processing capacity of network devices, security, energy consumption, compatibility, and device heterogeneity are the long-term IoT problems. As a result, energy and security are critical for data transmission across edge and IoT networks. Existing IoT interoperability techniques need more computation time, have unreliable authentication mechanisms that break easily, lose data easily, and have low confidentiality. In this paper, a key agreement protocol-based authentication mechanism for IoT devices is offered as a solution to this issue. This system makes use of information exchange, which must be secured to prevent access by unauthorized users. Using a compact contiki/cooja simulator, the performance and design of the suggested framework are validated. The simulation findings are evaluated based on detection of malicious nodes after 60 minutes of simulation. The suggested trust method, which is based on privacy access control, reduced packet loss ratio to 0.32%, consumed 0.39% power, and had the greatest average residual energy of 0.99 mJoules at 10 nodes.
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersIJECEIAES
In real world applications, data are subject to ambiguity due to several factors; fuzzy sets and fuzzy numbers propose a great tool to model such ambiguity. In case of hesitation, the complement of a membership value in fuzzy numbers can be different from the non-membership value, in which case we can model using intuitionistic fuzzy numbers as they provide flexibility by defining both a membership and a non-membership functions. In this article, we consider the intuitionistic fuzzy linear programming problem with intuitionistic polygonal fuzzy numbers, which is a generalization of the previous polygonal fuzzy numbers found in the literature. We present a modification of the simplex method that can be used to solve any general intuitionistic fuzzy linear programming problem after approximating the problem by an intuitionistic polygonal fuzzy number with n edges. This method is given in a simple tableau formulation, and then applied on numerical examples for clarity.
The performance of artificial intelligence in prostate magnetic resonance im...IJECEIAES
Prostate cancer is the predominant form of cancer observed in men worldwide. The application of magnetic resonance imaging (MRI) as a guidance tool for conducting biopsies has been established as a reliable and well-established approach in the diagnosis of prostate cancer. The diagnostic performance of MRI-guided prostate cancer diagnosis exhibits significant heterogeneity due to the intricate and multi-step nature of the diagnostic pathway. The development of artificial intelligence (AI) models, specifically through the utilization of machine learning techniques such as deep learning, is assuming an increasingly significant role in the field of radiology. In the realm of prostate MRI, a considerable body of literature has been dedicated to the development of various AI algorithms. These algorithms have been specifically designed for tasks such as prostate segmentation, lesion identification, and classification. The overarching objective of these endeavors is to enhance diagnostic performance and foster greater agreement among different observers within MRI scans for the prostate. This review article aims to provide a concise overview of the application of AI in the field of radiology, with a specific focus on its utilization in prostate MRI.
Seizure stage detection of epileptic seizure using convolutional neural networksIJECEIAES
According to the World Health Organization (WHO), seventy million individuals worldwide suffer from epilepsy, a neurological disorder. While electroencephalography (EEG) is crucial for diagnosing epilepsy and monitoring the brain activity of epilepsy patients, it requires a specialist to examine all EEG recordings to find epileptic behavior. This procedure needs an experienced doctor, and a precise epilepsy diagnosis is crucial for appropriate treatment. To identify epileptic seizures, this study employed a convolutional neural network (CNN) based on raw scalp EEG signals to discriminate between preictal, ictal, postictal, and interictal segments. The possibility of these characteristics is explored by examining how well timedomain signals work in the detection of epileptic signals using intracranial Freiburg Hospital (FH), scalp Children's Hospital Boston-Massachusetts Institute of Technology (CHB-MIT) databases, and Temple University Hospital (TUH) EEG. To test the viability of this approach, two types of experiments were carried out. Firstly, binary class classification (preictal, ictal, postictal each versus interictal) and four-class classification (interictal versus preictal versus ictal versus postictal). The average accuracy for stage detection using CHB-MIT database was 84.4%, while the Freiburg database's time-domain signals had an accuracy of 79.7% and the highest accuracy of 94.02% for classification in the TUH EEG database when comparing interictal stage to preictal stage.
Analysis of driving style using self-organizing maps to analyze driver behaviorIJECEIAES
Modern life is strongly associated with the use of cars, but the increase in acceleration speeds and their maneuverability leads to a dangerous driving style for some drivers. In these conditions, the development of a method that allows you to track the behavior of the driver is relevant. The article provides an overview of existing methods and models for assessing the functioning of motor vehicles and driver behavior. Based on this, a combined algorithm for recognizing driving style is proposed. To do this, a set of input data was formed, including 20 descriptive features: About the environment, the driver's behavior and the characteristics of the functioning of the car, collected using OBD II. The generated data set is sent to the Kohonen network, where clustering is performed according to driving style and degree of danger. Getting the driving characteristics into a particular cluster allows you to switch to the private indicators of an individual driver and considering individual driving characteristics. The application of the method allows you to identify potentially dangerous driving styles that can prevent accidents.
Hyperspectral object classification using hybrid spectral-spatial fusion and ...IJECEIAES
Because of its spectral-spatial and temporal resolution of greater areas, hyperspectral imaging (HSI) has found widespread application in the field of object classification. The HSI is typically used to accurately determine an object's physical characteristics as well as to locate related objects with appropriate spectral fingerprints. As a result, the HSI has been extensively applied to object identification in several fields, including surveillance, agricultural monitoring, environmental research, and precision agriculture. However, because of their enormous size, objects require a lot of time to classify; for this reason, both spectral and spatial feature fusion have been completed. The existing classification strategy leads to increased misclassification, and the feature fusion method is unable to preserve semantic object inherent features; This study addresses the research difficulties by introducing a hybrid spectral-spatial fusion (HSSF) technique to minimize feature size while maintaining object intrinsic qualities; Lastly, a soft-margins kernel is proposed for multi-layer deep support vector machine (MLDSVM) to reduce misclassification. The standard Indian pines dataset is used for the experiment, and the outcome demonstrates that the HSSF-MLDSVM model performs substantially better in terms of accuracy and Kappa coefficient.
Democratizing Fuzzing at Scale by Abhishek Aryaabh.arya
Presented at NUS: Fuzzing and Software Security Summer School 2024
This keynote talks about the democratization of fuzzing at scale, highlighting the collaboration between open source communities, academia, and industry to advance the field of fuzzing. It delves into the history of fuzzing, the development of scalable fuzzing platforms, and the empowerment of community-driven research. The talk will further discuss recent advancements leveraging AI/ML and offer insights into the future evolution of the fuzzing landscape.
Water scarcity is the lack of fresh water resources to meet the standard water demand. There are two type of water scarcity. One is physical. The other is economic water scarcity.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
Forklift Classes Overview by Intella PartsIntella Parts
Discover the different forklift classes and their specific applications. Learn how to choose the right forklift for your needs to ensure safety, efficiency, and compliance in your operations.
For more technical information, visit our website https://intellaparts.com
Courier management system project report.pdfKamal Acharya
It is now-a-days very important for the people to send or receive articles like imported furniture, electronic items, gifts, business goods and the like. People depend vastly on different transport systems which mostly use the manual way of receiving and delivering the articles. There is no way to track the articles till they are received and there is no way to let the customer know what happened in transit, once he booked some articles. In such a situation, we need a system which completely computerizes the cargo activities including time to time tracking of the articles sent. This need is fulfilled by Courier Management System software which is online software for the cargo management people that enables them to receive the goods from a source and send them to a required destination and track their status from time to time.
2. ISSN:2088-8708
IJECE Vol. 7, No. 6, December 2017 : 3669–3673
3670
services [5]. It is centralized or distributed in regional areas. Fog servers have localized data storage center
that avoids delay in computing
2. OVERVIEW
Fog computing is defined as a distributed computing paradigm that fundamentally extends the
services provided by the cloud to the edge of the network [6]. Cisco defines it as fog computing is considered
as an extension of the cloud computing paradigm from the core of network to the edge of the network [7]. It
facilitates the computation, storage and networking between the end devices and the traditional cloud servers.
Instead of running application only in the cloud, fog computing involves the cloud as well as the edge
devices between the end devices and cloud servers to run the application. Fog computing takes advantages of
both edge and cloud computing while it benefits from edge devices’ close proximity to the endpoints, it also
leverages the on-demand scalability of cloud resources[8]. It basically reduces the load on the cloud server by
efficiently using the resources available in the edge nodes to do partial computation and also it reduces the
traffic to the cloud server by doing filtering operations in the nodes.
There are mainly two concepts which are usually confused with fog computing. These concepts are
Mobile Edge Computing (MEC) and Mobile Cloud Computing (MCC). MCC basically suggests that both the
data storage and data processing is done outside the mobile in a cloud. So it moves the data and computing
power from the individual mobile to the cloud. MEC is similar to that of a Cloudlet. It can be seen as a cloud
computing paradigm in a more concentrated. It is like a cloud server running on an edge of a mobile network
[9]. Fog computing is a mixture of two of these concepts along with features of its own making it more
reliable and useful.
3. ISSUES
Fog computing extends cloud computing and acts on Internet of Things. These devices, called the
fog nodes can be deployed in any environment with a network connection. Fog computing has additional
storage resources at the edges to process the requirements. Hence, the Fog server needs to adapt its services
leading to management and maintenance cost. In addition, the operator needs to encounter the following
issues:
3.1. Privacy
Fog computing being dominated by wireless primarily, there is a big concern for network privacy.
Network operator generates configurations manually, fog nodes being deployed at the edge of Internet,
massive maintenance cost is involved [10]. The leakage of private data is gaining attention while using
networks. The end users are more accessible to the Fog nodes. Because of this, more sensitive information is
collected by Fog nodes than remote cloud. Encryption methods like HAN (Home-Area Network) can be used
to counter these issues.
3.2. Security
The main security issue is the authentication of the devices involved in fog computing at different
gateways. Each appliance has its own IP address. A malicious user may use a fake IP address to access
information stored on the particular fog node. To overcome this access control an intrusion detection system
has to be applied at all layers of the platform [11].
3.3. Network Management
Being connected to heterogeneous devices, managing the fog nodes, the network, connection
between each nodes will be burden unless SDN and NFV techniques are applied [12].
3.4. Placement of Fog Servers
Placing a group of fog servers in such a way that they deliver maximum service to the local
requirements is an issue. Analyzing the work done in each node in the server before placing them reduces the
maintenance cost [13].
3.5. Delay in Computing
Delays due to Data aggregation, Resource over-usage reduces the effectives of services provided by
the fog servers, causing delay in computing data. Data Aggregation should take place before data processing,
Resource-limited fog nodes should be designed scheduling by using priority and mobility model.
3. IJECE ISSN: 2088-8708
Fog Computing: Issues, Challenges and Future Directions (Prakash P)
3671
3.6. Energy Consumption
Since fog environments use large number of fog nodes, the computation is distributed and can be
less energy-efficient. Hence, reduction of energy consumption in fog computing is essential [14].
4. APPLICATIONS
Fog computing have huge benefits in the real time applications. It is broadly used in IOT
applications which involves real time data. It acts as an extended version of cloud computing. It is an
intermediate between the cloud and end users (closer to end users).It can used in both the ways, that can be
between machine and machine or between the human to machine.
4.1. Mobile Big Data Analytics
In IOT (Internet of things) data is collected in bulk and it can’t be stored in cloud (Not efficient
enough). In such situations it is beneficial to use fog computing instead of cloud computing as fog nodes are
much closer to end systems.It also eliminates other problems such as delay, traffic, processing speed,
delivery time, response time, storage data transportation and data processing. Fog computing could be the
future of IOT applications.
4.2. Water Pressure at Dams
Sensors installed in dams send data to the cloud where the data is analyzed and officials are alerted
in case of any discrepancies. The problem faced here is the delay of information which could be dangerous.
To solve this, Fog is used, and since it is near the end systems it is easier to transmit data, analyze it and give
instantaneous feedback.
4.3. Smart Utility Service
Here the main aim is to conserve time, money and energy. The analysis of data needs to run every
minute to stay updated. This mostly involves the end users therefore cloud might not serve the purpose.
These applications inform users everyday as to which appliances conserve more energy. IOT also creates a
lot of traffic in the network where sending other data is difficult therefore fog is a good alternative.
4.4. Health Data
When data needs to be transferred from one hospital to another high security and data integrity is a
must. This can be provided by using Fog since the data is transferred locally. These fog nodes can be used by
the labs to update the patient’s lab records which can be accessed by the nearby hospitals easily. Hardcopies
of medical history and health issues of a patient need not be carried as these unified records can be accessed
by any doctor.
5. ARCHITECTURE
One major problem faced with cloud computing is the bandwidth, especially on wireless networks.
The problem only increases as the Internet of things continues to expand with a large number of physical
objects connected wirelessly. Fog computing solves this problem by storing data in local computers and
devices generally referred to as fog nodes. Any device with computing, storage and network connectivity can
be utilized as a fog node e.g. hand-held devices, tablets, PC’s, routers etc. These fog nodes are managed by
Fog Data Service which serves various purposes like control and security of data, data reduction, data
virtualization and edge analytics. Data could also be sent to the cloud for long term analytics.
4. ISSN:2088-8708
IJECE Vol. 7, No. 6, December 2017 : 3669–3673
3672
Figure 1. Fog computing architecture
6. COMPARISON BETWEEN CLOUD COMPUTING AND FOG COMPUTING
Cloud computing is a great solution when there is an uninterrupted access to a cloud server capable
of processing and transmitting data quickly to the end device. Fog computing is mainly an architecture of
heterogeneous devices in which certain applications and services are managed at the node by a smart device
but the actual management is by the cloud. Fog computing primarily targets the mobile users while cloud
targets the general internet users. The service type is localized in fog computing whereas in cloud, it’s
globalized. Though the storage is limited in fog computing compared to cloud computing, the distance
between the users is very less that it can be communicated through wireless connections but in cloud
computing, communication is through IP networks.
Comparing the parameters of fog and cloud computing, in fog computing, Mobility is supported
whereas in cloud it's limited. The number of service nodes in fog is more than in the other. Real time
interactions are supported in the former while it's not supported in the latter. Providing local security in cloud
is tedious while it is possible in Fog computing.
Table 1. Comparisons of the Parameters of Cloud and Fog computing [15]
Parameters Cloud computing Fog computing
Server nodes Location Within the Internet At the edge of the network
Client and Server Distance Multiple hops Single/Multiple hops
Latency High Low
Delay Jitter High Very low
Security Less secure, Undefined More secure, can be defined
Awareness about Location No Yes
Vulnerability High Probability Very low probability
Geographical Distribution Centralized Dense and Distributed
Number of Server nodes Few Very Large
Real Time Interactions Supported Supported
Kind of last mile connectivity Leased line Wireless
Mobility Limited support Supported
5. IJECE ISSN: 2088-8708
Fog Computing: Issues, Challenges and Future Directions (Prakash P)
3673
7. CONCLUSION
Fog Computing and its applications has been discussed .Fog computing has the ability to handle the
flooding of data created by Internet of Things on the edge of the network. The characteristics of fog
computing like mobility, proximity to end-users, low latency, location awareness, heterogeneity and due to
its real-time applications fog computing platform is considered as the appropriate platform for Internet of
Things. Fog computing is entering an exciting time, where it can positively affect operational costs. Fog
computing resolves problems related to congestion, space and internet traffic. Fog computing also provides
an intelligent platform to manage the distributed and real-time nature of emerging IoT infrastructures.
Developing these services at the edge through fog computing will lead to new business models and
opportunities for network operators.
REFERENCES
[1] Sangeetha K.S, Prakash P, "Big Data and cloud: A survey", (2015) Advances in Intelligent Systems and
Computing, 325, pp. 773-778.
[2] Shiva Jegan R.D, Vasudevan S.K, Abarna K., Prakash P, Srivathsan S., Gangothri V., "Cloud computing: A
Technical Gawk", (2014) International Journal of Applied Engineering Research, 9(14), pp. 2539-2554.
[3] Viraj G. Mandlekar, VireshKumar Mahale, Sanket S.Sancheti, Maaz S. Rais, "Survey on Fog Computing
Mitigating Data Theft Attacks in Cloud", International Journal of Innovative Research in Computer Science &
Technology, Nov - 2014.
[4] Ramin Elahi, "Fog Computing and its Ecosystem", SNIA Global Education, 2015.
[5] Tom H. Luan, Longxiang Gao, Zhi Li, Yang Xiang, Guiyi We, and Limin Sun, "Fog Computing: Focusing on
mobile Users at the Edge", Mar 2016.
[6] Shanhe Yi, Zhengrui Qin and Qun Li, “Security and Privacy Issues of Fog Computing: A Survey”, International
Conference on Wireless Algorithms, Systems, and Applications, 2015, pp. 685-695.
[7] Shanhe Yi, Zijiang Hao, Zhengrui Qin, and Qun Li, "Fog Computing: Platform and Applications", dept. Of
Computer Science, College of William and Mary, IEEE - 2015.
[8] Harshit Gupta, Amir Vahid Dastjerdi, Soumya K. Ghosh and Rajkumar Buyya, “iFogSim: A Toolkit for Modeling
and Simulation of Resource Management Techniques in Internet of Things, Edge and Fog Computing
Environments”, Cloud Computing and Distributed Systems Laboratory, The University of Melbourne, June 6,
2016.
[9] Shanhe Yi, Zijiang Hao, Zhengrui Qin, and Qun Li, "Fog Computing: Platform and Applications", dept. Of
Computer Science, College of William and Mary, IEEE - 2015.
[10] Shanhe Yi, Zhengrui Qin, and Qun Li, "Security and Privacy Issues Of Fog Computing: A Survey", College of
William and Mary.
[11] Shanhe Yi, Zijiang Hao, Zhengrui Qin, and Qun Li, "Fog Computing: Platform and Applications", dept. Of
Computer Science, College of William and Mary, IEEE - 2015.
[12] Shanhe Yi, Zijiang Hao, Zhengrui Qin, and Qun Li, "Fog Computing: Platform and Applications", dept. Of
Computer Science, College of William and Mary, IEEE - 2015.
[13] Tom H. Luan, Longxiang Gao, Zhi Li, Yang Xiang, Guiyi We, and Limin Sun, "Fog Computing: Focusing on
mobile Users at the Edge", Mar 2016.
[14] Amir Vahid Dastjerdi, Harshit Gupta, Rodrigo N. Calheiros, Soumya K. Ghosh, and Rajkumar Buyya, "Fog
Computing: Principles, Architectures, and Applications", IEEE -21016.
[15] Anuj Kumar, K.P Saharan, "Fog in comparison to cloud: A survey", International Journal of Computer
Applications, Jul - 2015.