This document discusses fog computing, which is a decentralized computing infrastructure that processes data closer to the data source, between IoT devices and cloud servers, to address limitations of cloud computing. It defines fog computing and compares it to cloud computing. It outlines the characteristics, architecture involving terminal, fog and cloud layers, and how fog computing works. The document also discusses advantages like low latency, applications like connected vehicles, and the future potential of fog computing.
The Future of Fog Computing and IoT: Revolutionizing Data ProcessingFredReynolds2
Sending a business e-mail, watching a YouTube video, making an online video call meeting, or playing a video game online requires considerable data flow. It necessitates such massive data flow in the direction of servers in data centers. Cloud computing prefers remote data processing and substantial storage systems to develop online apps we use daily. But we must know that other decentralized cloud computing systems exist. Fog computing technology is growing wildly in popularity. As per fog technology experts, the global fog technology market will reach nearly $2.3 billion at the end of 2032. The market for fog technology was $196.7 million at the end of 2022.
Fog computing is defined as a decentralized infrastructure that places storage and processing components at the edge of the cloud, where data sources such as application users and sensors exist.It is an architecture that uses edge devices to carry out a substantial amount of computation (edge computing), storage, and communication locally and routed over the Internet backbone.To achieve real-time automation, data capture and analysis has to be done in real-time without having to deal with the high latency and low bandwidth issues that occur during the processing of network data In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures.. In 2015, Cisco partnered with Microsoft, Dell, Intel, Arm and Princeton University to form the OpenFog Consortium.The consortium's primary goals were to both promote and standardize fog computing. These concepts brought computing resources closer to data sources.Fog computing also differentiates between relevant and irrelevant data. While relevant data is sent to the cloud for storage, irrelevant data is either deleted or transmitted to the appropriate local platform. As such, edge computing and fog computing work in unison to minimize latency and maximize the efficiency associated with cloud-enabled enterprise systemsFog computing consists of various componets such as fog nodes.Fog nodes are independent devices that pick up the generated information. Fog nodes fall under three categories: fog devices, fog servers, and gateways. These devices store necessary data while fog servers also compute this data to decide the course of action. Fog devices are usually linked to fog servers. Fog gateways redirect the information between the various fog devices and servers. With Fog computing, local data storage and scrutiny of time-sensitive data become easier. With this the amount and the distance of passing data to the cloud is reduced, therefore reducing the security challenges.Fog computing enables data processing based on application demands, available networking and computing resources. This reduces the amount of data required to be transferred to the cloud, ultimately saving network bandwidth.Fog computing can run independently and ensure uninterrupted services even with fluctuating network connectivity to the cloud. It performs all time-sensitive actions close to end users which meets latency constraints of IoT applications.
IoT applications where data is generated in terabytes or more, where a quick and large amount of data processing is required and sending data to the cloud back and forth is not feasible, are good candidates for fog computing. Fog computing provides real-time processing and event responses which are critical in healthcare. Besides, it also addresses issues regarding network connectivity and traffic required for remote storage, processing and medical record retrieval from the cloud.
Fog Computing: Issues, Challenges and Future Directions IJECEIAES
In Cloud Computing, all the processing of the data collected by the node is done in the central server. This involves a lot of time as data has to be transferred from the node to central server before the processing of data can be done in the server. Also it is not practical to stream terabytes of data from the node to the cloud and back. To overcome these disadvantages, an extension of cloud computing, known as fog computing, is introduced. In this, the processing of data is done completely in the node if the data does not require higher computing power and is done partially if the data requires high computing power, after which the data is transferred to the central server for the remaining computations. This greatly reduces the time involved in the process and is more efficient as the central server is not overloaded. Fog is quite useful in geographically dispersed areas where connectivity can be irregular. The ideal use case requires intelligence near the edge where ultralow latency is critical, and is promised by fog computing. The concepts of cloud computing and fog computing will be explored and their features will be contrasted to understand which is more efficient and better suited for realtime application.
A Comprehensive Exploration of Fog Computing.pdfEnterprise Wired
This article delves into the intricacies of Fog computing, exploring its definition, key components, benefits, and its transformative impact on various industries.
The Future of Fog Computing and IoT: Revolutionizing Data ProcessingFredReynolds2
Sending a business e-mail, watching a YouTube video, making an online video call meeting, or playing a video game online requires considerable data flow. It necessitates such massive data flow in the direction of servers in data centers. Cloud computing prefers remote data processing and substantial storage systems to develop online apps we use daily. But we must know that other decentralized cloud computing systems exist. Fog computing technology is growing wildly in popularity. As per fog technology experts, the global fog technology market will reach nearly $2.3 billion at the end of 2032. The market for fog technology was $196.7 million at the end of 2022.
Fog computing is defined as a decentralized infrastructure that places storage and processing components at the edge of the cloud, where data sources such as application users and sensors exist.It is an architecture that uses edge devices to carry out a substantial amount of computation (edge computing), storage, and communication locally and routed over the Internet backbone.To achieve real-time automation, data capture and analysis has to be done in real-time without having to deal with the high latency and low bandwidth issues that occur during the processing of network data In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures.. In 2015, Cisco partnered with Microsoft, Dell, Intel, Arm and Princeton University to form the OpenFog Consortium.The consortium's primary goals were to both promote and standardize fog computing. These concepts brought computing resources closer to data sources.Fog computing also differentiates between relevant and irrelevant data. While relevant data is sent to the cloud for storage, irrelevant data is either deleted or transmitted to the appropriate local platform. As such, edge computing and fog computing work in unison to minimize latency and maximize the efficiency associated with cloud-enabled enterprise systemsFog computing consists of various componets such as fog nodes.Fog nodes are independent devices that pick up the generated information. Fog nodes fall under three categories: fog devices, fog servers, and gateways. These devices store necessary data while fog servers also compute this data to decide the course of action. Fog devices are usually linked to fog servers. Fog gateways redirect the information between the various fog devices and servers. With Fog computing, local data storage and scrutiny of time-sensitive data become easier. With this the amount and the distance of passing data to the cloud is reduced, therefore reducing the security challenges.Fog computing enables data processing based on application demands, available networking and computing resources. This reduces the amount of data required to be transferred to the cloud, ultimately saving network bandwidth.Fog computing can run independently and ensure uninterrupted services even with fluctuating network connectivity to the cloud. It performs all time-sensitive actions close to end users which meets latency constraints of IoT applications.
IoT applications where data is generated in terabytes or more, where a quick and large amount of data processing is required and sending data to the cloud back and forth is not feasible, are good candidates for fog computing. Fog computing provides real-time processing and event responses which are critical in healthcare. Besides, it also addresses issues regarding network connectivity and traffic required for remote storage, processing and medical record retrieval from the cloud.
Fog Computing: Issues, Challenges and Future Directions IJECEIAES
In Cloud Computing, all the processing of the data collected by the node is done in the central server. This involves a lot of time as data has to be transferred from the node to central server before the processing of data can be done in the server. Also it is not practical to stream terabytes of data from the node to the cloud and back. To overcome these disadvantages, an extension of cloud computing, known as fog computing, is introduced. In this, the processing of data is done completely in the node if the data does not require higher computing power and is done partially if the data requires high computing power, after which the data is transferred to the central server for the remaining computations. This greatly reduces the time involved in the process and is more efficient as the central server is not overloaded. Fog is quite useful in geographically dispersed areas where connectivity can be irregular. The ideal use case requires intelligence near the edge where ultralow latency is critical, and is promised by fog computing. The concepts of cloud computing and fog computing will be explored and their features will be contrasted to understand which is more efficient and better suited for realtime application.
A Comprehensive Exploration of Fog Computing.pdfEnterprise Wired
This article delves into the intricacies of Fog computing, exploring its definition, key components, benefits, and its transformative impact on various industries.
A Review- Fog Computing and Its Role in the Internet of ThingsIJERA Editor
Fog computing extends the Cloud Computing paradigm to the edge of the network, thus enabling a new breed of applications and services. Dening characteristics of the Fog are: a) Low latency and location awareness; b) Wide-spread geographical distribution; c) Mobility; d) Very large number of nodes, e) Predominant role of wireless access, f) Strong presence of streaming and real time applications, g) Het-erogeneity. In this paper we argue that the above characteristics make the Fog the appropriate platform for a number of critical Internet of Things (IoT) services and applications, namely, Connected Vehicle, Smart Grid , Smart Cities, and, in general, Wireless Sensors and Actuators Net-works (WSANs).
Extends cloud computing services to the edge of the network.
Similar to cloud, Fog provides:
Data
Computation
Storage
Application Services to end users.
Motivations for Fog Computing:
Smart Grid, Smart Traffic Lights in vehicular networks and Software Defined Networks.
Fog Computing and Its Role in the Internet of ThingsHarshitParkar6677
Fog Computing extends the Cloud Computing paradigm to
the edge of the network, thus enabling a new breed of applications
and services. Dening characteristics of the Fog
are: a) Low latency and location awareness; b) Wide-spread
geographical distribution; c) Mobility; d) Very large number
of nodes, e) Predominant role of wireless access, f) Strong
presence of streaming and real time applications, g) Heterogeneity.
In this paper we argue that the above characteristics
make the Fog the appropriate platform for a number
of critical Internet of Things (IoT) services and applications,
namely, Connected Vehicle, Smart Grid , Smart
Cities, and, in general, Wireless Sensors and Actuators Networks
(WSANs).
Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the internet backbone.
Internet of Things (IoT) represents a remarkable transformation of the way in which our world will soon interact. Much like the World Wide Web connected computers to networks, and the next evolution connected people to the Internet and other people, IoT looks poised to interconnect devices, people, environments, virtual objects and machines in ways that only science fiction writers could have imagined.
The proliferation of Internet of Things (IoT) and the success of rich cloud services have pushed the horizon of a new computing paradigm, edge computing, which calls for processing the data at the edge of the network. Edge computing has the potential to address the concerns of response time requirement, battery life constraint, bandwidth cost saving, as well as data safety and privacy. In this paper, we introduce the definition of edge computing, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge computing. Finally, we present several challenges and opportunities in the field of edge computing, and hope this paper will gain attention from the community and inspire more research in this direction.
Edge computing refers to the enabling technologies allowing computation to be performed at the edge of the network, on downstream data on behalf of cloud services and upstream data on behalf of IoT services. Here we define “edge” as any computing and network resources along the path between data sources and cloud data centers. For example, a smart phone is the edge between body things and cloud, a gateway in a smart home is the edge between home things and cloud, a micro data center and a cloudlet is the edge between a mobile device and cloud. The rationale of edge computing is that computing should happen at the proximity of data sources. From our point of view, edge computing is interchangeable with fog computing, but edge computing focus more toward the things side, while fog computing focus more on the infrastructure side. Edge computing could have as big an impact on our society as has the cloud computing.
Cloud and Edge Computing Systems
IoT Systems
IoT and 5G Applications
Mobile Cloud Computing
Edge and Fog Computing
Mobile Ad hoc Cloud
Private Edge Cloud
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Security and Privacy Issues of Fog Computing: A SurveyHarshitParkar6677
Abstract. Fog computing is a promising computing paradigm that ex-
tends cloud computing to the edge of networks. Similar to cloud comput-
ing but with distinct characteristics, fog computing faces new security
and privacy challenges besides those inherited from cloud computing. In
this paper, we have surveyed these challenges and corresponding solu-
tions in a brief manner.
A Review- Fog Computing and Its Role in the Internet of ThingsIJERA Editor
Fog computing extends the Cloud Computing paradigm to the edge of the network, thus enabling a new breed of applications and services. Dening characteristics of the Fog are: a) Low latency and location awareness; b) Wide-spread geographical distribution; c) Mobility; d) Very large number of nodes, e) Predominant role of wireless access, f) Strong presence of streaming and real time applications, g) Het-erogeneity. In this paper we argue that the above characteristics make the Fog the appropriate platform for a number of critical Internet of Things (IoT) services and applications, namely, Connected Vehicle, Smart Grid , Smart Cities, and, in general, Wireless Sensors and Actuators Net-works (WSANs).
Extends cloud computing services to the edge of the network.
Similar to cloud, Fog provides:
Data
Computation
Storage
Application Services to end users.
Motivations for Fog Computing:
Smart Grid, Smart Traffic Lights in vehicular networks and Software Defined Networks.
Fog Computing and Its Role in the Internet of ThingsHarshitParkar6677
Fog Computing extends the Cloud Computing paradigm to
the edge of the network, thus enabling a new breed of applications
and services. Dening characteristics of the Fog
are: a) Low latency and location awareness; b) Wide-spread
geographical distribution; c) Mobility; d) Very large number
of nodes, e) Predominant role of wireless access, f) Strong
presence of streaming and real time applications, g) Heterogeneity.
In this paper we argue that the above characteristics
make the Fog the appropriate platform for a number
of critical Internet of Things (IoT) services and applications,
namely, Connected Vehicle, Smart Grid , Smart
Cities, and, in general, Wireless Sensors and Actuators Networks
(WSANs).
Fog computing or fog networking, also known as fogging, is an architecture that uses edge devices to carry out a substantial amount of computation, storage, and communication locally and routed over the internet backbone.
Internet of Things (IoT) represents a remarkable transformation of the way in which our world will soon interact. Much like the World Wide Web connected computers to networks, and the next evolution connected people to the Internet and other people, IoT looks poised to interconnect devices, people, environments, virtual objects and machines in ways that only science fiction writers could have imagined.
The proliferation of Internet of Things (IoT) and the success of rich cloud services have pushed the horizon of a new computing paradigm, edge computing, which calls for processing the data at the edge of the network. Edge computing has the potential to address the concerns of response time requirement, battery life constraint, bandwidth cost saving, as well as data safety and privacy. In this paper, we introduce the definition of edge computing, followed by several case studies, ranging from cloud offloading to smart home and city, as well as collaborative edge to materialize the concept of edge computing. Finally, we present several challenges and opportunities in the field of edge computing, and hope this paper will gain attention from the community and inspire more research in this direction.
Edge computing refers to the enabling technologies allowing computation to be performed at the edge of the network, on downstream data on behalf of cloud services and upstream data on behalf of IoT services. Here we define “edge” as any computing and network resources along the path between data sources and cloud data centers. For example, a smart phone is the edge between body things and cloud, a gateway in a smart home is the edge between home things and cloud, a micro data center and a cloudlet is the edge between a mobile device and cloud. The rationale of edge computing is that computing should happen at the proximity of data sources. From our point of view, edge computing is interchangeable with fog computing, but edge computing focus more toward the things side, while fog computing focus more on the infrastructure side. Edge computing could have as big an impact on our society as has the cloud computing.
Cloud and Edge Computing Systems
IoT Systems
IoT and 5G Applications
Mobile Cloud Computing
Edge and Fog Computing
Mobile Ad hoc Cloud
Private Edge Cloud
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Efficient ECC-Based Authentication Scheme for Fog-Based IoT EnvironmentIJCNCJournal
The rapid growth of cloud computing and Internet of Things (IoT) applications faces several threats, such as latency, security, network failure, and performance. These issues are solved with the development of fog computing, which brings storage and computation closer to IoT-devices. However, there are several challenges faced by security designers, engineers, and researchers to secure this environment. To ensure the confidentiality of data that passes between the connected devices, digital signature protocols have been applied to the authentication of identities and messages. However, in the traditional method, a user's private key is directly stored on IoTs, so the private key may be disclosed under various malicious attacks. Furthermore, these methods require a lot of energy, which drains the resources of IoT-devices. A signature scheme based on the elliptic curve digital signature algorithm (ECDSA) is proposed in this paper to improve the security of the private key and the time taken for key-pair generation. ECDSA security is based on the intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP), which allows one to use much smaller groups. Smaller group sizes directly translate into shorter signatures, which is a crucial feature in settings where communication bandwidth is limited, or data transfer consumes a large amount of energy. In this paper, we have chosen the safe curve types of elliptic-curve cryptography (ECC) such as M221, SECP256r1, curve 25519, Brainpool P256t1, and M-551. These types of curves are the most secure curves of other curves of ECC as their security is based on the complexity of the ECDLP of the curve. And these types of curves exceed the complexity of the ECDLP. A valid signature can be generated without reestablishing the whole private key. ECDSA ensures data security and successfully reduces intermediate attacks. The efficiency and effectiveness of ECDSA in the IoT environment are validated by experimental evaluation and comparison analysis. The results indicate that, in comparison to the two-party ECDSA and RSA, the proposed ECDSA decreases computation time by 65% and 87%, respectively. Additionally, as compared to two-party ECDSA and RSA, respectively, it reduces energy consumption by 77% and 82%.
Security and Privacy Issues of Fog Computing: A SurveyHarshitParkar6677
Abstract. Fog computing is a promising computing paradigm that ex-
tends cloud computing to the edge of networks. Similar to cloud comput-
ing but with distinct characteristics, fog computing faces new security
and privacy challenges besides those inherited from cloud computing. In
this paper, we have surveyed these challenges and corresponding solu-
tions in a brief manner.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
1. Fog Computing
Under the guidance of:
Prof. V. S. Bhonkar
Presented by:
Miss Chaitali Gajanan Panvalkar
Roll no.:2201955
PRN no.:2030408246001
Institute of Petrochemical Engineering
Department of Information Technology
Dr. Babasaheb Ambedkar Technological University
Seminar on
2. Index
o What is IoT and Cloud ?
o Fog computing
o Limitations overcome from cloud
o Limitations overcome in fog
o Characteristics of fog computing
o Fog computing architecture
o How fog computing works ?
o Features of fog computing
o Advantages of fog computing
o Limitations of fog computing
o Why fog computing ?
o When to use fog computing ?
o Applications
o Conclusion
o Future scope
o References
3. What is IoT and Cloud ?
o Cloud Computing is defined as storing and accessing of data and computing
services over the internet.
o The Internet of Things can be described as connecting everyday objects to the
internet.
o IoT applications generate a large amount of data. Transferring this data to the cloud
leads to a number of issues/challenges.
o To overcome these challenges, faced by IoT applications, in the cloud environment,
the term fog computing was introduced by Cisco in the year 2012.
4. Fog Computing
It is also known as fogging/edge computing.
It is a decentralized computing infrastructure in which data, compute, storage and
applications are located somewhere between the data source and the cloud.
Rather than sending all of data to cloud-based servers to be processed, many of these
devices will create large amounts of raw data and then it will get forwarded to reduce
bandwidth.
Fog is nothing but cloud near end user.
5. Limitations overcome from cloud
Vulnerability
Depends on internet connection
The level of security
1
3
5
Power consumption
7
High latency
2
4
6
8 Low connection
Technical problem
Downtimes
6. Limitations overcome in Fog Computing
A lack of centralized processing power.
The ability to process data in real-time.
Increased distributed storage capacity with fog computing.
The ability to adapt more quickly than cloud computing when an outage occurs.
The ability to work better with IoT devices and big data sets.
Data Locality.
Faster Processing of data and power consumption.
7. Charcteristics of Fog Computing
Geographical
distribution
Real-time
interaction
Close to the end
user
Edge location
Massive sensor
networks
8. Fog Computing Architecture
The hierarchical fog architecture comprises of following three layers:
1. Terminal Layer
o This layer includes devices like mobile phones, sensors, smart vehicles, readers,
smartcards, etc.
o Devices are distributed across a number of locations separated far apart from each
other.
o The layer mostly deals with data sensing and capturing.
9. 2. Fog Layer
o Fog layer includes devices like routers, gateways, access points, base stations,
specific fog servers, etc., called as Fog nodes.
o Fog nodes are located at the edge of a network. The Fog nodes are situated in-
between end devices and cloud data centers.
o Fog nodes can compute, transfer and store the data temporarily.
10. 3. Cloud Layer
o This layer consists of devices that can provide large storage and machines (servers)
with high performance.
o The data centers are both scalable and provide compute resources on-demand basis.
o It acts as a back-up as well as provides permanent storage for data in a fog
architecture.
12. How Fog Computing works ?
4. Data is analyzed
locally, filtered, and
then sent to the
cloud for long-term
storage if necessary.
1.Works by
utilizing fog nodes
and edge devices.
3. This data is
sent to a fog
node close to the
data source.
2. Raw data is
captured by IoT
beacons.
13. 1
Features of Fog
2 3 4
Aims to place the
data close to the
end user.
Enhance the cloud
experience.
Big data and
analytics can be
done faster with
better results.
Administrators are
able to support
location based
mobility demands.
14. Advantages of Fog Computing
Productivity
Privacy
Bandwidth
Security
Latency
Fog
computing
16. Why Fog Computing ?
Reduction of
network traffic
Suitable for
IOT tasks and
Quries
Scalability
Low Latency
requirement
Monitoring
Raw data
management
Resource
Provisioning
17. When to use Fog Computing ?
1
2
3
4
When only selected data is required to
send to the cloud.
When the data should be analyzed within a
fraction of seconds.
whenever a large number of services need to
be provided over a large area at different
geographical locations.
Devices that are subjected to rigorous
computations and processing’s.
Use
18. Applications of Fog Computing
Linked vehicles
Real-time analytics
Smart Grids and Smart
Cities:
19. Connected cars
o Fog computing is ideal for connected
vehicles (CV) because real-time interactions.
o The connected car will start saving lives by
reducing automobile accidents.
20. Decentralized smart building control
o Wireless sensors are installed to measure temperature,
humidity, or levels of various gaseous components in the
building atmosphere.
o Information can be exchanged among all sensors in the
floor.
21. Smart cities
o Fog computing would be able to obtain
sensor data on all levels, and integrate
all the mutually independent network
entities within.
22. Conclusion
o The key advantages of fog computing are greater comparing to cloud
computing.
o Fog provides unprecedented levels of security in the cloud and in social
networks.
o The usage of fog computing can accelerate the innovation process in ways
that has never been seen before.
23. Future Scope
Fog computing is the future for organizations. It has several
advantages over cloud computing.
Cloud computing for IoT may fade away but fog computing will
take over.
Fog computing is the key to accomplish this critical work. It can
boost usability and accessibility in various computing environments
26. References
1. Bonomi, Flavio (September 19–23, 2011). "Connected Vehicles, the Internet of Things,
and Fog Computing, The 8th ACM International Workshop on VehiculAr Inter-
NETworking (VANET 2011), Las Vegas, NV, USA". www.sigmobile.org.
Retrieved 2019-08-07.
2. "What Is Fog Computing? Webopedia Definition". www.webopedia.com.
Retrieved 2017-04-07.
3. Bonomi, F., Milito, R., Zhu, J., and Addepalli, S. Fog Computing and its Role in the
Internet of Things. In Proc of MCC (2012), pp. 13-16.[4].
4. "Fog brings the cloud closer to the ground: Cisco innovates in fog
computing". newsroom.cisco.com. Retrieved 2019-01-24.
Editor's Notes
It offers better security.
Fog nodes are mobile in nature. Hence they can join and left the network at any time.
It is easy to develop fog applications using the right tool that can drive machines as power customers needs.
Fog nodes, such as tracks, cars, factory floors can survive harsh environmental conditions.
Fog computing offers a reduction in latency as data are analyzed locally. This is due to less round trip time and is also a fewer amount of data bandwidth.
It reduces the latency requirement and hence quick decisions can be made. This helps in avoiding accidents.
It process selected data locally instead of sending them to the cloud for processing.
This computing offers better privacy to the user’s data as they are analyzed locally instead of sending them to the cloud. The IT team can manage everything and control the devices.