Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
Sergio Takeo Kofuji, Assistant Professor at the University of São Paulo, Coordinator to FI WARE LAB in University of São Paulo, Brazil
The European Commission, in a recent communication (April 19th), has identified 5G and Internet of Things (IoT) amongst the ICT standardisation priorities for the Digital Single Market (DSM). This session will discuss the emergence of the mobile edge computing paradigm to reduce the latency for processing near the source large quantities of data and the need of the emerging 5G technology to satisfy the requirements of different verticals. Mobile Edge Clouds have the potential to provide an enormous amount of resources, but it raises several research challenges related to the resilience, security, data portability and usage due to the presence of multiple trusted domains, as well as energy consumption of battery powered devices. Large and centralized clouds have been deployed and have shown how this paradigm can greatly improve performance and flexibility while reducing costs. However, there are many issues requiring solutions that are user and context aware, dynamic, and with the capability to handle heterogeneous demands and systems. This is a challenge triggered by the Internet of Things (IoT) scenario, which strongly requires cloud-based solutions that can be dynamically located and managed, on demand and with self-organization capabilities to serve the purposes of different verticals.
A talk presented at IEEE ComSoc workshop on Evolution of Data-centers in the context of 5G.
Discuss about what is edge computing and management issues in Edge Computing
Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
Sergio Takeo Kofuji, Assistant Professor at the University of São Paulo, Coordinator to FI WARE LAB in University of São Paulo, Brazil
The European Commission, in a recent communication (April 19th), has identified 5G and Internet of Things (IoT) amongst the ICT standardisation priorities for the Digital Single Market (DSM). This session will discuss the emergence of the mobile edge computing paradigm to reduce the latency for processing near the source large quantities of data and the need of the emerging 5G technology to satisfy the requirements of different verticals. Mobile Edge Clouds have the potential to provide an enormous amount of resources, but it raises several research challenges related to the resilience, security, data portability and usage due to the presence of multiple trusted domains, as well as energy consumption of battery powered devices. Large and centralized clouds have been deployed and have shown how this paradigm can greatly improve performance and flexibility while reducing costs. However, there are many issues requiring solutions that are user and context aware, dynamic, and with the capability to handle heterogeneous demands and systems. This is a challenge triggered by the Internet of Things (IoT) scenario, which strongly requires cloud-based solutions that can be dynamically located and managed, on demand and with self-organization capabilities to serve the purposes of different verticals.
A talk presented at IEEE ComSoc workshop on Evolution of Data-centers in the context of 5G.
Discuss about what is edge computing and management issues in Edge Computing
Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.
Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance.Edge computing deployments are ideal in a variety of circumstances. One is when IoT devices have poor connectivity and it’s not efficient for IoT devices to be constantly connected to a central cloud.
Other use cases have to do with latency-sensitive processing of information. Edge computing reduces latency because data does not have to traverse over a network to a data center or cloud for processing. This is ideal for situations where latencies of milliseconds can be untenable, such as in financial services or manufacturing.
The term “fog computing” or “edge computing” means that rather than hosting and working from a centralized cloud, fog systems operate on network ends. It is a term for placing some processes and resources at the edge of the cloud, instead of establishing channels for cloud storage and utilization.
Through this presentation, you will get to know about Edge computing and explore the fields where it is needed.
You can start exploring the technical knowledge by seeing what industries are working on now-days
Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined networks.
Module I
Introduction to Distributed systems - Examples of distributed systems, resource sharing and the web, challenges - System model - introduction - architectural models - fundamental models - Introduction to inter-process communications - API for Internet protocol - external data.
Edge Computing: An Extension to Cloud ComputingRamneek Kalra
This presentation was shared by Shally Gupta (PhD Research Scholar | IEEE Graduate Member) & Ramneek Kalra (IEEE Impact Creator) at IEEE MRU Student Branch, Faridabad, Haryana, India.
Fog computing, also known as fogging/edge computing, it is a model in which data, processing and applications are concentrated in devices at the network edge rather than existing almost entirely in the cloud.
The term "Fog Computing" was introduced by the Cisco Systems .
Its extended from cloud
ABSTRACT
Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. With these new computing and communications paradigms arise new data security challenges. Existing data protection mechanisms such as encryption have failed in preventing data theft attacks, especially those perpetrated by an insider to the cloud provider. For securing user data from such attacks a new paradigm called fog computing can be used. Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined network .This technique can monitor the user activity to identify the legitimacy and prevent from any unauthorized user access. Here we have discussed this paradigm for preventing misuse of user data and securing information.
Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined networks,
Fog computing is a term created by Cisco that refers to extending cloud computing to the edge of an enterprise's network.
Cisco introduced its fog computing vision in January 2014 as a way of bringing cloud computing capabilities to the edge of the network .
As the result, closer to the rapidly growing number of connected devices and applications that consume cloud services and generate increasingly massive amounts of data.
Cloud Computing Technology
Cloud Architecture
Cloud Modeling and Design
Foundation Grid
Cloud and Virtualization
Virtualization and Cloud Computing.
Cloud Lifecycle model
In this presentation, I am going to briefly talk about 'what cloud is' and highlight the various types of cloud (IaaS, PaaS, SaaS). The bulk of the talk will be about using the fog gem using IaaS. I will discuss fog concepts (collections, models, requests, services, providers) and supporting these with actual examples using fog
Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds.
Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance.Edge computing deployments are ideal in a variety of circumstances. One is when IoT devices have poor connectivity and it’s not efficient for IoT devices to be constantly connected to a central cloud.
Other use cases have to do with latency-sensitive processing of information. Edge computing reduces latency because data does not have to traverse over a network to a data center or cloud for processing. This is ideal for situations where latencies of milliseconds can be untenable, such as in financial services or manufacturing.
The term “fog computing” or “edge computing” means that rather than hosting and working from a centralized cloud, fog systems operate on network ends. It is a term for placing some processes and resources at the edge of the cloud, instead of establishing channels for cloud storage and utilization.
Through this presentation, you will get to know about Edge computing and explore the fields where it is needed.
You can start exploring the technical knowledge by seeing what industries are working on now-days
Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined networks.
Module I
Introduction to Distributed systems - Examples of distributed systems, resource sharing and the web, challenges - System model - introduction - architectural models - fundamental models - Introduction to inter-process communications - API for Internet protocol - external data.
Edge Computing: An Extension to Cloud ComputingRamneek Kalra
This presentation was shared by Shally Gupta (PhD Research Scholar | IEEE Graduate Member) & Ramneek Kalra (IEEE Impact Creator) at IEEE MRU Student Branch, Faridabad, Haryana, India.
Fog computing, also known as fogging/edge computing, it is a model in which data, processing and applications are concentrated in devices at the network edge rather than existing almost entirely in the cloud.
The term "Fog Computing" was introduced by the Cisco Systems .
Its extended from cloud
ABSTRACT
Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. With these new computing and communications paradigms arise new data security challenges. Existing data protection mechanisms such as encryption have failed in preventing data theft attacks, especially those perpetrated by an insider to the cloud provider. For securing user data from such attacks a new paradigm called fog computing can be used. Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined network .This technique can monitor the user activity to identify the legitimacy and prevent from any unauthorized user access. Here we have discussed this paradigm for preventing misuse of user data and securing information.
Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The motivation of Fog computing lies in a series of real scenarios, such as Smart Grid, smart traffic lights in vehicular networks and software defined networks,
Fog computing is a term created by Cisco that refers to extending cloud computing to the edge of an enterprise's network.
Cisco introduced its fog computing vision in January 2014 as a way of bringing cloud computing capabilities to the edge of the network .
As the result, closer to the rapidly growing number of connected devices and applications that consume cloud services and generate increasingly massive amounts of data.
Cloud Computing Technology
Cloud Architecture
Cloud Modeling and Design
Foundation Grid
Cloud and Virtualization
Virtualization and Cloud Computing.
Cloud Lifecycle model
In this presentation, I am going to briefly talk about 'what cloud is' and highlight the various types of cloud (IaaS, PaaS, SaaS). The bulk of the talk will be about using the fog gem using IaaS. I will discuss fog concepts (collections, models, requests, services, providers) and supporting these with actual examples using fog
Improving Web Siste Performance Using Edge Services in Fog Computing Architec...Jiang Zhu
We consider web optimization within Fog Computing context. We apply existing methods for web optimization in a novel manner, such that these methods can be combined with unique knowledge that is only available at the edge (Fog) nodes. More dynamic adaptation to the user’s conditions (eg. network status and device’s computing load) can also be accomplished with network edge specific knowledge. As a result, a user’s webpage rendering performance is improved beyond that achieved by simply applying those methods at the webserver or CDNs.
Presentation at IoT World, May 2016 in Santa Clara, CA. Session "Manage your IoT Sensor Data at the Edge! Control your IoT sensor data at the most appropriate spot" (Thursday, 12 May 2016. IoT & the Cloud Track)
Analyzing data and driving business decisions to the edge of Internet-of-Things (IoT) is rapidly becoming critical for any IoT solution. And for real-time analysis of the data as it streams in is vital to many business processes. Informix, as the data management system of choice for IoT solutions delivers significant value proposition for businesses across all industry segments looking to deploy IoT Solutions. And with Apache Edgent/Quarks integration, you get real-time analysis of streaming IoT data.
IBM IoT Architecture and Capabilities at the Edge and Cloud Pradeep Natarajan
This slide deck answers the following questions:
1) What does the generalized IoT architecture looks like?
2) What is the need for an IoT gateway or IoT edge solution?
3) Why use a database solution in the IoT gateway?
4) Why IBM Informix is the perfect data management solution for IoT gateways at the edge?
E3: Edge and Cloud Connectivity (Predix Transform 2016)Predix
http://predixtransform.com
The edge is where the Industrial Internet starts (and ends). Understand the roles Predix Machine and Connectivity play for your app architecture. Then use the essential tool kits to build your own edge-connected apps. We'll cover edge management (enrollment and security), edge analytics, and data ingestion (e.g., HTTP and MQTT).
Towards the extinction of mega data centres? To which extent should the Clou...Thierry Coupaye
Keynote by Thierry Coupaye at the IEEE International Conference on Cloud Networking, Niagara Falls, Canada, October 2015.
Summary: Cloud computing emerged, a decade or so ago, from underused computing and storage ressources in Internet players mega data centres that were thought to be provided "as a service". As a result of this inception, Cloud is often considered as a synonym for massive data center, which somehow fuels a very centralised vision of (cloud) computing and storage provision. However, we might be at a time in which the pendulum begins to swing back. Indeed, several initiatives are emerging around a vision of more geographically distributed clouds where computing and storage resources are made available at the edge of the network, close to users, in complement or replacement of massive remote data centres. This presentation discusses, through some examples, the evolution of cloud architectures towards more distribution, the signs and stakes of these mutations.
The data streaming paradigm and its use in Fog architecturesVincenzo Gulisano
These are the slides for the lecture I gave at the EBSIS Summer School about data streaming and its challenges and trade-offs for data analysis in Fog architectures.
For the full video of this presentation, please visit:
https://www.embedded-vision.com/platinum-members/embedded-vision-alliance/embedded-vision-training/videos/pages/may-2017-embedded-vision-summit-maslan
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Carter Maslan, CEO of Camio, presents the "Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring" tutorial at the May 2017 Embedded Vision Summit.
Network cameras and other edge devices are collecting ever-more video – far more than can be economically transported to the cloud. This argues for putting intelligence in edge devices. But the cloud offers unique, valuable capabilities, such as aggregating information from multiple cameras, applying state-of-the-art algorithms, and providing users with access to their data anywhere, any time.
Camio uses a combination of machine learning at the edge (in network cameras and network video recorders) and in the cloud to generate alerts, highlight the most significant events captured by a camera, and to let users search for events of interest. In this talk, Maslan explores the trade-offs between edge and cloud processing for systems that extract meaning from video, and explains how the two approaches can be combined to create big opportunities.
Federated HPC Clouds Applied to Radiation TherapyAndrés Gómez
Presentation delivered in the Research Track at ISC CLOUD'13 at Heidelberg (Germany) on Sep. 24th 2013.
It describe the Virtual Cluster Architecture developed during BonFIRE project and the reasons to do it. Some proof-of-concept experiments are also presented
From Cloud to Fog: the Tao of IT Infrastructure DecentralizationFogGuru MSCA Project
Keynote by Dr. Guillaume Pierre, Professor of Computer Science at the University of Rennes 1 (France), at the IEEE CloudNet conference, 4th November 2019.
Time and resource constrained offloading with multi-task in a mobile edge co...IJECEIAES
In recent years, the importance of the mobile edge computing (MEC) paradigm along with the 5G, the Internet of Things (IoT) and virtualization of network functions is well noticed. Besides, the implementation of computation-intensive applications at the mobile device level is limited by battery capacity, processing capabalities and execution time. To increase the batteries life and improve the quality of experience for computationally intensive and latency-sensitive applications, offloading some parts of these applications to the MEC is proposed. This paper presents a solution for a hard decision problem that jointly optimizes the processing time and computing resources in a mobile edge-computing node. Hence, we consider a mobile device with an offloadable list of heavy tasks and we jointly optimize the offloading decisions and the allocation of IT resources to reduce the latency of tasks’ processing. Thus, we developped a heuristic solution based on the simulated annealing algorithm, which can improve the offloading rate and reduce the total task latency while meeting short decision time. We performed a series of experiments to show its efficiency. Finally, the obtained results in terms of full-time treatrement are very encouraging. In addition, our solution makes offloading decisions within acceptable and achievable deadlines.
Foog computing and iFogSim for sustainable smart city.sindhuRashmi1
This gives a overview of what is Fog computing how it is different from cloud computing for developing a efficient and sustainable smart cities. it also give a basic knowledge about simulating the fog layer and a tool kit that helps in simulation which is a IfogSim
Deadline Monotonic Scheduling to Reduce Overhead of Superframe in ISA100.11aOka Danil
In this paper, we develop new method to reduce communication overhead in ISA100.11a
Wireless Industrial Networks. Deadline monotonic scheduling scheme was proposed for
the analysis and simulation of the system. The result shows, our proposed approach
require less number of beacon to transmit data in network.
In today’s world the growing demand for knowledge has made cloud computing a center of attraction. Cloud computing is providing utility based services to all the users worldwide. It enables presentation of applications from consumers, scientific and business domains. However, data centers created for cloud computing applications consume huge amounts of energy, contributing to high operational costs and a large amount of carbon dioxide emission to the environment. With enhancement of data center, the power consumption is increasing at such a rate that it has become a key concern these days because it is ultimately leading to energy shortcomings and global climatic change. Therefore, we need green cloud computing solutions that can not only save energy, but also reduce operational costs.
Stochastic Computing Correlation Utilization in Convolutional Neural Network ...TELKOMNIKA JOURNAL
In recent years, many applications have been implemented in embedded systems and mobile Internet of Things (IoT) devices that typically have constrained resources, smaller power budget, and exhibit "smartness" or intelligence. To implement computation-intensive and resource-hungry Convolutional Neural Network (CNN) in this class of devices, many research groups have developed specialized parallel accelerators using Graphical Processing Units (GPU), Field-Programmable Gate Arrays (FPGA), or Application-Specific Integrated Circuits (ASIC). An alternative computing paradigm called Stochastic Computing (SC) can implement CNN with low hardware footprint and power consumption. To enable building more efficient SC CNN, this work incorporates the CNN basic functions in SC that exploit correlation, share Random Number Generators (RNG), and is more robust to rounding error. Experimental results show our proposed solution provides significant savings in hardware footprint and increased accuracy for the SC CNN basic functions circuits compared to previous work.
Optimization of Time Restriction in Construction Project Management Using Lin...IJERA Editor
This study is an attempt to identify the minimum time of a construction project using the critical path method
and linear programming model. A systematic analysis is attempted by developing a work breakdown structure
for entire project to establish work elements for quantifying various resources against time and cost. A network
is established taking into consideration all the predecessor and successor activities. The network is then
optimized through crashing of activities so as to obtain optimal solution and serves as a base for optimizing total
project cost. Finally, linear programming model is used to formulate the system of crashing network for
minimum time by LINGO model and Microsoft Excel. These models consider many considerations of project
thus reducing the duration of project. Ultimately, comparison of both the software outputs and the manual
calculations is done and the best verifier is determined.
The computing continuum extends the high-performance cloud data centers with energy-efficient and low-latency devices close to the data sources located at the edge of the network. However, the heterogeneity of the computing continuum raises multiple challenges related to application and data management. These include (i) how to efficiently provision compute and storage resources across multiple control domains across the computing continuum, (ii) how to decompose and schedule an application, and (iii) where to store an application source and the related data. To support these decisions, we explore in this thesis, novel approaches for (i) resource characterization and provisioning with detailed performance, mobility, and carbon footprint analysis, (ii) application and data decomposition with increased reliability, and (iii) optimization of application storage repositories. We validate our approaches based on a selection of use case applications with complementary resource requirements across the computing continuum over a real-life evaluation testbed.
The UberCloud - From Project to Product - From HPC Experiment to HPC Marketpl...Wolfgang Gentzsch
The UberCloud online marketplace for engineers and scientists to discover, try, and buy compute power on demand, in the cloud. Starting with free experiments in the cloud, including application software, cloud hardware, and expertise. Learning by doing how to use your application in the cloud.
Pruning Edge Research with Latency Shears. Nitinder Mohan, Lorenzo Corneo, Aleksandr Zavodovski, Suzan Bayhan, Walter Wong, and Jussi Kangasharju.
In Proceedings of the 19th ACM Workshop on Hot Topics in Networks (HotNets '20).
DOI: https://doi.org/10.1145/3422604.3425943
Edge Computing Platforms and Protocols - Ph.D. thesisNitinder Mohan
Introductory presentation for Ph.D. thesis of Nitinder Mohan titled "Edge Computing Platforms and Protocols". The defense took place at the University of Helsinki, Finland on 8th November 2019.
The video of the presentation is available at https://youtu.be/dDVZozTwreE
The thesis can be found on https://helda.helsinki.fi/handle/10138/306041
DeCloud: Truthful Decentralized Double Auction for Edge CloudsNitinder Mohan
DeCloud: Truthful Decentralized Double Auction for Edge Clouds presented at International Conference on Distributed Computing Systems (ICDCS) 2019 in Texas, USA
Open Infrastructure for Edge: A Distributed Ledger OutlookNitinder Mohan
Open Infrastructure for Edge: A Distributed Ledger Outlook presented at 2nd USENIX Workshop on Hot Topics in Edge Computing (HotEdge) in Renton, Washington, USA
ExEC: Elastic Extensible Edge Cloud presented at 2nd ACM Workshop on Edge Systems, Analytics and Networking (EdgeSys) 2019 co-located with EUROSYS 2019 in Dresden, Germany.
Slides owned and prepared by Aleksandr Zavodovski
ICON: Intelligent Container Overlays presented at 17th ACM Workshop on Hot Topics in Networks (HotNets) 2018 in Redmond, Washington
Slides owned and prepared by Aleksandr Zavodovski
Anveshak: Placing Edge Servers In The WildNitinder Mohan
Published in MECOMM workshop colocated with SIGCOMM 2018 held in Budapest, Hungary.
Paper PDF is available at: https://dl.acm.org/citation.cfm?id=3229560
ABSTRACT: In this paper, we present Anveshak, a framework that solves the problem of placing edge servers in a geographical topology and provides the optimal solution for edge
providers. Our proposed solution considers both end-user application requirements as well as deployment and operating costs incurred by edge platform providers. The placement optimization
metric of Anveshak considers the request pattern of users and existing user-established edge servers.
Paper PDF is available: https://dl.acm.org/citation.cfm?id=3195871
Accepted and presented at 5th Workshop on CrossCloud Infrastructures & Platforms, EuroSys Conference, April 2018
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Home assignment II on Spectroscopy 2024 Answers.pdf
Edge-Fog Cloud
1. Edge-Fog Cloud: A Distributed
Cloud for Internet of Things
Computations
Nitinder Mohan, Jussi Kangasharju
Department of Computer Science, University of Helsinki, Finland
{firstname.lastname@cs.helsinki.fi}
Conference on Cloudification of Internet of Things (CIoT) – 2016
Paris
2. Rise of connected IoT devices
Projected number of IoT devices Average cost of a sensor
Broadband by the numbers (NCTA), https://www.ncta.com/broadband- by-the-numbers
2
4. Problem: Network!
High transport cost
High data volume
High network latency
https://cloud.google.com/about/locations/
4
Computational Data Centers
5. Fog Cloud Computing
Cloud
Fog
Devices
Hong, K., Lillethun, D., Ramachandran, U., Ottenwälder, B., & Koldehofe, B. (2013). Mobile fog. Proceedings of the Second ACM SIGCOMM Workshop on Mobile Cloud Computing - MCC ’13
Processing-capable network resources augment the cloud
4
6. Edge Cloud Computing
Processing-capable, voluntary, user-controlled devices augment the cloud
Lopez, P. G., Montresor, A., Epema, D., Iamnitchi, A., Felber, P., & Riviere, E. (2015). Edge-centric Computing : Vision and Challenges. Acm Ccr, 45(5), 37–42.
5
7. Edge & Fog Cloud: Problem
Computation requires routing data to a central cloud!
Cloud
Fog
Devices
6
9. Architecture
Data
Store
Fog
Edge
Edge
Collection of devices:
i. Loosely-coupled
ii. Voluntary
iii. Human operated
1-2 hops away from sensors & clients
Ad-hoc device-to-device connectivity
within layer
Varying processing capability
e.g. desktops, laptops, workstations,
nano data centers etc. 8
10. Data
Store
Fog
Edge
Architecture
Fog
Network devices with high compute
capability
Manufactured, managed and deployed
by cloud vendors such as CISCO*
Lies farther from sensors but closer to
core
Dense connectivity within layer
Reliable connectivity to Edge
e.g. routers, switches etc.
*CISCO, “Cisco fog computing solutions: Unleash the power of the Internet of Things (whitepaper),” 2015
8
14. D1 D2 D3
D4 D5
1
4 34
1
Edge-Fog Cloud
J1 J2 J3
J4
J5
Job Graph
*Haubenwaller, Andreas Moregård, and Konstantinos Vandikas. "Computations on the Edge in the Internet of Things." Procedia Computer Science 52 (2015)
Network Only Cost Assignment*
11
20. Least Processing Cost First (LPCF)
3 2 2 5 6
4 2 5 4 2
Dproc [i] =
Jsize [i] =
I. Optimize Processing Cost
Minimize:
Linear Assignment Problem
• Solved using Kuhn-Munkres/
Hungarian algorithm
• Optimal solution guaranteed in
O(n3)
𝑖,𝑗∈𝐴
𝐶
𝐽𝑠𝑖𝑧𝑒(𝑖)
𝐷 𝑝𝑟𝑜𝑐(𝑗)
𝑥𝑖𝑗
16
21. Least Processing Cost First (LPCF)
I. Optimize Processing Cost
Minimize:
Linear Assignment Problem
• Solved using Kuhn-Munkres/
Hungarian algorithm
• Optimal solution guaranteed in
O(n3)
𝑖,𝑗∈𝐴
𝐶
𝐽𝑠𝑖𝑧𝑒(𝑖)
𝐷 𝑝𝑟𝑜𝑐(𝑗)
𝑥𝑖𝑗
D1:3 D2:2 D3:2
D4:5 D5:6
1
4 34
1
J1:4 J2:2 J5:2
J4:4 J3:5
Least Processing Cost: 4.966
16
22. Least Processing Cost First (LPCF)
II. Create sub-problem space
Edge-Fog Cloud composes of
several homogeneous devices
running homogeneous jobs
New Assignment Calculation:
1. Same processing power
→ interchange jobs
2. Same job size
→ interchange devices
D1:3 D2:2 D3:2
D4:5 D5:6
1
4 34
1
J1:4 J2:2 J5:2
J4:4 J3:5
Least Processing Cost: 4.966
J1:4 J5:2 J2:2
J4:4 J3:5
J4:4 J5:2 J2:2
J1:4 J3:5
17
23. Least Processing Cost First (LPCF)
D1 D2 D3 D4 D5
1. J1 J2 J5 J4 J3
2. J1 J5 J2 J4 J3
3. J4 J5 J2 J1 J3
4. J4 J2 J5 J1 J3
Least Processing Cost: 4.966
17
II. Create sub-problem space
Edge-Fog Cloud composes of
several homogeneous devices
running homogeneous jobs
New Assignment Calculation:
1. Same processing power
→ interchange jobs
2. Same job size
→ interchange devices
24. Least Processing Cost First (LPCF)
III. Account Network Cost
1. Compute network cost of
each assignment
2. Choose the assignment
with least network cost
D1 D2 D3 D4 D5
1. J1 J2 J5 J4 J3
2. J1 J5 J2 J4 J3
3. J4 J5 J2 J1 J3
4. J4 J2 J5 J1 J3
𝐽𝑐𝑜𝑛𝑛 𝑖, 𝑗 ∗ 𝐷𝑐𝑜𝑛𝑛(𝑓 𝑖 , 𝑓(𝑗))
Least Processing Cost: 4.966
N/W
20
27
19
28
18
25. Least Processing Cost First (LPCF)
Advantages
1. Computed assignment has least processing cost and
almost-optimal network cost
2. Task assignment accounts for processing cost of task
deployment
3. Assignment solution is guaranteed in polynomial time
19
27. Edge-Fog Cloud Simulator
Python-based Edge-Fog Cloud Simulator
1. Generates:
i. Edge and Fog node graphs with device
processing and network costs
ii. Job node graphs with variable job sizes
2. Incorporates LPCF for assignment computation
3. Open Source
21
28. LPCF vs NOC
Least Processing Cost
First
Network Only Cost
*solver available from QAPLIB, http://anjos.mgi.polymtl.ca/qaplib/
22
Edge-Fog Cloud Simulator
+
LPCF Solver
Edge-Fog Cloud Simulator
+
Kuhn-Munkres Solver*
33. Q. How well connected should EF nodes be?
~21%
~17%
~9%
27
34. Q. How does deployed job impact overall cost?
28
35. Conclusion
Our contributions in this work are:
1. Formal architecture of Edge-Fog cloud
2. LPCF algorithm for assigning tasks on EF cloud
3. Open source Edge Fog cloud simulator & LPCF solver
4. Deployment analysis of Edge Fog cloud
Source code available at: www.github.com/nitinder-mohan/EdgeFogSimulator
29
37. LPCF Search Space Reduction
Topology Size 5 10 15 30 60 100 150
Original Space 5! 10! 15! 30! 60! 100! 150!
LPCF Space 1! 3! > 4! > 5! > 7! > 8! > 9!
37
38. EF Cloud Simulator Parameters
Property Value
Total number of devices/jobs Experiment Specific
Number of Edge devices 60% of total
Number of Fog devices 40% of total
Processing power of an Edge device 2-5
Processing power of a Fog device 7-9
Connection density of Edge layer (0-1) 0.2
Connection density of Fog layer (0-1) 0.6
Connection density between Edge and Fog layer (0-1) 0.5
Lowest job size in pool 2
Highest job size in pool 6
Inter-dependence density between jobs (0-1) 0.2
38
Editor's Notes
Projected number of devices including sensors connected by networks.
With time, sensor deployment may become location independent e.g. vehicles, drones, mobiles, embedded biometrics
US-central, east, west
EU-Belgium
Asia-Taiwan, Tokyo
Network resources running cloud application logic.
Developed and Deployed by a cloud vendor.
3. aggregation/computation while routing data to cloud. Heavy computation is in central cloud
Lie in 1-hop proximity to sensors.
Pre-processing computation on the edge. Heavy in the cloud.
Semi-dependence does not work well for applications which generate large amounts of data which is distributable.
EF cloud is a completely decentralized architecture which decouples processing time from network delays. Imbibes the benefits of Edge and Fog clouds
Handles data close to the generators and consumers
Some edge devices support mobility natively
Edge can combine data from sensors providing it location/application context
Completely decentralized
Deployment must map one job to one device node
Find deployment without impacting overall processing time
Trans1: j2 and j5
Trans2: j4 and j3
F(i) signifies constraint of deploying a job to a particular device.
1. Computing the optimal deployment for a problem space of 30 nodes using QAP may take up to a week on a computational grid comprising of 2500 machines
C is overall cost function, xij is binary job assignment variable
C is overall cost function, xij is binary job assignment variable
Based on property of EF cloud
C is overall cost function, xij is binary job assignment variable
As algorithm proceeds to compute network cost iteratively, a branch-and-bound version of LPCF could be used in large search space sizes
For the list of parameters used by the simulator, I encourage you to check out the paper or simulator code available on Git
Full name of NOC
Corresponds to Table 3 in the paper.
Nodes<40, LPCF finds assignment within 1 sec whereas both QAP and Naïve solvers reach the limit
Nodes=150, LPCF reaches limit as search space is approximately 9!
Graph 1:
Max and mins are bounds obtained by choosing N smallest/largest link costs. Might not be valid assignment.
NOC QAP require large time to complete, 100 node topology finished after 71 hours.
Graph 2:
Branch-and-bound QAP limited to time taken by LPCF
LPCF always outperforms NOC QAP
Edge has device-to-device connections so density cannot be drastically increased.
E-F connection density can reduce overall network cost greatly