The use of standards-based protocols
enables the ability to support and
manage the multitude of technology
and devices required across the smart
grid, providing increased interoperability
and reduced operational cost.
When looking at new system solutions
to upgrade existing infrastructure, some
of the primary attributes to assess are
the network management capabilities,
ability to transition, and support for
existing and evolving technology.
Grid changes are coming, and a growing number
of vendors will participate in the process and add
further innovations. Interoperability will serve as
the prime enabler of these changes. Utilities will
be able to serve their constituents well only if
they choose an upgrade path based on a common
platform for change and find solutions that
accommodate upgrades across the whole spectrum
of the grid.
Woodside Capital Partners SDN Seminar 6.19.13WoodsideCapital
As networks demand more programmability, flexibility and agility, Software-defined networking is emerging as an important step in transforming data centers and service provider wide area networks. This presentation covers how SDN is emerging as a viable network architecture and includes a market outlook, long-term implications and recent M&A activity in the SDN space.
The digital transformation underway is accelerating, enabling new business opportunities both for telecom operators and for enterprises from other industries. The main drivers are the need for increased efficiency, flexibility and new business models enabled by the introduction of 5G and increased adoption of cloud technologies. New services can be expected to be deployed at an unprecedented pace.
When we get water, electricity, or gas delivered to our home or place of work we expect it to have predictable quality. Why isn't this also true of broadband? The answer is we don't (yet) have the "glue" to integrate performance in digital supply chains.
Applications in the power sector have been a major source of revenue for blockchain in energy stakeholders. According to International Renewable Energy Agency, majority of the projects are being implemented in blockchain for energy are being done for peer-to-peer trading. Most countries today are trying to maximize energy generation, and with the electricity consumers are becoming producers, also known as prosumers, the trading of energy is the major application area for blockchain in power sector.
Are you ready for IoT disruption? by Ana SeliškarBosnia Agile
Digitization is redefining industries. Every business, city and country will become digital. Trends show that by 2020 50 billion devices will be connected. This growth of IoT requires compute and storage close to the end devices for several reasons – data reduction, resiliency, latency and scale. All those can be addressed by fog computing uniquely enabled by Cisco as one of six pillars of Cisco IoT System which provides the technologies and software you need to deploy, accelerate, and innovate in the era of IoT.
Towards the extinction of mega data centres? To which extent should the Clou...Thierry Coupaye
Keynote by Thierry Coupaye at the IEEE International Conference on Cloud Networking, Niagara Falls, Canada, October 2015.
Summary: Cloud computing emerged, a decade or so ago, from underused computing and storage ressources in Internet players mega data centres that were thought to be provided "as a service". As a result of this inception, Cloud is often considered as a synonym for massive data center, which somehow fuels a very centralised vision of (cloud) computing and storage provision. However, we might be at a time in which the pendulum begins to swing back. Indeed, several initiatives are emerging around a vision of more geographically distributed clouds where computing and storage resources are made available at the edge of the network, close to users, in complement or replacement of massive remote data centres. This presentation discusses, through some examples, the evolution of cloud architectures towards more distribution, the signs and stakes of these mutations.
Woodside Capital Partners SDN Seminar 6.19.13WoodsideCapital
As networks demand more programmability, flexibility and agility, Software-defined networking is emerging as an important step in transforming data centers and service provider wide area networks. This presentation covers how SDN is emerging as a viable network architecture and includes a market outlook, long-term implications and recent M&A activity in the SDN space.
The digital transformation underway is accelerating, enabling new business opportunities both for telecom operators and for enterprises from other industries. The main drivers are the need for increased efficiency, flexibility and new business models enabled by the introduction of 5G and increased adoption of cloud technologies. New services can be expected to be deployed at an unprecedented pace.
When we get water, electricity, or gas delivered to our home or place of work we expect it to have predictable quality. Why isn't this also true of broadband? The answer is we don't (yet) have the "glue" to integrate performance in digital supply chains.
Applications in the power sector have been a major source of revenue for blockchain in energy stakeholders. According to International Renewable Energy Agency, majority of the projects are being implemented in blockchain for energy are being done for peer-to-peer trading. Most countries today are trying to maximize energy generation, and with the electricity consumers are becoming producers, also known as prosumers, the trading of energy is the major application area for blockchain in power sector.
Are you ready for IoT disruption? by Ana SeliškarBosnia Agile
Digitization is redefining industries. Every business, city and country will become digital. Trends show that by 2020 50 billion devices will be connected. This growth of IoT requires compute and storage close to the end devices for several reasons – data reduction, resiliency, latency and scale. All those can be addressed by fog computing uniquely enabled by Cisco as one of six pillars of Cisco IoT System which provides the technologies and software you need to deploy, accelerate, and innovate in the era of IoT.
Towards the extinction of mega data centres? To which extent should the Clou...Thierry Coupaye
Keynote by Thierry Coupaye at the IEEE International Conference on Cloud Networking, Niagara Falls, Canada, October 2015.
Summary: Cloud computing emerged, a decade or so ago, from underused computing and storage ressources in Internet players mega data centres that were thought to be provided "as a service". As a result of this inception, Cloud is often considered as a synonym for massive data center, which somehow fuels a very centralised vision of (cloud) computing and storage provision. However, we might be at a time in which the pendulum begins to swing back. Indeed, several initiatives are emerging around a vision of more geographically distributed clouds where computing and storage resources are made available at the edge of the network, close to users, in complement or replacement of massive remote data centres. This presentation discusses, through some examples, the evolution of cloud architectures towards more distribution, the signs and stakes of these mutations.
2021 Predictions and Trends for the SD-WAN and Edge MarketQOS Networks
Looking at the new year with a refreshed understanding of what the IT team is looking for, what CIOs are being tasked with, and how to drive a relevant conversation can make the difference with your customer. Join us for our 2021 market insight and trends that can help target the conversation around the edge network and solutions that complement those needs!
It’s called data center in a box, unified computing and dynamic computing. Cisco, our technology partner, calls its offering Unified Computing System (UCS), which is what Peak 10 has standardized on for its data centers. Whatever you call it, converged infrastructure (CI) is getting bigger by the day. The global CI market is expected to grow to nearly $34 billion by 2019, a CAGR of 24.1 percent.
Structure 2014 - Disrupting the data center - Intel sponsor workshopGigaom
Presentation from Gigaom's Structure 2014 conference, June 21-22 in San Francisco
Intel sponsor workshop: Disrupting the data center
#gigaomlive
More at http://events.gigaom.com/structure-2014/
Gigaom's Structure 2014 conference, June 21-22 in San Francisco Launchpad company profiles
#gigaomlive
More at http://events.gigaom.com/structure-2014/
A fresh approach to remote IoT connectivity | by Podsystem Kira Ugai
There are a huge number of IoT devices, often roaming across countries and continents, that are located outside urban areas.
This poses significant challenges to both the design and connectivity of the device, the biggest concern being that there is no room for error, as troubleshooting and maintenance of remote and roaming devices is complicated and costly.
As part of the Internet Of Things North America conference in Chicago Illinois (April 13th – 14th 2016), Podsystem Inc. CEO Sam Colley will be presenting ‘A Fresh Approach to Remote IoT Connectivity’ at 11:30 on April 14th.
Sam will address the challenges faced by remote IoT applications developers and discuss ways of overcoming them.
His presentation is centered around an infographic which outlines the main issues involved in developing remote IoT applications and explains how to make the correct choices in terms of device design, connectivity and future proofing to prolong the lifespan of the application and avoid costly mistakes.
Ericsson Review: Software-Defined-NetworkingEricsson
An architecture based on software-defined networking (SDN) techniques gives operators greater freedom to balance operational and business parameters, such as network resilience, service performance and QoE against opex and capex. With its beginnings in data-center technology, software-defined networking (SDN) technology has developed to the point where it can offer significant opportunities to service providers.
The traditional way of describing network architecture and how a network behaves is through the fixed designs and behaviors of its various elements. The concept of software-defined networking (SDN) describes networks and how they behave in a more flexible way – through software tools that describe network elements in terms of programmable network states.
To maximize the potential benefits and deliver superior user experience, software-defined networking (SDN) needs to be implemented outside the sphere of the data center across the entire network. This can be achieved through enabling network programmability based on open APIs. Service Provider SDN will help operators to scale networks and take advantage of new revenue-generating possibilities.
For more from Ericsson Review visit: http://www.ericsson.com/thinkingahead/technology_insights
Array Networks’ Application Delivery Solutions Now Available Through Promark ... Array Networks
Array Networks Inc., a global leader in application delivery networking, today announced that it has entered into a distribution agreement with Promark Technology, a premier U.S.-focused value-added distributor (VAD) and wholly-owned subsidiary of Ingram Micro Inc. Under the terms of the agreement, Promark will offer Array’s application delivery networking products and solutions, including load balancing, SSL VPN and WAN optimization, as well as Array’s line of next-generation virtualized appliances.
Turtles all the Way Up – From OSGi bundles to Fog Computing - Tim Ward (Paremus)mfrancis
OSGi Community Event 2018 Presentation by Tim Ward (Paremus)
Abstract: The model of centralized cloud compute is changing. As large-scale IoT deployments have started to become real organizations are realizing that a single central cloud can’t cope with the data security, data volumes, latency or robustness that they need for their businesses. Centralizing in a single cloud also offers a huge operational risk – if the cloud fails, their business must continue!
This talk will introduce BRAIN-IoT, an EU Horizon 2020 funded project. BRAIN-IoT uses the latest OSGi R7 specifications to create an adaptive modular “Fog” environment with decentralized data processing and decision making. We’ll review the current design decisions made by the BRAIN-IoT team, including the issues concerning generic Edge Device discovery & integration, and see how they can be applied across different IoT use cases, including Smart Utilities and Industry 4.0 Factories of the Future.
How Far Can You Go with Agile for Embedded Software?TechWell
With the proliferation of IoT and consumer demand for smarter homes, appliances, automobiles, and wearables, many traditional product-based manufacturing companies are now becoming embedded software companies. This means that the design and manufacturing of physical products is becoming more complex since it now requires the integration of the physical components of the product, the firmware, and the myriad software components these products contain. Historically, embedded software developers have lagged behind IT in the adoption of agile development practices, largely due to the requirement of developing for the target hardware. Anders Wallgren shares concrete tips and best practices used by some of the largest embedded and IoT manufacturers to adopt and scale agile methodologies to transform their business—in product design, development, test, and manufacturing. Learn how to uncover and remove bottlenecks to agile velocity downstream as well as how multi-domain continuous delivery helps accelerate innovation and product delivery.
IEEE Blockchain in Energy P2418.5 WG Standards (October 2019_Claudio Lima) crlima10
The IEEE Blockchain Energy Standards, P2418.5 provides an open, common, and interoperable reference framework model for blockchain in the energy sector. It also covers three aspects: 1) Serve as a guideline for Blockchain use cases in the Electrical Power industry; Oil & Gas industry and Renewable energy industry and their related services. 2) Create standards on reference architecture, interoperability, terminology, and system interfaces for blockchain applications in the Energy sector by building an open protocol and technology agnostic layered framework. 3) Evaluate and provide guidelines on scalability, performance, security, and interoperability through evaluation of consensus algorithm, smart contracts, and type of blockchain implementation, etc. for the Energy sector.
Introduction to developing or migrating models to be compliant to the OpenMI Standard. OpenMI is an open standard which allows dynamic linking of numerical models, such as river models rainfall-runoff models and so on. See also:
http://www.lictek.com
2021 Predictions and Trends for the SD-WAN and Edge MarketQOS Networks
Looking at the new year with a refreshed understanding of what the IT team is looking for, what CIOs are being tasked with, and how to drive a relevant conversation can make the difference with your customer. Join us for our 2021 market insight and trends that can help target the conversation around the edge network and solutions that complement those needs!
It’s called data center in a box, unified computing and dynamic computing. Cisco, our technology partner, calls its offering Unified Computing System (UCS), which is what Peak 10 has standardized on for its data centers. Whatever you call it, converged infrastructure (CI) is getting bigger by the day. The global CI market is expected to grow to nearly $34 billion by 2019, a CAGR of 24.1 percent.
Structure 2014 - Disrupting the data center - Intel sponsor workshopGigaom
Presentation from Gigaom's Structure 2014 conference, June 21-22 in San Francisco
Intel sponsor workshop: Disrupting the data center
#gigaomlive
More at http://events.gigaom.com/structure-2014/
Gigaom's Structure 2014 conference, June 21-22 in San Francisco Launchpad company profiles
#gigaomlive
More at http://events.gigaom.com/structure-2014/
A fresh approach to remote IoT connectivity | by Podsystem Kira Ugai
There are a huge number of IoT devices, often roaming across countries and continents, that are located outside urban areas.
This poses significant challenges to both the design and connectivity of the device, the biggest concern being that there is no room for error, as troubleshooting and maintenance of remote and roaming devices is complicated and costly.
As part of the Internet Of Things North America conference in Chicago Illinois (April 13th – 14th 2016), Podsystem Inc. CEO Sam Colley will be presenting ‘A Fresh Approach to Remote IoT Connectivity’ at 11:30 on April 14th.
Sam will address the challenges faced by remote IoT applications developers and discuss ways of overcoming them.
His presentation is centered around an infographic which outlines the main issues involved in developing remote IoT applications and explains how to make the correct choices in terms of device design, connectivity and future proofing to prolong the lifespan of the application and avoid costly mistakes.
Ericsson Review: Software-Defined-NetworkingEricsson
An architecture based on software-defined networking (SDN) techniques gives operators greater freedom to balance operational and business parameters, such as network resilience, service performance and QoE against opex and capex. With its beginnings in data-center technology, software-defined networking (SDN) technology has developed to the point where it can offer significant opportunities to service providers.
The traditional way of describing network architecture and how a network behaves is through the fixed designs and behaviors of its various elements. The concept of software-defined networking (SDN) describes networks and how they behave in a more flexible way – through software tools that describe network elements in terms of programmable network states.
To maximize the potential benefits and deliver superior user experience, software-defined networking (SDN) needs to be implemented outside the sphere of the data center across the entire network. This can be achieved through enabling network programmability based on open APIs. Service Provider SDN will help operators to scale networks and take advantage of new revenue-generating possibilities.
For more from Ericsson Review visit: http://www.ericsson.com/thinkingahead/technology_insights
Array Networks’ Application Delivery Solutions Now Available Through Promark ... Array Networks
Array Networks Inc., a global leader in application delivery networking, today announced that it has entered into a distribution agreement with Promark Technology, a premier U.S.-focused value-added distributor (VAD) and wholly-owned subsidiary of Ingram Micro Inc. Under the terms of the agreement, Promark will offer Array’s application delivery networking products and solutions, including load balancing, SSL VPN and WAN optimization, as well as Array’s line of next-generation virtualized appliances.
Turtles all the Way Up – From OSGi bundles to Fog Computing - Tim Ward (Paremus)mfrancis
OSGi Community Event 2018 Presentation by Tim Ward (Paremus)
Abstract: The model of centralized cloud compute is changing. As large-scale IoT deployments have started to become real organizations are realizing that a single central cloud can’t cope with the data security, data volumes, latency or robustness that they need for their businesses. Centralizing in a single cloud also offers a huge operational risk – if the cloud fails, their business must continue!
This talk will introduce BRAIN-IoT, an EU Horizon 2020 funded project. BRAIN-IoT uses the latest OSGi R7 specifications to create an adaptive modular “Fog” environment with decentralized data processing and decision making. We’ll review the current design decisions made by the BRAIN-IoT team, including the issues concerning generic Edge Device discovery & integration, and see how they can be applied across different IoT use cases, including Smart Utilities and Industry 4.0 Factories of the Future.
How Far Can You Go with Agile for Embedded Software?TechWell
With the proliferation of IoT and consumer demand for smarter homes, appliances, automobiles, and wearables, many traditional product-based manufacturing companies are now becoming embedded software companies. This means that the design and manufacturing of physical products is becoming more complex since it now requires the integration of the physical components of the product, the firmware, and the myriad software components these products contain. Historically, embedded software developers have lagged behind IT in the adoption of agile development practices, largely due to the requirement of developing for the target hardware. Anders Wallgren shares concrete tips and best practices used by some of the largest embedded and IoT manufacturers to adopt and scale agile methodologies to transform their business—in product design, development, test, and manufacturing. Learn how to uncover and remove bottlenecks to agile velocity downstream as well as how multi-domain continuous delivery helps accelerate innovation and product delivery.
IEEE Blockchain in Energy P2418.5 WG Standards (October 2019_Claudio Lima) crlima10
The IEEE Blockchain Energy Standards, P2418.5 provides an open, common, and interoperable reference framework model for blockchain in the energy sector. It also covers three aspects: 1) Serve as a guideline for Blockchain use cases in the Electrical Power industry; Oil & Gas industry and Renewable energy industry and their related services. 2) Create standards on reference architecture, interoperability, terminology, and system interfaces for blockchain applications in the Energy sector by building an open protocol and technology agnostic layered framework. 3) Evaluate and provide guidelines on scalability, performance, security, and interoperability through evaluation of consensus algorithm, smart contracts, and type of blockchain implementation, etc. for the Energy sector.
Introduction to developing or migrating models to be compliant to the OpenMI Standard. OpenMI is an open standard which allows dynamic linking of numerical models, such as river models rainfall-runoff models and so on. See also:
http://www.lictek.com
Presentata alla quinta Girl Geek Dinner Milano, il 24 ottobre 2008 da Sara Rosso con la contribuzione di Bruna Gardella. Un introduzione all’Open Source, il mondo della donna e l’Open Source, la Girl Geek e l’Open Source, e tanti modi di essere più coinvolto nel mondo Open Source.
Introduction to developing or migrating models to be compliant to the OpenMI Standard. OpenMI is an open standard which allows dynamic linking of numerical models, such as river models rainfall-runoff models and so on. See also:
http://www.lictek.com
The F5 Networks Application Services Reference Architecture (White Paper)F5 Networks
Build elastic, flexible application delivery fabrics that are ready to meet the challenges of optimizing and securing applications in a constantly evolving environment.
The F5 Networks Application Services Reference Architecture (White Paper)
Revue de presse IoT / Data du 26/03/2017Romain Bochet
Sommaire :
- From the Edge To the Enterprise
- The Internet of Energy: Smart Sockets
- Google's big data calculates US rooftop solar potential
- Energy management: Oracle Utilities launches smart grid and IoT device management solution in the cloud
- Are vehicles the mobile sensor beds of the future?
Load Balance in Data Center SDN Networks IJECEIAES
In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control a nd forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce un derutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.
There are five disruptive forces shaping IT today, but none has more wide-ranging impact on all enterprises than the emergence of cloud as a preferred means of service delivery. This article discusses the cloud industry and how WGroup can help give client a competitive advantage using a service delivery strategy and new IT operating models.
Modeling the Grid for De-Centralized EnergyTon De Vries
Utilities are facing massive changes that affect all aspects of their business, from planning through operations. Once an industry characterized as technology-risk averse, utilities have been shifting to more agile approaches with a higher tolerance for risk. Modeling the grid to accommodate these changes requires new approaches and closer relationships with trusted
technology partners. This paper will examine what methodologies have driven the acceleration of grid decentralization and what technologies still need to be applied for smooth integration and success.
Network performance - skilled craft to hard scienceMartin Geddes
This document describes the technical and business journey for network operators wanting to turn network performance from a skilled craft into hard science.
2016 IDC Pan-European Utilities Summit: Open for BusinessOMNETRIC
The OMNETRIC Group's CEO, Maikel van Verseveld presented at the IDC Pan-European Utilities Summit 2016 in Italy. There he introduced attendees to the concept of, “Open for Business.” For the utility sector these days, that refers to open platforms, open ecosystems, open architectures, and of course, open-minds. After all, a lot is changing - and quickly – in the business of powering the planet.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Key Trends Shaping the Future of Infrastructure.pdf
SMART GRID INTEROPERABILITY: THE GREAT ENABLER
1. April 2015 1
FierceMarkets Custom Publishing
April 2015 1
share:
SMART GRID INTEROPERABILITY:THE GREAT ENABLER
SMART GRID INTEROPERABILITY:THE GREAT ENABLER
CORRESPONDING PAIN POINT: HOW TO MAKE GRID TECHNOLOGY
UPGRADES FUTURE-FRIENDLY
SIMPLIFY THE UTILITY JOURNEY
When the journey is long, the landscape tends to change in meaningful ways. Whether dealing with
system upgrades or regulatory game changers, the utility journey has faced many twists and turns,
with increasing urgency. So how do utilities move successfully into the next stage? The short answer
is through system interoperability. Until now, the journey has been a true balancing act, especially
considering weather-related system shocks, new sources of distributed energy that must be integrated,
evolving regulations, and demands for enhanced security and reliability. But the next phase of the
journey must be as future-friendly as possible, and the concept of system interoperability is integral to
the future.
WHAT CAN INTEROPERABILITY DO FOR YOU?
Interoperability is the ability for grid components to speak the same language through common
protocols and APIs. New systems must be interoperable with one another and must also communicate
with the legacy systems that have served as the backbone of operations for decades. A survey of smart
grid project managers by the Joint Research Centre Institute for Energy and Transport reveals that the
lack of interoperability between grid components is the number one obstacle to smart grid projects—
2. FierceMarkets Custom Publishing
April 2015 2
SMART GRID INTEROPERABILITY:THE GREAT ENABLER
even more so than the technical feasibility of
projects. However, the stakes are high: according to
the Electric Power Research Institute, the potential
economic benefits of a fully deployed smart grid
would be in the range of $1.3 to $2 trillion.
One major hurdle faced by utilities is that standards
tend to change quickly. Also, the standards
established by the National Institute of Standards
and Technology are not always adopted by the
Federal Energy Regulatory Commission, the
primary energy regulatory body in the United
States. Moreover, even if these standards are
adopted, they’re not always enforceable.
When standards are both uncertain and
unenforceable, the entire ecosystem has less
incentive to invest in grid changes and make costly
capital expenditures. No wonder industry leaders at
a recent Edison Foundation conference emphasized
“regulation”of the future, even more than the
concept of the “utility of the future/utility 2.0”– all
the more reason to address interoperability with
a flexible toolkit.The benefits of doing so include
reducing stranded assets in the installed base and
building a bridge between advanced metering
infrastructure and outage management systems to
more accurately delivery outage and restoration
status. Even the most advanced distribution
management system can’t live up to its potential if
it can’t communicate with entrenched operational
technology.
Grid interoperability can seem like a moving target
but is a strong enabler for everything else.The
pragmatic approach is for utilities to plan and assess
for interoperability based on their current systems,
to choose the most future-friendly platforms, and to
build coalitions of vendors for future growth.
PLAN AND ASSESS
Jeff McCullough, VP of System Design and
Development at Elster, describes the process
of factoring interoperability into a company’s
plans.“When looking at new system solutions
to upgrade existing infrastructure, some of the
primary attributes to assess are the network
management capabilities, ability to transition, and
support for existing and evolving technology,”
says McCullough.“To enable this need, it will
be imperative that the end devices -- both
communications as well as device operating systems
-- can be remotely upgraded securely over the air.
This capability will not only help future proof your
investment, but also reduce the OpEx.”
Lay a strong foundation for interoperability, and
future changes can be addressed more easily.
This also cuts down on the need to try to predict
exactly which components and applications will be
needed and when. Adds McCullough,“Nobody
can predict what the future will require, but with
remotely upgradeable and configurable devices, new
applications can be created and downloaded to the
device level of the solution to meet these needs.
This enables utilities to continually leverage their
investments to obtain the maximum return.”
In instances where a utility plans to take the
implementation in phases, possibly to reduce
CapEx, an interoperable solution is key. For
example, a utility may initially implement a walk-
by/drive-by smart meter solution, but with future
plans to migrate to a fixed network offering.
When the transition time comes, an interoperable
offering will enable an easy switch from the mobile
“The use of standards-based protocols
enables the ability to support and
manage the multitude of technology
and devices required across the smart
grid, providing increased interoperability
and reduced operational cost.”
Jeff McCullough, VP of System Design and
Development at Elster
3. FierceMarkets Custom Publishing
April 2015 3
SMART GRID INTEROPERABILITY:THE GREAT ENABLER
system to a fixed network system, according to
McCullough.
Planning and assessing means understanding that
core systems should be upgraded in concert with
one another. It also means that those systems
(AMS, OMS, DMS, etc.) will need to be agile
enough to smoothly provide future analytics and
reporting data so utilities can quickly meet user
needs and take corrective action when necessary.
CHOOSE A FLEXIBLE SOLUTION
The future tends to be friendlier when you keep the
past close in mind. With the advent of more open
protocols, legacy and core systems can easily be
integrated into the big picture. Preeminent energy
evangelist Daniel Yergin once said,“Combining
three or four dozen different technologies for a
smart grid system is far more difficult and time
consuming than coming up with a new iPhone
app.”But combining systems becomes more
seamless when open protocols are involved.
“With Elster’s Connexo platform, we have a single
offering which is designed to provide integrated
solutions for the smart grid,”says McCullough,
referencing the company’s recently introduced
interoperable smart grid software suite.“For large
IOUs, the need is to provide the ability to integrate
with enterprise systems. For smaller utilities
that don’t have many of the enterprise systems
an IOU may have, Connexo provides integrated
options such as outage management, transformer
monitoring, theft detection, and so on, meeting
many of the small utilities’operational needs.”
So much of what makes a smart grid software
platform effective today is the use of standards-
based application program interfaces (APIs).
Standards-based APIs unite applications and
components together more successfully than
proprietary APIs.This applies both at the broader
application and more specific device level.
“Standards-based APIs reduce costs,”says
McCullough.“With proprietary interfaces,
interoperability is greatly reduced without added
expenses.The use of standards-based protocols
enables the ability to support and manage the
multitude of technology and devices required across
the smart grid, providing increased interoperability
and reduced operational cost.”
Flexible APIs take into account where a utility
has been and where it’s going. IPv6 technology is
very much an example of that. While the standard
may not be used extensively yet, its importance
is growing. According to McCullough, IPv6 is
the type of future-proof communications that
utilities will need.“The communication will
change and morph over time, switching from
existing mesh technology to standardization of
IPv6 mesh technology. IPv6 has been requested
in some markets, and its coming. Right now, the
existing backhaul does not support IPv6. Given the
limitations of existing public and private networks
to support IPv6 over IPv4, the Elster solution
utilizes standards based protocols to enable our
solution to provide end-to-end IPv6 utilizing
a standards-based IPv4 tunnel routing through
backhauls.This is how the use of standards enables
true system flexibility and interoperability.”
BUILD A COALITION:
Utilities can also build technology and regulatory
coalitions with the very vendors that supply
the grid. In contrast to state-mandated grid
“When looking at new system solutions
to upgrade existing infrastructure, some
of the primary attributes to assess are
the network management capabilities,
ability to transition, and support for
existing and evolving technology”
Jeff McCullough, VP of System Design and
Development at Elster
4. FierceMarkets Custom Publishing
April 2015 4
SMART GRID INTEROPERABILITY:THE GREAT ENABLER
Elster Solutions is the North American electricity business unit of Elster, a multi-national, 7500-person
company providing electricity, gas and water meters and related communications, network and software
solutions to customers in more than 130 countries. Headquartered in Raleigh, NC, Elster Solutions is focused
on delivering the vital connections utilities need to achieve the greatest possible value from their meter data.
April 2015
modernization plans, these types of coalitions
are highly flexible in their choice of the most
relevant technology for interoperability.They also
highlight the fact that regulators don’t always
emphasize the same technology as utilities. In
2014, Massachusetts-based utilities argued against
a state-mandated proposal for a ten-year grid
modernization plan because they felt it reduced the
overall flexibility of their technology upgrades.
The coalition approach, exemplified by Duke
Energy’s “Coalition of the Willing,”focuses on
true interoperability. With 25 coalition partners,
focusing on all aspects from communication to
grid control, the goal is to standardize the way grid
technologies integrate. In the way that Android
and iOS have opened up the world’s smartphones,
Duke’s approach is to get participating vendors
to open up their own systems. Other utilities
can adopt this approach or at least benefit from
participating in conversations around protocol
standardization with their vendors. Elster is a
participant in Duke’s coalition and believes that
such standardization is important for justifying
future utility investments.
“It’s about innovation.The collaboration with
utilities not only helps to better understand
what they are trying to achieve, but also enables
discussion to generate new ideas to meet their
business needs,”says McCullough.“One good
example is applying the technology for distribution
automation -- how to use communication
management to monitor transformers and
perform outage management, which spreads
across residential and distribution, linking the two
together to increase the intelligence obtained from
the network.That’s what we like to talk to utilities,
and Duke about.”
The more closely utilities work with their vendors
the better the chances that interoperability can
work as a cohesive strategy through the entire
system.
CONCLUSION
Grid changes are coming, and a growing number
of vendors will participate in the process and add
further innovations. Interoperability will serve as
the prime enabler of these changes. Utilities will
be able to serve their constituents well only if
they choose an upgrade path based on a common
platform for change and find solutions that
accommodate upgrades across the whole spectrum
of the grid.