In this piece we highlight the utility data monetization imperative and how utilities can build the right strategies to take advantage of this opportunity
Overview of Business Analytics and career lessons learnt / advice. Presentation delivered to Melbourne Business School - Masters of Business Analytics - July 2016.
Master Data Management (MDM) for Mid-MarketVivek Mishra
Over the years, MDM has catered to the data requirements of big players or large organizations. But in last few years due to its benefits, small and medium businesses (SMBs) are moving towards MDM to organize, categorize, and localize their master data according to the scale of operations and business processes.
Checkout our whitepaper to learn how master data management (MDM) can help you refine your data for optimal data usage. Discover the essential features of a successful MDM software, and explore how MDM can help improve your organization’s data quality, business insight, and more.
This is a presentation in a meetup called "Business of Data Science". Data science is being leveraged extensively in the field of Banking and Financial Services and this presentation will give a brief and fundamental highlight to the evergreen field.
Slides: Data Monetization — Demonstrating Quantifiable Financial Benefits fro...DATAVERSITY
Data monetization is a cross-functional discipline that draws from best practices in Enterprise Data Management (EDM), technology, legal engineering, and finance to leverage data to increase revenues, reduce costs, and manage risk. EDM programs have generally found it extremely difficult to get senior management buy-in the absence of regulatory pressures or the fear of a data breach. Data monetization is an approach to drive quantifiable business benefits from data and information. This bottom-line driven approach is key to generating business adoption with stakeholders.
This session will review the key aspects of data monetization:
• Introduction to Data Monetization
• Identify Stakeholders
• Build Inventory of Use Cases
• Develop Business Cases
• Execute Initiatives
• Realize Business Benefits
• Legal Engineering and Regulatory Compliance
• Data Marketplace
Business Intelligence made easy! This is the first part of a two-part presentation I prepared for one of our customers to help them understand what Business Intelligence is and what can it do...
Advanced Topics In Business Intelligenceguest1a9ef2
The blurring of the line between decision support systems and operational systems because of real-time warehousing, the use of Enterprise Information Integration (EII), and closed- loop business processes
Overview of Business Analytics and career lessons learnt / advice. Presentation delivered to Melbourne Business School - Masters of Business Analytics - July 2016.
Master Data Management (MDM) for Mid-MarketVivek Mishra
Over the years, MDM has catered to the data requirements of big players or large organizations. But in last few years due to its benefits, small and medium businesses (SMBs) are moving towards MDM to organize, categorize, and localize their master data according to the scale of operations and business processes.
Checkout our whitepaper to learn how master data management (MDM) can help you refine your data for optimal data usage. Discover the essential features of a successful MDM software, and explore how MDM can help improve your organization’s data quality, business insight, and more.
This is a presentation in a meetup called "Business of Data Science". Data science is being leveraged extensively in the field of Banking and Financial Services and this presentation will give a brief and fundamental highlight to the evergreen field.
Slides: Data Monetization — Demonstrating Quantifiable Financial Benefits fro...DATAVERSITY
Data monetization is a cross-functional discipline that draws from best practices in Enterprise Data Management (EDM), technology, legal engineering, and finance to leverage data to increase revenues, reduce costs, and manage risk. EDM programs have generally found it extremely difficult to get senior management buy-in the absence of regulatory pressures or the fear of a data breach. Data monetization is an approach to drive quantifiable business benefits from data and information. This bottom-line driven approach is key to generating business adoption with stakeholders.
This session will review the key aspects of data monetization:
• Introduction to Data Monetization
• Identify Stakeholders
• Build Inventory of Use Cases
• Develop Business Cases
• Execute Initiatives
• Realize Business Benefits
• Legal Engineering and Regulatory Compliance
• Data Marketplace
Business Intelligence made easy! This is the first part of a two-part presentation I prepared for one of our customers to help them understand what Business Intelligence is and what can it do...
Advanced Topics In Business Intelligenceguest1a9ef2
The blurring of the line between decision support systems and operational systems because of real-time warehousing, the use of Enterprise Information Integration (EII), and closed- loop business processes
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
The Institution's Innovation Council (Ministry of HRD initiative) and the Institution of Electronics and Telecommunication Engineers (IETE) invited me to grace "World Telecommunication & Information Society Day" on 18 May 2020.
DataEd Slides: Data Strategy Best PracticesDATAVERSITY
Your Data Strategy should be concise, actionable, and understandable by business and IT! Data is not just another resource. It is your most powerful, yet poorly managed and therefore underutilized organizational asset. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Overcoming lack of talent, barriers in organizational thinking, and seven specific data sins are organizational prerequisites to be satisfied before (a measurable) nine out of 10 organizations can achieve the three primary goals of an organizational Data Strategy, which are to:
- Improve the way your people use data
- Improve the way your people use data to achieve your organizational strategy
- Improve your organization’s data
In this manner, your organizational Data Strategy can be used to best focus your data assets in precise support of your organization's strategic objectives. Once past the prerequisites, organizations must develop a disciplined, repeatable means of improving the data literacy, standards, and supply as business objectives in specific areas that become the foci of subsequent Data Governance efforts. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (Data Strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are covered, including:
- A cohesive argument for why Data Strategy is necessary for effective Data Governance
- An overview of prerequisites for effective Data Strategy, as well as common pitfalls that can detract from its implementation, such as the “Seven Deadly Data Sins”
- A repeatable process for identifying and removing data constraints, and the importance of balancing business operation and innovation while doing so
Elevating customer analytics - how to gain a 720 degree view of your customerActian Corporation
big data creates significant opportunities for marketers. Using big data analytics tools, marketers can improve decision making, deliver better value for their marketing spend, create truly personalized customer experiences, and understand their audience at the level of each individual consumer.
Requirements for a Master Data Management (MDM) Solution - PresentationVicki McCracken
Working on Requirements for a Master Data Management solution and looking for thoughts on how to approach the requirements? This is an overview presentation that complements my guide on how to approach requirements for a Master Data Management solution (Requirements for an MDM Solution). You may be able to leverage all or some of the approach described in this guide to formulate your approach.
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
What Is Unstructured Data And Why Is It So Important To Businesses?Bernard Marr
Unstructured data is created at an incredible rate each day and with the advent of artificial intelligence and machine learning tools to gather, process, analyse and report insights from unstructured data, it now provides important business value to organizations. It’s essential for all businesses to start making the most of their unstructured data.
Becoming a Data-Driven Organization - Aligning Business & Data StrategyDATAVERSITY
More organizations are aspiring to become ‘data driven businesses’. But all too often this aim fails, as business goals and IT & data realities are misaligned, with IT lagging behind rapidly changing business needs. So how do you get the perfect fit where data strategy is driven by and underpins business strategy? This webinar will show you how by de-mystifying the building blocks of a global data strategy and highlighting a number of real world success stories. Topics include:
•How to align data strategy with business motivation and drivers
•Why business & data strategies often become misaligned & the impact
•Defining the core building blocks of a successful data strategy
•The role of business and IT
•Success stories in implementing global data strategies
Data Integration is a key part of many of today’s data management challenges: from data warehousing, to MDM, to mergers & acquisitions. Issues can arise not only in trying to align technical formats from various databases and legacy systems, but in trying to achieve common business definitions and rules.
Join this webinar to see how a data model can help with both of these challenges – from ‘bottom-up’ technical integration, to the ‘top-down’ business alignment.
Watch full webinar here: https://bit.ly/2Y0vudM
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Register to attend this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise?
Data Strategy - Executive MBA Class, IE Business SchoolGam Dias
For today's enterprise Data is now very much a corporate asset, vital to delivering products and services efficiently and cost effectively. There are few organizations that can survive without harnessing data in some way.
Viewed as a strategic asset, data can be a source of new internal efficiencies, improved competitive advantage or a source of entirely new products that can be targeted at your existing or new customers.
This slide deck contains the highlights of a one day course on Data Strategy taught as part of the Executive MBA Program at IE Business School in Madrid.
Making data based decisions makes instinctive sense, and evidence is mounting that it makes strong commercial sense too.
Whilst being aware of this kind of potential is undoubtedly valuable, knowing it and doing something about it are two very different things.
So how do you go about becoming a data driven organization?
And how does the Data Management Maturity Assessment help in achieving your data strategy goals?
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
This Cartesian Insight examines how operators are leveraging subscriber data for new revenue streams, cost savings and customer experience enhancement.
www.cartesian.com
Data is poised to play an important role in the enterprises of the future, with businesses looking to scale up production and recover costs. Visit: https://www.raybiztech.com/blog/data-analytics/what-are-big-data-data-science-and-data-analytics
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
The Institution's Innovation Council (Ministry of HRD initiative) and the Institution of Electronics and Telecommunication Engineers (IETE) invited me to grace "World Telecommunication & Information Society Day" on 18 May 2020.
DataEd Slides: Data Strategy Best PracticesDATAVERSITY
Your Data Strategy should be concise, actionable, and understandable by business and IT! Data is not just another resource. It is your most powerful, yet poorly managed and therefore underutilized organizational asset. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Overcoming lack of talent, barriers in organizational thinking, and seven specific data sins are organizational prerequisites to be satisfied before (a measurable) nine out of 10 organizations can achieve the three primary goals of an organizational Data Strategy, which are to:
- Improve the way your people use data
- Improve the way your people use data to achieve your organizational strategy
- Improve your organization’s data
In this manner, your organizational Data Strategy can be used to best focus your data assets in precise support of your organization's strategic objectives. Once past the prerequisites, organizations must develop a disciplined, repeatable means of improving the data literacy, standards, and supply as business objectives in specific areas that become the foci of subsequent Data Governance efforts. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (Data Strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are covered, including:
- A cohesive argument for why Data Strategy is necessary for effective Data Governance
- An overview of prerequisites for effective Data Strategy, as well as common pitfalls that can detract from its implementation, such as the “Seven Deadly Data Sins”
- A repeatable process for identifying and removing data constraints, and the importance of balancing business operation and innovation while doing so
Elevating customer analytics - how to gain a 720 degree view of your customerActian Corporation
big data creates significant opportunities for marketers. Using big data analytics tools, marketers can improve decision making, deliver better value for their marketing spend, create truly personalized customer experiences, and understand their audience at the level of each individual consumer.
Requirements for a Master Data Management (MDM) Solution - PresentationVicki McCracken
Working on Requirements for a Master Data Management solution and looking for thoughts on how to approach the requirements? This is an overview presentation that complements my guide on how to approach requirements for a Master Data Management solution (Requirements for an MDM Solution). You may be able to leverage all or some of the approach described in this guide to formulate your approach.
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
What Is Unstructured Data And Why Is It So Important To Businesses?Bernard Marr
Unstructured data is created at an incredible rate each day and with the advent of artificial intelligence and machine learning tools to gather, process, analyse and report insights from unstructured data, it now provides important business value to organizations. It’s essential for all businesses to start making the most of their unstructured data.
Becoming a Data-Driven Organization - Aligning Business & Data StrategyDATAVERSITY
More organizations are aspiring to become ‘data driven businesses’. But all too often this aim fails, as business goals and IT & data realities are misaligned, with IT lagging behind rapidly changing business needs. So how do you get the perfect fit where data strategy is driven by and underpins business strategy? This webinar will show you how by de-mystifying the building blocks of a global data strategy and highlighting a number of real world success stories. Topics include:
•How to align data strategy with business motivation and drivers
•Why business & data strategies often become misaligned & the impact
•Defining the core building blocks of a successful data strategy
•The role of business and IT
•Success stories in implementing global data strategies
Data Integration is a key part of many of today’s data management challenges: from data warehousing, to MDM, to mergers & acquisitions. Issues can arise not only in trying to align technical formats from various databases and legacy systems, but in trying to achieve common business definitions and rules.
Join this webinar to see how a data model can help with both of these challenges – from ‘bottom-up’ technical integration, to the ‘top-down’ business alignment.
Watch full webinar here: https://bit.ly/2Y0vudM
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Register to attend this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise?
Data Strategy - Executive MBA Class, IE Business SchoolGam Dias
For today's enterprise Data is now very much a corporate asset, vital to delivering products and services efficiently and cost effectively. There are few organizations that can survive without harnessing data in some way.
Viewed as a strategic asset, data can be a source of new internal efficiencies, improved competitive advantage or a source of entirely new products that can be targeted at your existing or new customers.
This slide deck contains the highlights of a one day course on Data Strategy taught as part of the Executive MBA Program at IE Business School in Madrid.
Making data based decisions makes instinctive sense, and evidence is mounting that it makes strong commercial sense too.
Whilst being aware of this kind of potential is undoubtedly valuable, knowing it and doing something about it are two very different things.
So how do you go about becoming a data driven organization?
And how does the Data Management Maturity Assessment help in achieving your data strategy goals?
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
This Cartesian Insight examines how operators are leveraging subscriber data for new revenue streams, cost savings and customer experience enhancement.
www.cartesian.com
Data is poised to play an important role in the enterprises of the future, with businesses looking to scale up production and recover costs. Visit: https://www.raybiztech.com/blog/data-analytics/what-are-big-data-data-science-and-data-analytics
Cartesian explores the future of customer data monetization for mobile operators, from advertising, to sophisticated customer engagement and adjacent revenue growth.
Who needs Big Data? What benefits can organisations realistically achieve with Big Data? What else required for success? What are the opportunities for players in this space? In this paper, Cartesian explores these questions surrounding Big Data.
www.cartesian.com
Are you making money on your data assets? You could be. And there is more than one way. Boost innovation, tap into new revenue streams and industry sectors. It’s time to see the potential of big data.
Optimising Supply Chain With Big Data LogisticseTailing India
Dear friends, here we move to the 3rd part of Logistics Week Series. The logistic industry is going through an unprecedented transformation and today, we are going to study the role of Big Data in optimizing Supply Chain Operations.
Big Data is still a relatively untapped asset that logistics companies can exploit once they adopt a shift of mindset and apply the right drilling techniques. Sophisticated data analytics can consolidate this traditionally fragmented sector, and these new capabilities put logistics providers in pole position.
La base para optimizar y potenciar la toma de decisiones en cualqueir empresa es la información. Pero no la información en bruto, sino aquella de la que podemos obtener valor tras su análisis.
Managing the Energy Information Grid - Digital Strategies for UtilitiesIndigo Advisory Group
In this piece we highlight the digital imperative for the industry and how utilities can optimize their digital strategies, build business cases and incorporate emerging technologies.
Big Data Update - MTI Future Tense 2014Hawyee Auyong
The Futures Group first wrote about the emerging phenomenon of Big Data in 2010 as it was about to enter the mainstream. It was envisaged that Big Data would create a demand for new skills (Google has identified statisticians as the “sexy job of the decade”) and generate new industries. This report updates on the industry value chain and business models for the data analytics industry, latest developments as well as the opportunities for Singapore.
Proposed ranking for point of sales using data mining for telecom operatorsijdms
This study helps telecom companies in making decisions that optimize its sales points to reduce costs, also
to identify profitable customers and churn ones. This study builds two research models; physical model for
continuous mining of database where ever it resides i.e., as we have On Line Analytic Processing (OLAP)
we must have On Line Data Mining (OLDM), and logical model using Technology Acceptance Model.
Previous Studies showed that using basic information of customers, call details and customer service
related data, a model can effectively achieve accurate prediction data.
This research gives a new definition and classification for telecommunication services from the data
mining point of view. Then this research proposed a formula for total rank a shop and each term of this
formula gives a sub rank. The proposed example shows that even a shop with lower numbers of population
and visitors, it still has higher rank.
This research suggested that telecom operators has to concentrate more on their e-shopping and epayment
as it is more cost effective and use data from shops for marketing issues. Some assumptions made
in this study need to be validated using surveys, also proposed ranking should be applied on live database.
Learn about Addressing Storage Challenges to Support Business Analytics and Big Data Workloads and how Storage teams, IT executives, and business users will benefit by recognizing that deploying appropriate storage infrastructure to support a wide range of business analytics workloads will require constant evaluation and willingness to adjust the infrastructure as needed. For more information on IBM Storage Systems, visit http://ibm.co/LIg7gk.
Visit the official Scribd Channel of IBM India Smarter Computing at http://bit.ly/VwO86R to get access to more documents.
We are very excited to share our first edition of our OpenInsight. This is the first of a regular series of updates which will include insights covering extending the value of data within your business, tips and tricks on the use of the RAPid platform and generic articles on business analytics.
We are very excited to share our first edition of our OpenInsight. This is the first of a regular series of updates which will include insights covering extending the value of data within your business, tips and tricks on the use of the RAPid platform and generic articles on business analytics.
Similar to Strategies to Monetize Energy Data - How Utilities Can Increase Their 'Earnings per Byte' (20)
Grid Tech is a rapidly growing and evolving market. The decarbonization, economic and customer benefits derived from applying sensors, communications and software on power networks are enormous. The market is comprised of several stakeholders. On the one hand, you have a broad set of utilities, with various ownership structures and other buyers who have experienced decades of changing physical grid requirements and evolving regulatory frameworks. On the other hand, a range of suppliers ranging from conglomerates, specialized mid-sized companies and startups are innovating and integrating sophisticated solutions into new and existing product sets. This is all underpinned by a rapidly growing climate VC market, substantial amounts of public money and new standalone business models that are enabling new economic arrangements to quickly spread throughout the industry.
As we enter the ‘Decade of Deployment’ for clean technologies and as public and private finance ramps up, analyzing this market as a whole becomes a necessity. The last round of significant public funding, over a decade ago, paved the way for a robust Smart Grid market while simultaneously an evolving regulatory structure enabled a bourgeoning Grid Edge market. The economic impact of that stimulus was significant and has formed the technical foundation for future upgrades.
The multibillion-dollar Grid Tech and software market enables utility decarbonization, investment deferral, and improved customer satisfaction. In terms of value, our initial analysis of submarkets and a refreshing older smart grid models suggest the Grid Tech market is forecasted to be worth $641 billion by 2030, with a CAGR of over 19%.
The Grid Tech Market is comprised of primarily software related solutions, sensors, and applications. The constituent submarkets are generation source agonistic and focus on grid performance and utility innovation. These submarkets enable new sources of generation both centrally and at the grid edge. Enabling these solutions are 1000's of startups, conglomerates, and tangential companies that are deploying software and hardware across the energy value chain.
Taken as a whole, the Grid Tech market represents the six key submarkets including Core Systems, Digital Asset Management, Robotics & Connected Worker, Optimization Technologies, Flexibility and eMobility. Across these markets, a range of vendors, use cases and deployments exist in the power industry and at various levels of maturity. The diagram below highlights some of the major applications we are tracking in each area.
We invite readers to visit our research hub and discover analysis, data sets, and evaluations of what is a fast moving, high impact market. We also invite readers to download our complementary introductory Grid Tech report “The Decade of Deployment”
Addressing the shifting landscape across policy & regulation, revenue, business model innovation, technology innovation and changing consumer behavior, Indigo Advisory Group has created a series of tools, frameworks and strategies to help utilities manage the energy transition
In this piece, we explore how AI has the potential to deliver the active management that will be required for the grid of the future. Powerful intelligence will be able to balance grids, manage demand, negotiate actions, enable self-healing and facilitate a host of new products and services.
Artificial intelligence in Energy and Utilities – Market OverviewIndigo Advisory Group
Artificial Intelligence has been around for decades, however, over the past 2-3 years the technology has been finding applications across a series of sectors, including energy and utilities. This presentation includes some of the highlights given on an Engerati Webinar on September 27th including three major application areas.
Presentation to the New York Association for Energy Economics on October 12th 2017 on how blockchain and distributed ledger technology is being applied to the power sector. The talk focused on examining emerging applications, the limitations of the technology, while also looking to the future of distributed ledgers and their potential impact on the energy value chain.
In this infographic, we explore how AI has the potential to deliver the active management that will be required for the grid of the future. Powerful intelligence will be able to balance grids, manage demand, negotiate actions, enable self-healing and facilitate a host of new products and services.
This pivotal moment of transformation in the utility industry is providing large scale and unprecedented opportunity for traditional power providers and those operating at the edge of the grid. In this capability primer, we highlight some of the broader industry technology trends and the resulting tools, approaches and insights that Indigo Advisory Group employs to help utilities navigate uncertainty and create the right strategies.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
2.
As we announced early this year, Indigo is focusing on 4 key research themes over the coming year.
Our first quarter research theme is focused on monetizing utility data, a largely untapped and huge
potential revenue stream for utilities. Our research builds on our insights from 2017 — “Monetizing
Utility Data — The Utility Data as a Service Opportunity”. In that piece we outlined our initial Utility
Data Monetization Framework of basic data and value-added data and explored fee based structures
for value-added data services, ultimately advancing the “UDaaS” opportunity. In this piece we want
to go a step further and highlight the maturity of the market over the past 3 years and to look
forward over the next decade at how utilities can capitalize on the growing data points they are
gathering both internally and externally.
Importantly, in our research, when we discuss utility data monetization, we are not referring to the
endless opportunities that data provides utilities to improve operations and productivity but rather
the opportunity for utilities to sell data sets, insights and value-added data services to customers and
partners. (For operational use cases check out our utility analytics and UtiliAPP offering). That said,
and as pointed out in the MIT Sloan Review these paths are not mutually exclusive, and some
companies accomplish both, as is the case with telecommunications companies such as Verizon,
Deutsche Telekom, and Telefónica. They’ve achieved internal monetization by using data to optimize
operations and client services, and they also leveraged that data, anonymized and aggregated, across
various use cases for their B2B clients and partners by offering. In that second offering these
companies focused on products such as geotargeting for retailers, traffic flow for city planners, fraud
detection for banking institutions, smart targeting for digital advertisers and IoT applications for a
variety of companies. It is this offering that we are researching and benchmarking for utilities.
Utility Data Monetization Market Potential
Frost & Sullivan (2019) believe that the Data Monetization Markets in the Power and Utilities industry
could be worth nearly $20 billion by 2020 with a CAGR of 12.2% globally and with the volume of data
created reaching 175 Zettabytes (ZB) by 2025.
3.
Without a doubt, this demand is a huge opportunity for energy organizations that can best harness
and maximize the value of their data. Indeed, this year at CES, we saw data monetization from
sensors and antennas as a cross-industry mega-trend. From the evolution of Connected Services,
Location-Based Commerce, new In-Car Payment techniques and a significant amount of work being
done around the collection, cleansing, and shaping of Data Exchange itself — the market has very
much evolved from a hardware play. Over the next decade the 4th Industrial revolution, spurred on
by the convergence of AI, Big Data, 5G, Distributed Ledger Technology and IoT will unleash a host a
revenue opportunities for utilities. To assess the opportunities for utilities it is useful to examine
what is occurring in other industries as very often these trends eventually translate into the power
sector. To that, in a cross-industry survey recently conducted by the German based Business
Application Research Center (BARC), they found the beginnings of a data monetization market across
multiple sectors.
These results are consistent with a recent McKinsey study on Data Monetization, where they found
that across industries, most respondents agree that the primary objective of their data-and-analytics
activities is to generate new revenue. Interestingly, in that study, they found that more than half of
the respondents in energy, say their companies have begun monetizing data. What’s more, these
efforts are
4.
also proving to be a source of differentiation. Most notably, data monetization seems to correlate
with industry-leading performance. All this being said, this is still a very nascent area for utilities with
many factors to assess such as market, regulatory and technology complexities. In the next section
we outline a staged process that utilities can employ as they move forward with this upcoming
revenue opportunity.
Starting the Utility Data Monetization
Journey
Although every company has the potential to earn revenue from the information it generates, in a
recent study of more than 400 companies in 34 countries, only 1 in 12 were monetizing their data to
its fullest extent. Modern data monetization strategies can help utilities open brand new revenue
streams. In the diagram below and in this section we highlight Indigo’s 7 step process to monetize
utility data.
In terms of step 1, completing a data inventory, this may include utility data from operational
systems (GIS, ADMS, OMS, DSMS, DERMS, EMS etc.), enterprise systems such ERP data and customer
data such as CIS data. In essence, this is a complete review of all of the available data across a utility
that is both structured and unstructured. As a utility moves to step 2 data can be organized into
various needs with and eye to step 3. For example:
• Grid Needs and Planned Investment Data (Grid Need Type, Location, Scale of Deficiency,
Planned Investment, Reserve Margin, Historical Data, Forecast Data and Expected
Forecast Error)
5. • Hosting Capacity Data (Circuit Model, Loading, Equipment Ratings and Settings)
• Locational Value Data (Energy + Losses, Generation, Transmission & Distribution Capacity,
Ancillary Services, Renewable Energy Compliance, Societal Benefits, Voltage and Power Quality,
Conservation Voltage Reduction, Equipment Life Extension, Reliability and Resiliency, Market
Price Suppression)
Across a utility some of the most common types of data or data services that could be monetized
may include packaged data product that is ready-to-consume and requires little or no analysis or
transformation. It may include data insights such as dashboards, metrics and indices, going further it
may include data enhancement where data sets have been augmented with customer data for
additional insights (e.g. zip codes etc.).
In step 3, Monetization Analysis, a utility must recognize that in the majority of jurisdictions utilities
are required to make some level of data available to customers and to third parties, at no cost.
However, in cases where customers request information that is more detailed and/or more frequent
than basic required data, utilities could supply this value-added data for a nominal fee. This second
type of service — additional data — would derive directly from the monopoly function and could be
treated as a platform service revenue. A third case for example could be where utilities perform
analysis of customer-specific data, and provide recommendations based on that analysis,
conditioned on utilities implementing tools to allow customers to easily share their usage data with
third-party vendors including firms providing data analysis. For example, EV’s are now able to
capture and share many types of data, including geolocation, vehicle performance, driver behavior,
energy data and biometrics data. In this case OEMs and utilities could explore a wide variety of data-
based products and service offerings, including user-based insurance, mobile commerce, mobility-as-
a-service (MaaS), behavioral, energy and geo-based advertising, infotainment, and personal health
monitoring. In general, however, the graph below we highlight how adding insights to data sets
increases its value to a utility.
6.
Step 4 entails examining the end customer for the data or data services. Part of this step is to create
a market forecast by data type and ultimately a “total addressable market” number. This will help
the creation of a business case that will result from future steps. It will also help in further refining
the product by customer type. Step 5 entails creating a price point for the data. There are two
methods a utility or energy company could apply here. Firstly they could look at cost pricing, which
involves adding a percentage to actual costs for data collection, storage, preparation, and
transformation. Secondly, they could look at value pricing involves charging for the value your data
will bring a customer. In the first instance, cost pricing involves understanding your costs for data
collection, storage, preparation, transformation, and sharing so you can add a percentage margin as
you price your data above your costs. To inform that business case three major elements should be
examined: the cost of data sourcing, the cost of data packaging and the cost of data sharing. That
said, it also may be that your goal is not to maximize data revenue, but rather to use the offering as a
customer acquisition tool, for example a DER or DR product. If so, you might price your data at or
below cost as a loss leader, or even give some of it away for free. The size of the discount might then
depend on the value of the new business sought and the expected conversion rate of prospects into
clients. Value pricing on the other hand, involves looking at your data from a customer’s perspective
and identifying the value it will bring. With this pricing strategy, utilities should consider elements
such as the uniqueness of data, access restrictions, technology and expertise, market alternatives
(e.g. Green Button), analysis and most importantly business value. In this scenario, reducing the cost
of customer acquisition for a DER
7. provider (which can run into excess of 30% of a providers operating cost) would be priced according
to business value of the data.
A useful exercise at this point is to plot these elements on a quadrant like the one shown below to
help guide internal discussions around pricing. On the y-axis, plot the level of insights the data offers.
On the x-axis, plot the range and level of proprietary data.
In terms of Step 5, determining the ‘price’, Snowflake Computing recommends a tiered pricing
structure. This type of plan can help attract new users with lower costs for data access only, while
ensuring that your existing customers get the data and services they need, at a cost that best fits
their needs and budget. Utilities will also need to decide whether to sell data by the set or by
subscription, perhaps monthly or annually, or if they want to charge based on usage of the data.
When you plot the different attributes of your data and the elements that comprise value for
customers, utilities can create a matrix like the one below to help identify the different packages you
can offer.
8.
Step 6 and 7 involve packaging the product and selling it. While a direct data transfer to customers
cuts out intermediaries and may give a utility more control over the final product. the downside is
that a power company will have do all the work, often with standards such as FTP and APIs. This
method can include storage, security and ETL costs for both parties. Additionally, while a data broker
can help market your data and will sometimes also control pricing they offer limited opportunities for
promotion and incomplete control over the presentation. As such, we recommend a data exchange,
similar to the “Trust Portal” we defined with the Joint Utilities of NY here. At this stage a distributed
ledger solution is both elegant and efficient. As highlighted below, conceptually a utility data
marketplace or ‘data factory’ defines a standard data model and interfaces for buyers and sellers to
exchange data.