These are the slides from my 2013 SQL Saturday presentations in Mountain View and Sacramento. I suggest you view the (newer) videos, as they cover all that material and more. However, here is the session description these slides cover:
A recent survey by Information Week found that data quality is the greatest barrier to BI adoption in enterprises. MDS addresses this challenge with modeling, validation, alerting and security capabilities. In this presentation, you will learn how to use MDS to model your data to ensure correctness, update it with changes from your ERP, and create workflows with notifications. Next you will learn the capabilities of DQS and see how it addresses data standardization, completeness and other challenges. You will then see how to use them together to enable Enterprise Information Management. BI professionals will come away with knowledge on how to use tools that address the greatest risk to success for BI projects - data quality
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
Introduction to Master Data Services in SQL Server 2012Stéphane Fréchette
What is Master Data Services? Why is it important? - Will discuss Master Data Services capabilities, it's underlying architecture. Will demo creating a model, using SQL Server 2012 MDS add-in for Microsoft Excel, creating hierarchies, business rules and exposing/integrating data with other interfaces (Data Warehouse)
Microsoft SQL Server 2012 Master Data ServicesMark Ginnebaugh
Author: Mark Gschwind, DesignMind
San Francisco, California
Master Data Services had a major upgrade in the SQL Server 2012 release. BI Consultant Mark Gschwind takes you through the new Excel interface, the new Silverlight look and feel, and integration improvements.
Knowing how to use this tool can be a valuable addition to your repertoire as a BI professional, allowing you to address data quality and other challenges.
Mark will show how to create a model, add columns and rows, manage security, and create hierarchies. He demos the new Excel interface and discuss how to allow you to manage master data yourself. He'll touch on how to integrate with a DW, migrating from Dev to Production.
You'll learn:
* How to let users manage dimensions and hierarchies for your DW
* How to create workflows to improve data quality in your DW
* Tips from real-life implementations to help you achieve a successful implementation
Mark Gschwind, Partner at DesignMind, is an expert on data warehousing, OLAP, and ERP migration. He has authored three enterprise data warehouses and over 80 OLAP cubes for 46 clients in a wide range of industries. Mark has certifications in SQL Server and Oracle Essbase.
Introduccion a SQL Server Master Data ServicesEduardo Castro
En esta presentación hacemos una introducción a SQL Server 2008 R2 Master Data Services.
Saludos,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
Introduction to Master Data Services in SQL Server 2012Stéphane Fréchette
What is Master Data Services? Why is it important? - Will discuss Master Data Services capabilities, it's underlying architecture. Will demo creating a model, using SQL Server 2012 MDS add-in for Microsoft Excel, creating hierarchies, business rules and exposing/integrating data with other interfaces (Data Warehouse)
Microsoft SQL Server 2012 Master Data ServicesMark Ginnebaugh
Author: Mark Gschwind, DesignMind
San Francisco, California
Master Data Services had a major upgrade in the SQL Server 2012 release. BI Consultant Mark Gschwind takes you through the new Excel interface, the new Silverlight look and feel, and integration improvements.
Knowing how to use this tool can be a valuable addition to your repertoire as a BI professional, allowing you to address data quality and other challenges.
Mark will show how to create a model, add columns and rows, manage security, and create hierarchies. He demos the new Excel interface and discuss how to allow you to manage master data yourself. He'll touch on how to integrate with a DW, migrating from Dev to Production.
You'll learn:
* How to let users manage dimensions and hierarchies for your DW
* How to create workflows to improve data quality in your DW
* Tips from real-life implementations to help you achieve a successful implementation
Mark Gschwind, Partner at DesignMind, is an expert on data warehousing, OLAP, and ERP migration. He has authored three enterprise data warehouses and over 80 OLAP cubes for 46 clients in a wide range of industries. Mark has certifications in SQL Server and Oracle Essbase.
Introduccion a SQL Server Master Data ServicesEduardo Castro
En esta presentación hacemos una introducción a SQL Server 2008 R2 Master Data Services.
Saludos,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
Master Data Management (MDM) is a feature of Microsoft Dynamics AX 2012 R3 that lets you synchronize master data records across multiple instances of Microsoft Dynamics AX 2012. By creating and maintaining a single copy of master data, you can help guarantee the consistency of important information, such as customer and product data, that is shared across AX 2012 instances
More than 70% of Master Data Management fails to reach full ROI due to inadequate implementation. I tried to highlight some of key areas to watch for during MDM implementation.
This session was about Master Data Services and what it also could be used as - the client wanted an application to validate and submit warehouse inventories.
An introduction to Data Quality Services. DQS enables to discover, build, and manage knowledge about your data. Use that knowledge to perform data cleansing, matching and profiling. We will explore the numerous features and capabilities of Data Quality Services and its integration with SSIS with the DQS Cleansing Transform. Data Quality Services in SQL Server 2012
A Crash Course in SQL Server Administration for Reluctant Database Administra...Chad Petrovay
Reluctant DBAs are those of us who aren’t formally trained in database administration, but manage through a combination of our wits, technical manuals, and online forums. This practical session will explore best practices for installing, configuring, and maintaining Microsoft SQL Server, and highlight some SQL Server features (and Easter eggs) that can improve your user experience and institutional ROI.
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Suresh Menon, Vice President, Product Management - Information Quality Solutions at Informatica, shares how to master your data and your business from the 2015 Informatica Government Summit.
Master Data Management (MDM) is a feature of Microsoft Dynamics AX 2012 R3 that lets you synchronize master data records across multiple instances of Microsoft Dynamics AX 2012. By creating and maintaining a single copy of master data, you can help guarantee the consistency of important information, such as customer and product data, that is shared across AX 2012 instances
More than 70% of Master Data Management fails to reach full ROI due to inadequate implementation. I tried to highlight some of key areas to watch for during MDM implementation.
This session was about Master Data Services and what it also could be used as - the client wanted an application to validate and submit warehouse inventories.
An introduction to Data Quality Services. DQS enables to discover, build, and manage knowledge about your data. Use that knowledge to perform data cleansing, matching and profiling. We will explore the numerous features and capabilities of Data Quality Services and its integration with SSIS with the DQS Cleansing Transform. Data Quality Services in SQL Server 2012
A Crash Course in SQL Server Administration for Reluctant Database Administra...Chad Petrovay
Reluctant DBAs are those of us who aren’t formally trained in database administration, but manage through a combination of our wits, technical manuals, and online forums. This practical session will explore best practices for installing, configuring, and maintaining Microsoft SQL Server, and highlight some SQL Server features (and Easter eggs) that can improve your user experience and institutional ROI.
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Suresh Menon, Vice President, Product Management - Information Quality Solutions at Informatica, shares how to master your data and your business from the 2015 Informatica Government Summit.
This session shows via live demonstration the use of Integration Services, Data Quality- and Master Data Services to create a closed loop information management solution, which cleans, standardize, merge and purges data all with the new data curation tools of SQL Server 2012. The session will also cover principals and best practises for each of the technology used.
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
• History of Data Management
• Business Drivers for implementation of data governance • Building Data Strategy & Governance Framework
• Data Management Maturity Models
• Data Quality Management
• Metadata and Governance
• Metadata Management
• Data Governance Stakeholder Communication Strategy
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsDenodo
Watch full webinar here: https://buff.ly/3vhzqL5
Join our exclusive webinar series designed to empower credit unions with transformative insights into the untapped potential of data. Explore how data can be a strategic asset, enabling credit unions to overcome challenges and foster substantial growth.
This webinar will delve into how data can serve as a catalyst for addressing key challenges faced by credit unions, propelling them towards a future of enhanced efficiency and growth.
Mike Ferguson, managing director of Intelligent Business Strategies, highlights his top ten worst practices in Master Data Management (MDM) in this Information Builders webinar slideshow.
Salesforce Architect Group, Frederick, United States April 2023 - Architect’s...NadinaLisbon1
Data governance has always been one of the top priorities for executive leadership and architects. The impact of good data governance spans from mitigating compliance risks to efficient processes and happy stakeholders. In this session, Amey will go over the difference between Data Governance vs. Data Management and how Salesforce architects can guide organizations to get started with data governance in their Salesforce org.
Data Science Operationalization: The Journey of Enterprise AIDenodo
Watch full webinar here: https://bit.ly/3kVmYJl
As we move into a world driven by AI initiatives, we find ourselves facing new and diverse challenges when it comes to operationalization. Creating a solution and putting it into practice, is certainly not the same. The challenges span various organizational and data facades. In many instances, the data scientists may be working in silos and connecting to the live data may not always be possible. But how does one guarantee their developed model in a silo is still relevant to live data? How can we manage the data flow and data access across the entire AI operationalization cycle?
Watch on-demand to explore:
- The journey and challenges of the Data Scientist
- How Denodo data virtualization with data movement streamlines operationalization
- The best practices and techniques when dealing with siloed data
- How customers have used data virtualization in their data science initiatives
Enterprise-Level Preparation for Master Data Management.pdfAmeliaWong21
Master Data Management (MDM) continues to play a foundational role in the Data Management Architecture of every 21st century enterprise. In a forward-looking organization, MDM is significant in the Enterprise Integration Hub.
Then & Now: Strategic Considerations for Data QualityPrecisely
In today's digital age, data is considered the lifeblood of organizations and decision-making. However, not all data is created equal, and poor data quality can lead to bad decisions and wasted resources. Many organizations are implementing data architectures based on cloud data stores.
We will discuss how modern data quality solutions provide scalability to keep up with processing needs and run natively where the data resides while using data observability to proactively identify outliers and anomalies. Precisely, the global leader in data integrity will share the value of using modern data quality solutions combined with a business-first data governance approach to ensure you can trust the data used in your important business decisions.
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3sumuL5
Join KashTech and Denodo to discover how Data Virtualization can help accelerate your time-to-value from data while reducing the costs at the same time.
Gartner has predicted that organizations using Data Virtualization will spend 40% less on data integration than those using traditional technologies. Denodo customers have experienced time-to-deliver improvements of up to 90% within their data provisioning processes and cost savings of 50% or more. As Rod Tidwell (Cuba Gooding Jr.) said in the movie 'Jerry Maguire', "Show me the money!"
Register to attend and learn how Data Virtualization can:
- Accelerate the delivery of data to users
- Drive digital transformation initiatives
- Reduce project costs and timelines
- Quickly deliver value to your organization
Good data is like good water: best served fresh, and ideally well-filtered. Data Management strategies can produce tremendous procedural improvements and increased profit margins across the board, but only if the data being managed is of high quality. Determining how Data Quality should be engineered provides a useful framework for utilizing Data Quality management effectively in support of business strategy. This, in turn, allows for speedy identification of business problems, delineation between structural and practice-oriented defects in Data Management, and proactive prevention of future issues. Organizations must realize what it means to utilize Data Quality engineering in support of business strategy. This webinar will illustrate how organizations with chronic business challenges can often trace the root of the problem to poor Data Quality. Showing how Data Quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from reoccurring.
Learning objectives:
-Help you understand foundational Data Quality concepts for improving Data Quality at your organization
-Demonstrate how chronic business challenges for organizations are often rooted in poor Data Quality
-Share case studies illustrating the hallmarks and benefits of Data Quality success
Conference Chairman Keynote & Welcome
Capitalizing on MDM in Times of Crisis
Aaron Zornes, Founder & Chief Research Officer, The MDM Institute
--------------------------------------------------------------------------------
MDM is particularly important in today’s increasingly complex and harsh global business landscape – in part due to increasingly demanding suppliers, trading partners, customers … as well as financial challenges and government regulations. Despite the current economic crisis, analyst firms have declared MDM to be “recession proof” as businesses strive to dramatically reduce costs, meet compliance reporting mandates, deliver increased sales and marketing effectiveness, and provide superior service to customers and suppliers. MDM and its variants – customer data integration (CDI), product information management (PIM), and data governance – all significantly contribute to these tactical business priorities.
Research analysts at the MDM Institute annually produce a set of twelve milestones for their MDM Road Map to help Global 5000 enterprises focus efforts for their own large-scale, mission-critical MDM projects. This keynote will focus on this set of strategic planning assumptions and present an enlightening view of the key trends and issues facing IT organizations during 2009-10 and beyond by highlighting:
Understanding the impact of MDM market momentum, maturation, and consolidation
Coping with the skills shortage for data governance, MDM project leadership, & enterprise architecture
Identifying the essential (vs. desirable) features of an enterprise-strength MDM solution
Présentation Forrester - Forum MDM Micropole 2014Micropole Group
Présentation du Cabinet Forrester lors du 3eme Forum MDM Micropole le 19 novembre 2014 à Paris.
Forrester présente les tendances du marché du Master Data Management et de la gouvernance des données.
Similar to Enterprise Information Management (EIM) in SQL Server 2012 (20)
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
2. Mark Gschwind
Independent Consultant
Business Intelligence practitioner, manager since 1995
Over 50 Business BI projects
Data Warehousing/Cubing/Reporting/Data Mining/EIM
MCP, certified in Oracle Essbase, Melissa Data MVP
Working with clients on EIM since 2008
mark@gschwindconsulting.com
find me on
www.linkedin.com/in/markgschwind
Blog Site:
www.marksbiblog.com
3. Agenda
Enterprise Information Management (EIM)
What is it and why do we need it?
Microsoft EIM, 3 technologies working together
DQS
• Capabilities
• Demo
SSIS
MDS
• Capabilities
• Demo
EIM=DQS+MDS+SSIS
Wrap up
Questions
8. DQS: What is Data Quality?
Data Quality represents the degree to which the
data is suitable for business usages
Data Quality is built through People + Processes +
Technology
Bad Data Bad Business
“Poor data quality can cost companies 15%
to 25% (or more) of their operating budget”
- Larry English (International Data Quality Expert)
9. Common Data Quality Issues
Data
Quality
Issue Sample Data Problem
Standard Are data elements consistently
defined and understood?
Gender code = M, F, U in one system and
Gender code = 0, 1, 2 in another system
Complete Is all necessary data present? 20% of customers‟ last name is blank,
50% of zip-codes are 99999
Accurate Does the data accurately
represent reality or a verifiable
source?
A Supplier is listed as „Active‟ but went out of
business six years ago
Valid Do data values fall within
acceptable ranges?
Salary values should be between
60,000-120,000
Unique Data appears several times Both John Ryan and Jack Ryan appear in
the system – are they the same person?
10. Common Issues DQS Addresses
Name Gender Street House # Zip code City State D.O.B
John Doe Male 60th street 45 New York New York 08/12/64
Jane Doe Male Jonathan ln 36 10023 Poughkeepsy NY 21-dec-1954
Name Gender Street House # Zip
code
City State D.O.B
John Doe Male E 60th St 45W 10022 New York NY 08/12/64
Jane Doe Female Jonathan
Lane
36 10023 Poughkeepsie NY 12/21/54
Name Address Postal Code City State
John Smith 545 S Valley View Drive # 136 34563 Anytown New York
Margaret & John smith 545 Valley View ave unit 136 34563-2341 Anytown New York
Maggie Smith 545 S Valley View Dr Anytown New York
John Smith 545 Valley Drive St. 34253 NY NY
Name Address Zip Code City State Cluster
John Smith 545 S Valley View Drive # 136 34563 Anytown New York 1
Margaret & John smith 545 Valley View ave unit 136 34563-2341 Anytown New York 1
Maggie Smith 545 S Valley View Dr Anytown New York 1
John Smith 545 Valley Drive St. 34253 NY NY 2
Before
Before
After
After
Completeness Accuracy Conformity Consistency Uniqueness
11. DQS Use Cases
• One-Time cleanups
o Merge/Migrate multiple divisional CRMs into one
• Continuous Process with Steward Intervention
o Vendor master with continuous trickle of data
o Customer master with incomplete data
• Continuous Process with Minimal Intervention
o Database marketing mailing list
15. MDS: What is Master Data?
Continuous quality management
Ease of use for business users (not just IT)
Effective sharing (producing and consuming)
Centralized maintenance, by different departments
Changes that keep pace with the business
Master Data contains different attributes for
different departments
(marketing, finance, operations, business
groups…)
The challenge: To make a trusted single source
of business data used across multiple
systems, applications, and processes
16. MDS Use Cases
Regulatory
Enable security
management and auditing
of data used for
regulatory reporting
Data Warehouse /
Data Marts Mgmt
Operational Data
Management
Enable business users to
manage the dimensions
and hierarchies of DW /
Data Marts
Central data records
mgmt and consumption
sourced by other
operational systems
A company has adopted 6 “best
of breed” systems from
different vendors. They need
to be able to propagate the
correct customer information to
each system in a consistent
way.
MDS provides a platform for
central schema, integration
points and validation for
SI/ISV/Internal IT to develop a
custom solution
The IT department has built a
data warehouse and reporting
platform, but business users
complain about the
correctness of the dimensions
and lack of agility in making
updates.
MDS empowers the
business users to manage
dimensions themselves
while IT can govern the
changes
There are 3 G/L systems
whose G/L accounts need to
be consolidated and rolled up
to create financial statements
for regulatory reporting to
several countries
MDS enables an approval
process for changes with
role-based security and
transactional auditing of all
changes
18. Versioning
Validation
Authoring business rules
to ensure data
correctness
Modeling
Entities, Attributes,
Hierarchies
Enabling Integration & Sharing
MDS Capabilities
Role-based Security and
Transaction Annotation
Master Data
Stewardship
External
(CRM, ..)
Excel DWH
Loading batched
data through
Staging Tables
Consuming data
through Views
Registering to
changes through
APIs
Excel Add-In Web UI
Workflow /
Notifications
Data Matching
(DQS Integrated)
19. MDS Architecture
MDS Database
Entity Based
Staging Tables
Subscription
Views
IIS Service
MDS Service
Excel Add-InWEB-UI
External
System
CRM/ERP
Workflow /
Notifications
DWH
Excel Cleansing and
Matching
(DQS)
Silverlight
SSIS
SSIS
SSIS
BI
OLAP
External System
WCF
PW
Pivot
BizTalk / Others
21. Business Rules
Business Rules are expressions and actions that
can govern the conduct of business processes*
Enable data governance by:
-- Enforcing data standards
-- Alerting users to data quality issues
-- Creating simple workflows
Have limitations, but can be extended
*EIM = DQS+MDS+SSIS+People+Process
22. Security
Functional area permissions
Model/Entity level permissions provide column-
level security
Hierarchy permissions allow row-level security
Use AD groups, not individual users
Only use Hierarchy permissions if row-level
security is required
24. Key Takeaways
SQL Server has tools to address EIM, the biggest
impediment to BI success
EIM is People + Processes enabled by Technology
Editor's Notes
Working w these EIM technologies for 5 years, 7 implementations
How many people are using MDS or DQS ? How many people are using something else for MDM ?Need to start w a little background…
http://reports.informationweek.com/cart/index/downloadasset/id/8574“2013 Analytics & Information Management Trends” (in 2012 was “2012 BI and Information Management Trends”)Was top barrier in 2011 as well
Today I will show you 3 tools that address these top 3 impediments to success
Microsoft has 3 tools that work together to address these challengesThese technologies + People+ Processes is the MSFT strategy to Product accurate, trustworthy dataMDS appeared in 2008R2 (acquired Stratature), DQS in 2012 (acquired Zoomix). Integration of these products is a work-in-progress.
Data Quality is kind of like doing the dishes; a lot of work you don’t get much credit for
Larry English claims that “Poor data quality can cost companies 15% to 25% (or more) of their operating budget”Good discussion on the cost of bad data is here:http://dataqualitybook.com/?p=300
<skip>
Now, how to address DQ use cases
DQS is a Knowledge-Driven data quality solution,ie you must know some things about your data in order to cleanse it.Ie, you must know rules to identify valid values, lists of valid values, etc.Create a process to continually improve the KBReference data from the azure marketplace
Transition: from a “Continuous Process with Steward Intervention” use case to “Continuous Process with Minimal Intervention”Map values to a kb + domains in DQS, can do a conditional split on bad values etc
Transition: we’ve gone through 2 legs of EIM (DQS and SSIS), not the 3rd leg, MDS…most of us know what master data is, but stating some things about it will help frame our discussion about it.Because of its importance, it can be in the center of many business processes and hence must be effectively shared for both producing and consumingWhat MDS does is enable these different groups bring their objects together and they can be cared for centrallyOnce an organization has this, it can be used in a number of scenarios
Explaining by saying where it ends up
Let’s talk about MDS’s capabilities for addressing these use casesIn the center we have our data steward who uses the MDS web UI and Excel addin to continuously maintain data qualityModeling an enterprise’s master data objects is a capability brought to the data stewardship process, as well as…DQS – some integration, won’t be showing tonightData Quality Services is acquired from Zoomix in 2008MDS is acquired from Stratature in 2007
Now let’s talk about the underlying technologies supporting these capabilitiesA requirement for any MDM system these days is it has to be SOAP-enabled, to interact with ERPs like SAP and Oracle.The Windows Communication Foundation (or WCF), is an application programming interface (API) in the .NET Framework for building connected, service-oriented applications.The Excel addin communicates through WCF, the Web UI uses Silverlight 5 (new in 2012 and enhances the performance)BizTalk allows organizations to more easily connect disparate systems with over 25 multi-platform adapters and a robust messaging infrastructure.External systems can interact w MDS either through the WCF to the MDS service, or more directly with SQL tablesMention the database can be sql 2008 or sql 2012
DEMOS TO DO:TileSample
Slide Goal: Review what was saidThese technologies + People+ Processes is the MSFT strategy to Product accurate, trustworthy data