Master Data Management (MDM) is a feature of Microsoft Dynamics AX 2012 R3 that lets you synchronize master data records across multiple instances of Microsoft Dynamics AX 2012. By creating and maintaining a single copy of master data, you can help guarantee the consistency of important information, such as customer and product data, that is shared across AX 2012 instances
More than 70% of Master Data Management fails to reach full ROI due to inadequate implementation. I tried to highlight some of key areas to watch for during MDM implementation.
Enterprise Information Management (EIM) in SQL Server 2012Mark Gschwind
These are the slides from my 2013 SQL Saturday presentations in Mountain View and Sacramento. I suggest you view the (newer) videos, as they cover all that material and more. However, here is the session description these slides cover:
A recent survey by Information Week found that data quality is the greatest barrier to BI adoption in enterprises. MDS addresses this challenge with modeling, validation, alerting and security capabilities. In this presentation, you will learn how to use MDS to model your data to ensure correctness, update it with changes from your ERP, and create workflows with notifications. Next you will learn the capabilities of DQS and see how it addresses data standardization, completeness and other challenges. You will then see how to use them together to enable Enterprise Information Management. BI professionals will come away with knowledge on how to use tools that address the greatest risk to success for BI projects - data quality
Introduction to Master Data Services in SQL Server 2012Stéphane Fréchette
What is Master Data Services? Why is it important? - Will discuss Master Data Services capabilities, it's underlying architecture. Will demo creating a model, using SQL Server 2012 MDS add-in for Microsoft Excel, creating hierarchies, business rules and exposing/integrating data with other interfaces (Data Warehouse)
More than 70% of Master Data Management fails to reach full ROI due to inadequate implementation. I tried to highlight some of key areas to watch for during MDM implementation.
Enterprise Information Management (EIM) in SQL Server 2012Mark Gschwind
These are the slides from my 2013 SQL Saturday presentations in Mountain View and Sacramento. I suggest you view the (newer) videos, as they cover all that material and more. However, here is the session description these slides cover:
A recent survey by Information Week found that data quality is the greatest barrier to BI adoption in enterprises. MDS addresses this challenge with modeling, validation, alerting and security capabilities. In this presentation, you will learn how to use MDS to model your data to ensure correctness, update it with changes from your ERP, and create workflows with notifications. Next you will learn the capabilities of DQS and see how it addresses data standardization, completeness and other challenges. You will then see how to use them together to enable Enterprise Information Management. BI professionals will come away with knowledge on how to use tools that address the greatest risk to success for BI projects - data quality
Introduction to Master Data Services in SQL Server 2012Stéphane Fréchette
What is Master Data Services? Why is it important? - Will discuss Master Data Services capabilities, it's underlying architecture. Will demo creating a model, using SQL Server 2012 MDS add-in for Microsoft Excel, creating hierarchies, business rules and exposing/integrating data with other interfaces (Data Warehouse)
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Introduccion a SQL Server Master Data ServicesEduardo Castro
En esta presentación hacemos una introducción a SQL Server 2008 R2 Master Data Services.
Saludos,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
This session was about Master Data Services and what it also could be used as - the client wanted an application to validate and submit warehouse inventories.
Enterprises are faced by information overload. Big data appears as an opportunity, but has no relevance until enterprises can put it in context of their activities, processes, and organizations, Applying MDM principles to Big Data is therefore an opportunity that enterprises should target.
This presentation covers the following topics :
- what is MDM and Information Management
- what is Big Data and what are the use cases
- why and how Big Data can take advantage of MDM ? why and how MDM can take advantage of Big Data ?
Microsoft SQL Server 2012 Master Data ServicesMark Ginnebaugh
Author: Mark Gschwind, DesignMind
San Francisco, California
Master Data Services had a major upgrade in the SQL Server 2012 release. BI Consultant Mark Gschwind takes you through the new Excel interface, the new Silverlight look and feel, and integration improvements.
Knowing how to use this tool can be a valuable addition to your repertoire as a BI professional, allowing you to address data quality and other challenges.
Mark will show how to create a model, add columns and rows, manage security, and create hierarchies. He demos the new Excel interface and discuss how to allow you to manage master data yourself. He'll touch on how to integrate with a DW, migrating from Dev to Production.
You'll learn:
* How to let users manage dimensions and hierarchies for your DW
* How to create workflows to improve data quality in your DW
* Tips from real-life implementations to help you achieve a successful implementation
Mark Gschwind, Partner at DesignMind, is an expert on data warehousing, OLAP, and ERP migration. He has authored three enterprise data warehouses and over 80 OLAP cubes for 46 clients in a wide range of industries. Mark has certifications in SQL Server and Oracle Essbase.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
The what, why, and how of master data managementMohammad Yousri
This presentation explains what MDM is, why it is important, and how to manage it, while identifying some of the key MDM patterns and best practices that are emerging. This presentation is a high-level treatment of the problem space.
The presentation is summarizing the article of Microsoft in a simple way.
https://msdn.microsoft.com/en-us/library/bb190163.aspx
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
"Afin de moderniser leur IT pour de meilleures performances, une plus grande évolutivité et une meilleure gestion des coûts, nombreuses entreprises ont l’envie de migrer leur environnement Oracle vers des environnements SQL plus souples hébergés dans Azure, mais sont freinées par la perspective de temps d’interruption important, de processus complexes et de consommation de ressources. Ces évolutions de bases de données peuvent enfin se faire à chaud, sans interruption de production grâce à notre solution Double-Take Share.
Nous vous présenterons comment grâce à sa technologie de réplication cross-plateforme, cross-database flexible et performante Double-Take Share supprime les barrières entre les plateformes IT, les bases de données et les OS pour vous permettre une migration d’Oracle vers SQL automatisée et transparente. En gardant les bases de données online pendant tout le temps de la migration, Double-Take Share élimine les temps d’interruption, et ainsi simplifiera et accélérera vos projets de migration."
Microsoft master data services mds overviewEugene Zozulya
Master data management (MDM) is a technology discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.
Master data management tools can be used to support master data management by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Microsoft Master Data Services (MDS) is the SQL Server solution for master data management. Master data management (MDM) describes the efforts made by an organization to discover and define non-transactional lists of data, with the goal of compiling maintainable master lists. An MDM project generally includes an evaluation and restructuring of internal business processes along with the implementation of MDM technology. The result of a successful MDM solution is reliable, centralized data that can be analyzed, resulting in better business decisions.
Other Master Data Services features include hierarchies, granular security, transactions, data versioning, and business rules.
Master Data Services includes the following components and tools:
- Master Data Services Configuration Manager, a tool you use to create and configure Master Data Services databases and web applications.
- Master Data Manager, a web application you use to perform administrative tasks (like creating a model or business rule), and that users access to update data.
- MDSModelDeploy.exe, a tool you use to create packages of your model objects and data so you can deploy them to other environments.
- Master Data Services web service, which developers can use to extend or develop custom solutions for Master Data Services.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Introduccion a SQL Server Master Data ServicesEduardo Castro
En esta presentación hacemos una introducción a SQL Server 2008 R2 Master Data Services.
Saludos,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
This session was about Master Data Services and what it also could be used as - the client wanted an application to validate and submit warehouse inventories.
Enterprises are faced by information overload. Big data appears as an opportunity, but has no relevance until enterprises can put it in context of their activities, processes, and organizations, Applying MDM principles to Big Data is therefore an opportunity that enterprises should target.
This presentation covers the following topics :
- what is MDM and Information Management
- what is Big Data and what are the use cases
- why and how Big Data can take advantage of MDM ? why and how MDM can take advantage of Big Data ?
Microsoft SQL Server 2012 Master Data ServicesMark Ginnebaugh
Author: Mark Gschwind, DesignMind
San Francisco, California
Master Data Services had a major upgrade in the SQL Server 2012 release. BI Consultant Mark Gschwind takes you through the new Excel interface, the new Silverlight look and feel, and integration improvements.
Knowing how to use this tool can be a valuable addition to your repertoire as a BI professional, allowing you to address data quality and other challenges.
Mark will show how to create a model, add columns and rows, manage security, and create hierarchies. He demos the new Excel interface and discuss how to allow you to manage master data yourself. He'll touch on how to integrate with a DW, migrating from Dev to Production.
You'll learn:
* How to let users manage dimensions and hierarchies for your DW
* How to create workflows to improve data quality in your DW
* Tips from real-life implementations to help you achieve a successful implementation
Mark Gschwind, Partner at DesignMind, is an expert on data warehousing, OLAP, and ERP migration. He has authored three enterprise data warehouses and over 80 OLAP cubes for 46 clients in a wide range of industries. Mark has certifications in SQL Server and Oracle Essbase.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
The what, why, and how of master data managementMohammad Yousri
This presentation explains what MDM is, why it is important, and how to manage it, while identifying some of the key MDM patterns and best practices that are emerging. This presentation is a high-level treatment of the problem space.
The presentation is summarizing the article of Microsoft in a simple way.
https://msdn.microsoft.com/en-us/library/bb190163.aspx
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
"Afin de moderniser leur IT pour de meilleures performances, une plus grande évolutivité et une meilleure gestion des coûts, nombreuses entreprises ont l’envie de migrer leur environnement Oracle vers des environnements SQL plus souples hébergés dans Azure, mais sont freinées par la perspective de temps d’interruption important, de processus complexes et de consommation de ressources. Ces évolutions de bases de données peuvent enfin se faire à chaud, sans interruption de production grâce à notre solution Double-Take Share.
Nous vous présenterons comment grâce à sa technologie de réplication cross-plateforme, cross-database flexible et performante Double-Take Share supprime les barrières entre les plateformes IT, les bases de données et les OS pour vous permettre une migration d’Oracle vers SQL automatisée et transparente. En gardant les bases de données online pendant tout le temps de la migration, Double-Take Share élimine les temps d’interruption, et ainsi simplifiera et accélérera vos projets de migration."
Seamless, Real-Time Data Integration with ConnectPrecisely
As many of our customers have come to learn - integrating legacy data into modern data architecture is easier said than done! View this on-demand webinar to learn all about Precisely's seamless data integration solutions and how they have helped thousands of customers like you trust their data.
Learn about the two flavors of Precisely's Connect:
• Collect, prepare, transform and load your data to various targets using Connect ETL with the flexibility of using clusters and running on many different environments. With our 'design once, deploy anywhere' feature; what is built on prem today, can run on a cloud platform tomorrow with no development or mainframe expertise required.
• Capture data changes in real-time with no coding, tuning or performance impact using Connect CDC. Replicating exactly WHAT you need and HOW you need it with over 80 built-in data transformation methods.
Do you lose precious time due to data quality problems?
Do you need to integrate data from multiples sources and provide an integrated view of your customer or product attributes to other systems?
SQL Server 2016 Data Quality and Master Data Services can help you.
Mainframe Modernization with Precisely and Microsoft AzurePrecisely
Today’s businesses are leveraging Microsoft Azure to modernize operations, transform customer experience, and increase profit. However, if the rich data generated by the mainframe applications is missed in the move to the cloud, you miss the mark.
Without the right solutions in place, migrating mainframe data to Microsoft Azure is expensive, time-consuming, and reliant on highly specialized skillsets. Precisely Connect can quickly integrate mainframe data at scale into Microsoft Azure without sacrificing functionality, security, or ease of use.
View this on-demand webinar to hear from Microsoft Azure and Precisely data integration experts. You will:
- Learn how to build highly scalable, reliable data pipelines between the mainframe and Microsoft Azure services
- Understand how to make your Microsoft Azure implementation ready for mainframe
- Dive into case studies of businesses that have successfully included mainframe data in their cloud modernization efforts with Precisely and Microsoft Azure
“A broad category of applications and technologies for gathering, storing, analyzing, sharing and providing access to data to help enterprise users make better business decisions” -Gartner
The new Microsoft Azure SQL Data Warehouse (SQL DW) is an elastic data warehouse-as-a-service and is a Massively Parallel Processing (MPP) solution for "big data" with true enterprise class features. The SQL DW service is built for data warehouse workloads from a few hundred gigabytes to petabytes of data with truly unique features like disaggregated compute and storage allowing for customers to be able to utilize the service to match their needs. In this presentation, we take an in-depth look at implementing a SQL DW, elastic scale (grow, shrink, and pause), and hybrid data clouds with Hadoop integration via Polybase allowing for a true SQL experience across structured and unstructured data.
A Complete BI Solution in About an Hour!Aaron King
In this presentation Aaron will cover how to collect data from multiple sources using SQL Server 2012 Integration Services (SSIS). Then he will use SQL Server Reporting Services (SSRS) to report detail on that data. After that he will use SQL Server Analysis Services (SSAS) to create a KPI. Finally he’ll present that KPI on a dashboard via a web page. The goal of this presentation is to show how seamless the Microsoft Business Intelligence products are. If you’ve only used a few of these products, you’ll appreciate seeing them together all at once. Code will be provided.
Bridging to a hybrid cloud data services architectureIBM Analytics
Enterprises are increasingly evolving their data infrastructures into entire cloud-facing environments. Interfacing private and public cloud data assets is a hallmark of initiatives such as logical data warehouses, data lakes and online transactional data hubs. These projects may involve deploying two or more of the following cloud-based data platforms into a hybrid architecture: Apache Hadoop, data warehouses, graph databases, NoSQL databases, multiworkload SQL databases, open source databases, data refineries and predictive analytics.
Data application developers, data scientists and analytics professionals are driving their organizations’ efforts to bridge their data to the cloud. Several questions are of keen interest to those who are driving an organization’s evolution of its data and analytics initiatives into more holistic cloud-facing environments:
• What is a hybrid cloud data services architecture?
• What are the chief applications and benefits of a hybrid cloud data services architecture?
• What are the best practices for bridging a logical data warehouse to the cloud?
• What are the best practices for bridging advanced analytics and data lakes to the cloud?
• What are the best practices for bridging an enterprise database hub to the cloud?
• What are the first steps to take for bridging private data assets to the cloud?
• How can you measure ROI from bridging private data to public cloud data services?
• Which case studies illustrate the value of bridging private data to the cloud?
Sign up now for a free 3-month trial of IBM Analytics for Apache Spark and IBM Cloudant, IBM dashDB or IBM DB2 on Cloud.
http://ibm.co/ibm-cloudant-trial
http://ibm.co/ibm-dashdb-trial
http://ibm.co/ibm-db2-trial
http://ibm.co/ibm-spark-trial
Migrating on premises workload to azure sql databasePARIKSHIT SAVJANI
Azure SQL Database is a fully managed cloud database service with built-in intelligence, elastic scale, performance, reliability, and data protection that enables enterprises and ISVs to reduce their total cost of ownership and operational cost and overheads. In this session, I will share real-world experience of successfully migrated existing SaaS application and on-premises workload for some our tier 1 customers and ISV partners to Azure SQL Database service. The session walks through planning, assessment, migration tools and best practices from the proven experiences and practices of migrating real world applications to Azure SQL Database service.
CRM Integration Options–Scribe, SmartConnect, Microsoft Connector. What's the...BDO IT Solutions
Integration of CRM to your financial or operational systems can increase overall value, reduce manual entry effort, and reduce errors. Integration will also accelerate the speed of your business. During this session learn about the integration options available, price points and implementation effort.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
2. Agenda
• What is MDM
• MDM & AX
• Setup of MDM
• Data Import/Export Framework
3. What is Master Data Management
• Master data is shared across computer
system in the enterprise
• One source of truth
• Used only with reference data that is
non-transactional (People, products,
and locations)
4. MDM & AX
• MDM builds on top (Extension of) of
Data Import Export Framework (DIXF)
that was shipped in R2
• Synchronize multiple instances of AX
• Uses SQL Server as master document
• Hub and Spoke relationship
5. Master data management overview
Contoso HQ
Regional Site 1
Regional Site 2
Regional Site 3
Central data store
(SQL MDS)
6. Master data management overview
Contoso HQ
Regional Site 1
Regional Site 2
Regional Site 3
Central data store
(SQL MDS)
Data
synchronization by
leveraging SQL MDS
AX and non-AX
systems as spoke
Single/multi master
modes
Central conflict
management
Policy based
selective
synchronization
First step towards
holistic data
governance
7. High level data flow
AX 1 SQL MDS
Customer MDS entity
Data sync
batch job
MDM
MDM UX
MDM Client
Configuration
UX
MDM
Provisioning
UX
Customer Entity
Customer Address
Entity
import job staging
Data Import/Export
1. Change
tracking
4. Invoke
import
job
2. Push
Customer entity
subscription view
Change tracking &
conflict detection
AX 2
Non – AX
system
8. Data management scenarios
SQL MDS
AX Instance 1 AX Instance 2
Customers
Customers Customers
Push/Pull Push/Pull
Multi
master
Products
Products
Products
Push/Pull Pull Only
Single
master
Create/
modify
Read
only
MDS
9. Data management modes
Single master
• Only one instance is allowed to
write to MDS
• All other instances are read only.
MDS records overwrite any
changes in other instances.
• Mode can be defined at entity
level
• Enforce by customizing security
and defined business process
Multi Master
• Any instance is allowed to create/modify
records
• Conflicts are automatically detected in MDS at
record level.
• Conflict resolution is done manually by data
stewards using MDS excel add-in
10. Setup of MDM
• Update AX and SQL Server
• Connect AX to SQL Server
• Determine which spokes can write data
• Configure Synchronization
• Filter data that is pulled and pushed
11. Data Import/Export Framework
• Once MDM is setup, you can use Data
Import/Export Framework (DIXF, DMF) as usual
• DIXF is an extension that helps you import and
export data in AX
(Master Data, Open Stock, and Balances)
• MDM will automatically back up DIXF on SQL
Server and check for any conflicts
Hello, my name is Hai Nguyen and I’m from Dallas, TX.
All right today, I’m going to go over Master Data Management or MDM in Microsoft Dynamics AX
So We are going to start off by talking about what MDM is and then we are going to talk about how it is implemented in AX. Talking about some setups required and then how it goes to Data Import Export Framework
So Master data management is data that is going to be shared across all your systems in the entire company.
So basically people referring to this is as the one source of truth.
We know that master data is always the correct information, going to be the information that everyone is gong to build of and it is highest in hierarchy of information. We want to keep this data clean and pure so that we can always reference back to master data when there is discrepancy, naming convention or anything like that
The type of information is going to be non transactional information. Information that is not going to be change all the time such as people’ names, addresses, product names, locations.
MDM builds on top of Data Import Export Framework that was shipped in R2. MDM is released in R3. Basically it is an extension of the framework built on those ideas. It takes one step further. So if you know how to use DIXF framework, this is just a little extra step that is going to help you out. Basically what it does is it takes all your instances of AX, synchronize them up, use master data that set up in your system and make sure every of your instance uses that data so we don’t have users come in and change someone last name and now throwing naming convention off. We want to make sure all is the same
How it does in AX, MDM use SQL server as master document. It takes all the information and upload to SQL server. SQL server is going to hold that information there and everything is going drawn off that SQL server document. SQL MDS is actually where the one true copy is stored. People are sometimes getting confused about ok we are talking about MDM for AX how come data is stored in MDS. Because Microsoft tries to take advantage of some of capabilities offered already in MDS. It gives us capabilities of conflict resolution obviously, versioning, push and pull thinking and all that good stuff.
So essentially we have spoke and hub architecture. SQL MDS is your hub and the rest of the system is the spoke. And the spoke obviously can be your AX system or can also be your non AX system.
------------------
It goes by using a Hub and Spoke relationship. So what that means is SQL server is a hub of a wheel , all the spokes that come off difference instances. All instances funnel back to hub and the hub will shoot information to the spokes.
Let’s take have a look how MDM was born.
Imagine there is a customer, Contoso Global, that has multiple distributed systems. Contoso-Global’s headquarters is in Europe, with regional sites in various parts of the world. Further more, each of these regional sites has its own independent AX application instance and Contoso Global has its own AX application instance. Now Contoso Global would like to share master data amongst all its regional sites. Each of these instances are creating their own copies of master data currently. It is important for Contoso Global to control the master data and to avoid duplicating the same data across different regional sites. For instance, if a customer, Acme Corporation does business with regional site 1 and regional site 2, Contoso Global would prefer that there is just one copy of Acme Corporation in both these sites. Contoso Global would also want to make sure if there are any changes that are made to the customer record in one of the sites, the changes are propagated to all of the other sites. There also needs to be a clear systems and processes to ensure that conflicts are managed properly. All of these requirements are basically what’s driving the need for MDM. More importantly, having good Master Data Management system would enable Contoso Global to reliably and efficiently share master data across all systems. Let’s now look at an overview of MDM and how it solves above requirements
As we saw, Contoso Global has multiple regional sites, each running its own installation of AX. The master data in all of these regional sites must be in sync with that of Contoso Global.
MDM accomplishes this by storing all the master data in a central data store, also known as the hub. For this MDM uses SQL Server Master Data Services or SQL MDS as central data store.
The central data store synchronizes the data with each of the regional sites, which are called spokes. Synchronizing with AX spokes is done using the data import/export framework entities as units of synchronization. You can also configure non-AX systems as spoke systems. Some of the most requested entities have already been enabled for MDM out of the box. These entities are customers, vendors, employees, global address book, and product. You can also create customizations to support other data import/export framework entities for use within MDM.
MDM supports two modes of synchronization, single and multi-master modes. In a single master mode, the data is created in only one place and then synchronized to all other spokes. In a multi-master mode, each site may choose to create or edit master data, and this data is then synchronized across all of the other nodes.
When configured in this mode, MDM leverages SQL MDS to enable conflict detection and resolution.
MDM also allows you to configure policy-based selective synchronization so that not all master data is synchronized to all sites. You may choose to have certain data synchronized to only certain sites and not other sites.
The MDM feature built into AX 2012 R3 can be considered as a first step towards supporting a holistic data governance within AX.
This is a typical multi instance scenario. We have a HQ somewhere in Europe and regional sites scattered across the globe. MDM is shipped in R3 is actually built on top of SQL MDS (SQL master data services). SQL MDS is actually where the one true copy is stored. People are sometimes getting confused about ok we are talking about MDM for AX how come data is stored in MDS. Because Microsoft tries to take advantage of some of capabilities offered already in MDS. It gives us capabilities of conflict resolution obviously, versioning, push and pull thinking and all that good stuff.
So essentially we have spoke and hub architecture. SQL MDS is your hub and the rest of the system is the spoke. And the spoke obviously can be your AX system or can also be your non AX system.
And then we have single mode and multi master modes. Single mode…We have 3 instances of AX spread out but only my HQ is allowed to create a new customers, product or new data. The rest of instance is consumers of master data. Normally this is easiest thing to do compared to multi mode. Because when you have data comes from difference places, things get more complicated as far as global data governance and data stewardship is converened . Things get very hairy when you allow multi modes.
A typical scenario is I create data in HQ and I do a sync between HQ and SQL MDS. My data is maintained in SQL MDS, and then the rest of spoke instances actually pull data from SQL MDS, like on demand model, push and model, we pull data from SQL MDS
We have a centralized conflict management when we do multi master mode. For example we have the same customer/vendor that was modified a Phone number in North America and Brazil regional site, it is going to cause a conflict here in MDS. And That is where your data steward is going to step into and resolve the conflict.
This is first step towards holistic data governance. We all know that MDM is very complicated space but It absolutely gives you, lead you to the right direction as far as managing all your AX instances concerns but you have to build on top of that if you want to integrate with other non AX system.
This is high level data flow of MDM.
The Master Data Management capabilities of AX 2012 R3 leverage SQL Server Master Data services and the Dynamics AX data import export framework for making everything work. Let’s look at how these components work together.
First we have an instance of AX, let’s call it AX 1, on which we have configured MDM for the customer entity.
Then we have SQL MDS, which is used to manage the master data. Once MDM has been configured in AX, the framework will also create the corresponding entity in the SQL MDS. In this case, the customer MDS entity.
Change tracking also been enabled on both AX and the SQL MDS databases. Then we have a data sync job that is used to push and pull data between AX and SQL MDS.
The data sync job runs periodically to find changes that have been made in AX, and when a change is detected in one of the entities configured fro MDM, the data sync job pushes these changes to the corresponding MDS entity in SQL.
Conflict detection is performed in the SQL MDS in case there have been updates to the same record from other instances of AX.
The data sync job then pulls any changes that have occurred on SQL MDS apart from the changes that have been pushed back into AX.
These changes are pushed into AX using the data import export framework via staging table and an import job.
In a typical installation, there would be other AX installations that are also configured to connect with the same name SQL MDS instance. These instances each also perform the same steps to synchronize their master data with SQL MDS. While the standard implementation of MDM in AX does not include adapters for non AX sources, the framework supports synchronizing data with other non-AX data sources using customizations.
This is multi master mode that I talked about where you have 2 instances that can push and pull data.
And you can also have single master mode that one instance can create data and the other instance consumes it.
Again, a summary of Single master vs Multi Master mode.
I have two more slides of setup of MDM and DIXF. Instead of bore you to death with these 2 slides, I decided to demo how we are going to set all these up.
So now for the setup of MDM
First you want to make sure to have the latest version of AX and SQL Sever. Because this is new to both of those versions, it needs to have MDM service installed and updated.
Once you have those updated, there is a way to connect AX to SQL server so that it knows what to look for, where to go, where to dump the information.
Then you can determine what people can write to data, so you can have every single instance to be able to write information, upload it or just have one instance to be able to write the information.
So you can only change master data from one specific instance so that it can give people permission and not permission to update that master list.
You can pick which specific data is going to be in the master data, how it is going to be pulled, when it is going to be pushed if you want to do it, few minutes, couples of days, you can set all of them up.
Once MDM is set up, you can use Data Import Export Framework as usual to import to excel document or move stuff over.
Master Data Management then is going to automatically run based on your service that you set up, back up anything that you used by DIXF on your SQL server, checks for conflicts and all of that stuff.
Basically, what you have you are running the same, backing it up to a different area that is going to check all instances make sure that it’s working with each other and detects if you have any problems
When we look at AX, we go to Data Import Export framework section. There is a new section is called Master Data Management.
The first thing you are going to do is to go to Configure SQL Server Master Data Services connection and it will bring up the screen here.
You are going to put in your SQL server information, endpoint information. Basically you are telling AX where to go to get Master Data information.
The next thing you are going to do once set that up is to go to Manage synchronization. This is everything going to be set up in here. So you create your new configuration group. Once you are in here you can create entities. That is where you can pick what data to upload/download, set up schedule to synchronize. It also shows conflicts
That’s in we are into the demo. Before we start, I can take few questions.