The document provides guidance on data governance and stewardship best practices. It begins by outlining the importance of having accurate and relevant data to drive business growth. It then discusses getting started with data governance, including assessing data assets, understanding governance options, and planning an approach. The document provides numerous tips for setting up a data governance program, such as establishing a governance structure and processes, defining roles and responsibilities, and developing a high-level rollout plan. It also offers best practices for improving data quality through techniques like validation rules, dependent picklists, approval workflows, and regular data cleansing activities.
Unlocking Success in the 3 Stages of Master Data ManagementPerficient, Inc.
Master data management (MDM) comprises the processes, governance, policies, standards and tools that define and manage critical data. MDM is used to conduct strategic initiatives such as customer 360, product excellence and operational efficiency.
The quality of enterprise Information depends on the master data, so getting it right should be a high priority. This webinar will highlight key factors needed for success in each of the three stages of the MDM journey:
Planning
Implementation
Steady state
We review each stage in detail and provide insight into planning and collaborative activities. In this slideshare you will learn:
Best practices, tips and techniques for a successful MDM program
Top considerations for business case building, architecture and going live
How to support the overall program after launching your MDM program
Federated data organizations in public sector face more challenges today than ever before. As discovered via research performed by North Highland Consulting, these are the top issues you are most likely experiencing:
• Knowing what data is available to support programs and other business functions
• Data is more difficult to access
• Without insight into the lineage of data, it is risky to use as the basis for critical decisions
• Analyzing data and extracting insights to influence outcomes is difficult at best
The solution to solving these challenges lies in creating a holistic enterprise data governance program and enforcing the program with a full-featured enterprise data management platform. Kreig Fields, Principle, Public Sector Data and Analytics, from North Highland Consulting and Rob Karel, Vice President, Product Strategy and Product Marketing, MDM from Informatica will walk through a pragmatic, “How To” approach, full of useful information on how you can improve your agency’s data governance initiatives.
Learn how to kick start your data governance intiatives and how an enterprise data management platform can help you:
• Innovate and expose hidden opportunities
• Break down data access barriers and ensure data is trusted
• Provide actionable information at the speed of business
Data Quality Management: Cleaner Data, Better Reportingaccenture
In this new Accenture Finance & Risk presentation we explore a process to investigate, prioritize and resolve data quality issues, key to creating a more efficient and accurate reporting environment. View our presentation to learn more.
For more on regulatory reporting, see presentation on Financial Reporting Robotics: http://bit.ly/2qaLK9y
Visit our blog for latest Regulatory Insights: https://accntu.re/2qnXs1B
DAMA Australia: How to Choose a Data Management ToolPrecisely
The explosion of data types, sources, and use cases makes it difficult to make the right decisions around the best data management tools for your organisation. Why do you need them? Who is going to use them? What is their value?
Watch this webinar on-demand to learn how to demystify the decision making process for the selection of Data Management Tools that support:
· Data governance
· Data quality
· Data modelling
· Master data management
· Database development
· And more
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
Unlocking Success in the 3 Stages of Master Data ManagementPerficient, Inc.
Master data management (MDM) comprises the processes, governance, policies, standards and tools that define and manage critical data. MDM is used to conduct strategic initiatives such as customer 360, product excellence and operational efficiency.
The quality of enterprise Information depends on the master data, so getting it right should be a high priority. This webinar will highlight key factors needed for success in each of the three stages of the MDM journey:
Planning
Implementation
Steady state
We review each stage in detail and provide insight into planning and collaborative activities. In this slideshare you will learn:
Best practices, tips and techniques for a successful MDM program
Top considerations for business case building, architecture and going live
How to support the overall program after launching your MDM program
Federated data organizations in public sector face more challenges today than ever before. As discovered via research performed by North Highland Consulting, these are the top issues you are most likely experiencing:
• Knowing what data is available to support programs and other business functions
• Data is more difficult to access
• Without insight into the lineage of data, it is risky to use as the basis for critical decisions
• Analyzing data and extracting insights to influence outcomes is difficult at best
The solution to solving these challenges lies in creating a holistic enterprise data governance program and enforcing the program with a full-featured enterprise data management platform. Kreig Fields, Principle, Public Sector Data and Analytics, from North Highland Consulting and Rob Karel, Vice President, Product Strategy and Product Marketing, MDM from Informatica will walk through a pragmatic, “How To” approach, full of useful information on how you can improve your agency’s data governance initiatives.
Learn how to kick start your data governance intiatives and how an enterprise data management platform can help you:
• Innovate and expose hidden opportunities
• Break down data access barriers and ensure data is trusted
• Provide actionable information at the speed of business
Data Quality Management: Cleaner Data, Better Reportingaccenture
In this new Accenture Finance & Risk presentation we explore a process to investigate, prioritize and resolve data quality issues, key to creating a more efficient and accurate reporting environment. View our presentation to learn more.
For more on regulatory reporting, see presentation on Financial Reporting Robotics: http://bit.ly/2qaLK9y
Visit our blog for latest Regulatory Insights: https://accntu.re/2qnXs1B
DAMA Australia: How to Choose a Data Management ToolPrecisely
The explosion of data types, sources, and use cases makes it difficult to make the right decisions around the best data management tools for your organisation. Why do you need them? Who is going to use them? What is their value?
Watch this webinar on-demand to learn how to demystify the decision making process for the selection of Data Management Tools that support:
· Data governance
· Data quality
· Data modelling
· Master data management
· Database development
· And more
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
Capacity Management Maturity: A Survey of IT ProfessionalsPrecisely
Implementing or maturing a Capacity Management process takes executive buy-in, proper planning and the tools to make it possible – plus it helps when you get to enjoy a significant return on investment from the process! Based off the results of our Capacity Management Maturity Assessment survey, we learned that organizations willing to make minor changes in their capacity management processes can reap major benefits.
View this webinar to learn the full results of the survey along with key indicators of capacity management maturity such as:
• How your organization captures key component level capacity metrics
• Where capacity reports are available and how they are generated
• If your organization stores performance and capacity data centrally in a CMIS
Linking Data Governance to Business GoalsPrecisely
The importance of data to businesses has increased exponentially over recent years as companies seek benefits such as gains in efficiency, the ability to respond to growing privacy regulations scale quickly and increased and increase customer loyalty.
Despite being a vital part of any Data Transformation, Data Governance has sometimes been misrepresented as a restrictive and controlling process leaving governance leaders having to continually make the case for business buy-in.
In this on-demand webinar we will explore the concept of business-first Data Governance, an approach that promotes adoption by the organisation, lays the foundation for data integrity and consistently delivers business value in the long term.
Improve IT Security and Compliance with Mainframe Data in SplunkPrecisely
Avoid security blind spots with an enterprise-wide view.
If your organization relies on Splunk as its security nerve center, you can’t afford to leave out your mainframes.
They work with the rest of your IT infrastructure to support critical business applications–and they need to be
viewed in that wider context to address potential security blind spots.
Although the importance of including mainframe data in Splunk is undeniable, many organizations have left it out
because Splunk doesn’t natively support IBM Z® environments. Learn how Precisely Ironstream can help with a
straight-forward, powerful approach for integrating your mainframe security data into Splunk, and making it actionable
once it’s there.
The Data Governance Annual Conference and International Data Quality Conference in San Diego was very good. I recommend this conference for business and IT persons responsible for data quality and data governenance. There will be a similar event in Orlando, December 2010. This is the presentation I delivered to a grateful audience.
Developing & Deploying Effective Data Governance FrameworkKannan Subbiah
This is the slide deck presented at the Customer Privacy and Data Protection India Summit 2019 held in Mumbai, India. The specific topics touched upon are the guiding principles, Aligning with Data Architecture, Data Quality & Compliance.
Revolution In Data Governance - Transforming the customer experiencePaul Dyksterhouse
The foundation of managing data security and big data is implementing data governance. Data Owners, Metadata tagging, Customer feedback and Continuous Improvement are critical facets to provide the transparency and consistency so that customer's can trust the data, and make informed decisions.
Building an Effective & Extensible Data & Analytics Operating ModelCognizant
Building an effective and scalable operating model requires a strong basis in data and analytics management. Creating such an operating model is a step-by-step process, as outlined here.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Big data governance as a corporate governance imperativeGuy Pearce
Poor data governance impacts reputation risk by data breach, by privacy violations and by acting on poor quality data. Furthermore, there are some important differences in what data governance means for big data compared to data governance for operational data.
That poor data governance impacts reputation risk means it has considerable implications for the Board of Directors, for whom reputation risk is the number one risk according to Deloitte (2013).
This presentation targeting the Board of Directors and the C-Suite and presented at the National Data Governance and Privacy Congress in Calgary, Canada presented some reasons why data governance is critical, from the perspective of both the C-Suite and the Board of Directors.
(Also on YouTube at http://youtu.be/QR4KO3Yx0n4)
Financial Services Technology Leader Turns Mainframe Logs into Real-Time Insi...Precisely
A global financial services leader in payment, banking, and investment solutions, stays ahead of the competition by relying on powerful mainframe processing and real-time analytics. The company delivers an exceptional – and secure – digital experience to their global customer base by using Splunk and Precisely Ironstream.
View this on-demand webinar to hear how this customer turns mainframe security and operational log data into real-time insights with Ironstream and Splunk to:
- Proactively detect fraud and monitor security
- Meet SLAs with near-instant application response times
- Save time, effort and resources while lowering MTTI & MTTR
Develop and Implement an Effective Data Management Strategy and Roadmap Info-Tech Research Group
Treat data as an asset and gain a competitive advantage.
Your Challenge
Despite the growing focus on data, many organizations struggle to develop an effective strategy for their data assets. This is due to their intangible nature and varying use across the business.
Data Management is a business process managed by IT. This creates a challenge for IT as it is required to create and manage complex systems of operations that link closely to integral business operations.
Our Advice
Critical Insight
Data Management is not one size fits all. Cut through the noise related to Data Management and create a strategy and process that is right for your organization.
Have the business drive your Data Management project.
It all starts and ends with Data Governance. At a minimum, invest in Data Governance initiatives.
Impact and Result
Coordination between IT and the business will create a Data Management strategy that understands and satisfies the data requirements of the business.
Data Management requirements and initiatives will be derived from the following: business goals and strategic plans, current capability assessments, business drivers for data, understanding of market and technology opportunities, and a clear understanding of the business’s drivers regarding data.
Creating a clear Data Management Strategy and developing a roadmap of initiatives will allow IT to create a plan for how to bridge the gap between IT and the business and create a Data Management framework that supports the business’s immediate and long-term data requirements.
2013 Data Governance Professionals Organization (DGPO) Digital River WebinarDeepak Bhaskar, MBA, BSEE
Hosted by the Data Governance Professionals Organzation (DGPO) for webinar attendees. Successful Data Governance at Digital River. 2013 DGIQ Data Governance Best Practice Award: Finalist
Slides from tutorial at EDW 2017 in Atlanta, GA on Implementing Agile Data Governance. Discusses how to write and add governance stories into existing Agile projects.
The Data Quality Assessment Manager is a Data Quality product specifically designed to manage data quality assessments, manage data quality scores, review and correct quality issues and manage the workflow across all stakeholders involved in a data quality assessment. DQAM is the industry’s first platform designed to put data quality in the hands of data stewards and business owners who know and understand the data the best.
Stop the madness - Never doubt the quality of BI again using Data GovernanceMary Levins, PMP
Does this sound familiar? "Are you sure those numbers are right?" "Why are your numbers different than theirs?"
We've all heard it and had that gut wrenching feeling of doubt that comes with uncertainty around the quality of the numbers.
Stop the madness! Presented in Dunwoody on April 18 by industry leading expert Mary Levins who discusseses what it takes to successfully take control of your data using the Data Governance Framework. This framework is proven to improve the quality of your BI solutions.
Mary is the founder of Sierra Creek Consulting
Capacity Management Maturity: A Survey of IT ProfessionalsPrecisely
Implementing or maturing a Capacity Management process takes executive buy-in, proper planning and the tools to make it possible – plus it helps when you get to enjoy a significant return on investment from the process! Based off the results of our Capacity Management Maturity Assessment survey, we learned that organizations willing to make minor changes in their capacity management processes can reap major benefits.
View this webinar to learn the full results of the survey along with key indicators of capacity management maturity such as:
• How your organization captures key component level capacity metrics
• Where capacity reports are available and how they are generated
• If your organization stores performance and capacity data centrally in a CMIS
Linking Data Governance to Business GoalsPrecisely
The importance of data to businesses has increased exponentially over recent years as companies seek benefits such as gains in efficiency, the ability to respond to growing privacy regulations scale quickly and increased and increase customer loyalty.
Despite being a vital part of any Data Transformation, Data Governance has sometimes been misrepresented as a restrictive and controlling process leaving governance leaders having to continually make the case for business buy-in.
In this on-demand webinar we will explore the concept of business-first Data Governance, an approach that promotes adoption by the organisation, lays the foundation for data integrity and consistently delivers business value in the long term.
Improve IT Security and Compliance with Mainframe Data in SplunkPrecisely
Avoid security blind spots with an enterprise-wide view.
If your organization relies on Splunk as its security nerve center, you can’t afford to leave out your mainframes.
They work with the rest of your IT infrastructure to support critical business applications–and they need to be
viewed in that wider context to address potential security blind spots.
Although the importance of including mainframe data in Splunk is undeniable, many organizations have left it out
because Splunk doesn’t natively support IBM Z® environments. Learn how Precisely Ironstream can help with a
straight-forward, powerful approach for integrating your mainframe security data into Splunk, and making it actionable
once it’s there.
The Data Governance Annual Conference and International Data Quality Conference in San Diego was very good. I recommend this conference for business and IT persons responsible for data quality and data governenance. There will be a similar event in Orlando, December 2010. This is the presentation I delivered to a grateful audience.
Developing & Deploying Effective Data Governance FrameworkKannan Subbiah
This is the slide deck presented at the Customer Privacy and Data Protection India Summit 2019 held in Mumbai, India. The specific topics touched upon are the guiding principles, Aligning with Data Architecture, Data Quality & Compliance.
Revolution In Data Governance - Transforming the customer experiencePaul Dyksterhouse
The foundation of managing data security and big data is implementing data governance. Data Owners, Metadata tagging, Customer feedback and Continuous Improvement are critical facets to provide the transparency and consistency so that customer's can trust the data, and make informed decisions.
Building an Effective & Extensible Data & Analytics Operating ModelCognizant
Building an effective and scalable operating model requires a strong basis in data and analytics management. Creating such an operating model is a step-by-step process, as outlined here.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Big data governance as a corporate governance imperativeGuy Pearce
Poor data governance impacts reputation risk by data breach, by privacy violations and by acting on poor quality data. Furthermore, there are some important differences in what data governance means for big data compared to data governance for operational data.
That poor data governance impacts reputation risk means it has considerable implications for the Board of Directors, for whom reputation risk is the number one risk according to Deloitte (2013).
This presentation targeting the Board of Directors and the C-Suite and presented at the National Data Governance and Privacy Congress in Calgary, Canada presented some reasons why data governance is critical, from the perspective of both the C-Suite and the Board of Directors.
(Also on YouTube at http://youtu.be/QR4KO3Yx0n4)
Financial Services Technology Leader Turns Mainframe Logs into Real-Time Insi...Precisely
A global financial services leader in payment, banking, and investment solutions, stays ahead of the competition by relying on powerful mainframe processing and real-time analytics. The company delivers an exceptional – and secure – digital experience to their global customer base by using Splunk and Precisely Ironstream.
View this on-demand webinar to hear how this customer turns mainframe security and operational log data into real-time insights with Ironstream and Splunk to:
- Proactively detect fraud and monitor security
- Meet SLAs with near-instant application response times
- Save time, effort and resources while lowering MTTI & MTTR
Develop and Implement an Effective Data Management Strategy and Roadmap Info-Tech Research Group
Treat data as an asset and gain a competitive advantage.
Your Challenge
Despite the growing focus on data, many organizations struggle to develop an effective strategy for their data assets. This is due to their intangible nature and varying use across the business.
Data Management is a business process managed by IT. This creates a challenge for IT as it is required to create and manage complex systems of operations that link closely to integral business operations.
Our Advice
Critical Insight
Data Management is not one size fits all. Cut through the noise related to Data Management and create a strategy and process that is right for your organization.
Have the business drive your Data Management project.
It all starts and ends with Data Governance. At a minimum, invest in Data Governance initiatives.
Impact and Result
Coordination between IT and the business will create a Data Management strategy that understands and satisfies the data requirements of the business.
Data Management requirements and initiatives will be derived from the following: business goals and strategic plans, current capability assessments, business drivers for data, understanding of market and technology opportunities, and a clear understanding of the business’s drivers regarding data.
Creating a clear Data Management Strategy and developing a roadmap of initiatives will allow IT to create a plan for how to bridge the gap between IT and the business and create a Data Management framework that supports the business’s immediate and long-term data requirements.
2013 Data Governance Professionals Organization (DGPO) Digital River WebinarDeepak Bhaskar, MBA, BSEE
Hosted by the Data Governance Professionals Organzation (DGPO) for webinar attendees. Successful Data Governance at Digital River. 2013 DGIQ Data Governance Best Practice Award: Finalist
Slides from tutorial at EDW 2017 in Atlanta, GA on Implementing Agile Data Governance. Discusses how to write and add governance stories into existing Agile projects.
The Data Quality Assessment Manager is a Data Quality product specifically designed to manage data quality assessments, manage data quality scores, review and correct quality issues and manage the workflow across all stakeholders involved in a data quality assessment. DQAM is the industry’s first platform designed to put data quality in the hands of data stewards and business owners who know and understand the data the best.
Stop the madness - Never doubt the quality of BI again using Data GovernanceMary Levins, PMP
Does this sound familiar? "Are you sure those numbers are right?" "Why are your numbers different than theirs?"
We've all heard it and had that gut wrenching feeling of doubt that comes with uncertainty around the quality of the numbers.
Stop the madness! Presented in Dunwoody on April 18 by industry leading expert Mary Levins who discusseses what it takes to successfully take control of your data using the Data Governance Framework. This framework is proven to improve the quality of your BI solutions.
Mary is the founder of Sierra Creek Consulting
DGIQ - Case Studies_ Applications of Data Governance in the Enterprise (Final...Enterprise Knowledge
Thomas Mitrevski, Senior Data Management and Governance Consultant and
Lulit Tesfaye, Partner and Vice President of Knowledge and Data Services
presented “Case Studies: Applications of Data Governance in the Enterprise” on December 6th, 2023 at DGIQ in Washington D.C.
In this presentation, Thomas and Lulit detailed their experiences developing strategies for multiple enterprise-scale data initiatives and provided an understanding of common data governance and maturity needs. Thomas and Lulit based their talk on real-world examples and case studies and provided the audience with examples of achieving buy-in to invest in governance tools and processes, as well as the expected return on investment (ROI).
Check out the presentation below to learn:
How Leading Organizations are Benchmarking Their Data Governance Maturity
Why End-User Training was Imperative in Seeing Scaled Governance Program Adoption
Which Tools and Frameworks were Critical in Getting Started with Data Governance
How Organizations Achieved Success with Data Governance in Under 12 Weeks
What Successful Data Governance Implementation Roadmaps Really Look Like
Standards make it easier to create, share, and integrate data by making sure that there is a clear understanding of how the data are represented and that the data you receive are in a form that you expected. Data standards are the rules by which data are described and recorded. In order to share, exchange, and understand data, we must standardize the format as well as the meaning. Simply put, using standards makes using things easier. If different groups are using different data standards, combining data from multiple sources is difficult, if not impossible.
how to successfully implement a data analytics solution.pdfbasilmph
The adoption of data analytics in business has demonstrated a transformative power in modern entrepreneurship. By analyzing vast reservoirs of data, businesses can make informed decisions, optimize operations and predict trends, thus fueling growth.
Getting Ahead Of The Game: Proactive Data GovernanceHarley Capewell
Data today is getting bigger, more widely available and
changing more quickly than ever before. Data Governance
coach Nicola Askham shares her advice on why you
need to embrace Data Governance NOW and what good
governance looks like.
The Merger is Happening, Now What Do We Do?DATUM LLC
This was presented on October 24, 2018 at the ASUG EIM Conference. One of the many challenges presented by an acquisition and divestiture event is unifying disparate data and integrating systems together. If you are leading an integration, you may have more questions than answers on how to approach this event. Learn how to best leverage the momentum and budgets that accompany these activities to jump start good governance practices up front, as well as how to measure the return on investment, ensuring data and EIM professionals' ongoing success.
MANAGING RESOURCES FOR BUSINESS ANALYTICS BA4206 ANNA UNIVERSITYFreelance
A business analyst is an individual who statistically analyzes large data sets to identify effective ways of boosting organizational efficiency. They bridge the gap between the client and the development team.
Enabling Success With Big Data - Driven Talent AcquisitionDavid Bernstein
Adopting an evidence-based recruitment marketing strategy is not just reserved for large employers. In fact, a targeted sourcing strategy can in some ways have a greater impact on small and mid-size businesses who need to allocate already-limited resources to the areas that will provide the most value. Ultimately, hiring the right candidate means profitability for your business. How can talent acquisition professionals gain the insights their organizations need to make better-informed decisions about their recruitment marketing efforts?
Data governance is a bunch of strategies and practices that ensure high quality through the complete lifecycle of your data. Data Governance is a practical and actionable framework to assist a wide range of data stakeholders across any organization in identifying and meeting their data requirements.
Similar to DC Salesforce1 Tour Data Governance Lunch Best Practices deck (20)
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Epistemic Interaction - tuning interfaces to provide information for AI support
DC Salesforce1 Tour Data Governance Lunch Best Practices deck
1. Data Governance and
Stewardship Best Practices
Beth Fitzpatrick, Director Product Marketing, Data.com
Eric Kasserman, Solution Engagement Manager, Data.com
2. Driving Growth and Success through Business Strategy
Company
Strategy
Profitable growth by leveraging
existing customers and
products and expanding
internationally
Data Strategy
Have the right accurate data at the
right moment to drive growth,
optimize business process, and
scale
4. There is SO much to do, where do we even start?
Keep it simple, there is no magic bullet and things WILL change as you go through this
journey
Step 1-
Assess Your Entire Portfolio
Catalog YourAssets
1 2 3
Step 2-
Understand Your Options
Not every application should be run
in Salesforce.com
Step 3-
Plan Your Approach
Create Your Team, Create Your
Plan, Validate ItAnd Socialize
5. Ok, so a little more than just 3 steps…
It’s really not hard to do, once you know where to start
1- Catalog all portfolio and data assets
2- Identify the 6-18 month out treatment plan for each
3- Identify current state “health and wellness” of data
4- Identify future state “health and wellness” of data
5- Identify ancillary modules or capabilities that will be used
6- Identify data volumes and paths (new records, updated records)
7- Identify business stakeholders, IS stakeholders, organizational stakeholders
1
2
3
6. Data Assets – They Are YOUR Assets
You own them, manage them, care and pay for them, but they grow and need your guidance
For each data asset, determine the following Considerations
What kind of data is it? Public, proprietary, private?
Master record, transactional record (dated), structured, unstructured,
etc.?
Who is the business audience that consumes the data? What level of importance is the asset to them – functionally speaking?
Who owns the data (authorship) Does this data allow you to be actionable- if so how?
Where does new data come from (primarily)? Is the data fed in from other systems?
Where do updates come from-primarily? New record updates, data enrichment from 3rd parties, etc?
How much data do you have, how fast does it change, what different
varieties exist….the answer is the trustworthiness of the data.
Data volume, velocity, variety= veracity (trustworthy)
7. Data- Lifecycle and Process
These areas reflect common touch points in the data lifecycle
Acquisition
Quality
Hygiene
Governance
&
Stewardship
Reporting &
Dashboards
•Document all sources- streamline manual and offline
record creation
(Manual, purchased lists, spreadsheets, W2L, ERP, etc.)
•Structured hierarchies using DUNS number
will help you see the full picture of an
organization and where you penetrate.
•Define ‘must have’ and ‘nice to have’
attributes that define your records.
•Matching and cleansing with trusted external
sources is paramount.
•On-going process that requires a strategy; data
goes bad over time – use automated cleansing
routines (off-line even if large data sets are involved)
•Work with stewardship team to divide and conquer
•Define roles for governance team
•Define the company’s data assets
•Determine integration points and
paths- make sure to identify which
systems are masters
•Identify if there are any legal,
compliance or security restrictive
actions
•Data should be reportable
•You should be able to
track inbound data
•It’s difficult to manage
what you cannot measure
8. Portfolio Assessment
• Catalog Applications First
– Identify core functionality –what is the value to the organization?
– Identify current state health and wellness
– Identify on-going financial costs (licensing, support, etc.)
• Identify Consumers, Users, Stakeholders
– What people use the application, how many are there?
– Who and how many support resources are needed?
– Who and how many development/IT resources are needed?
• Create Optimum Future State Direction
– Even if you can’t do it for a while, include it for consideration
What exactly does this mean- what to include?
9. General Functionality Data Support / Operational
Questions To Aide In Your Analysis
These critical areas should also be cataloged when evaluating your portfolio
• What area of your businessdoes the
application currentlyprovide valuefor?
• How long has it been since a significant
update was made to the application’s
capabilities?
• Is the applicationneeded for a regulatory,
legalor compliancepurpose?
• Does the data in the application provide
criticalinformation that stillmeets your
businessgrowth needs?
• How do new records get added, how are
records updated?
• Can you extract data easily?
• Does data from the applicationfeed into
another system – does it receive data from
another system?
• Wouldyou need to provide additionalsupport
to users if changeswere made?
• Is it currentlydifficult to obtainresourcesto
provide support?
• Is the applicationevolving and getting
updates, or is it staying still?
• Does the applicationcontain highlysensitive
or proprietary information?
11. Data Governance – At The Organizational Level
• Board
Direction
• Owners
• Enterprise Architecture
• Change Management
Governance
• Data Quality
• Master and Reference Data
• Reporting and Analytics
• Data Management/Architecture
• Models and Meta-Data
Management
• Working Groups (SMEs)
• Domain Working Groups (SMEs)
• Business Intelligence
• Repository / ETL Tools
• Enterprise Applications (Security, Compliance, Risk Management,
Lifecycle Management)
Teams
• Stewards- Quality Analysts
• Stewards- Custodians
• Business Analysts – Providers
• Architects
• Modelers & Analysts
Roles
12. Governance Structure – Processes and Activities
Governance Committee
Council & Organization
Terms and Definitions
Working Groups
Alignment Liaison
Roles and Responsibilities
Owners
Stewards
Custodians
Data Governance Office
Data Management
Policies and Processes
Principles
Policies
Standards
Processes
Program
Maturity Matrix
Strategy
Scope
Business Case
Implementation
Reporting and Assurance
Performance
Measurement
Continuous Improvement
Evidence Repository
Communications
13. Governance Structure – People (Organizational Areas)
Governance Committee
Sales
Customer Management
Lead to Opportunity
Quoting / Ordering
Sales Operations
Price Books /
Pricing Strategy
Pipeline Reporting
Territory Planning
Data Acquisition
Marketing
Lead
Generation
Content /
Publishing
Events
Branding
Marketing Ops
Data Acquisition
Event Planning
Event Reporting
Reporting/
Analytics
Customer Intel
Information Systems
Data Model
Data Quality
Analytics
Support
Application
Development
Executive
Strategic Vision
Product Direction
Services Direction
Organizational Growth
Funding
14. Meet Your Governance Team – Data Custodians
Qualities for Data Custodians
Understand the importance and criticality of the data assets in their purview. They are often super users or core SMEs and
can articulate details to others of ‘how things work’.
Social/Personal:
Work with average end users to correct data, and or make recommendations to prohibit recurrence of problems
Ability to converse with data stewards to effectively problem solve
Are collaborative- can work with others to ensure buy in and effective change
Technical:
Understand high level data model; can define and potentially write reports for data assets they understand, but they may
not be super technical (don’t expect them to be developers).
Not afraid to get their hands dirty (updating data when necessary)
YOU WILL NOT GET A CAPE, TROPHY, RAISE OR PUBLIC RECOGNITION
AS A DATA CUSTODIAN
15. Meet Your Governance Team – Data Stewards
Qualities for Data Stewards
Understand that Data Stewards are those business people within your company that can provide the knowledge behind
many of your applications. They often know the reasons ‘why’ things are done ‘that way’.
Social/Personal:
They work well with end users (correcting data, isolating problem areas) as well as with custodians and other stewards and
executives. Great data stewards are good communicators, but are even better facilitators.
Ability to converse with mid-line management on potential risk factors
Technical:
Usually you will not be getting your hands dirty, however may need to be involved in helping data custodians validate
changes and prioritize elements as determined by the governance board.
Data Stewards should be knowledgeable about basic report writing and understanding complex models in the organization.
YOU WILL NOT GET A CAPE, TROPHY, RAISE OR PUBLIC RECOGNITION
AS A DATA STEWARD
16. How Do The Teams Work Together?
Most conflicts are resolved at the operational level, however when additional guidance is
needed, it’s recommended to have the ‘council’ assist.
17. High Level Plan To Get You Started
Planning
• Identify Stakeholders
• Identify Goals,
Objectives, Vision &
Drivers
• Complete Application
Portfolio Catalog
• Create Business Case
for Change
Baseline & Target
• Define Data Policies,
Standards & Org
Structure & Roles
• Define Governance
Process
• Establish Current
Maturity Level & Target
Maturity Level
• Check Baseline Against
Target
• Review and Confirm
Data Architecture
Roadmap
• Identify and Prioritize
Projects & Activities
• Develop Data
Management Roadmap
• Conduct Gap Analysis
• Establish Rollout Plan
and Organizational
Readiness (Impact)
Rollout
• Fill Governance Roles
and Socialize
Structures
• Develop Policies and
Standards
• Train and
Communicate
Throughout The
Organization
• Monitor Performance
and Obtain Feedback
•Business Case For Change
•Data Strategy
•Data Principles
•Terms of Reference
•Role Definitions
•Data Management Roadmap
•Gap Analysis
•Organization Readiness
•Rollout Plan
•Feedback Processes
•Communication & Training Plan(s)
•Standards and Policies
•Data Governance Maturity
Assessment
•Conceptual Models
•Process Definitions
Deliverables for Your Governance Plan
19. Assess
- Get a sense of the state of your current data
- Who are your users – reports/adoption
- What fields are being used - fieldtrip
- What do they do – integration/workflow/dependencies/docs/conga etc.
- How is the overall quality – 3rd party, self check
- What do your users “use” it for – ask them/stalk them
- What tools are dependent – Integrations/downstream
- What analytics are important – dashboards/reports/BI
Goal: get inventory and current state
20. Clean It Up
- Initiate some “level 1” cleansing
- Standardize outliers (normalize)
- Self append (inferred fixes)
- Baseline duplicate management (careful of dependencies/history considerations)
- Kill useless records – FHD – Flag,Hide,Delete
- 3rd party append (internal and external)
- Advanced duplicate management
Goal: get your baseline in order
21. Develop a strategy
- Two choices – distributed or managed
- What will work within your “culture” today
- What is sustainable looking forward
- Recommendation – develop a distributed data management model
Goal: get your baseline in order
22. Levers
• Forced business processes – contract generation/automated replies/dashboards
• Entitlement and ownership – labeling, ownership, naming
• SWAT team – call for help – tactical support team
• Gift of time
• Gift of focus and analytics
• Gift of assignment
X
24. Getting Tactical
Moving from talking to doing:
• 9 declarative elements in SFDC that are excellent
governance/stewardship enablers
Check the www.tractionondemand.com blog for additional details
25. Data Quality
Security
What:
Leverage SFDC field level
security to restrict access to
certain data validation fields.
IE approval status, record
condition.
Why:
Allocate responsibility in
determining what is “trusted” to
a certain group of people. Hide
fields to enable usability.
How:
•Set up custom profiles for ALL – catalogue access
•Manage Field Access
•Then create Permission Sets
Hide/Restrict access to certain fields that are
strategic in nature
26. Data Quality
Validation Rules/Dependencies
What:
Block the ability for users to
enter misaligned values via
validation rules. Leverage
rules to create gentle blocks
and encourage correct
process.
Why:
If you give people
workarounds, they’ll use them.
Typically workarounds = bad
data and no governance
How:
• Conditional Validation statements using mixed
AND/OR
• English: if the record type is Prospect and the
state/prov is empty require it.
• Give GREAT explanations and embed brand
27. Data Quality
Record Types/Layouts/ Visual Indicators
What:
Use record types to segment
an object based on status to
ensure only relevant
information is presented based
on stage in process.
Why:
Don’t show users information
that is meaningless within the
context they are operating.
-RT/Layouts by status
-RT/Layouts by type
How:
• Establish your profiles
• Establish your types of records (account type)
• Establish your status/progress by type
• Use icons to clearly indicate stage/ quality
• Determine what is relevant by type/status
• Develop custom page layouts for each
• Create WF to auto move RT based on defined
actions
28. Data Quality
Dependent Picklist Fields
What:
Only show relevant values on a
particular record. Don’t give
users incorrect choices
Why:
Noise. Makes your system look
poorly thought through. Easy
logical fix
How:
Set up profiles
Set up record types
Create fields, assign values by RT
Create additional dependent fields, follow same
path
Use Excel to map your matrix out.
29. Data Quality
Approval Workflows
What:
Prior to record lock, or pass
over to integration leverage
approval workflow as final gate.
Why:
Not all data gets migrated
Apply expensive resources to
sample
Ensure data that is propagated is
good
How:
• Set up profiles
• Set up record types
• Set up page layouts
• Set approval workflow. Apply submit for
approval button to specific layouts. Block
progress without approval via validation.
30. Data Quality
System / User Fields
What:
Create custom fields to allow
users to enter basic information
without disturbing sync data.
Leverage formula fields to
differentiate
Why:
Battle user frustration
Open up usability without losing
DQ
Small step in managing biz
expectation
How:
Save standard fields for native synchronizations
and leverage custom fields for variable data.
31. Data Quality
Add a Data Quality Score
What:
Establish a basic point scoring
formula to provide data quality
ratings on records
Why:
Expose your “trust” in a record and
detach the typical link between data
quality and adoption.
Set user expectations on records
Create positive motivation to
improve
How:
Create a single formula field to score
completeness from priority fields
Conditional statement that evaluates:
-Consistency
-Recency – last changed, last activity
-Completeness
-No duplicates
-3rd party validation
-Represent point ranges with a graphic – one
score
-Use Analytic Snapshots to measure over time
-Report by Rep for accountability
32. Data Quality
Kill Suspects
What:
Simply put, most systems have
2x the data they need. Clean
house!
Why:
Eliminate noise
Give ownership to users
Invest resources in high profiles
prospects
How:
Never delete first
1. Isolate suspects
2. Flag for elimination and color code
3. Hide with security
4. Wait
5. Backup
6. Delete
!! Warning. This record has been flagged for deletion. Please
update details with complete information by #formula to prevent
removal.
33. Data Quality
De-dupe
What:
Follow a consistent
method/process when de-
duping and NEVER deter
Why:
Duplicates are easy to eliminate,
and very expensive to restore
should you have made a mistake
How:
Main Order
1. Accounts vs Accounts
2. Contacts within Accounts
3. Contacts between Accounts
4. Accounts vs Accounts
5. Leads
6. Leads to Contacts
Search before create
Address correction
34. Data Quality
Make it Easy
What:
Consider how record
generation be easy and
convenient.
Why:
If data entry is easy and there is
value in entering details,
supports workflow, people will do
it.
How:
Search before create – DDC API applications
Address tools
Clicktools forms to flatten SFDC record
generation
Experian QAS/ Postcode Anywhere
Workflow to infer values
Social search
36. Data Quality Best Practices
Tips and Tricks for Improving Data Quality
Data Quality Issue Best Practice Tip #1 Best Practice Tip #2 Best Practice Tip #3
Duplicates from multiple systems (like
SAP)
Duplicates in the existing data bases:
Recommend an analysis and de-dupe
project to get to a baseline. Recommend
Demand Tools as product to assist in
ongoing
Duplicates from incoming systems
(ERP, etc..): Need to work with
these system owners to identify the
common key to use for prevention
Lead source creating duplicates:
Determine lead process and use of
the lead object in conjunction with a
baseline de-dupe
Need control points to evaluate data
accuracy and quality
Identify the specific fields for your
organization that are of value (by object);
write reports to ascertain where the data
is non-standard or missing, then have
validation rules created to maximize the
entry going forward.
Be a Formula Ninja to help with Data
Quality:
http://www.youtube.com/watch?v=r
1T767LzrZY#t=2377
Lack of data quality record delete
strategy
Archiving policy should be established and
communicated. It should consider all
existing records with/without transactions
and treament should be defined
accordingly based on business processes
Auto delete within 60 days if no
activity- record retention
No good definition of an account today
Organizations should standardize on the
definition of an account with a 3rd party
referential data source like D&B
Don’t try to recreate your ERP in
your CRM . ERP's handle many
complex relationships. Determine
what is required in CRM and port
data accordingly
Salesforce is a dumping ground for new
data
Leverage Lead Object to filter out the
garbage. Potentially do an offline cleanup
first and the build out a process to
manage ongoing
37. Compliance and Administration Best Practices
Tips and Tricks for Improving CRM Compliance, Security and Adherence
Compliance Issue Best Practice Tip #1 Best Practice Tip #2 Best Practice Tip #3
How to determine and manage
centralized and decentralized
processes
Use delegated administration
Salesforce functions to provide
support for users. Establish
governance team with input from
business units that cross
geographic boundaries to enable
adequate support in end user time
zones
Identify what areas of the business
should be managed and supported
via business specific teams, versus
the overall management of the org
Use Centralized support when
there are 3rd party system
integrations being used,
decentralized when users can be
managed 1 admin to every 400
users
Security best practices for
compliance and regulatory
adherence; profile updates
Structure your governance plan
that dictates the process that
needs to be followed when initially
assigning users to profiles; what
validation needs to be done when
profile updates are necessary, who
should sign off on it from a
functional side and technical side
Identify what data fields are
necessary to be maintained due to
regulatory requirements, or hidden
from certain profiles/users
Compliance of data acquisitions
and routing processes
For any data sets that cannot be
automatically updated within
Salesforce (like D&B information)
assign a data steward that can
periodically run updates and check
for inconsistencies
Always include a data source on
files that are assigned to users-
this will help you identify if source
data is problematic or if internal SF
actions have updated (i.e.
workflow field updates change
values). Turn on activity history
tracking only for critical fields
38. Compliance and Administration Best Practices
Tips and Tricks for Improving CRM Compliance, Security and Adherence
Compliance Issue Best Practice Tip #1 Best Practice Tip #2
Data acquisition purchases vs. user obtained
and organizational data
Put in place formal process for users to submit
alternate data sources to administrator for
uploading to Salesforce
Establish ROI on all purchased data sources
first- understand what your business benefits
are versus the overall maintenance, care and
feeding of additional data assets. Assign a
SME or data steward to each data asset and
work with them to establish proper
maintenance routines
Centralization issues- one group doing all
updates and data loads
Limiting the number of people who can upload
data is a best practice. Only certified admins
with a clear understanding of the system and
the data being considered should be permitted
to load data
All data loads should be tested and signed off
in a sandbox org prior to loading to production
Opportunities not created until deals close
This is a top down driven approach. Tie to
compensation. Need to do this to create a
pipeline visibility and forecast
39. Success Metrics Best Practices
Tips and Tricks for Driving Success and User Behavior
Success Metrics Issue Best Practice Tip #1 Best Practice Tip #2 Best Practice Tip #3
Identifying and defining fields and
metrics for accuracy
Identify where field population
enhances organizational reporting;
define what you need to report on
first, then go back and fill in
missing fields with information
Use consistent pick lists, run
reports for missing information
(eliminate 'unknown')
Create dashboards that can be
scheduled for refresh to show
progress each month
How to deploy a stewardship
model without dedicated resources
or funding
Turn your users into stewards.
Leverage field tracking to track
changes and owners to create an
audit trail for updates and changes
to records. Create field
requirements on all records to
ensure that proper data is entered
at point of creation to avoid
duplicates and shell records
Deploy a de-dupe tool and search
before create to build in data
quality tools that prevent user
abuse and limit the need for a
high level of effort by users
Create an archiving strategy to
assist with currency and quality of
data. Implement a flagging
strategy so that users can denote
records they want to keep. If the
record is not maintained or
updated it can move to a hidden
status prior to archive. Follow
similar steps as outlined in the
data quality section
How do we motivate the right
behavior from sales
In many cases sales and other
departments for that matter are
motivated by 1) Recognition 2)
Compensation 3) Impact to the
business. All three should be
considered when putting a data
governance plan together
Create a hero board that
highlights those that are driving
data improvements. Potentially
leverage a spiff to drive record
completeness and accuracy
Link opportunity ownership to data
accuracy and completeness. An
opportunity requires certain data
points for creation and without an
opportunity a contract cannot be
created nor can a rep be
compensated.
40. Data Management Best Practices
Tips and Tricks for Improved Data Management
Data Management Issue Best Practice Tip #1 Best Practice Tip #2 Best Practice Tip #3
Hierarchies- without restrictions
so you can view accounts with
open opptys
Establish what fields you wish to
have in a hierarchy- i.e. legal, by
brand (tradestyle), etc.
Establish and review account
sharing/access rules (i.e. read
only for all users); make sure
every user has a manager in SF
and the manager is active
Using permission sets and role
hierarchies, allow management
team members more visibility into
entire family trees
Blocking account creation for reps
Allow reps to create "shell" or
"propspect" records. Enfore rules
that require a minimum amount of
info for the account to become a
"customer" or allow opportunities
to be created/closed on the
account
Rep requests a record to be
created and request routes to
stewardship team. They create the
account once vetted or use dupe
blocker with DDC on insert
Use dupe blocker with Data.com
on insert to prevent dupes from
being created and drive user
behavior to import records from
Data.com rather than manual
creation
Segmentation: governments,
affiliates, hierarchies; non-
standard categorization; routing,
analysis and validation (territory
assignments)
Apply consistency across the
board with segmentation rules;
make fields required if they are
needed for segmentation purposes
If territory alignment changes
frequently consider using territory
management. This will allow you
the ability to move accounts with
related items to new users
including open opportunities
41. Data Management Best Practices
Tips and Tricks for Improved Data Management
Data Management Issue Best Practice Tip #1 Best Practice Tip #2
Data performance; data volume management
and handling of data; Eloqua activities (email
views being logged as activities)
Establish effective governance to identify and put in
place clean up policies- i.e. deletion routines when
there is no activity on an account or contact within a
certain period of time (flag, hide, delete); conduct
weekly (or for large orgs- monthly) data exports to
quickly identify records that should be removed. Flag
records with a potential removal date and status. Then
once that date has occurred, hide records (workflow
field update to change record type), and eventually
delete after org export has been done
For Eloqua and marketing system integrations, keep
nurturing and drip activities in those systems and
only pass qualified records back to Salesforce when
thresholds are met
Lead source and categories
Define and segregate leads by inbound sources; every
lead should have a source so you can track metrics on
results when converted
Use lead scoring to help with routing and
governance (i.e if you are missing critical fields,
update lead assignment rules to have sketchy leads
go to an inbound data steward queue first (then
supplement before sending to sales users)
Not using leads in Salesforce
Using leads allows your company to maintain two
separate lists - one for prospective customers and one
for existing customers. You can store your prospects as
leads, and then once a lead becomes qualified, you can
convert it to an account, contact, and, optionally, an
opportunity
Leads are especially useful if your company has two
separate teams - one that handles lead generation
and mass marketing and one that handles sales.
The lead generation team can concentrate their
work on the Leads tab, and the opportunity team
can use the Account, Contact, and Opportunity tabs
42. Data Management Best Practices
Tips and Tricks for Improved Data Management
Data Management Issue Best Practice Tip #1 Best Practice Tip #2
How to build out hierarchies
D&B upward linkage is a very effective way to
understand account family structures. This
understanding enables effective territory
management, cross sell/upsell opportunities as well
as roll up reporting to understand total
sales/exposure to a particular business
Define and determine the role of a legal hierarchy
vs. an internal company defined hierarchy. Some
customers need to understand both and reflect legal
ownership, as well as the unique view of the account
from the account management perspective.
Marketing automation integration
Marketing automation integration can be handled in
two main manners 1) Bring all records into SFDC,
Clean and enrich, then push applicable data back to
MA tool
2) If a subset of records is desired in Salesforce,
Data.com API's can be utilized to enrich data within
the MA tool. Note licensing restrictions may apply. In
both cases enriched data is critical for lead routing,
scoring and reporting
Need simplified view of customer; too many related
objects; inline dashboards might be good
Ongoing system governance and system audits to
prohibit and weed out unneeded configuration and
technical debt is key to a successful system
MDM team that owns the golden record- not visible
via Salesforce views today
Record visibility is determined via Salesforce sharing
and role hierarchy set-up. Any integration should
keep these rules in mind
Each platform owner owns data - 34 instances of
Salesforce
To enable cross platform visibility, a CDM/MDM
strategy should be considered. Salesforce to
Salesforce integrations as well as consolidation orgs
are considered for these needs. Clean, keyed data is
the foundation to enable such cross-org visibility