If you’ve migrated donor data from one CRM to another, no doubt you have dealt with a lot of difficult decisions. In fact, our donor data migration clients often express surprise at the number of decisions they have to make. In this post, we discuss our list of Top 10 Tough Donor Data Migration Decisions. From our webinar hosted by Bloomerang on August 20, 2014. Presented by Gary Carr, CEO Third Sector Labs.
10 Decisions You Will Face With Any Donor Data Migration ProjectBloomerang
Donor data migration to a new CRM can be downright frustrating for some nonprofits. Planning is critical. More importantly, however, you need to prepare for the inevitable decisions you will have to make during the process.
In this webinar, we will examine 10 decisions for which every nonprofit needs to be prepared in order to experience a successful transition to a new CRM.
Learning Objectives:
Understand the CRM data migration process.
Identify the key decisions that will be made along the way.
Discuss pros and cons of decision options.
Take away from the event a sense of preparedness and control over your next data migration project.
Be able to apply what you’ve learned to other data migration projects at your organization.
Artificial Intelligence Expert Session Webinar ibi
Tom Redman of Data Quality Solutions and Information Builders' CMO Michael Corcoran share the latest on artificial intelligence trends in this webinar.
Growth, Engagement & Search Metrics: Snake Oil or North StarsJune Andrews
Talk at Social Media & Web Analytics
LinkedIn's homepage contains content from over 40 product areas and has evolved over hundreds of experiments. For modern websites this is not an unusual phenomena. To parallelize website development and work in harmony, product teams rely on two guidance systems, organizational cohesion and analytical feedback. Our focus is analytics and in particular, metrics. Unfortunately, not all metrics are created equal. Common metrics such as mean average precision and engagement stickiness have massive downsides if used incorrectly. Here we explore criteria to align optimizing metrics with improving user experience and reaching company goals.
Replication in Data Science - A Dance Between Data Science & Machine Learning...June Andrews
We use Iterative Supervised Clustering as a simple building block for exploring Pinterest's Content. But simplicity can unlock great power and with this building block we show the shocking result of how hard it is to replicated data science conclusions. This begs us to challenge the future for When is Data Science a House of Cards?
Predictive Analytics - How to get stuff out of your Crystal BallDATAVERSITY
Everyone wants to leverage data. The optimal implementation of analytics is an organization-wide set of capabilities. These are called advantageous organizational analytic capabilities in that a clear ROI is demonstrable from these efforts. Turns out that there are a number of prerequisites to advantageous organizational analytics. These include:
Adopting a crawl, walk, run strategy
Understanding current and potential organizational maturity and corresponding capabilities
Achieving an appropriate technology/human capability balance
Implementing useful IT systems development practices
Installing necessary non-IT leadership
This webinar will explore these and other topics using examples drawn from DOD, healthcare researchers, and donation center operations.
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
10 Decisions You Will Face With Any Donor Data Migration ProjectBloomerang
Donor data migration to a new CRM can be downright frustrating for some nonprofits. Planning is critical. More importantly, however, you need to prepare for the inevitable decisions you will have to make during the process.
In this webinar, we will examine 10 decisions for which every nonprofit needs to be prepared in order to experience a successful transition to a new CRM.
Learning Objectives:
Understand the CRM data migration process.
Identify the key decisions that will be made along the way.
Discuss pros and cons of decision options.
Take away from the event a sense of preparedness and control over your next data migration project.
Be able to apply what you’ve learned to other data migration projects at your organization.
Artificial Intelligence Expert Session Webinar ibi
Tom Redman of Data Quality Solutions and Information Builders' CMO Michael Corcoran share the latest on artificial intelligence trends in this webinar.
Growth, Engagement & Search Metrics: Snake Oil or North StarsJune Andrews
Talk at Social Media & Web Analytics
LinkedIn's homepage contains content from over 40 product areas and has evolved over hundreds of experiments. For modern websites this is not an unusual phenomena. To parallelize website development and work in harmony, product teams rely on two guidance systems, organizational cohesion and analytical feedback. Our focus is analytics and in particular, metrics. Unfortunately, not all metrics are created equal. Common metrics such as mean average precision and engagement stickiness have massive downsides if used incorrectly. Here we explore criteria to align optimizing metrics with improving user experience and reaching company goals.
Replication in Data Science - A Dance Between Data Science & Machine Learning...June Andrews
We use Iterative Supervised Clustering as a simple building block for exploring Pinterest's Content. But simplicity can unlock great power and with this building block we show the shocking result of how hard it is to replicated data science conclusions. This begs us to challenge the future for When is Data Science a House of Cards?
Predictive Analytics - How to get stuff out of your Crystal BallDATAVERSITY
Everyone wants to leverage data. The optimal implementation of analytics is an organization-wide set of capabilities. These are called advantageous organizational analytic capabilities in that a clear ROI is demonstrable from these efforts. Turns out that there are a number of prerequisites to advantageous organizational analytics. These include:
Adopting a crawl, walk, run strategy
Understanding current and potential organizational maturity and corresponding capabilities
Achieving an appropriate technology/human capability balance
Implementing useful IT systems development practices
Installing necessary non-IT leadership
This webinar will explore these and other topics using examples drawn from DOD, healthcare researchers, and donation center operations.
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
Data-Ed Online: Emerging Trends in Data JobsDATAVERSITY
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Takeaways:
Organizational thinking must change: Value-added data management practices must be considered and included as a vital part of your business strategy.
Walk before you run with data focused initiatives: Understand and implement necessary data management prerequisites as a foundation, then build upon that foundation.
There are no silver bullets: Tools alone are not the answer. Specifying business requirements, business practices and data governance are almost always more important.
JDO 2019: Data Science for Developers - Matthew RenzePROIDEA
Data science is revolutionizing the world around us. We’re incorporating artificial intelligence, machine learning, and data-driven decision making into all aspects of business. However, many software developers have yet to learn how to leverage these practices to create better software. In this presentation, we’ll learn how expert developers are using data science to create better software. We’ll learn how to use data analytics, machine learning, and anticipatory design to create more intelligent software. In addition, we’ll learn how to use data from our dev-ops pipeline to improve our software development practices.
Data quality testing – a quick checklist to measure and improve data qualityJaveriaGauhar
Don't wait for a data migration event to test your data quality. Perform data quality tests now before it gets too late. Here's everything you need to know!
https://dataladder.com/data-quality-test-checklist/
Data Quality: The Data Science struggle nobody mentions - Data Science MeetUp...University of Twente
Presentation about data quality at the second Data Science MeetUp Twente https://www.meetup.com/Data-Meetup-Twente/events/241545781/ on "Responsible Data Analytics", 7 Sep 2017.
Slides: How to Avoid the 10 Big Data Analytics Blunders — Best Practices for ...DATAVERSITY
As a steward for your enterprise’s data and digital transformation initiatives, you’re tasked with making the right choice. But before you can make those decisions, it’s important to understand what not to do when planning for your organization’s big data initiatives.
Michael Stonebraker shares the top 10 big data blunders that he has witnessed in the last decade or so. As a pioneer of database research and technology for more than 40 years, Michael understands the mistakes enterprises often made and knows how to correct and avoid them. By learning about the major blunders, you’ll know how best to future-proof your big data management and digital transformation needs. Common blunders include problems from not planning on moving everything to the cloud to believing that a data warehouse will solve all your problems to succumbing to the “innovator’s dilemma.” To illustrate the blunders, he shares a variety of corrective tips, strategies, and real-world examples.
DataEd Slides: Data Strategy – Plans Are Useless but Planning Is InvaluableDATAVERSITY
Too often we hear the question – can you help me with a data strategy? Unfortunately, for most, this is the wrong request because it focuses on its least valuable aspect. The more useful request is – can you help me apply data strategically in support of strategy? Yes, at early maturity phases, the process is more important than the product! Trying to write a good (much less perfect) data strategy on the first attempt is generally not productive – particularly giving the widespread acceptance of Mike Tyson’s truism: “Everybody has a plan until they get punched in the face.” By refocusing lesson learning on crawl, walk, run approaches to using data strategically, data is able to keep up with agile, evolving strategies. This approach will contribute more to three primary organizational data goals than other efforts. Learn how improving:
• Your organization’s data
• The way your people use data
• The way your people use data to achieve your organizational strategy
contributes more than predetermined plans. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Addressing existing challenges pervasively includes overcoming necessary but insufficient prerequisites and developing a disciplined, repeatable means of improving business objectives. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (data strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are covered including:
• A cohesive argument for why Data Strategy is necessary for effective Data Governance
• An overview of prerequisites for effective strategic use of Data Strategy, as well as common pitfalls
• A repeatable process for identifying and removing data constraints
• The importance of balancing business operation and innovation
Join our #DataTalk on Thursdays at 5 p.m. ET. This week, we tweeted with Dr. Michael Wu, the Chief Scientist at Lithium, where he applies data-driven methodologies to investigate the complex dynamics of the social web.
Michael works with big data and has developed many predictive and prescriptive social analytics with actionable insights. His R&D won him the recognition as a 2010 Influential Leader by CRM Magazine.
You can see all tweets and resources here:
http://www.experian.com/blogs/news/about/data-scientists/
Every company has legacy applications.
And every framework, language, technology will dead or will become legacy.
How to modernise them without the need to rewrite everything by hand?
Does an automatic migration of source code is realistic on large applications?
Does Eclipse technologies help for these tasks?
This talk will answer these questions through a real use case executed for a french ministry: the migration of a large scale application (5 millions of line of code) from Forte to Java.
We will explain our project process to insure 99,9999% of garanties to migrated code and how we create a custom migration factory with:
EMF to represent source code as a structured model
CDO to store several Gigabits of datas with very good performances
Agility to reverse existing source code
GMF and ATL for Software Mining and cartography
Acceleo for translating
The talk will also show demos of this migration tooling applied to other technologies and other needs of legacy analysis:
ADA to C++
VB to DotNet
OracleForms to JavaEE
Natural Sonar (quality checker)
Cobol cartography
Lift Your Legacy UNIX Applications & Databases into the Cloud Fadi Semaan
Unlock efficiency and innovation while reducing costs. In this presentation we will address:
1) Legacy pain overview
2) Dell application modernization services
3) UNIX to Linux migration
4) Case studies
Presented by Rich Cronheim
Executive Director , Dell Application Modernization
Services
This session provides an overview of how organizations can migrate workloads to the AWS cloud at scale. We will go through available migration frameworks and best practices with common use case examples during this session.
Migrating large fleets of legacy applications to AWS cloud infrastructure requires careful planning, since each phase needs to balance risk tolerance against the speed of migration.
Through participation in many large-scale migration engagements with customers, AWS Professional Services has developed a set of successful best practices, tools, and techniques that help migration factories optimize speed of delivery and success rate. In this session, we cover the complete lifecycle of an application portfolio migration with special emphasis on how to organize and conduct the assessment and how to identify elements that can benefit from cloud architecture.
UNIX to SUSE Linux Enterprise Server : Tools and Tips for a Successful MigrationNovell
Tired of expensive, proprietary UNIX systems? Today’s IT professionals are making strategic investments in Linux, preferring its open architecture and low cost to the proprietary—and very expensive—UNIX platform. This session will target all those who are looking to move their software application from UNIX to SUSE Linux Enterprise. We will cover the programming languages, developer tools, and programming APIs that are available for many software applications types. We will also discuss how to plan and structure a porting project for SUSE Linux Enterprise Server.
Where to Begin? Application Portfolio Migration - Miha Kralj, Principal Consultant, AWS
Application portfolio assessment is a technique used at the beginning of enterprise application migration process. It helps migration team to gather, analyse and understand their app portfolio before deciding on priorities and sequences of application migration. This session will present the app assessment process, the most common migration strategies and tools, and the placement of application portfolio migration in a complete IT Transformation process.
Data-Ed Online: Emerging Trends in Data JobsDATAVERSITY
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Takeaways:
Organizational thinking must change: Value-added data management practices must be considered and included as a vital part of your business strategy.
Walk before you run with data focused initiatives: Understand and implement necessary data management prerequisites as a foundation, then build upon that foundation.
There are no silver bullets: Tools alone are not the answer. Specifying business requirements, business practices and data governance are almost always more important.
JDO 2019: Data Science for Developers - Matthew RenzePROIDEA
Data science is revolutionizing the world around us. We’re incorporating artificial intelligence, machine learning, and data-driven decision making into all aspects of business. However, many software developers have yet to learn how to leverage these practices to create better software. In this presentation, we’ll learn how expert developers are using data science to create better software. We’ll learn how to use data analytics, machine learning, and anticipatory design to create more intelligent software. In addition, we’ll learn how to use data from our dev-ops pipeline to improve our software development practices.
Data quality testing – a quick checklist to measure and improve data qualityJaveriaGauhar
Don't wait for a data migration event to test your data quality. Perform data quality tests now before it gets too late. Here's everything you need to know!
https://dataladder.com/data-quality-test-checklist/
Data Quality: The Data Science struggle nobody mentions - Data Science MeetUp...University of Twente
Presentation about data quality at the second Data Science MeetUp Twente https://www.meetup.com/Data-Meetup-Twente/events/241545781/ on "Responsible Data Analytics", 7 Sep 2017.
Slides: How to Avoid the 10 Big Data Analytics Blunders — Best Practices for ...DATAVERSITY
As a steward for your enterprise’s data and digital transformation initiatives, you’re tasked with making the right choice. But before you can make those decisions, it’s important to understand what not to do when planning for your organization’s big data initiatives.
Michael Stonebraker shares the top 10 big data blunders that he has witnessed in the last decade or so. As a pioneer of database research and technology for more than 40 years, Michael understands the mistakes enterprises often made and knows how to correct and avoid them. By learning about the major blunders, you’ll know how best to future-proof your big data management and digital transformation needs. Common blunders include problems from not planning on moving everything to the cloud to believing that a data warehouse will solve all your problems to succumbing to the “innovator’s dilemma.” To illustrate the blunders, he shares a variety of corrective tips, strategies, and real-world examples.
DataEd Slides: Data Strategy – Plans Are Useless but Planning Is InvaluableDATAVERSITY
Too often we hear the question – can you help me with a data strategy? Unfortunately, for most, this is the wrong request because it focuses on its least valuable aspect. The more useful request is – can you help me apply data strategically in support of strategy? Yes, at early maturity phases, the process is more important than the product! Trying to write a good (much less perfect) data strategy on the first attempt is generally not productive – particularly giving the widespread acceptance of Mike Tyson’s truism: “Everybody has a plan until they get punched in the face.” By refocusing lesson learning on crawl, walk, run approaches to using data strategically, data is able to keep up with agile, evolving strategies. This approach will contribute more to three primary organizational data goals than other efforts. Learn how improving:
• Your organization’s data
• The way your people use data
• The way your people use data to achieve your organizational strategy
contributes more than predetermined plans. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Addressing existing challenges pervasively includes overcoming necessary but insufficient prerequisites and developing a disciplined, repeatable means of improving business objectives. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (data strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are covered including:
• A cohesive argument for why Data Strategy is necessary for effective Data Governance
• An overview of prerequisites for effective strategic use of Data Strategy, as well as common pitfalls
• A repeatable process for identifying and removing data constraints
• The importance of balancing business operation and innovation
Join our #DataTalk on Thursdays at 5 p.m. ET. This week, we tweeted with Dr. Michael Wu, the Chief Scientist at Lithium, where he applies data-driven methodologies to investigate the complex dynamics of the social web.
Michael works with big data and has developed many predictive and prescriptive social analytics with actionable insights. His R&D won him the recognition as a 2010 Influential Leader by CRM Magazine.
You can see all tweets and resources here:
http://www.experian.com/blogs/news/about/data-scientists/
Every company has legacy applications.
And every framework, language, technology will dead or will become legacy.
How to modernise them without the need to rewrite everything by hand?
Does an automatic migration of source code is realistic on large applications?
Does Eclipse technologies help for these tasks?
This talk will answer these questions through a real use case executed for a french ministry: the migration of a large scale application (5 millions of line of code) from Forte to Java.
We will explain our project process to insure 99,9999% of garanties to migrated code and how we create a custom migration factory with:
EMF to represent source code as a structured model
CDO to store several Gigabits of datas with very good performances
Agility to reverse existing source code
GMF and ATL for Software Mining and cartography
Acceleo for translating
The talk will also show demos of this migration tooling applied to other technologies and other needs of legacy analysis:
ADA to C++
VB to DotNet
OracleForms to JavaEE
Natural Sonar (quality checker)
Cobol cartography
Lift Your Legacy UNIX Applications & Databases into the Cloud Fadi Semaan
Unlock efficiency and innovation while reducing costs. In this presentation we will address:
1) Legacy pain overview
2) Dell application modernization services
3) UNIX to Linux migration
4) Case studies
Presented by Rich Cronheim
Executive Director , Dell Application Modernization
Services
This session provides an overview of how organizations can migrate workloads to the AWS cloud at scale. We will go through available migration frameworks and best practices with common use case examples during this session.
Migrating large fleets of legacy applications to AWS cloud infrastructure requires careful planning, since each phase needs to balance risk tolerance against the speed of migration.
Through participation in many large-scale migration engagements with customers, AWS Professional Services has developed a set of successful best practices, tools, and techniques that help migration factories optimize speed of delivery and success rate. In this session, we cover the complete lifecycle of an application portfolio migration with special emphasis on how to organize and conduct the assessment and how to identify elements that can benefit from cloud architecture.
UNIX to SUSE Linux Enterprise Server : Tools and Tips for a Successful MigrationNovell
Tired of expensive, proprietary UNIX systems? Today’s IT professionals are making strategic investments in Linux, preferring its open architecture and low cost to the proprietary—and very expensive—UNIX platform. This session will target all those who are looking to move their software application from UNIX to SUSE Linux Enterprise. We will cover the programming languages, developer tools, and programming APIs that are available for many software applications types. We will also discuss how to plan and structure a porting project for SUSE Linux Enterprise Server.
Where to Begin? Application Portfolio Migration - Miha Kralj, Principal Consultant, AWS
Application portfolio assessment is a technique used at the beginning of enterprise application migration process. It helps migration team to gather, analyse and understand their app portfolio before deciding on priorities and sequences of application migration. This session will present the app assessment process, the most common migration strategies and tools, and the placement of application portfolio migration in a complete IT Transformation process.
Migrating Enterprise Applications to AWS: Best Practices & Techniques (ENT303...Amazon Web Services
This session discusses strategies, tools, and techniques for migrating enterprise software systems to AWS. We consider applications like Oracle eBusiness Suite, SAP, PeopleSoft, JD Edwards, and Siebel. These applications are complex by themselves; they are frequently customized; they have many touch points on other systems in the enterprise; and they often have large associated databases. Nevertheless, running enterprise applications in the cloud affords powerful benefits. We identify success factors and best practices.
Adcieo recently co-hosted a data management webinar with Third Sector Labs. The topic - "Why don't you have a data management plan?" - focusing on why you need, and how to create, a nonprofit data management plan.
Why don't you have a data management plan finalBrandon Fix
A thought-provoking webinar where we looked at:
- How bad data traps us into putting out data fires reactively
- Why every organization needs a proactive data management plan
- The difference between a data map and a data plan, and why you need both
- What are the characteristics of a good data management plan
Speakers included:
Debbie Snyder - Adcieo, Vice President, Sales and Marketing
Debbie is responsible for leading sales and marketing for Adcieo and ensuring that Adcieo’s expertise, strategies and tools are positioned to create solutions for our nonprofit clients. Debbie is responsible for leading sales and marketing for Adcieo and ensuring that Adcieo’s digital expertise, strategies and tools are positioned to create solutions for our nonprofit clients. Debbie is focused on improving financial performance for Adcieo clients by driving faster and stronger constituent engagement, across all channels, thus increasing the overall value of constituent relationships.
Gary Carr is the President and CEO of Third Sector Labs, a company challenging nonprofit organizations to re-think their data practices. Believing that today's brand relationships begin and end with data, Gary's goal is to help nonprofits succeed through data best practices. Understanding the challenges of good data management becomes the key to realizing opportunities to succeed through data.
Healthcare Best Practices in Data Warehousing & AnalyticsDale Sanders
This is from a class lecture that I gave in 2005. Rather dated, but 95% of content is still very relevant today, which is a bit unfortunate. That's an indication of how little we've progressed in the healthcare domain.
Top 7 Reasons why Maintenance Work Orders are Closed Out AccuratelyRicky Smith CMRP, CMRT
Closing out work orders accurately is critical for leadership to make the “right decisions at the right time with accurate data” and it can only occur if work orders are “Closed with the Right Information/Data”.
If metrics and Key Performance Indicators are so important where are people pulling the data from without their work orders having the right data on them when they are closed into that dark hole called the CMMS or EAM.
Without good data you are lost and probably are making decisions based on passion and not facts.
Anyone that works with data downstream in an organization has seen things go...wrong, while upstream managers and business leaders are being held accountable. Whether it's a failure in process, or something technically goes wrong, working with data is not always easy. What happened? How can we prevent it from happening again? What's next?
This talk, given at the Portland Data Science Group on October 27, 2016, uncovers 4 common foibles of working with organizational data.
Big Data - it's the big buzz. But is it dead on arrival?
In this presentation Daragh O Brien looks at the history of information management, the challenges of data quality and governance, and the implications for big data...
Good data is like good water: best served fresh, and ideally well-filtered. Data Management strategies can produce tremendous procedural improvements and increased profit margins across the board, but only if the data being managed is of a high quality. Determining how Data Quality should be engineered provides a useful framework for utilizing Data Quality Management effectively in support of business strategy, which in turn allows for speedy identification of business problems, delineation between structural and practice-oriented defects in Data Management, and proactive prevention of future issues. Organizations must realize what it means to utilize Data Quality engineering in support of business strategy. This webinar will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor Data Quality. Showing how Data Quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.
4 Steps to Creating an Effective Sales DashboardDomo
Sales executives deal with a daily barrage of data - forecast numbers, pipeline velocity, lead volume, territory effectiveness, win/loss reports. The biggest challenge is figuring out how to consume the data and translate it into better decision-making. How is this accomplished? The answer for an increasing number of successful sales leaders is an effective sales dashboard.
Discussion Topics include:
· The Explosion of Sales and Marketing Data
· Key Metrics Sales Leaders are Tracking
· Lessons Learned from a Sales Veteran
· New Sales Dashboard Technologies
4 Barriers to creating predictive talent analytics and how to overcome themMartin Sutherland
This presentation highlights the 4 big barriers to creating a talent analytics platform. HR systems have, and still are, mostly designed from the inside (HR) out (employees). This means they do not have a consumer mindset that engages people and provides a steady stream of relevant people data to analyze.
If employees get no value from providing an organization with data, they stop doing it. No data=No analytics. By giving every employee a self-directed career management tool that helps them to identify their strengths, identify the strengths they need to be successful and provides them with personalized advice on how to develop new strengths, the organization maintains a steady stream of relevant talent data.
This presentation was delivered at the Chief Human Resource Officers conference in Cape Town 2015. In the 20 minutes it took to do the presentation, we were able to provide a delegate from a large food manufacturer with a live site.
From Asset to Impact - Presentation to ICS Data Protection Conference 2011Castlebridge Associates
This is a presentation I delivered to the Irish Computer Society Data Protection Conference in February 2011 and again on a webinar for dataqualitypro.com in March 2011.
It looks (for what I believe was the first time) at the relationship between Information Quality and Data Governance principles and practices and the objectives of Data Protection/Privacy compliance. it includes my first version of the mapping of the 8 Data Protection principles to the POSMAD Information Life Cycle referred to by McGilvray and others in the IQ/DQ fields.
As part of your fundraising campaigns and online engagement, you likely collect many metrics and data points. But do you take the time to reflect on this data and use it to improve for next time? In this session, we’ll discuss metrics you can collect, share each other’s best practices for data collection processes, and demo dashboard tools that will help you see the big picture.
You've heard the news, Data Science is the cool new career opportunity sweeping the world. Come learn from Thinkful Mentors all about this new and exciting industry.
Delta Analytics is a 501(c)3 non-profit in the Bay Area. We believe that data is powerful, and that anybody should be able to harness it for change. Our teaching fellows partner with schools and organizations worldwide to work with students excited about the power of data to do good.
Welcome to the course! These modules will teach you the fundamental building blocks and the theory necessary to be a responsible machine learning practitioner in your own community. Each module focuses on accessible examples designed to teach you about good practices and the powerful (yet surprisingly simple) algorithms we use to model data.
To learn more about our mission or provide feedback, take a look at www.deltanalytics.org.
Similar to 10 tough decisions donor data migration decisions (Webinar hosted by Bloomerang, presented by Gary Carr, Third Sector Labs) (20)
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
ZGB - The Role of Generative AI in Government transformation.pdfSaeed Al Dhaheri
This keynote was presented during the the 7th edition of the UAE Hackathon 2024. It highlights the role of AI and Generative AI in addressing government transformation to achieve zero government bureaucracy
Understanding the Challenges of Street ChildrenSERUDS INDIA
By raising awareness, providing support, advocating for change, and offering assistance to children in need, individuals can play a crucial role in improving the lives of street children and helping them realize their full potential
Donate Us
https://serudsindia.org/how-individuals-can-support-street-children-in-india/
#donatefororphan, #donateforhomelesschildren, #childeducation, #ngochildeducation, #donateforeducation, #donationforchildeducation, #sponsorforpoorchild, #sponsororphanage #sponsororphanchild, #donation, #education, #charity, #educationforchild, #seruds, #kurnool, #joyhome
What is the point of small housing associations.pptxPaul Smith
Given the small scale of housing associations and their relative high cost per home what is the point of them and how do we justify their continued existance
Presentation by Jared Jageler, David Adler, Noelia Duchovny, and Evan Herrnstadt, analysts in CBO’s Microeconomic Studies and Health Analysis Divisions, at the Association of Environmental and Resource Economists Summer Conference.
This session provides a comprehensive overview of the latest updates to the Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards (commonly known as the Uniform Guidance) outlined in the 2 CFR 200.
With a focus on the 2024 revisions issued by the Office of Management and Budget (OMB), participants will gain insight into the key changes affecting federal grant recipients. The session will delve into critical regulatory updates, providing attendees with the knowledge and tools necessary to navigate and comply with the evolving landscape of federal grant management.
Learning Objectives:
- Understand the rationale behind the 2024 updates to the Uniform Guidance outlined in 2 CFR 200, and their implications for federal grant recipients.
- Identify the key changes and revisions introduced by the Office of Management and Budget (OMB) in the 2024 edition of 2 CFR 200.
- Gain proficiency in applying the updated regulations to ensure compliance with federal grant requirements and avoid potential audit findings.
- Develop strategies for effectively implementing the new guidelines within the grant management processes of their respective organizations, fostering efficiency and accountability in federal grant administration.
2. Our Agenda
Please participate in our online poll while we
get organized for today’s event.
1. Overview
2. Some ground rules
3. Data migration - the process, the plan
4. 10 unavoidable decisions
– And what to do about them
5. Takeaways and Q&A
2
3. Nonprofit Data Services
Founded in 2013 by professionals with 20+ years of technology and data experience with
Fortune 500 companies, the federal government, and nonprofits
Offices in Washington, DC and Seattle, WA metro areas
www.thirdsectorlabs.com
LEVEL 1:
ASSESSMENTS AND
CLEANING
LEVEL 2:
DATA MANAGEMENT,
ENRICHMENT, MIGRATION
LEVEL 3:
WAREHOUSING, MINING,
VISUALIZATION
3
Gary Carr
President, Co-founder
gcarr@thirdsectorlabs.com
linkedin.com/in/gpfcarr
5. No decision is
still a decision
10 Decisions
you will face in any donor data migration
5
Highly degradable
… just like people’s
lives
As in
“unavoidable”
There is always
risk when you
move
something
6. Data confound us … why?
“It is a capital mistake to theorize before one has data.”
• Sherlock Holmes
“Data is the new oil.”
• Attributed to many people
“Data is not the new oil, but instead a new kind of resource entirely.”
• Jer Thorp, in a Harvard Business Review article
6
Confound
kon-FOUND, v
- To perplex or
amaze
- To through into
confusion or disorder
7. Here’s the heart of the problem …
“Personally, the NSA collecting data on me freaks me out. And I’m
from the generation that wants to put a GPS in their kids so I always
know where they are.”
• Joss Whedon, screenwriter, director
We are feeling overwhelmed … big data = big confusion
What data do we need … and what can we ignore?
7
8. Answering this question …
“What donor data do we need …
and what can we ignore?”
... sums up the purpose of today’s webinar.
8
9. You are here today because …
1. You are in the midst of a CRM migration and you are
looking for insights
1. You have a CRM migration coming up
1. You have completed a CRM data migration recently and
you are still wrestling with some problems
1. Data inspires you!
– Then you must want a job with Third Sector Labs
9
10. Let’s set some ground rules
“Never tear down a bridge before you
know why it was built. It may be your
only means of retreat.”
10
- Seasoned general
- Successful technologist
11. Our data migration ground rules
1. Your donor relationships depend on data – all of them. Therefore
you need your donor data to be as “complete” as possible.
2. “Complete” = what you will actually use.
3. Your shiny new CRM represents your fundraising future, NOT your
past.
4. Not making a decision is still making a decision.
5. All data migrations start with an understanding of the process,
and they require a plan.
11
15. The technical process … really
1. ANALYSIS 2. MAPPING
3. DATA
EXTRACTION
4. Clean now or
later?
5. Parse now or
later?
6. NEW
DATABASE
CONFIGURATION
7. Test file
8. Re-configure
database
9. CREATE DATA
IMPORT FILES
10. IMPORT 11. Test 12. Re-import
13. Test
14. Remaining
cleaning, parsing
15. Create
archives
15
Steps most
people
focus on
16. Creating a plan
Actually, your data experts
will build the plan
You want to plan ahead and
be prepared … and ask better
questions.
Start with a checklist
Here’s one from the Third
Sector website.
http://3rdsectorlabs.com/resources/data-migration-
checklist/
21. Correct answer
“Yes!”
Why?
Without policies and
standards, you won’t be
able to make the necessary
decisions to complete your
data migration.
There will be too many
unanswered questions.
21
22. Examples
1. Purpose
– For what purposes do we store donor / constituent data?
– What defines a “complete” donor record?
2. Processes
– What are our processes for data gathering / input?
– How frequently (and on what schedule) will we clean / update / enrich our
donor data?
3. Storage
– How long do we store old records?
– When does a prospect stop being a prospect and just become ‘bad data’?
– How many instances of an address or phone # or email do we store?
4. Security
– What are our data security standards?
5. Other … compliance? Systems integration?
22
23. #2
How many years of donor data do we
migrate?
23
24. Wrong answer
The data hoarder
in us all says:
“Bring it all!”
24
25. Correct answer
(Answering a question
with a question)
When was the last time
you logged into your
CRM and studied donors
or gifts older than 3
years?
“Start with 3 years”
Justify anything else with
specific use cases … not fear
of losing data
Archive the rest
25
26. #3
What about lapsed donors – do import
them too?
26
27. Hint
• This is a communications / fundraising problem.
• NOT a data problem
27
????
28. Correct answer: “It depends”
Option A:
“Segment your lapsed
donors upon import.”
• For newer, retention-based
CRMS like Bloomerang
Why?
You need a separate
outreach strategy for
lapsed donors:
- 2 or 3 communications
- New messaging,
targeted
- Anyone responding goes
into the new CRM
- Purge non-respondents
28
29. Correct answer: “It depends”
Option B:
“Do not import lapsed
donors.”
• If you can use your old system
• To manage the targeted
outreach campaign mentioned
on the previous slide
Why?
The majority of your
lapsed donors are
probably lost
- Don’t muck up your new
CRM engine with a
bunch of gunk
- Only bring over the
lapsed donors that you
re-engage
29
31. Wrong answer
• “Keep trying … there’s
got to be a way to get it
all in there.”
• “But it all fits in the old
system!”
31
32. Correct answer
Why?
• Legacy data may be
poorly formatted
• Corrupt
• Doesn’t fit new CRM data
structure
• Doesn’t fit with new data
governance policies
• You want to be able to
get to it later … if you
need it
32
“Archive it.”
• No, not in an actual file
cabinet …
• Microsoft Excel, Access …
something simple
33. #5
We have a couple of ad hoc text fields
with lots of notes – what do we do
about them?
33
34. Wrong answer
“We need text fields in
our new CRM database.”
“You never know when
we may need the
flexibility.”
L Name F Name Gift Notes
Abrams Sally $500 Born 3/4/74
Married, Dave
One child, Cindy
Michigan State
Attended gala
Referred Dave Smith
David Randel $250 Has vacation home
in Florida
Wife, Cheryl
Subscriber to
newsletter
Forresta Jacque 4/17 – spoke about
giving; made pledge
5/14 – followed up
about gift pledge
Nevers Alicia $50 Only send emails; do
not direct mail
34
35. Correct answer
“Save it, and
parse it …
later”
Why?
• Don’t let a parsing
project interfere with a
data migration … it will
slow you down.
• The text data needs
analysis.
• The parsing potential
needs to be assessed
against your CRM
database.
35
36. What is parsing?
1. Analyze fields
2. Look for opportunities to
break data into multiple
fields
3. Export to suitable tool …
(Excel often works)
4. Separate the data in a
new file
5. Map the new fields to the
database
6. Re-import data in the
new file format
L Name F Name Gift Notes
Abrams Sally $500 Born 3/4/74
Married, Dave
One child, Cindy
Michigan State
Attended gala
Referred Dave Smith
David Randel $250 Has vacation home
in Florida
Wife, Cheryl
Subscriber to
newsletter
Forresta Jacque 4/17 – spoke about
giving; made pledge
5/14 – followed up
about gift pledge
Nevers Alicia $50 Only send emails; do
not direct mail
36
37. The result
L Name F Name Gift D.O.B. Spouse Childre
n
Alma
Mater
Subsc
riber
Comm
Choice
Soft
Credit
Notes
Abrams Sally $500 3/4/74 Dave Cindy Michigan
State
All Dave
Smith
David Randel $250 Cheryl Yes All Has vacation
home in Florida
Forresta Jacque All 4/17 – spoke
about giving;
made pledge
5/14 – followed
up about gift
pledge
Nevers Alicia $50 Email
37
Ground rule reminder:
“Complete” = what you
will use
38. #6
When should our data be cleaned,
before or after the data migration?
38
39. Data hygiene polling data
When was the last time you cleaned your
29%
13%
4%
53%
donor data?
3 months
6 months
12 months
Not sure
39
*Data from TSL 2014 webinar attendees
40. Correct answer: “It depends”
Rule of thumb:
“Before migration.”
Why?
Only bring over clean
data:
- Apply data governance
- Normalize
- De-dupe
- Purge
Post import:
- Append
- Parse
40
41. Correct answer: “It depends”
Exception to the rule:
“After migration.”
Why?
• If the plan calls for it
• If too many records are
co-mingled in a larger
database … uncertainty
about record ownership
• If there is migration
urgency
41
42. #7
We are three months into our data
migration project and we just figured
out that some data fields won’t
translate to the new CRM. What do we
do now?
42
44. This is not uncommon
1. This usually occurs after analysis, data mapping, CRM
configuration and initial testing is underway.
2. Then … Ah-ha!!
3. Some fields in the new CRM are not interpreting data
the way you expected .
4. How do you know?
– Reports look wrong
– Data seems missing
– Donor profiles appear incomplete
44
45. What to do
1. Stop the imports
2. Identify data gaps and mistakes
3. Re-map
– This can be tedious
4. Re-configure the new CRM database
– Do you need new or custom fields?
5. Create new test files
– Does the problem lie with the test file itself?
6. Then re-run your test imports
45
46. But be open minded
• If you can’t figure out a way for the new CRM to
accommodate the old data, you probably don’t need it
… and you were trying to hold onto it for the wrong
reasons.
46
Ground rule reminder:
The new CRM represents
your future, not your past!
• Is the real issue that the old
database is suffering from
bad data management
practices that the new CRM
won’t tolerate?
47. #8
We can’t agree on what data to keep
and what to purge. Can’t we just bring
it all over to the new CRM and decide
later?
47
48. Correct answer
“No!”
Why?
• You are stuck on one or more
data governance policies that
you don’t want to follow.
• Work through the problem.
• Remember: archiving data is
your piece of mind.
48
Ground rule reminder:
No decision IS a decision
49. #9
Once the migration is completed – and
our data is rock solid - who is
responsible for data quality?
49
50. Potential answers
1. Tech team or dba (database
administrator)
2. Marketing / communications
3. Fundraising
4. Consultant
50
(Just don’t expect this
level of enthusiasm)
51. Correct answer
“Any of them”
Why?
• All are good choices
• Depends on your org
structure
What is necessary:
1. Accountability
2. Budget
3. Manage data quality on
its own schedule
51
52. What do we know about data quality?
“If your data isn’t getter better, it’s getting worse”
-- TSL data scientist
“What! Why?”
-- audience
53. Data quality vs. data degradation
“Data degrades”
• What does that mean?
54. Data degradation
Cause #1: your organization
– Lack of data entry standards
– Unskilled data entry workers
– Common mistakes
– Record fragmentation
Cause #2: the technology
– Multiple, disparate systems
– System upgrades
– Integration, processing errors
– Sheer volume of data
Cause #3: those darned donors … life!
– Change in address … every 5 to 7 years
– Change in jobs … 9 to 11 jobs in a lifetime
– Family / life event … divorce rate, birth of children, death … what else?
55. That’s why data quality management requires
Three necessary ingredients:
1. Accountability
2. Budget
3. Manage data quality on its own schedule
55
56. #10
Do we need a data consultant to
complete our CRM migration, or can we
just rely on our new vendor?
56
57. At the risk of sounding self-serving …
“Probably”
(unless you have in-house
staffing)
Why?
• You need one or more
resources who can:
– Extract legacy data
– Clean, normalize and purge
– Create import files for the
new CRM
– Create post-migration
archives
57
58. New CRM vendor tech resources
• Want to receive a clean data set
• Configure the CRM database
• Import the clean data
• Get done as quickly as possible
Be sure to review a plan - including roles and
responsibilities - with your new vendor.
58
Ground rule reminder:
Data migrations require
a plan
62. There are many
1. Clean data
2. Future focused
3. No wasted money on per-record SaaS costs
4. No wasted time due to bad data clogging up systems,
exports, etc.
5. Improved fundraising results
6. Better donor relationships
62
65. Take-aways
1. Understand the CRM data migration process
2. Identify the key decisions that will be made along the
way
3. Discuss pros and cons of decision options
4. Have a sense of preparedness and control over your
next data migration project
66. How we can help
Data basics
• Assessments, hygiene, management
Data intermediates
• Migrations, integrations, security
Data advanced
• Warehousing, mining, analytics,
66
visualizations
Gary Carr
President, Co-founder
ThirdSectorLabs.com
gcarr@thirdsectorlabs.com
linkedin.com/in/gpfcarr
67. For your time and attendance …
and …
a special thanks to our host
67
Thank you!
68. We’d like to hear from you!
Please submit your questions…
68
Q & A
Editor's Notes
Moderator: Welcome everyone … and thank you for attending.
Run the poll … gives people additional time to log in.
To Chris …
The event topic is full of LOADED words …
Quote 1 – is traditional, solid thinking
Quote 2 – is the information age
Quote 3 – is data geeks trying to take over the world!
Quote 4 is reality …
This is a GREAT example of a ground rule … NOT DEBATABLE
The process is more complicated than we at first think … more steps … repeated steps.
The order of tasks will vary from one migration project to the next.
To Chris …
To Chris …
With emphasis …
Let’s stick to the term “donor data”
To Chris …
To Chris …
Lapsed donors are LOST donors … and that means they are unwanted data unless you can get them to re-engage
Lapsed donors are LOST donors … and that means they are unwanted data unless you can get them to re-engage
To Chris …
To Chris …
To Chris …
Lapsed donors are LOST donors … and that means they are unwanted data unless you can get them to re-engage
Lapsed donors are LOST donors … and that means they are unwanted data unless you can get them to re-engage
To Chris …
To Chris …
With emphasis …
To Chris …
To Chris …
Chris: make initial points, then ask Gary to comment.
Gary: The important point here is that we can de-fragment our donors without having to replace all of our existing databases. The reality is that no software will ever be “one size fits all”. Relationships, communications … life! … are too complicated.
In the data science world, the goal is to build databases and create data architectures that make it easier to both store and share data.
There are a couple of approaches for nonprofits to address this problem. Chris is showing us one of those in this example about one CRM having a complete view. With 2Dialog, for example, you can leave your existing systems in place, and sync that data to the 2Dialog database in order to utilize their multi-channel marketing capabilities. This is what data science wants to see – flexibility, integration, sharing of data.
This is an important topic … and one of our case study examples that we will dig into deeper tomorrow.