A data-informed culture, something very different from a data-driven culture. The term “data-driven” has been used to describe organizations that rely solely on cold hard data to make decisions. Being data-driven sounds great—in theory. But, because it doesn’t acknowledge the importance of basing decisions on multiple information sources The phrase “data-informed” is a far more useful label. Data-informed describes agile, responsive, and intelligent businesses that are better able to succeed in a rapidly changing environment. Data-informed cultures are not slaves to their data. Mario Morino uses the phrase “information-based introspection” to refer to using and applying data in context to excel. Multiple sources for decision-making are critical. “Data is an important part of the story, but not all of it. Nonprofits have to balance an overreliance on passion or belief in one's mission with over-fetishisation of data and analysis.”
http://www.flickr.com/photos/theimagegroup/369893824/ Organizations with data-informed cultures have the conscious use of assessment, revision, and learning built into the way they plan, manage, and operate. From leadership, to strategy, to decision-making, to meetings, to job descriptions—a data-informed culture has continuous improvement embedded in the way it functions. Key Performance Indicators (KPIs) are the specific quantifiable metrics that an organization agrees are necessary to achieve success. They are the mileposts that tell a data-informed organization whether they are making progress toward their goals. Measurement and data visualization are tools that nonprofits with data-informed cultures use to improve their programs; they observe the results of their programs, and then learn from those results to improve and refine their next programs. Data-informed cultures design measurement into their projects—not just so they have measureable outcomes, but so they provide the data necessary to guide how to improve them. Measurement can be used for many things, some of them undesirable, like justifying your existence, getting someone fired, or proving a point. A data-informed culture uses measurement to continuously improve.
DoSomething.org is a fully functioning data-informed culture. Founded by Andrew Shue and Michael Sanchez, DoSomething.org’s mission is to convince young people that community service is as popular, cool, and, most importantly, normal, as watching TV or playing sports. Their idea was that if community service could become ingrained in young people, then they wouldn't think twice about helping others or volunteering. Back in 1993, Shue approached Aaron Spelling, the executive producer of “Melrose Place,” and asked for 30 seconds of airtime during the show to tell the world about DoSomething.org. Spelling agreed and DoSomething.org was officially launched! DoSomething.org, a mid-sized nonprofit with about 40 staff members including a fulltime data analysts, focuses on social change makers under 25 years of age and delivers most of its programs through the web, mobile messaging, and/or through social networks. They don’t collect data for data’s sake. They use their data to shape programs and drive social change, making decisions based on a balance of data and experience.
It Starts at the Top Creating a data-informed culture comes down to leadership. At DoSomething.org it starts with the board, which is dominated by leaders in the tech field, including Reid Hoffman, co-founder of Linked-In, and Raj Kapoor, co-founder of Snapfish. Reid has famously said, “The future of the web is data.” Lublin purposefully developed a data-informed culture, building her team with staff members who share her passion, like CTO George Weiner. Weiner manages the Internet, computer, and online communication strategy for DoSomething.org. He says, “One of the biggest challenges to nonprofits becoming more data-informed is the HiPPO in the room that no one wants to talk about. ‘HiPPO’ stands for ‘Highest Paid Person in the Organization,’ and it’s usually your CEO. The HiPPO has to buy into data-informed decisions otherwise it doesn’t happen.”
http://www.flickr.com/photos/foreversouls/7318906/ Don’t Just Count, Understand Why The DoSomething.org staff mine their program data for actionable insights that they share with Lublin at regular meetings. “ I think one of the reasons our organizational culture has evolved is that our nonprofit is 90% funded by corporate sponsorships,” explains Lublin. “They look at us as a media purchase. As a result, we’ve always collected key performance metrics. Not just traffic, but engagement metrics, and, of course, actions taken. But we don’t just count, we try to understand why.” Lublin has brought in leading thinkers from the corporate sector to mentor her and her staff on how to think about their data. She says, “I was fortunate to spend some time with John Lilly from Mozilla. He encouraged us to have a more open philosophy for sharing and analyzing our data. If we’re transparent about sharing our dashboards, it generates feedback and discussion from our stakeholders that leads to improvement.”
http://www.flickr.com/photos/mklingo/245562110/sizes/l/in/photostream/ Don’t Be A Slave To Data: Think Lublin says that nonprofits must listen to the data, but stay focused on the mission. “I don’t mean that we don’t have a spine. If the data told us to focus on senior citizens, we’d package it up and send it to another organization because it doesn’t support our goal of activating young people to take action.” Lublin also talks about the importance of constant experimentation. “And it isn’t just saying, ‘O.K., we’ll try this.’ We state a specific hypothesis with a number and measure against that.” DoSomething.org integrates critical metrics from social media, e-mail, SMS, and the Web. They don’t just count their data; they use sophisticated methods like A/B testing to determine what is working and improve their tactics.
Fail Fest And Pink Boas: Don’t Be Afraid To Fail DoSomething.org doesn’t use its data to pat itself on the back or make the staff feel good. Lublin notes that they’re not afraid of failure. They hold regular “Fail Fest” meetings, where each person on staff has to present a campaign or program failure. They share three things they learned about themselves and three things the organization learned. To remove the stigma from failure, Lublin says, “We have to wear pink boas when we present.” http://www.flickr.com/photos/ruminatrix/2734602916/in/faves-cambodia4kidsorg/
Spend More Time Thinking About The Data, Less On Collecting It DoSomething.org uses its data to continuously improve programs, develop content, and shape campaign strategies. So DoSomething.org wants its staff to spend more of its brainpower thinking about the data, rather than collecting it. To ensure that this happens, DoSomething.org’s Data Analyst Bob Filbin’s job is more than programming formulas in Excel spreadsheets. Says Filbin, “One of the biggest barriers in nonprofits is finding the time to collect data, the time to analyze, and the time to act on it. Unless someone is put in charge of data, and it’s a key part of their job description, accelerating along the path towards empowered data-informed culture is going to be hard, if not impossible.”
http://www.flickr.com/photos/mkrigsman/3428179614/ Tear Down Those Silos Lublin says that it is important not to silo your data analysts. “You can’t treat them like accountants that sit quietly in the background and assign categories to expenses. I’ve made sure that our data analyst shares an office and works interactively with staff.” He is responsible for making sure that departmental and overall organizational goals are aligned, and that social media data are seamlessly integrated into achieving key organizational results. Filbin says, “My goal is to make sure that every person and department has access to the data they need in order to create actionable changes in their work. Each person has an automated dashboard that has different levels of detail and relates to organizational results.”
Make It Personal And Make It Relevant Filbin also talks about how to overcome staff resistance. “Reports should be presented in a way that seeks to avoid bruised egos. Rather than bringing a number to a meeting, people should be reviewing their own statistics and data. This is part of what I am doing at DoSomething.org—closing the data loop; making sure each department can access its data to answer their questions.”
Even The Smallest Victory Is A Win Filbin says it is important to start with sharing small wins. “For example, I shared an analysis of A/B testing for Facebook ads for an event signup. “We discovered the conversion rate was very low because we directed people to an external site (our own website) rather than a signup page on Facebook. This insight helped us use Facebook ads more effectively to bring people to the event.” http://www.flickr.com/photos/recipher/2491336563/sizes/l/in/photostream/
The Stages of Becoming Data Informed: Crawl, Walk, Run, Fly Obviously, not all nonprofits are born with the data-informed gene. And it’s not a culture you can acquire along with your analytics software. It’s an evolutionary process that happens in stages. These include: Crawl: At this stage, the organization does not know where to start. It collects data from time-to-time, but doesn’t do formal reporting. What data is collected doesn’t relate to decision-making. There are no systems in place, no dashboards, and no collection methods. Staff is often overwhelmed by the thought of measurement and the task falls to the bottom of the to-do list. There is no process for analyzing success or failure. Decisions are all passion-driven. Walk: At this stage, the organization is regularly collecting data but not in a consistent manner. For example, different people and departments may be collecting data but not sharing it. Or data is focused on the metrics that are specific to social media channels but not linked to high-level organizational results or mission-driven goals across programs and could, in fact, be the wrong data. Discussions on how to improve results are rarely part of staff meetings, nor are there linkages to organizational experience. The organization does not understand the fine distinction between being data-driven and the intelligent use of data. Run: At this stage, the non-profit has an organization-wide system and dashboard for collecting measurement data that is shared with different departments. Decisions are not based solely on a data or intuition, but multiple sources. Managers hold weekly check-ins to evaluate what’s working and what’s not across communications channels, as well as, any specific social media feedback received that would help shape our future campaigns or social media use. At this stage it monitors feedback from target audiences in real time but supplements that information with trend or survey data. The organization may work with measurement consultants or specialists to improve skills and capacity and it provides training and professional development for staff to learn how to use measurement tools. Fly: At this phase the non-profit has established key performance indicators that are used across programs. The organization has a staff person responsible for managing the organization’s data, but staff are empowered to check and apply their own data. In addition to providing weekly check-ins, the organizational dashboard includes key performance metrics related to goals. The organizational dashboard is shared across departments and there is a process for analyzing, discussing, and applying results. They use data visualization techniques to report the data analysis but also to reflect on best practices culled from the data. There is no shame or blame game because of “failures,” instead these are embraced as learning opportunities. There is a regular report to senior leadership that details high level successes, challenges, and recommendations for moving forward. Staff performance reviews incorporate how well the organization is doing on KPIs. Leadership celebrates successes by sharing measurement data across the organization.
Becoming Data-Informed: Change Is Easy With Baby Steps Changing an organization’s culture to a more data-informed approach must begin with baby steps. While it does not have to be difficult to orchestrate, it does need to start from the top. Unless senior management can agree on the definitions of success and how they will be measured, you can waste a tremendous amount of time accumulating data but not using it. In Chapter 4, we describe the basic steps of any measurement program and discuss how to set up a measurement pilot program. Chapter 5 discusses how to identify the value of success. Getting started on the path to becoming a data-informed nonprofit is a matter of having some important internal conversations. It is not just about having new inspiration about measurement or working with new tools; it means thinking differently about the organization and how it works. Begin at the End: Discuss and Identify Results If your organization doesn’t know exactly what you’re going to measure, you can’t become data-informed. Unless you have a discussion upfront of what success looks like, you’ll end up collecting data, but it won’t help you make decisions. You will waste your time. So begin at the end by carefully identifying desired outcomes. Don’t be afraid of a bit of healthy disagreement. The best measurement programs are borne of—and benefit from—lively conversations about what really matters to the organization and who can “claim credit” for what. You need to keep your “mission” hat on and keep the conversation focused on the ultimate goals of the organization. Just keep repeating, it’s not about “credit”—it’s about achieving the mission. You will also want to manage expectations: What is realistic to expect given your current investment in social media, or compared to peer organizations? What do short-term, medium, and longer-term results look like? You might need to bring in an outside consultant to facilitate a meeting to help get consensus on what you want to measure or clarity on results. Or you may need to bring in a measurement expert to help you clarify what you want to measure and why. This doesn’t have to be expensive. For example, as we discuss in Chapter 8, the Analytics Exchange helped the American Leadership Forum by supplying an analytics volunteer to help create a framework and system for gathering data. Become a Curator of Metrics If you are the person responsible for implementing social media for your organization, either part time or as your whole job, you need to become what John Lovett defines as a “Curator of Metrics” in his book Social Media Metrics Secrets. This is someone, like Carie Lewis from the Humane Society whom we introduced you to do in Chapter 1, who knows the difference between different types of metrics and ensures that her organization is using data in an intelligent way. A curator of metrics knows how to help guide their organization into choosing the right metrics, and knows how to report insights in a way that connects them to organizational goals. Use Experiments To Make The Case To Evolve One way to evolve into a data-informed organization is through implementing a series of social media measurement experiments, as described below and in Chapter 4. Each one needs to have solid metrics, and should be designed to provide results that will help you make the case to evolve. Keep the end in mind when agreeing on how experiments will be structured, run, and measured. The experiments should not be willy-nilly, but help you develop and test your strategies and tactics – and lead the way to best practices. Take a Baby Step: My First Data Collection Project To get started, select a project, event, small campaign, or program that is a high priority on your organization’s work plan for the year, that incorporates social media, and that you can apply a couple of good metrics to. Be mindful of other organizational deadlines that may divert energy and focus from this important first baby step. You might find it difficult to set aside quality time to focus on it. Don’t try to measure every objective or collect all potential relevant data. Make it easy to manage. You should also have a very clear idea about what you want to learn. Keep in mind that you are going to take your report and use it to make the case for a more comprehensive measurement program. It’s important to make sure that anyone who is going to use the data, or sit in a meeting and review the data, buys into your metrics. That could be the Executive Director, a program manager, the board of trustees, or other people in your department. If there are many different decision makers you may need to do a formal survey to make sure that everyone ends up on the same page. Sara Thomas, who handles social media for the Ocean Conservancy, says, “It was really useful to bring in my entire department on the effort rather than working solo on the project. This helped with buy-in.” Learn from Your Results Once you collect your data, analyze it and understand how it can help inform decisions. Make sure you educate through examples. Show how adding a data-informed approach to your social media or all media or programs can avoid ineffective campaigns and increase audience satisfaction. More importantly, you don’t just need to develop discipline around collecting data, what you want is the discipline to look at what you’ve collected and generate insights. That requires reflection, not just counting. Doing a measurement pilot will help create the discipline of stepping back from whirlwind of social media tactical implementation, but also wrestle with larger questions about how social media fits into an organization’s overall efforts. Which vehicles and channels gain us the most traction? How should we adjust our workload internally to reflect those results? How are our social media activities helping us meet our overall strategic goals? How are our efforts using social media supporting our programs? Reflecting does not have to be a private activity. It can be done in connected, transparent ways. The organization’s blog or website can be a place to share lessons learned with readers, and ask them for their feedback and suggestions as well. The result: a powerful way to learn and improve over time. Conclusion To start the shift to a data-informed culture, you must begin with small incremental steps with the full support of leadership. It’s important to think big, looking at key results, but since many outcomes deal with long-term changes, you can’t get there overnight, nor can your organization transform its culture overnight. Keep the steps small and manageable. As your organization’s culture begins to shift, then when you present reports on social media activities, you get better questions from your executive director or board. You don’t get asked how many fans do we have or what does that mean? You get questions that help you Kanter, Beth. (October, 2011) Are You A Curator of Metrics? [Blog post]. Retrieved from http://www.bethkanter.org/curator-metrics/ Thomas, Sara, private conference call peer learning group with David and Lucile Packard grantees with Beth Kanter, September, 2011
This is why I got into data/infoviz: We’re pretty sure that human brains are wired to speak—and understand—SPOKEN language. [SLIDE ADVANCE] But…people have to be TAUGHT to read and write. [SLIDE ADVANCE] So…understanding the WRITTEN word is not a natural human ability—but many of our communications overlook this point. But, since it seems that our brains DO innately process shapes and colors, using more data/infoviz may improve our communication power—and ultimately help us accomplish our goals.
Just to give you an idea of who I am and why I care about data/infoviz: My background is in social sector evaluation. I work with all types of organizations. As an evaluator, I LOVE data. But one of the challenges rampant in my field is the issue of SHELF REPORTS. Reports that may contain great data and findings, but are NEVER USED. They sit on a shelf. More broadly than research and evaluation reports—I think this is true for a lot of data that is collected. Its great data. It can give great insight, but if its not presented in a way that supports and encourages use, it’ll be no better than a shelf report. So I love data/infoviz because it increases the likelihood that great data will be used.
Data/infoviz is a big, broad field. Like other fields, it has its own cast of characters, schools of thought, and running feuds. So just like economics or psychology, if you’re going to practice in the field of data/infoviz, its helpful to figure out which tribe you belong to—or which tenants you choose to hew to. I have to admit—I don’t follow one guru. I think Edward Tufte has some great ideas and he is an amazing scholar and practitioner in the field. But there are times that I purposefully deviate from his guidelines, and—I think—end up creating a more useful data/infoviz product given the audience and purpose. I’m going to highlight four design principles that I’ve found particularly relevant to data/infoviz for the social sector. First principle: Maximize data-ink. This particular slide is definitely Edward Tufte approved. He is a big proponent of the concept of the data-ink ratio. Data-ink is the non-erasable ink used for the presentation of data. If it were removed from the image, the graphic would lose the content. Non-data-ink is the ink that does not transport the information but is used for scales, labels and edges. The image on the right does a better job at maximizing data-ink. The theory is that the image on the left makes your brain work harder to extract the same meaning.
The second principle is to make color and contrast work for you. Depending on the data and the type of product, we usually work in schemes of 4 – 6 colors. We select a color scheme that includes an assortment of neutral, complementary, and accent colors. This color palette is the inspiration for an example I’ll show on the next slide. When selecting the color scheme for a particular report, presentation, or other product, keep in mind how the colors will work together to provide emphasis, clarity, unity, and flow. Emphasis: is there a color that can be used to make text or other visual elements stand out? Clarity: does the color scheme lend itself to being used consistently throughout the report or presentation? This mostly has to do with not selecting a scheme that has too many colors. Unity: do the colors work well together, and convey to your audience that the report or presentation is one holistic piece? Flow: is there an assortment of light and dark, bright and muted colors that can be used to visually lead an audience through the material?
Quick case example: This is the color palette of one of our clients. You’ll notice that this client appreciates bright, bold colors. When I put together a report and presentation for this client I needed to select a color palette—and their colors didn’t quite fit the bill. Their palette lacked neutrals and was a little too bright to work with. I set out to find a complementary palette. [SLIDE ANIMATION] This one had a good combination of bright and muted colors that could be used as accents and contrasts. I felt that this palette would work well for the charts, graphs and other visualizations. I especially selected a palette with a broad range of colors because of the number of answer choices that needed to be represented in the bar charts.
Third design principle: allow the purpose to select the medium. Form should follow function. Begin by considering how the product should be used, and then select the appropriate medium—digital or physical. For example, just because the most common paper size is 8.5 by 11, doesn’t mean it is the right size for every printed product you need to produce. We often begin working on a new product by opening a new Microsoft Word or PowerPoint file. But instead—before doing any of that—we should be thinking through the purpose of the product, how it will be used, and what format or medium would best support use.
And finally—draw on the excellent foundations of the field and get comfortable with graphic design priciples. Now you know a little more about where I’m coming from when I’m talking about data/infoviz.
At the outset of this presentation I said I would focus on data/infoviz for data, assessment, learning, and management. In my world as an evaluator, the short hand way to say that is STRATEGIC LEARNING. Strategic learning can be applied to many facets of nonprofit work—human resources, technology, programs, communications, etc. Strategic learning is the process of using evaluative and other information to make course corrections and improvements. In the social sector, strategic learning may take the place of a program evaluation. I don’t expect you to remember the definition of strategic learning, but the points that I want to make are that strategic learning is of value for every single nonprofit organization, and strategic learning insists on use. Strategic learning can’t happen with shelf reports. So what are the approaches and tools that help strategic learning happen? And as I talk through these keep an eye out for how they employ the design principles I just talked about. By the way—there are many more data/infoviz tools and approaches for strategic learning—these are just a few that are broadly applicable.
Mapping is a nearly universal tool. In evaluation—especially systems change and advocacy evaluation—maps are an incredibly valuable but underused tool. This particular map is of a philanthropic initiative to enlarge the group of grantmakers supporting a particular type of intervention. The nodes represent people, and the ties represent relationships. The color of the node indicates each person’s functional group, and the overlaid numbers indicate each person’s depth of involvement in the work. A lot of information is encoded in the map so it can be used by the initiative’s leaders to understand progress and make decisions. Maps can be used by advocacy organizations to track relationships with policymakers. In web design and analysis, maps can be used by to illustrate movement through a website. At an organization level, maps can be used to show information flow. A systems map can show the interactions and domains of various entities and actors across an entire community. Useful applications for maps are nearly limitless. I could go on for hours! One caution on maps: there can be resistance to maps. They’re not one of the preset chart or graph options in Excel—they’re less common. Out of the box, your audience may be less inclined to find them useful.
This may be heresy at a nonprofit technology conference, but mapping can be low—or even no—tech! Net-Map is a method for mapping social networks using facilitated stakeholder interviews and discussions. It was developed by Dr. Eva Schiffer while she was a post-doctoral fellow for the International Food Policy Research Institute. She was living in a small town in northern Ghana where she developed Net-Map as an answer to local governance problems. Net-Maps are developed with paper, figurines, and markers.
I don’t know what it is about the human psyche, but there’s something about putting everything in one place queues the brain into making connections. So here is a whole bunch of related information, all in one place, on what we like to call a data placemat. It’s on—gasp—11 by 17 inch paper. In strategic learning, we involve the project’s stakeholders in sense making. Before, we put each individual piece of data on a PowerPoint slide. It wasn’t until we stumbled on the 11 by 17 multiple data points presentation that we got great traction. Think about other applications—like your organization’s digital media reports, or communications reports. Group related data together, create visualizations where helpful, and involve some stakeholders in sense making. It is one of the funnest types of data meetings you can have.
Dashboards are a management tool that have garnered a lot of buzz in the sector. We recommend a ‘right sized’ approach to data dashboards. For the vast majority of nonprofit organizations—more than 80% of nonprofit organizations have annual budgets of less than $1 million. Often, this right sized approach involves basic tools like Excel. For organizations with more capacity, expertise, and resources, more powerful software packages may be appropriate. Regardless of the technology you use, you still have to systematically collect the data. This is an example of a simple dashboard we created for a project. This table illustrates a number of key indicators that are grouped into categories. Looking at this, you can quickly process which indicators are ‘in the red,’ which ones are in warning, and which ones are good.
And finally, the most obvious, use data/infoviz to up the communication power of your reports. Because after all, you’re making the report because it says something important. There is a message or a finding you want to be heard. Believe me—I work in the field of evaluation—most evaluation reports are NOT sexy—and they don’t get used. They become shelf reports. But if the MOST IMPORTANT data, findings, and recommendations can be illuminated with data/infoviz, the report has a better chance of being used. This is a finding from our research, State of Evaluation 2010: Evaluation Practice and Capacity in the Nonprofit Sector. It makes it easy to see that the number one audience for nonprofit evaluation is the funder audience—duh! But the viz captures attention and invites analysis in a way a table never would.
Another example from the same report: In ONLY 21% of nonprofits—one-fifth—are EVALUATORS—the orange guys—actually responsible for evaluation. And in fact, in about another one-fifth of organizations, NO ONE is responsible for evaluation. And those blue guys in the middle—I think I even feel worse for them—a hodge podge of staff with a multitude of other hats were tagged as responsible. Again, I think this has more communication power than a text and numbers table, and it livens up a presentation on an otherwise dry topic of evaluation practice and capacity.
Time for a little reflection. If I had a soapbox, these are four messages that I’d shout:
Lousy data infects good data. Collect more good data, less lousy data Nowadays data collection costs are the cheapest they’ve ever been, so a LOT of data is being collected. Cheap data collection costs have led to a lot of bad data being collected. Stop it! When bad data is included in analysis and used with good data, its like a bad apple, it infects the whole lot. Its better to collect a few good pieces of data than a lot of lousy data.
Fit visualizations to audience and purpose. So often a visualization is made by one person, and I think the deeper you get into designing a viz, the easier it is to forget about the audience. Don’t let a personal design aesthetic take over. Sometimes design preferences need to bend to engage an audience and achieve the viz’s purpose. This has been a particular hard pill for me to swallow. For example, some audiences just aren’t comfortable with relationship maps. Sometimes you have to figure out what your audience can work with, meet them there, and try to increase their comfort level over time. Also, don’t forget the no/low tech options. Remember, a really great viz isn’t great if no one uses it.
Visualizations do not equal truth. First, numbers can lie. So just because it is based on numbers, doesn’t mean it is a fact. Second, two pieces of data can be drawn together to imply relationship, when in fact there may not be a relationship. Third, visualizations are not objective. Computers and software may be used to draw visualizations, but they are designed by humans. There is an embedded point of view. And fourth, data/infoviz is PART of analysis—not in lieu of analysis.
And finally, you don’t always need the Cadillac—sometimes what you really need is a beat up old truck to do some heavy lifting. Sometimes you just need some workaday visualizations—simple, to the point, probably made in Excel without any after market tweaks. Just the basics to jumpstart data use. There’s nothing wrong with that. Know them for what they are, and know when to use them.
Thank you very much. It’s always nice to end on humor—so one of my favorite pie chart cartoons.
Children Now’s evolution toward using infographics and visuals to communicate about our work more effectively began, believe it or not, with introducing meaningful sub-section headers that communicated the gist of what we wanted the reader to take away. Instead of the standard and meaningless “Introduction” and “Scope of Work,” we redesigned our proposals to follow a problem/solution structure and pushed ourselves to clearly communicate both. We figured, if we could just get you to read a well-written problem and solution, we we’re at least halfway to convincing you to getting behind our work.
Perhaps reflective of my generation, and definitely not understood by my parents, I’ve worked at many different organizations, including advertising agencies and a brief stint at a multimedia learning company. Those experiences led me to believe that in aiming to communicate effectively, we should always aim to incorporate this fundamental learning principle.
As many of you may or may not know, Children Now is coordinating The Children’s Movement of California, what that is and why we need it will hopefully be very clear to you in a moment…so, what are the components of the problem underlying the need for The Children’s Movement (go through bullets)…There’s two years of strategic planning and analysis behind this, including the likes of McKinsey & Company and others. We didn’t dream it up overnight, though I wish we had as it would have been a lot less expensive than McKinsey. When we add visuals, the story gets stronger because we’re showing you just how badly kids are getting overpowered.
As many of you may or may not know, Children Now is coordinating The Children’s Movement of California, what that is and why we need it will hopefully be very clear to you in a moment…so, what are the components of the problem underlying the need for The Children’s Movement (go through bullets)…There’s two years of strategic planning and analysis behind this, including the help of McKinsey & Company and others. We didn’t dream it up overnight, though I wish we had as it would have been a lot less expensive than McKinsey. When we add visuals, the story gets stronger because we’re showing you just how badly kids are getting overpowered.
And what are the core components of the solution (go through bullets)? It’s arguable that more people in our society support kids than any other “issue.” Children’s needs always poll exceptionally well, which is why those seeking office give them a lot of lip service. We need to turn lip service into meaningful action and the powerful network of support being buiolt by the Movement can do it. We have over 200 diverse organization members already and it’s growing daily.
The key to effective data design for us is to clearly articulate what we’re aiming for. For this project is was enabling meaningful comparative analysis – across a ton of data points – to surface bright spots – and, perhaps like many of you, we didn’t have access to Flash and ColdFusion developers…in other words, we used what we had and did the best we could.
We added a couple of dimensions to the data to give us more to work with…and then we comped up lots of different designs – so we knew what we were aiming to do and took a lot of shots at doing it before hitting our target.
When the “fingerprints” are grouped together, the distinctiveness of individual county profiles emerges – which is critical for policymakers to recognize as “one size fits all” approaches may not be optimal -- and bright spots emerge
AgendaOpeningData/Infoviz for: • Measuriung the
Networked Nonprofit: Creating a Data-Informed Culture Beth Kanter • Data, Assessment, Learning, and Management Johanna Morariu • Communications and Advocacy Brian KennedyClosing Picturing Your Data is Better Than Slide 2 1,000 Numbers
Data-InformedData Is Used For Continuous
Improvement th Based on our interviews wi three program leaders, we have gy to get ways to improve our strate better results next quarter According to the data, we have reached our KPIs for t awareness raising in all bu one program ..
Stages of a Data Informed
CultureCRAWL WALK RUN FLY Data collection, but Data from multiple Has org wide KPIs orNo formal not consistent or results sourcesreporting shared between Organization wide System and structureLacks consistent departments dashboard with for collectiondata collection customized views Discussed at staffLacks systems Data not linked to Data is shared across meetingsDecisions are results, could be departments Uses data forpassion-driven wrong data Formal process for planning and decisions analyzing, discussing, and Rarely makes applying results decisions to improve Data visualization for reports and reflection
Becoming Data-Informed: Change Is EasyWith
Baby Steps• Begin at the end – discuss and identify results• Curator of metrics• Use experiments to help you evolve• Get started with a small data collection project that is high priority in your organization• Learn from your results
Understanding the written word is
not anatural human ability; but human brains doinnately process shapes. For a great discussion of these—and related—topics, I recommend Designing with the Mind in Mind (2010) by Jeff Johnson—especially chapter four, Reading is Unnatural. Picturing Your Data is Better Than Slide 30 1,000 Numbers
Design Principles1) Classic graphic design
principles Balance Rhythm Proportion Dominance Unity For a great discussion of these principles I recommend The Principles of Design by Joshua David McClurg-Genevese, available at http://www.digital-web.com/articles/principles_of_design/ Picturing Your Data is Better Than Slide 36 1,000 Numbers
Data/infoviz strategic learning examples 1)
Maps—relationships, networks, systems For more information about Eva’s work and Net-Map visit http://netmap.wordpress.com/ Picturing Your Data is Better Than Slide 39 1,000 Numbers
Toward more effective communication:One organization’s
evolution• In the beginning, we were all about words…lots and lots of them• Academic-style communication that challenged accomplished academics A 63-word sentence? Picturing Your Data is Better Than Slide 50 1,000 Numbers
We learn more deeply from
words &pictures than from words alone• This is known as the “multimedia principle”• It is applied by the masters of communication…the advertising industry• In TV ads, the magic moment is called “audio-visual lock” Picturing Your Data is Better Than Slide 51 1,000 Numbers
EX AM PL E#Communicating why
we need 1The Children’s Movement of California• Children clearly are not a priority in public policymaking today• Lobbying dollars, campaign contributions and large organized groups of voters are the sources of power in politics today• Kids need a source of power Picturing Your Data is Better Than Slide 52 1,000 Numbers
EX AM PL E#Communicating why
we need 1The Children’s Movement of California• Children clearly are not a priority in public policymaking today• Lobbying dollars, campaign contributions and large organized groups of voters are the sources of power in politics today• Kids need a source of power Picturing Your Data is Better Than Slide 53 1,000 Numbers
EX AM PL E#Communicating why
we need 1The Children’s Movement of California• The broad support for children in our society has been too diffuse to have the impact it should on California’s priorities• Connecting the many, many organizations and people that are pro-kid can change the game Picturing Your Data is Better Than Slide 54 1,000 Numbers
EX AM PL E#Adding dimension,
meaning & 2usefulness through data designThe 2010 California County Scorecard of Children’s Well-Being challenge: • Track 26 data indicators, for each of the state’s 58 counties, over time and by race/ethnicity • Aim to surface bright spots • The raw data: I’m actually only 1/13 of the raw data Picturing Your Data is Better Than Slide 55 1,000 Numbers
EX AM PL E#Adding dimension,
meaning & 2usefulness through data design• Calculated “Bottom, Middle, Top” terciles• Added Rural/Urban and Income Level designations• Unique county “fingerprints” of child well-being emerged Picturing Your Data is Better Than Slide 56 1,000 Numbers
EX AM PL E#Adding dimension,
meaning & 2usefulness through data design• Where are other counties like mine doing better? Picturing Your Data is Better Than Slide 57 1,000 Numbers