This document provides an overview of developing an effective measurement strategy for content by focusing on storytelling. It discusses planning measurement by understanding goals, content, and key measures. It emphasizes collecting contextual and non-analytics data in addition to analytics and organizing all data centrally. Finally, it discusses presenting data persuasively through focus and storytelling, with examples of telling the story of where an organization was, what changed, the resulting impact, and next steps. The overall message is that numbers should be turned into stories and stories should drive action through an effective measurement process.
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...Michael Powers
Data overload has come to content strategy. With so many things to measure and tools to measure it with, how do you find a way to use analytics without succumbing to analysis paralysis? And without spending all your time on analytics? This session will walk through the creation of a measurement strategy that supports your existing content strategy. Then we’ll look at the ways you can use those analytics to tell the kinds of stories that persuade your peers and superiors to make smarter content decisions.
In this session, you will:
Learn how to decide what to measure and why
Find out how to create an analytics routine that provides actionable insights without taking up all your time
Learn to present measurements and analytics in ways that influence and persuade others
Castillo high quality program evaluation in nonprofitsIsaac Castillo
Presentation for Tidewater Community College Workforce Solutions school - the Academy for Nonprofit Excellence.
http://www.tccworkforce.org/non-profit-management
Can your nonprofit prove you are making a difference?Isaac Castillo
Slides from class taught at Tidewater Community College's Academy for Nonprofit Excellence. Focuses on the basics of data collection, outcome measurement, logic models,and performance management.
VMCS14 REanalyze: What is your EVP Data Saying?VolunteerMatch
2014 VolunteerMatch Client Summit Best Practice Cafe
Demonstrating your employee volunteer program's impact internally and externally is critical to its success. While the industry as whole is still looking for ways to get beyond traditional metrics, some companies are taking it upon themselves to identify outcomes that reflect their priorities. They are also looking for new ways to quantify the engagement and impact of their employees so that they can better tell their stories to leadership, employees, nonprofit partners, and the community.
Join Jake Sanches, internal metrics and analytics guru at Palantir Technologies, to discuss VolunteerMatch's recent metrics benchmarking project. We'll review our findings and key takeaways, cover industry trends across key metric benchmarks, and discuss metrics analysis in finer detail and how it can be leveraged to drive improved programmatic and reporting approaches. Jake will also provide recommendations and demonstrate examples of ways to increase your use and presentation of data in your communications.
Number Stories: Win Friends and Influence HiPPOs with an Effective Measuremen...Michael Powers
Data overload has come to content strategy. With so many things to measure and tools to measure it with, how do you find a way to use analytics without succumbing to analysis paralysis? And without spending all your time on analytics? This session will walk through the creation of a measurement strategy that supports your existing content strategy. Then we’ll look at the ways you can use those analytics to tell the kinds of stories that persuade your peers and superiors to make smarter content decisions.
In this session, you will:
Learn how to decide what to measure and why
Find out how to create an analytics routine that provides actionable insights without taking up all your time
Learn to present measurements and analytics in ways that influence and persuade others
Castillo high quality program evaluation in nonprofitsIsaac Castillo
Presentation for Tidewater Community College Workforce Solutions school - the Academy for Nonprofit Excellence.
http://www.tccworkforce.org/non-profit-management
Can your nonprofit prove you are making a difference?Isaac Castillo
Slides from class taught at Tidewater Community College's Academy for Nonprofit Excellence. Focuses on the basics of data collection, outcome measurement, logic models,and performance management.
VMCS14 REanalyze: What is your EVP Data Saying?VolunteerMatch
2014 VolunteerMatch Client Summit Best Practice Cafe
Demonstrating your employee volunteer program's impact internally and externally is critical to its success. While the industry as whole is still looking for ways to get beyond traditional metrics, some companies are taking it upon themselves to identify outcomes that reflect their priorities. They are also looking for new ways to quantify the engagement and impact of their employees so that they can better tell their stories to leadership, employees, nonprofit partners, and the community.
Join Jake Sanches, internal metrics and analytics guru at Palantir Technologies, to discuss VolunteerMatch's recent metrics benchmarking project. We'll review our findings and key takeaways, cover industry trends across key metric benchmarks, and discuss metrics analysis in finer detail and how it can be leveraged to drive improved programmatic and reporting approaches. Jake will also provide recommendations and demonstrate examples of ways to increase your use and presentation of data in your communications.
Infographics: E-volving Instruction for Visual Literacy
Melanie Parlette-Stewart, Lindsey Robinson - University of Guelph, Guelph, Ontario
WILU 2014 - London, ON
Infographics involve the bringing together of information, data, and design. There is increasing need to be visually literate, as is highlighted in the ACRL Visual Literacy Competency Standards for Higher Education. This session presents the ACRL Visual Literacy Competency Standards and the application of these to an introductory infographics instruction session. This session will highlight the active learning approach used to allow students to engage with and create infographics at an introductory level.
Can you measure if the content in your eLearning system provides an enriching and engaging experience for your learners? If you can't answer this important question, you're not alone. Organizations struggle to combine the complex activity of analyzing data to identify opportunities that can improve learner engagement with their content. It's worth it to find out. Courses and related resources that may not be as valuable as intended can result in decreased interest and attendance rates—leading to poor learning outcomes. There are many ways to measure and analyze course engagement data in your LMS. These insights enable managers to identify, prioritize change to learning programs and step up their engagement game.
Presented at the 2014 SLATE conference (www.slategroup.org)
Faculty development is occurring increasingly online through text-based guides, just-in-time video tutorials, and social media, which is convenient for faculty looking for information on teaching or using technology. However, this makes it difficult for faculty development centers, used to traditional forms of assessments, to assess the quality and effectiveness of these programs and resources.
In this session, we will share how the Faculty Development and Instructional Design Center at Northern Illinois University has used web analytics to evaluate the usage of online materials and how the results have impacted our practice.
Stop Wasting Your Analytics Budget - edUi 2016Mitch Daniels
When approached with clear intentions, web analytics can be a game-changing part of any online presence. It can inform massive redesigns, drive additional engagement, and spur continued site improvements.
Despite its potential, the full power of analytics is often neutered by a misappropriation of priorities and resources, leading to a stream of sterile, uninspiring reports and dashboards. Learn to recognize these challenges, identify them within your own organization, and confront them head on.
We’ll explore the distinction between ‘interesting’ and ‘actionable’ data, the downsides of monthly reports, and the importance of the 10/90 rule. Finally, we’ll identify a single word that will immediately push your analytics strategy in the right direction: “Why?”.
MEASURING THE BUSINESS IMPACT OF LEARNING: WHAT WE’VE LEARNEDHuman Capital Media
The ‘Measuring the Business Impact of Learning’ benchmarking survey, conducted by LEO Learning and Watershed (on behalf of Learning Technologies Group) is entering its fourth year. With the survey launching on November 1st and closing on December 13th, LEO Learning and Watershed are holding a webinar to reflect on the results so far, plus discuss how organizations they’re working with have overcome the barriers in measurement planning and implementation. The insights are drawn from their group experience working with a range of clients in this field and should be valuable for anyone who wants to get going in learning analytics and sustainable business impact assessment.
Join your hosts as they cover the state of the world of measurement, and you’ll receive:
An understanding of how well-known organizations have overcome the barriers in measurement planning and implementation.
Real-world examples of how to get management buy-in, designing for data, building data ecosystems, implementing a learning analytics strategy and more.
The opportunity to take this years ‘Measuring the Business Impact of Learning’ survey, and see the results coming in live!
Infographics: E-volving Instruction for Visual Literacy
Melanie Parlette-Stewart, Lindsey Robinson - University of Guelph, Guelph, Ontario
WILU 2014 - London, ON
Infographics involve the bringing together of information, data, and design. There is increasing need to be visually literate, as is highlighted in the ACRL Visual Literacy Competency Standards for Higher Education. This session presents the ACRL Visual Literacy Competency Standards and the application of these to an introductory infographics instruction session. This session will highlight the active learning approach used to allow students to engage with and create infographics at an introductory level.
Can you measure if the content in your eLearning system provides an enriching and engaging experience for your learners? If you can't answer this important question, you're not alone. Organizations struggle to combine the complex activity of analyzing data to identify opportunities that can improve learner engagement with their content. It's worth it to find out. Courses and related resources that may not be as valuable as intended can result in decreased interest and attendance rates—leading to poor learning outcomes. There are many ways to measure and analyze course engagement data in your LMS. These insights enable managers to identify, prioritize change to learning programs and step up their engagement game.
Presented at the 2014 SLATE conference (www.slategroup.org)
Faculty development is occurring increasingly online through text-based guides, just-in-time video tutorials, and social media, which is convenient for faculty looking for information on teaching or using technology. However, this makes it difficult for faculty development centers, used to traditional forms of assessments, to assess the quality and effectiveness of these programs and resources.
In this session, we will share how the Faculty Development and Instructional Design Center at Northern Illinois University has used web analytics to evaluate the usage of online materials and how the results have impacted our practice.
Stop Wasting Your Analytics Budget - edUi 2016Mitch Daniels
When approached with clear intentions, web analytics can be a game-changing part of any online presence. It can inform massive redesigns, drive additional engagement, and spur continued site improvements.
Despite its potential, the full power of analytics is often neutered by a misappropriation of priorities and resources, leading to a stream of sterile, uninspiring reports and dashboards. Learn to recognize these challenges, identify them within your own organization, and confront them head on.
We’ll explore the distinction between ‘interesting’ and ‘actionable’ data, the downsides of monthly reports, and the importance of the 10/90 rule. Finally, we’ll identify a single word that will immediately push your analytics strategy in the right direction: “Why?”.
MEASURING THE BUSINESS IMPACT OF LEARNING: WHAT WE’VE LEARNEDHuman Capital Media
The ‘Measuring the Business Impact of Learning’ benchmarking survey, conducted by LEO Learning and Watershed (on behalf of Learning Technologies Group) is entering its fourth year. With the survey launching on November 1st and closing on December 13th, LEO Learning and Watershed are holding a webinar to reflect on the results so far, plus discuss how organizations they’re working with have overcome the barriers in measurement planning and implementation. The insights are drawn from their group experience working with a range of clients in this field and should be valuable for anyone who wants to get going in learning analytics and sustainable business impact assessment.
Join your hosts as they cover the state of the world of measurement, and you’ll receive:
An understanding of how well-known organizations have overcome the barriers in measurement planning and implementation.
Real-world examples of how to get management buy-in, designing for data, building data ecosystems, implementing a learning analytics strategy and more.
The opportunity to take this years ‘Measuring the Business Impact of Learning’ survey, and see the results coming in live!
Information may be time-sensitive. Subscribers should use the information contained at their own risk. Please check latest information with Dr. A by emailing bugdoctor@auburn.edu.
Presented by:
Dr. Lisa D’Adamo-Weinstein, Director of Academic Support , SUNY Empire State College
Dr. Tacy Holliday, Governance Coordinator, Montgomery College, NCLCA Learning Center Leadership Level
Description: Measuring and evaluating student success is crucial to retention efforts and program development. Join us as we talk about the key elements necessary to measure student success in your tutoring and learning centers. We will assist you in developing an assessment plan for your own center.
Getting your voice of the customer program up and running can be challenging. But, successful implementation will determine whether yours is a high performing program with actionable insights, or a data collection system that drowns in information overload.
Join Kyle Goff, former JetBlue VoC Analyst, and Innes Vanderniepen of Brussels Airlines, as they share their experiences implementing successful VoC programs that increased brand ROI and transformed customer interactions. You’ll learn how to create a high-level VoC implementation plan, and build a powerful program to increase your return on investment.
For seven years, CLO’s LearningElite awards have recognized the organizations that excel at managing the learning function from end to end. How do elite learning organizations align learning with organizational goals, engage their learners, measure success, engage leadership in employee development and use learning to make a measureable impact on the organization? Join Sarah Kimmel, vice president of research at Human Capital Media, as she discusses the practices that make LearningElite organizations effective, and that contribute to their high scores on the LearningElite benchmark.
You’ll learn:
The practices that distinguish LearningElite organizations from the rest.
How elite learning organizations achieve alignment of learning with organizational goals.
How the LearningElite engage leaders at all levels to support employee development and create a culture of learning.
What metrics elite learning organization use to measure impact on the learner and impact on the business.
Tips for maximizing your organization’s scores on the LearningElite application.
Participants will receive early access to the 2018 LearningElite application worksheets.
Measure what matters for your agile projectMunish Malik
While working with Agile projects, we simply can't get away from tracking and showcasing the progress of the project. A typical Agile project would be working with estimates, story points, velocities, burn-up or burn-down charts.
I have witnessed numerous sprint reviews and showcases where the business is only waiting to see those few slides of the presentation where there is the "actual" red worm, running against the "planned" green worm, trying to catch-up. If the red worm is ahead, I have seen a smile on the faces of the stakeholders. If it matches the green one, there is a sigh of relief. And as a development team you should just pray that the poor red guy is not falling behind the green one, lest it might lead to a lot of questions starting with why, how, what etc.
There have also been times where there have been some unfortunate heated discussions that last forever on why did the team end up not claiming a few points that they had committed. What gets lost is what the team accomplished in the sprint that adds good value to the product. There have also been times where the estimates are being questioned by the product owner or account managers. If you are working in a distributed setup where the product owner is working out of a different country, the problem is even bigger.
Let us think about a scenario where the project gets completed on time, budget and scope. Majority (or all) of estimates were correct. However, when the product went live to the market it failed big time. What is the use of building such a product?
Are we focusing too much on numbers and points and overlooking the other important aspects of Agile software development such as producing software that delights the customers and looking for ways on how we can measure that? Are we measuring if we are creating a solid, robust and a scalable platform that is ready for future developments and enhancements? Are we measuring the outcomes of the time we are spending in the shoes of the people who will actually use the software?
The objective of this presentation is to promote the thinking of measuring what matters for your project. To measure the goals that your software development wants to achieve. I don't plan to showcase an exhaustive list of measurements that can solve all your problems, however, I instead want to highlight some samples that I have used in my projects with the help of my team, that helped us to measure things that add value to the business and development v/S simply creating burn down charts.
Majorly, I want to encourage thinking out of the box to identify what measurements will really matter for your projects. Perhaps from the eyes of the users and business and see what things if measured will add a lot more value than simply estimates, and will help in creating a valuable product that will truly delight the business and the users of the product.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Numberstories 151105195238-lva1-app6892
1. Number Stories
Mike Powers
Director of Electronic Communications
Indiana University of Pennsylvania
Confab Higher Ed 2015
Win friends and influence HiPPOs
with an effective measurement strategy
24. What’s the story?
1. What are your goals?
2. What is your content?
3. How will this content achieve those goals?
4. What would success look like?
5. What would failure look like?
6. What measures would show success or failure?
7. What are your targets?
27. 1. What are your goals?
1. Bring in enough students over the next four years
to make program viable
2. Acquire a reputation for excellence in CS that
brings in students and helps them find
employment
28. 2. What is your content?
• New microsite about the program
• Press releases about the program, faculty
• Blog posts about CS written by faculty members
• Presentations by faculty and student at CS
conferences
• Online ads
30. 3. How does content achieve your goals?
Ad
Blog
Earned
Media
31. 3. How does content achieve your goals?
Come to
Microsite
Ad
Blog
Earned
Media
32. 3. How does content achieve your goals?
Come to
Microsite
Learn
More
Ad
Blog
Earned
Media
33. 3. How does content achieve your goals?
Come to
Microsite
Learn
More
Ad
Blog
Earned
Media
Request
Info
Request
Visit
Apply
34. 4. What does success look like?
Image by velkr0 https://www.flickr.com/photos/velkr0/
35. 4. What does success look like?
Image by velkr0 https://www.flickr.com/photos/velkr0/
• Highly qualified
• Diverse
• Likely to succeed
36. 5. What does failure look like?
Image by Andrew Allio https://www.flickr.com/photos/allio/
37. 5. What does failure look like?
Image by velkr0 https://www.flickr.com/photos/velkr0/
• Not qualified
• All the same
• Unable to afford the program
38. 6. What measures would show success
or failure?
Enrollment? Test Scores? Diversity Data?
Image by gozalewis https://www.flickr.com/photos/gozalewis/
39. Interim Goals
A. Do prospects understand content?
B. Does content communicate value
propositions?
C. Does content appeal to/engage prospects?
D. Does content encourage conversions?
53. 7. What are your numeric targets?
• If you want 50 students (donations, etc.)
• How many applications?
• How many inquiries?
• How many web sessions (visits)?
84. Context often means non-analytics
data
• How many applications?
• How many showed up for an event?
• How many students haven’t yet registered for
spring?
• How much did we spend on advertising/promotion?
85. A master spreadsheet gathers
• non-sampled data and
• non-analytics data
in the same place.
145. The structure of the stories we’ll tell
1. Here’s where we were
2. Then we changed x
3. Here’s what happened
4. Here’s what we need to do next
146. Example
• 40% of new students didn’t bring the right materials
to orientation, even though it was on the website
• We rewrote that content and provided a checklist
• This fall, only 20% of students didn’t bring the right
materials to orientation
• Next, we’ll look at the way this content is labeled
147. Example
• We’ve had an increase in students who start an
application but don’t complete it.
• We increased the number of reminder e-mails we
send them.
• But—traffic from e-mail actually dropped afterward.
• Next, we’ll cut back on the number of e-mails and
make the ones we do send more personalized.
149. 1. Plan for measurement
• Understand your goals
• Understand how your content gets you there
• Use that analysis to find a small number of
appropriate measures
• Set numeric targets
150. 2. Collect data effectively
• Collect contextual data
• Organize your measurements centrally
• Have a plan to sustain data collection