Recent Outcomes Evaluations of Legal Aid Tech Projects
From Investment to Impact: Recent
Outcomes Evaluations of Legal Aid
August 17, 2016
Claudia Johnson, Pro Bono Net
Keith Porcaro, SIMLab
Valerie Oliphant, SIMLab
Tara Saylor, Q2 Consulting
How we think of evaluations
An unpleasant but necessary task
A routine task a task, duty
A waste of time and money—black box
A fun helpful activity!
Why do we do evaluations?
1) We have to!
2) They made us do it to get the funding…
3) We love to do evaluations!
4) We find them helpful in figuring out how to use limited resources
--guide future technology investments
--do needs assessments
--find out what we can do better
--because we are curious and we love to learn and be smart on how we use what we have
Different types of evaluations
• There are different types of evaluations—
• In this call we will go over:
–Different approaches to evaluations, borrowing from other fields
–Tips on how to do resource constrained evaluations
–As a way to find more money, save money
Who we are:
SIMLab is a nonprofit with a mission to realize a world
where technology helps build societies that are
equitable and inclusive.
What are “inclusive technologies”?
• Incorporate as many people as
possible, including low-income and
• Communications approaches that have
a wide reach.
• Capitalize on and integrate with existing
tools and familiar communications
channels, wherever possible.
What is M&E?
Monitoring refers to an on-going, periodic process of tracking implementation
with the primary purpose of informing day-to-day project management decisions
and tracking how an initiative is progressing. In some programs, monitoring
includes “real-time” data and feedback from program participants that can
inform immediate decisions.
Evaluation is more of a discrete activity, which refers to the systematic and
objective assessment of an ongoing or completed project or program which
looks at its design, implementation and results. Evaluations may also aim to
determine the worth of an activity, intervention or program.
M&E: What is it good for?
A monitoring and evaluation (M&E) process is put in place for 3 main purposes:
● As a management tool to drive change
● As an accountability tool
● To provide lessons and learning
M&E may be used to:
● inform future funding decisions
● judge the performance of contractors or partners
● gather evidence to establish whether a particular approach is useful
● examine how a particular inclusive technology, or inclusive technology overall,
contributes to wider programmatic goals.
Challenges to M&E
● Technology adds an additional layer of complexity to an already complex
● Technology projects are frequently new operational partnerships and each
partner may have different priorities in terms of things to measure.
● Technologist partners and implementing partners have different project
cycles and working styles.
● Differences in what they would like to measure- the ‘business case’ -
improvements to efficiency, effectiveness, and ease of communication - are
only indirect contributions to the development-focused goal which traditional
M&E focuses on.
● Limited capacity and resources.
SIMLab’s M&E Framework
● OECD-DAC criteria are widely used by the international
● ALNAP (the Active Learning Network for Accountability and
Performance) later adapted these criteria to better fit complex
● SIMLab built upon ALNAP’s work and added additional
considerations to make them more applicable to inclusive
The extent to which the technology choice is
appropriately suited to the priorities, capacities and
context of the target group or organization.
A measure of the extent to which an information and
communication channel, technology tool, technology
platform, or a combination of these attains its
Efficiency measures the outputs -- qualitative and quantitative -- in
relation to the inputs. It is an economic term which signifies that the
project or program uses the least costly technology possible in order to
achieve the desired results. This generally requires comparing
alternative approaches (technological or non-technological) to achieving
the same outputs, to see whether the most efficient tools, platforms,
channels and processes have been adopted.
The positive and negative changes produced by the introduction or change in a
technology tool or platform on the overall development intervention, directly or
indirectly, intended or unintended. This involves the main impacts and effects
resulting from the technology tool or platform on the local social, economic,
environmental and other development indicators. The examination should be
concerned with both intended and unintended results and must also include
the positive and negative impact of external factors, such as changes in terms
of trade and financial conditions and digital information and communication
Sustainability is concerned with measuring whether
the benefits of a technology tool or platform are
likely to continue after donor funding has been
withdrawn. Projects need to be environmentally as
well as financially sustainable.
Coherence is related to the broader policy context
(development, market, communication networks,
data standards and interoperability mandates,
national and international law) within which a
technology was developed and implemented.
Use data to build a bigger picture of the world
● Use tools that produce data
○ Auditable logs
○ Exportable data
● Use data you already have
○ Time tracking and case management
○ Email and phone records
○ Qualitative data and opinions
● Use data that exists out in the world
○ Court records
○ Other organizations (legal aid and otherwise)
● Document for replicability
○ Regular use
● Document for errors
○ Practical difficulties
○ Mistaken input
○ Kludges, hacks, and compromises
● Document for the public
● Tools: we like Skitch for screenshots and markup, LICECap for animated GIF screenshots
● Budget: Expect to allocate 10-20% of the project cost for evaluation, especially if you’re
● Find an independent evaluator (we can help). Your technology provider should definitely
not be evaluating the job they just did.
● A report that’s thrown in a drawer is a waste of money; use evaluations to learn things that
are useful to your organization.
● SIMLab’s M&E Framework: http://simlab.org/resources/mandeoftech/
Source: Wayne State University, Center for Urban Studies
• Education for SRL
• Improved technologies
• Enhanced community
• SRLs use technology
• Efficiencies for SRLs, Courts,
• Fewer criminal records
• Improved job and housing
outcomes for SRL
• Improved education
outcomes for SRL
•Who we reach.
•What we do and create.
•Why we did this.
Know what to spend your evaluation budget on in a resource-
Ideally, a PhD evaluator will:
–Design your evaluation study using a logic model
–Determine sampling plans
–Create your instruments
–Teach you how to collect data properly
–Analyze your data
Reduce your Evaluation Budget by:
–Collecting the data yourself
–Recording the data in a format that works for your
–Writing the final report