Evaluating digital inclusion projects
in practice
Methods at Manchester
15th April 2015
Dr Alice Mathers (Head of Innovation and Research)
James Richardson (Research and Innovation Manager)
Today’s session
1. Seminar presentation:
a. Defining Digital Exclusion
b. Digital & Social Exclusion
c. The need to evidence what works
d. Tinder Foundation - evaluation in practice
2. Session workshop: working through real life
examples
3. Session close: group reflections and feedback
About Tinder Foundation
● Sheffield-based, staff-owned social enterprise
● Primarily funded by BIS, NHS England and DCLG
● National network of 3,000 UK online centres -
hyperlocal organisations supporting their individual
community needs
● The largest organisation in the UK working to
overcome digital exclusion
Defining Digital Exclusion
Exclusion from “the best use of digital technology,
either directly or indirectly, to improve the lives and
life chances of all citizens and the places in which they
live.” - William Gibson (yes, that William Gibson)
Defining Digital Exclusion
In practice Digital Exclusion is generally broken down into:
1. A lack of Basic Digital Skills
2. A lack of access to internet-connected devices
3. A lack of motivation and awareness of the value of
being online
Defining Digital Exclusion
● 10.5m UK adults lack Basic Digital Skills
● 43% are under the age of 65
● 69% are in socioeconomic groups C2DE
● DE costs an individual on average £1,064 p/a
● SMEs could generate an extra £18.8bn p/a
● Central government could save £1.8bn p/a
● Strong correlation with social exclusion
Digital & Social Exclusion
Social exclusion ‘involves both the lack or denial of resources,
rights, goods and services, and the inability to participate in the
normal relationships and activities, available to the majority of
people in a society, whether in economic, social, cultural or
political arenas. It affects both the quality of life of individuals
and the equity and cohesion of society as a whole’
Levitas et. al, (2007) Multi-dimensional analysis of social
exclusion. University of Bristol
Digital & Social Exclusion
Among UK online centres learners:
● 47% are educated below Level 2
● 31% are unemployed
● 56% are in receipt of means-tested benefits
● 34% live in social housing or are homeless
● 36% are disabled
● 30% are in HBAI relative income poverty
Digital & Social Exclusion
The overlap is clear!
addressing
individual barriers
and motivation
through tailored, local
support providing
+ digital skills
+ digital access
+ digital resources
= social inclusion Levitas et. al, (2007) Multi-dimensional analysis of social exclusion. University of Bristol
The Policy Landscape
- Government Digital Service (GDS)
- Created in 2011 to be the centre for digital govt
- Digital Inclusion Strategy
- Assisted Digital
- Labour Digital Government Review (Nov 2014)
- Pledge to get everyone online by 2020
- Digital Nation 2020 Report by Catherine
McDonald
The need to evidence what works
In practice this is driven by a number of factors:
● Facilitates agility (in changing funding and socio-economic climate)
● Demonstrates professionalism
● Drives innovation
● Identifies productisation opportunities
● Avoids stagnation
● Maintains relevance
● Transparent and inclusive approach
● Enables an organisation to be ‘positively critical’ about what they do
The need to evidence what works
Evaluation should:
● Not be seen as an add on or after thought
● Be carried out within a practice culture of
reflection and iteration
● Requires a consistent and clear approach within
organisation and across programme
The need to evidence what works
Key stages of data capture and evaluation:
1. Baseline (including aims and objectives)
2. Progression (reflecting on distance travelled)
3. Impact (reflecting on project impact against
baseline, aims and objectives)
Evidence drives innovation
Case Study 1: eReading Rooms
- The project ran in 2012-13 with the
aim of engaging non-traditional
learners and supporting them to
access informal learning
opportunities in their communities,
using technology as a key enabler.
- 20 pilot centres took part in the
project, employing 77 staff paid
staff who were able to engage 134
volunteers who in turn reached 1337
learners.
Tinder Foundation evaluation in practice
Case Study 2: Vodafone Mobile
Devices
“Can mobile connectivity and
devices help people to become
socially and digitally included?”
Tinder Foundation evaluation in practice
Session workshop
Each table has been given a real-life project brief for one of the
following two projects:
1. NHS: Widening Participation
2. Vodafone: the Financial Benefits of Being Online
We want you to design an evaluation framework for this project,
including details of different methodologies that will capture the
invention impact at a variety of relevant scales.
Session workshop
In developing your evaluation framework you should consider the following:
● How you will determine the link between digital exclusion and other forms
of social exclusion?
● What individual and organisational outcomes need to be measured?
● What methods will you use to strike the right balance between quantitative
and qualitative study?
● To what extent does your evaluation framework allow comparison between
different digital inclusion projects?
● How will your approach provide ‘value for money’? (A key consideration faced by
delivery organisations for whom funding is usually attached to significant numeric targets with a
minimal allocation for formal evaluation)
Imaging you are a practice research team you now have 20mins to
discuss the project you have been given and address the following, ready
to feedback to rest of the room:
1. Provide an overview of the project
2. Identify suitable evaluation aim(s)
3. Identify any potential issues and considerations
4. Describe the methodological approach you would take
5. Identify the appropriate outcome measures you would use
6. Outline your argument for how this evaluation would add value to
the project delivery
Session workshop
Dr Alice Mathers
Head of Innovation and Research
Tinder Foundation
alice@tinderfoundation.org
www.tinderfoundation.org
Thank you, any questions?
James Richardson
Research and Innovation Manager
Tinder Foundation
jamesrichardson@tinderfoundation.org
www.tinderfoundation.org

Evaluating digital inclusion projects in practice

  • 1.
    Evaluating digital inclusionprojects in practice Methods at Manchester 15th April 2015 Dr Alice Mathers (Head of Innovation and Research) James Richardson (Research and Innovation Manager)
  • 2.
    Today’s session 1. Seminarpresentation: a. Defining Digital Exclusion b. Digital & Social Exclusion c. The need to evidence what works d. Tinder Foundation - evaluation in practice 2. Session workshop: working through real life examples 3. Session close: group reflections and feedback
  • 3.
    About Tinder Foundation ●Sheffield-based, staff-owned social enterprise ● Primarily funded by BIS, NHS England and DCLG ● National network of 3,000 UK online centres - hyperlocal organisations supporting their individual community needs ● The largest organisation in the UK working to overcome digital exclusion
  • 4.
    Defining Digital Exclusion Exclusionfrom “the best use of digital technology, either directly or indirectly, to improve the lives and life chances of all citizens and the places in which they live.” - William Gibson (yes, that William Gibson)
  • 5.
    Defining Digital Exclusion Inpractice Digital Exclusion is generally broken down into: 1. A lack of Basic Digital Skills 2. A lack of access to internet-connected devices 3. A lack of motivation and awareness of the value of being online
  • 6.
    Defining Digital Exclusion ●10.5m UK adults lack Basic Digital Skills ● 43% are under the age of 65 ● 69% are in socioeconomic groups C2DE ● DE costs an individual on average £1,064 p/a ● SMEs could generate an extra £18.8bn p/a ● Central government could save £1.8bn p/a ● Strong correlation with social exclusion
  • 7.
    Digital & SocialExclusion Social exclusion ‘involves both the lack or denial of resources, rights, goods and services, and the inability to participate in the normal relationships and activities, available to the majority of people in a society, whether in economic, social, cultural or political arenas. It affects both the quality of life of individuals and the equity and cohesion of society as a whole’ Levitas et. al, (2007) Multi-dimensional analysis of social exclusion. University of Bristol
  • 8.
    Digital & SocialExclusion Among UK online centres learners: ● 47% are educated below Level 2 ● 31% are unemployed ● 56% are in receipt of means-tested benefits ● 34% live in social housing or are homeless ● 36% are disabled ● 30% are in HBAI relative income poverty
  • 9.
    Digital & SocialExclusion The overlap is clear! addressing individual barriers and motivation through tailored, local support providing + digital skills + digital access + digital resources = social inclusion Levitas et. al, (2007) Multi-dimensional analysis of social exclusion. University of Bristol
  • 10.
    The Policy Landscape -Government Digital Service (GDS) - Created in 2011 to be the centre for digital govt - Digital Inclusion Strategy - Assisted Digital - Labour Digital Government Review (Nov 2014) - Pledge to get everyone online by 2020 - Digital Nation 2020 Report by Catherine McDonald
  • 11.
    The need toevidence what works In practice this is driven by a number of factors: ● Facilitates agility (in changing funding and socio-economic climate) ● Demonstrates professionalism ● Drives innovation ● Identifies productisation opportunities ● Avoids stagnation ● Maintains relevance ● Transparent and inclusive approach ● Enables an organisation to be ‘positively critical’ about what they do
  • 12.
    The need toevidence what works Evaluation should: ● Not be seen as an add on or after thought ● Be carried out within a practice culture of reflection and iteration ● Requires a consistent and clear approach within organisation and across programme
  • 13.
    The need toevidence what works Key stages of data capture and evaluation: 1. Baseline (including aims and objectives) 2. Progression (reflecting on distance travelled) 3. Impact (reflecting on project impact against baseline, aims and objectives)
  • 14.
  • 15.
    Case Study 1:eReading Rooms - The project ran in 2012-13 with the aim of engaging non-traditional learners and supporting them to access informal learning opportunities in their communities, using technology as a key enabler. - 20 pilot centres took part in the project, employing 77 staff paid staff who were able to engage 134 volunteers who in turn reached 1337 learners. Tinder Foundation evaluation in practice
  • 16.
    Case Study 2:Vodafone Mobile Devices “Can mobile connectivity and devices help people to become socially and digitally included?” Tinder Foundation evaluation in practice
  • 17.
    Session workshop Each tablehas been given a real-life project brief for one of the following two projects: 1. NHS: Widening Participation 2. Vodafone: the Financial Benefits of Being Online We want you to design an evaluation framework for this project, including details of different methodologies that will capture the invention impact at a variety of relevant scales.
  • 18.
    Session workshop In developingyour evaluation framework you should consider the following: ● How you will determine the link between digital exclusion and other forms of social exclusion? ● What individual and organisational outcomes need to be measured? ● What methods will you use to strike the right balance between quantitative and qualitative study? ● To what extent does your evaluation framework allow comparison between different digital inclusion projects? ● How will your approach provide ‘value for money’? (A key consideration faced by delivery organisations for whom funding is usually attached to significant numeric targets with a minimal allocation for formal evaluation)
  • 19.
    Imaging you area practice research team you now have 20mins to discuss the project you have been given and address the following, ready to feedback to rest of the room: 1. Provide an overview of the project 2. Identify suitable evaluation aim(s) 3. Identify any potential issues and considerations 4. Describe the methodological approach you would take 5. Identify the appropriate outcome measures you would use 6. Outline your argument for how this evaluation would add value to the project delivery Session workshop
  • 20.
    Dr Alice Mathers Headof Innovation and Research Tinder Foundation alice@tinderfoundation.org www.tinderfoundation.org Thank you, any questions? James Richardson Research and Innovation Manager Tinder Foundation jamesrichardson@tinderfoundation.org www.tinderfoundation.org