Tina Phillips (Cornell Lab of Ornithology) presenting the DEVISE project, and learning in citizen science research at the Citizen Cyberlab Summit, 17-18 September 2015, University of Geneva (UNIGE).
Tina Phillips (Cornell Lab of Ornithology) - the DEVISE project
1. A membership institution interpreting and conserving the
earth’s biological diversity through research, education,
and citizen science focused on birds
birds.cornell.edu
2. Public Engagement in Science Program
Funding provided in part by:
Rick
Bonney,
Director
Tina Phillips
Manager,
Evaluation and
Research
Jennifer Shirk
Manager, Professional
Development
Jody Enck
Evaluation
Specialist
We provide
advise and
consultation
We help
design and
evaluate
projects.
We conduct
research on
learning in CS.
We support
youth
development
through CS
We develop
evaluation tools
and resources
We provide
leadership for
citizen science
professionals
• We design, evaluate, and research projects that engage
the public in scientific research and conservation.
• We conduct field building activities for professionals who
develop, evaluate, and research citizen science.
3. CAISE Inquiry Group Report, July 2009
Public Participation in Scientific Research: Defining the Field and
Assessing its Potential for Informal Science Education
Bonney, R., Ballard, H., Jordan, R., McCallie, E., Phillips,T., Shirk, J., andWilderman, C.
Center for the Advancement of
Informal Science Education (CAISE)
4. Contributory
Define a question/issue
Gather information
Develop explanations
Design data collection methods
Collect samples
Analyze samples
Analyze data
Interpret data/conclude
Disseminate conclusions
Discuss results/inquire further
Collaborative Co-CreatedCS/PPSR models:
5. Findings from CAISE Report
• Scarcity of quality evaluations
• Higher engagement suggested deeper learning
• Need for more sensitive measures
• No opportunity for cross-programmatic analyses
• Cry for help!
6. DEVISE
Assessed the state of evaluation in citizen science
Determined common goals, objectives, and indicators across
projects
Inventoried existing instruments; developed/modified new and
existing evaluation tools
Provide professional development opportunities
Build community of practice for evaluations of citizen science
projects
DEVISE instruments can be found at:
CitizenScience.org/Evaluation
Developing, Validating, and Implementing Situated Evaluations
Improve evaluation quality and capacity across the field of
Citizen Science
7. Evaluation is . . .
the systematic collection of data
to determine strengths and weaknesses
of programs, policy, products,
so as to improve their overall
effectiveness.
StakeholdersProcess
8. Framework Development
• Focus on Individual Learning Outcomes (ILOs)
• Literature review, interviews, and past experiences by
research team
• Inventory of goals & outcomes from 300 CS web sites
• Online Survey of CS practitioners; (N= 200)
• In-person survey at 2012 CS conference; (N= 104)
• Online Survey of Practitioners reaching underserved
audience; (N=400)
Phillips et al. in preparation
9. Theoretical Underpinnings
Goal is to provide measures for individual learning outcomes
• Sociocultural Theory (Vygotsky 1978, Rogoff 2003)
• Situated Learning, Communities of Practice,
(Lave and Wenger 1991)
• Free-choice Learning (Falk and Dierking 2000)
• Life-long Learning (Bell et al. 2009)
• Self Determination Theory (Ryan and Deci 2000)
10.
11. • Interest in Science and Nature (adult & youth version)
• Nature Relatedness Scale*
• Self-Efficacy for Science (adult & youth version)*
• Self-Efficacy for Environmental Action*
• Motivation for Science (adult & youth version)*
• Motivation for Environmental Action*
• Perceptions of Science Scale
• Skills of Science Self-Report*
• Data Interpretation Quiz
• General Environmental Stewardship Scale
*customizable
DEVISE Scales
12. User’s Guide to Project Evaluation
Available for free download at:
Citizenscience.org/evaluation
• Practical overview of evaluation
techniques, tips, and best-practices
• Templates and worksheets to help
with planning and implementation
14. Nearly half of the guide is made up of useful appendices and
resources, including templates, worksheets, and sample evaluation
planning
Appendices and Resources
15. Who uses the Guide?
- Zoos and Aquariums
- Monitoring Programs
- Science Museums
- Colleges and Universities
- Consulting Firms
- Conservation Organizations
- Educational Organizations
- Public Schools
- Government Agencies
- Wildlife Societies
- Citizen Science Projects
- Environmental Service Groups
- Non-profit Organizations
- Individuals
16. Engagement Learning Identity
Q1: What are the dimensions of citizen science
engagement across different types of projects?
Q2: What is the relationship between participant
engagement and science learning outcomes?
Q3: How does degree and quality of citizen science
participation develop and/or reinforce science identity
in participants?
Collaborative Research: Exploring Engagement and
Science Identity through Participation (EESIP)
UC Davis
18. Project Partners
Meta synthesis by Bonney et al. 2009
Contributory
Nestwatch
Monarch Larvae
Monitoring
Project
Collaborative
Hudson River
Estuary Program
CoCoRaHS
Co-Created
ALLARM
Global
Community
Monitors
Scientist Driven >>> >>> Community Driven
UC Davis
19. Phase 1 Methods
Data collection
83 Semi-structured interviews, ranging from 60 minutes to
2.5 hours (low, medium, high participants)
Literature review informed development of interview
protocol
Feedback from PL and COV on interview protocol
Data analysis
Coded using a priori constructs from the literature and
emergent themes
Nvivo software for coding and analysis
UC Davis
20. • What was it that led you to start participating in the
project (motivation)?
• What keeps you involved? (facilitators)
• Please describe what a typical day for you participating in
this project looks like? (engagement activities)
• Can you describe the most memorable day you had while
participating in the project? (feelings)
• What things did you do to learn how to participate?
(learning)
• How has your involvement in the project changed over
time since you began participating? (CoP, role expansion)
Interview Protocol
21. --“I never thought of myself as a trainer. But I feel like it's a – it's been a good thing for
me, as a retiree, I'm getting old enough that I think about what's my legacy, what do I
leave behind, what do – I have no children, so what is my contribution to the world
and humanity or whatever. So I think training other people is one way to pass on
skills that can go further than just me looking in boxes and enjoying the birds for
myself.” (NW -TRAINING PARTICIPANTS)
Examples of Coded “Extra” Activities
Well, in the very beginning we met at Albany High School. My class went to Albany
High School and met all the students that we're gonna be helping. And the DEC
discussed the protocol with us. (EELS – LEARNING PRTOCOLS)
I have a display to try to round up some interest in the public to see if we have any
other future monitors that might be willing to do that. So we’ve been doing that a
little bit more lately since I’ve been able to help out for recruitment. (ALLARM –
RECRUITING PARTICIPANTS)
22. “Extra” Activities
“It completely changes the way that they respond to us and actually how much
time they even give us because previously without that data, without something
tangible you walk into a room and you sit down, and you try to start having a
conversation and most of it’s revolving around anecdotal evidence.” (GCM –
COMMUNICATING FINDINGS)
“I kind of do my own little graphing to of my particular spot on the planet. Again, the
number of instars I’ve collected over the weeks and then to graph it into a chart
that I can follow from year to year and see how things are different on my
particular landscape.” (MLMP – ANALYZING/INTERPRETING DATA)
Attempting to read the data at the same time every day and then with the snow
in terms of how you collect the snow amounts and so there's a certain rigorous
process that in theory everybody's using so that the data set’s consistent.”
(CoCoRHaS – USINGSTANDARDIZED METHODS)
23. #
Source
s
#
Ref
data collection 54 1126
communicating with others 54 540
friends, family, coworkers, public 51 219
scientists or project leaders 45 199
other project participants 25 76
media, policy decision makers 16 36
learning protocols 52 182
Submitting data 47 161
communicating finding to others 40 181
friends, family, co-workers, public 25 67
media, policy, decision makers 18 61
scientists, project leaders 16 30
other project participants 10 19
miscellaneous 39 174
getting updates and feedback 35 104
Engagement activities (draft PEM)
#
Sources
#
Ref
exploring data 33 107
recruiting participants 31 80
using standardized methods 27 67
coordinating participant activities 23 98
analyzing or interpreting data 23 52
training participants 21 66
attend meetings 21 80
asking questions 21 50
compiling data-spreadsheets 20 55
use data 19 32
forming hypothesis 14 28
conducting investigations 6 15
tool building 5 13
habitat improvement 4 8
adapt or modify protocols 3 6
24. EmergingThemes
• Majority come in with a high interest in science and/or nature
• Lower engagers tend to feel less connected to project, some
crave more social outlets
• Across the board, even new participants have a strong
understanding of citizen science and their role in it
• Data transparency is both a barrier and a facilitator
• For the most part they all feel like what they are doing is part of
science, contributing to something bigger
• QA/QC is both personally and organizationally important
• Participants are engaged in a wide array of activities, beyond
simply data collection
• Group projects may have more impact than inddividual
projects
26. Research Q2
Q2: How do motivation and engagement influence
efficacy, skills and stewardship?
Efficacy for science
Science Process
Skills
Motivation for science
and/or environment
Demographics
Participant
Engagement Metric
Independent
Variables
Dependent
Variables
Environmental
Stewardship
Efficacy for
environmental action