This document summarizes a workshop on preparing and curating research data from the Prevention and Early Intervention Initiative (PEII) in Ireland. It describes several data collections that were generated from evaluations of PEII programs, including the Preparing for Life (PFL) study, the Children's Profile at School Entry (CPSE) study, and others. The PFL study involved home visiting and supports for families from pregnancy to age 4, while the CPSE study collected data on school readiness for junior infants. Both studies used mixed methods and longitudinal designs. The document outlines the process of preparing, anonymizing, and curating these datasets so they can be safely and ethically archived and reused.
A presentation to the Health Psychology in Public Health Network annual on practical, policy and research challenges in applying research to public health practice
Presentation of the first results of EUA's 2016/17 survey on Open Access. The survey is focused on the degree of implementation of institutional policies on:
1. Open Access to research publications
2. Research Data Management
3. Open Access to research data
Respondents: 338 universities from 39 countries (2015/16: 169 institutions; 2014: 106 institutions; 100% increase compared to 2015/16)
Talk at Medical Library Association Pacific Northwest Chapter meeting in Portland, OR on October 18, 2016.
http://pnc-mla.cloverpad.org/annual2016
Authors: Nicole Vasilevsky, Jackie Wirz, Bjorn Pederson, Ted Laderas, Shannon McWeeney, William Hersh, David Dorr, and Melissa Haendel
A presentation to the Health Psychology in Public Health Network annual on practical, policy and research challenges in applying research to public health practice
Presentation of the first results of EUA's 2016/17 survey on Open Access. The survey is focused on the degree of implementation of institutional policies on:
1. Open Access to research publications
2. Research Data Management
3. Open Access to research data
Respondents: 338 universities from 39 countries (2015/16: 169 institutions; 2014: 106 institutions; 100% increase compared to 2015/16)
Talk at Medical Library Association Pacific Northwest Chapter meeting in Portland, OR on October 18, 2016.
http://pnc-mla.cloverpad.org/annual2016
Authors: Nicole Vasilevsky, Jackie Wirz, Bjorn Pederson, Ted Laderas, Shannon McWeeney, William Hersh, David Dorr, and Melissa Haendel
The metric tide – Stephen Curry, Imperial College London, and Ben Johnson, HEFCE
Open infrastructures - Clifford Tatum, Leiden University
Open citation – Cameron Neylon, Curtin University
Jisc and CNI conference, 6 July 2016
Taking advantage of openness: understanding the variety of perspectives on op...OER Hub
There has been considerable coverage of the growth of Massive Open Online Courses (MOOCs) that give free access to courses that have familiar structures. However, there are many other ways in which Open Educational Resources are being used and influencing education. In the OER Research Hub we have worked across educational sectors looking at ways that OER are being adopted and used. In this paper we step back from some of the detailed work with collaborating projects to consider their different motivations and shared challenges. The case studies show how openness acts as inspiration, however the impact of openness can be harder to see. Our survey data is showing how open aspects can seem less important as projects seek to build to broad engagement, and that aims of widening access are challenged by findings that open education appeals to those who already have existing confidence and experience. The actions of the collaborating partners seek to address these issues for example through courses that help develop understanding of openness and by understanding the groups that they serve who have special needs.
Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Principles, key responsibilities, and their intersectionARDC
Dr Daniel Barr from RMIT University presented at the University of Technology Sydney's RIA Data Management Workshop on 21 June 2018. In partnership with the Australian Research Council, the National Health and Medical Research Council, the Australian Research Data Commons, and RMIT University, this is part of a national workshop series in data management for research integrity advisors.
Research Integrity Advisor and Data ManagementARDC
Dr Paul Wong from the Australian Research Data Commons presented at the University of Technology Sydney's RIA Data Management Workshop on 21 June 2018. In partnership with the Australian Research Council, the National Health and Medical Research Council, the Australian Research Data Commons, and RMIT University, this is part of a national workshop series in data management for research integrity advisors.
Presentation about OHSL's new initiative, Mycroft Cognitive Assistant®, which is intended to streamline the operational aspects of research using IBM Watson cognitive computing capabilities.
Two professionals from the University of Maryland compare and share best practices for measuring student success with the University of Johannesburg. This presentation is a summary of their visit.
Practical challenges for researchers in data sharingVarsha Khodiyar
Presentation given at the Research Data Alliance Plenary 12 session: IG Open Questionnaire for Research Data Sharing Survey, on Tuesday 6th November 2018, Gaborone, Botswana
‘Evidence-based forestry’: Constructing bridges that connect science, policy...CIFOR-ICRAF
This presentation by Gillian Petrokofsky from the University of Oxford shows what one should consider when talking about evidence-based forestry, what the bigger picture is and why a collaboration between EBF and landscape management might be the solution to many problems.
Ruth Geraghty (PEI) - The PEI Research Initiative at the Children's Research ...dri_ireland
Presentation given as part of co-hosted event between Digital Repository of Ireland and the Children's Research Network of Ireland, 13 November 2018, Royal Irish Academy, Dublin
The metric tide – Stephen Curry, Imperial College London, and Ben Johnson, HEFCE
Open infrastructures - Clifford Tatum, Leiden University
Open citation – Cameron Neylon, Curtin University
Jisc and CNI conference, 6 July 2016
Taking advantage of openness: understanding the variety of perspectives on op...OER Hub
There has been considerable coverage of the growth of Massive Open Online Courses (MOOCs) that give free access to courses that have familiar structures. However, there are many other ways in which Open Educational Resources are being used and influencing education. In the OER Research Hub we have worked across educational sectors looking at ways that OER are being adopted and used. In this paper we step back from some of the detailed work with collaborating projects to consider their different motivations and shared challenges. The case studies show how openness acts as inspiration, however the impact of openness can be harder to see. Our survey data is showing how open aspects can seem less important as projects seek to build to broad engagement, and that aims of widening access are challenged by findings that open education appeals to those who already have existing confidence and experience. The actions of the collaborating partners seek to address these issues for example through courses that help develop understanding of openness and by understanding the groups that they serve who have special needs.
Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Principles, key responsibilities, and their intersectionARDC
Dr Daniel Barr from RMIT University presented at the University of Technology Sydney's RIA Data Management Workshop on 21 June 2018. In partnership with the Australian Research Council, the National Health and Medical Research Council, the Australian Research Data Commons, and RMIT University, this is part of a national workshop series in data management for research integrity advisors.
Research Integrity Advisor and Data ManagementARDC
Dr Paul Wong from the Australian Research Data Commons presented at the University of Technology Sydney's RIA Data Management Workshop on 21 June 2018. In partnership with the Australian Research Council, the National Health and Medical Research Council, the Australian Research Data Commons, and RMIT University, this is part of a national workshop series in data management for research integrity advisors.
Presentation about OHSL's new initiative, Mycroft Cognitive Assistant®, which is intended to streamline the operational aspects of research using IBM Watson cognitive computing capabilities.
Two professionals from the University of Maryland compare and share best practices for measuring student success with the University of Johannesburg. This presentation is a summary of their visit.
Practical challenges for researchers in data sharingVarsha Khodiyar
Presentation given at the Research Data Alliance Plenary 12 session: IG Open Questionnaire for Research Data Sharing Survey, on Tuesday 6th November 2018, Gaborone, Botswana
‘Evidence-based forestry’: Constructing bridges that connect science, policy...CIFOR-ICRAF
This presentation by Gillian Petrokofsky from the University of Oxford shows what one should consider when talking about evidence-based forestry, what the bigger picture is and why a collaboration between EBF and landscape management might be the solution to many problems.
Ruth Geraghty (PEI) - The PEI Research Initiative at the Children's Research ...dri_ireland
Presentation given as part of co-hosted event between Digital Repository of Ireland and the Children's Research Network of Ireland, 13 November 2018, Royal Irish Academy, Dublin
Progress on Sustainable Development Goal 4 will require new partnerships to support improvements in data systems, measurement, monitoring, and evaluation of programmes and policies. This webinar presented efforts from national and regional levels that involve researcher networks and partnerships engaged in SDG 4 and its targets and indicators.
Evaluating Impact of OVC Programs: Standardizing our methodsMEASURE Evaluation
Jen Chapman presents on the Orphans and Vulnerable Children Program Evaluation Tool Kit, which supports PEPFAR-funded programs and helps fulfill the aims presented in the USAID Evaluation Policy.
A presentation by Emla Fitzsimons as part of the Comparability of Measurement Instruments Across Ages and Contexts panel discussion at the International Symposium on Cohort and Longitudinal Studies in Developing Contexts, UNICEF Office of Research - Innocenti, Florence, Italy 13-15 October 2014
Using research findings to inform policy and practice: the approach taken in ...Mike Blamires
Presentation by Isabella Craig, DCSF; Caroline Thomas, University of Stirling and Academic Co-ordinator for the ARi; Mary Beek, Adoption Team Manager, Norfolk Children's Services & Professional Advisor to the ARi and Mary Lucking, Head of Adoption, Children in Care Division, DCSF.
Sebba o higgins-educational outcomes of children in care_4_nov2014Young Lives Oxford
Understanding the Educational Outcomes of Young People in Care - presentation by Professor Judy Sebba and Aoife O'Higgins from the Rees Centre for Research in Fostering and Education. Gives an overview of research to date and some of the sources of data about education for children in care. Outlines a new study to assess and promote 'what works' to improve education outcomes for young people in care in the UK.
Join Dr. Anthony Levinson and Kalpana Nair, PhD from McMaster University as they discuss the Early Years Check-In (EYCI) and its companion web-based resource, Play&Learn. Designed for parents of children 18 months to 6 years of age, the EYCI helps parents quickly identify any concerns they may have about their child’s development across four domains: social and emotional, language, movement, and thinking and learning. The EYCI can be used as a discussion aid to foster dialogue about early child development between parents and practitioners providing early years services, creating opportunities to build relationships as well as provide education and support to parents to foster their child’s development.
Lisa Marriott - Working with Local Schools on Nutrition EducationSeriousGamesAssoc
Lisa Marriott, Assistant Professor, OHSU/PSU School of Public Health
This presentation was given at the 2017 Serious Play Conference, hosted by the George Mason University - Virginia Serious Play Institute.
Games for improving health and education: approaches for integrating data collection and persuasive system design on an academic budget
A presentation delivered by IPPOSI CEO, Derick Mitchell at a conference organised by the Clinical Research Facility, St. James's Hospital, Dublin, May 2018
Finding & accessing research data: Introduction to ISSDA, Key concepts, From survey to dataset, Finding data: Ireland, Europe, International, Accessing data
Accessing and using TILDA data, available through ISSDAISSDA
Accessing and using TILDA data, available through Irish Social Science Data Archive (ISSDA), Using publicly archived TILDA datasets. Delivered by TILDA in conjunction with the Irish Social Science Data Archive (ISSDA) and Gateway to Global Aging.
Anonymisation and Social Research
Ruth Geraghty
Data Curator on the CRNINI-PEI Research Initiative
Children’s Research Network for Ireland and Northern Ireland
Anonymising Research Data workshop, University College Dublin, 22nd June 2016
www.childrensresearchnetwork.org
Anonymising quantitative data
Dr Sharon Bolton
UK Data Service
UK Data Archive, University of Essex
Anonymising Research Data workshop
Dublin, 22 June 2016
This presentation covers the background to the Irish Social Science Data Archive (ISSDA), some of the sports related datasets that are available, how to access these data and tip to researchers using these data for secondary analysis.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Business update Q1 2024 Lar España Real Estate SOCIMI
PEI Research Initiative
1. PEI Research Initiative
Ruth Geraghty
Data Curator
Children’s Research Network
Finding and re-using research data workshop
UCD Health Sciences Library, 15th June 2017
Session 12:00 – 13:00
2. This session will cover the following:
Part 1: Background to the PEI Research Initiative – how we got here
and lessons learned so far
Part 2: Preparing for Life collection
Part 3: Children’s Profile at School Entry collection
Part 4: Early Childhood Care and Education collection
Part 5: What do you get in a curated PEII collection?
Later on today – PEI Research Grants
4. • Prevention and early intervention (PEI) one of the ‘issues’
tackled by The Atlantic Philanthropies’ investment in
Ireland
• Prevention = prevent a problem occurring
• Early intervention = intervene before problem gets worse
• 2004 – 2014 extended period of investment
• Sometimes in conjunction with government and
other organisations
5. Ten Years of Learning, Rochford et al., 2014
€127 million investment by Atlantic Philanthropies over 10yrs
30 partner agencies and community groups
52 prevention and early intervention services and programmes across RoI and
NI
Services operating across a diverse range of areas in child and youth well-
being:
• Early childhood
• Learning
• Child behaviour and development
• Parenting
• Inclusion
“A condition of funding for the PEII projects was that the organsiations involved
were required to rigorously evaluate the effectiveness of their services in
improving outcomes for children” (pg. 4)
http://www.effectiveservices.org/resources/article/prevention-and-early-
intervention-in-children-and-young-peoples-services-te
http://www.effectiveservices.org/downloads/PEII_10_Years_of_Learning_Repor
t.pdf
6. Why make evaluation a condition of funding?
EVIDENCE
• “Evidence can be a
powerful engine for
advancing change when
practitioners are given
robust tools or support
to apply it in their
work”
(from AP website)
SUSTAINABILITY
• “Building
partnerships with
government, NGOs
and communities
from the start can
lead to better and
more sustainable
outcomes”
(from AP website)
A LARGE BODY OF
DATA OF
INTERNATIONAL
SIGNIFICANCE AND
RANGE
But where is the data
and how do we make
it accessible?
7.
8. Picking the right tool for the job:
(below a small sample of actual evaluation designs used by PEII projects)
• Randomised control trial, plus qualitative interviews with
staff, plus a cost analysis
• Retrospective survey of administrative data that was
collected a few years ago and repurposed
• Two-group pre-post test design using a survey of parents
and teachers
• Quasi-experimental mixed methods design
• Longitudinal randomised control trial using a comparison
community, focus groups with parents at key stages in the
study, qualitative interviews with staff, and a cost
analysis
• Exploratory qualitative research
The result is a mixed bag of
research data types
- Population demographics
- Scale and indices scores
- Qualitative transcripts
- Behavioural indicators
9. The drive to archive
DISSEMINATION
Data can go further
than a report (which
can age very quickly)
AP investment
unprecedented
COMPARABILITY
International relevance
– data starts to be used
comparatively
All involved (organisations, evaluators) agreed – when approached by CRN
in 2016 - that in principle archiving this data would be a very good thing.
VALUE FOR MONEY
10. Evaluation data is an extreme example of the
sensitivities and tensions in social research data
Outcome evaluation
How the programme performed
What makes it sensitive:
• Control group v’s trial group
• Standardised measures, for example child’s cognitive and
emotional development or a parent’s sense of confidence
• Demographic info used to explore causal factors
• Baseline – most risky in term of identification of
individuals
• Longitudinal element = characteristics gathered
repeatedly
Process evaluation
How the programme worked in practice
What makes it sensitive:
• Opinions of participants, or feedback
from staff - could be quite critical
• Honest feedback is essential
• Personal stories – these tend to be
vulnerable populations
11. Safe and ethical archiving
Three steps to sharing sensitive data (as proposed by most social
science archives)
Gain
consent to
share
Anonymise
the data
Control
access
http://ukanon.net/ukan-resources/ukan-decision-making-framework/
12. Why do you have to ‘process’ research data?
1. Locate
the correct
version of
the file
2. Check
and add
meaningful
labels
3. Clean data
& standardise
missing data
codes
5. Check
copyright
on scale
measures
6. Collate
contextual
material
7. Create
metadata
and publish
data in the
right archive
4. Anonymise
the data
Example of
impact
analysis
data
13. Part 2: Preparing for Life collection
Northside Partnership and UCD Geary Institute
14. The PFL programme
• Operated by Northside Partnership in Dublin
• Objective: improve levels of school readiness of young children from several
disadvantaged areas in North Dublin
• Intervention: pregnancy to age 4 (school entry)
• Bottom up development
• Home visiting programme: PFL families receive a range of supports including:
• Assigned a mentor who visits home regularly
• Tip sheets – easy to read info on child development
• Parent training course (Triple P Parenting Programme)
15. The PFL evaluation
• UCD Geary Institute (Dr. Orla Doyle)
• 2008 – 2015
• 7 waves of data collection:
• prenatal baseline, 6 months, 12 months, 18 months, 24 months, 36 months, 48 months
• Mixed methods approach
• Impact evaluation, implementation analysis, process evaluation, direct observation of
children using British Ability Scale at school entry
• Randomisation into low support and high support, plus a comparison
community
16. The archived PFL data
• Approx. 300 cases – repeat data collection over 7 points
• Series of standardised measures and indices used – some at every point and
some at key points
Other data:
• At four years of age direct observation using BAS
• Connection to the CPSE – school readiness assessed by child’s teacher and by
primary caregiver using the Early Development Instrument (S-EDI)
• Qualitative data will be prepared next for IQDA
• Focus groups with primary caregivers
• Focus groups and interviews with fathers
• Interviews with PFL staff (e.g. mentors)
17. Impact evaluation data
7 waves of measures and scales
from CAPI survey with primary carer
High, low and non-support
measured
Long list of standardised scales used
Archived in ISSDA
BAS at school entry
Direct assessment of 4yr olds by
researchers using BAS
High versus low treatment groups
Archived in ISSDA
Process evaluation data: parents
Qualitative interviews with
programme staff
Focus groups with mothers
Focus groups with fathers
Will be archived in IQDA
Qualitative interviews with
children at 4yrs of age
Children’s drawings and follow up
qualitative interview
Will be archived in IQDA
Profile at school entry
General sample of children from
the local area at school entry
Comapred with
High and low treatment PFL
children at school entry
2008 – 2015 (8 years)
Archived in ISSDA
PFL: something for every researcher
18. What’s inside PFL: Domains
EIGHT DOMAINS
1. Child development
2. Child health
3. Parenting
4. Home environment
5. Maternal health and wellbeing
6. Social support
7. Childcare and service use
8. Household factors and socioeconomic status
19. What’s inside PFL: A small sample of the
standardised instruments used
• Ages & Stages Questionnaire (ASQ)
• Developmental Profile-3 (DP 3)
• Achenbach Child Behaviour Checklist
(CBCL)
• Strengths and Difficulties Questionnaire
(SDQ)
• Children’s Sleep Habits Questionnaire
(CSHQ)
• Parenting Daily Hassles Scale (PDH)
• Parenting Stress Index (PSI)
• Parenting Styles and Dimensions
Questionnaire (PSDQ)
• Home Learning Environment (HLE)
• Framingham Safety Survey (FSS)
• Rosenberg Self-esteem Scale (RSE)
• Edinburgh Postnatal Depression
(EPD)Scale
In the archived dataset scale acronym in caps is used to distinguish the
standardised measures – to allow cross-dataset tracking
20. What’s inside PFL: Some other topics
examined – a small sample
• Prenatal intention to breastfeed followed by breastfeeding
practice
• Maternal nutrition and child nutrition
• Social support – partner to neighbours
• Participation, voting
• Substance use and changes during pregnancy
• Children’s screen time
• Children’s sleeping habits….
…to name a few
21. Part 3: Children’s Profile at School Entry
collection
Northside Partnership and UCD Geary Institute
22. Purpose of CPSE study – connection to PFL
• Conducted by PFL evaluation team from 2008 – 2015
• Annual survey of levels of school readiness of Junior Infant
children at entry to Primary School
• Provided a comparison from the general population in PFL
catchment area against which PFL children could be compared
• 8 waves of data collection (Oct-Dec every year)
• PFL children included in the sample from Wave 5 (sample reached
school age from 2012 onwards) – PFL cases are listed in the data
23. The CPSE study design
• Teacher survey
• Demographic questions about self
• School readiness using SEDI measure
• School readiness using PFL (and subjective) measures
• Parent survey
• Socio-demographic info about self/household
• Mental health of primary carer
• Parenting
• School readiness using SEDI measure
• School readiness using PFL (and subjective) measures
24. The archived CPSE data
• 8 waves (years) of data
• Approx. 980 cases – data collected once per case but from two sources
• Set of standardised measures and indices used
• Connection to the PFL data – the child’s PFL ID is included to allow for
cross-dataset analysis
25. What’s inside CPSE
DOMAINS
• Household factors and
socioeconomic status
• Indications of school
readiness
• Parenting
• Caregiver (maternal)
health and wellbeing
• Teacher demographics
MEASURES
• Early Development Instrument
about child (S-EDI)
• Mental well-being assessed using
WHO-5
• Center for Epidemiologic Studies
Depression Scale (CES-D)
• Parenting Styles and Dimensions
Questionnaire (PSDQ)
26. Part 4: Early Childhood Care and Education
collection
Tallaght West Childhood Development Initiative
Centre for Social and Educational Research, Dublin Institute of Technology
and Institute of Education, University of London
27. The Early Years programme
• 2-year programme targeted at children and their families in
Tallaght West
• Targeted all families in Tallaght West but especially children that
face barriers to educational achievement and well-being
• The CDI Early Years Programme explicitly sought to improve
children’s ability to learn and to make them more ready for
school, both socially and cognitively (school readiness)
• Intervention delivered via the early years service: low-cost
childcare with HighScope principles, minimum staff qualifications,
longer working week, lower child to staff ratio, access to
services…
28. The Early Years evaluation
• A cluster randomised trial: Early Years services were randomly
allocated to the intervention or the control
• Two cohorts x three data collection points
• Cohort 1
• Baseline: Autumn 2008
• Mid-phase (end of first year): Summer 2009
• End phase (end of second year): Summer 2010
• Cohort 2
• Baseline Autumn 2009
• Mid-phase (end of first year): Summer 2010
• End phase (end of second year): Summer 2011
29. The Early Years evaluation
Child level
assessment
Collected at Baseline, Mid-Phase
and End phase
Standardised child assessments
taken over time measuring
cognitive, verbal, spatial
development, and emotional and
pro-social development
Instrument: direct assessment of
children by researcher (similar
method to PFL BAS data)
Suite of standardised scales
Parent level
assessment
Collected at Baseline and End
phase
Child social/behavioural profiles
and assessment of child’s
development. Other measures of
parenting and home learning
environment collected.
Instrument: survey with primary
caregiver (similar method to PFL)
Mix of household demographics
and standardised scales
Care setting level
assessment
Collected once during study
Quality assessments of the Early Years
services of the care setting
environment using ECCRS-R and
ECCERS-E and CIS
Assessment conducted by specially
trained researchers
Used a specific scale that had been
designed by one of the co-investigators
30. What’s inside ECCE - overview
Child level measures
• British Ability Scales 2nd Edition [BAS II]
• Rhyme and Alliteration
• Lower letter recognition
• Adaptive Social Behaviour Inventory [ASBI]
• Strengths and Difficulties Questionnaire
[SDQ]
Parent level measures
• Strengths and Difficulties Questionnaire
(SDQ) – measured child
• Parent Stress Scale
• Home Learning Environment Index
Service level measures
• Early Childhood Environmental Rating
Scale (ECERS-R)
• Early Childhood Environmental Rating
Scale - Extension (ECERS-E)
• Arnett Caregiver Interaction Scale (CIS)
31. Part 5: What do you get in a curated PEII
collection?
Curated by the Children’s Research Network in 2016 - 2017
32. Description of data file
• Delivered in .sav but should be useable across range of formats
• Variables are named and labelled to match survey as closely as
possible – for ease of reference
• Anonymised and derived variables content clearly labelled and
info given in codebook
• Missing data in categorical variables marked with standardised
value codes (996, 997 etc)
• All cases included in multi-wave studies to allow merging (so
missing cases included)
33. Description of codebook
Variables are listed chronologically as they appear in the archived data file. For each
variable the following information is provided:
• Variable name
• Position
• Label
• Type
• Code values are listed for nominal variables
• Missing values including “don't know”, “not applicable” and “refuse”
• Some basic frequencies
Each codebook concludes with a chronological index of variables in the appendix. The
name and label for each variable is provided in this index. Also variables are
categorised into the eight major domains of the study for quick reference, and in some
cases additional information is provided to assist the user to find groups of variables of
particular interest.
34. Description of survey
• Copy of the survey that was used to collect the data is provided in
PDF
• Scale materials that cannot be shared due to copyrighted
restrictions are redacted but cited
• Users should refer to survey to examine full text of original
question (not always possible to fit it into label)
35. Ruth Geraghty, Data Curator and Researcher
rgeraghty.crn@effectiveservices.org
Questions?
Editor's Notes
Operated by the Northside Partnership in Dublin
Objective: improve levels of school readiness of young children from several disadvantaged areas in North Dublin
Intervening during pregnancy and working with families until the children start school
Developed by 28 local agencies and community groups (bottom up) over 5 years and tailored to meet the needs of the local community and was grounded in empirical evidence
home visiting programme
assisting parents in developing skills to help prepare their children for school
PFL families receive a range of supports
Families in the high treatment group were assigned mentor who visited their home fortnightly or monthly, for between 30 minutes and two hours from pregnancy (about 21 weeks)
parents make informed decisions
Tip Sheets, which were colourful representations of information related to child development presented in a clear, concise manner and were developed by PFL staff based on available information from local organisations such as the Health Service Executive, the Department of Health and Department of Children and Youth Affairs, and Barnardos Children’s Charity
Triple P aims to improve positive parenting through the use of videos, vignettes, role play, and tip sheets in a group-based setting for eight consecutive weeks
The PFL programme was evaluated between 2008 and 2015 by the UCD Geary Institute at University College Dublin, Ireland to provide evidence on the effectiveness of the PFL programme to positively impact on parent and child outcomes
The Children’s Profile at School Entry (CPSE) study was conducted between 2008 and 2015 as part of the wider Preparing for Life intervention by the Northside Partnership and its evaluation by the UCD Geary Institute at University College Dublin
The CPSE study took place in parallel to the PFL evaluation, and provided an annual, representative survey of the levels of school readiness of all Junior Infant children in the PFL catchment area – both those participating in PFL and the general population not participating in PFL. The annual survey 1) indicated the general level of school readiness of children attending schools in the PFL catchment area, 2) indicated whether the PFL programme was generating positive Junior Infants is the entry level class for primary education in the Republic of Ireland. Children commence Junior Infants at approximately 4 to 5 years of age.
Data files
cleaned, labelled in full
Anonymised
Derived variables labelled as such
Missing data marked in file
Checked to ensure waves can be merged
Data files are usually restricted to approved users – access given under certain conditions
Codebooks
Notes contain supplementary information
Some basic frequencies
This is openly available material