SlideShare a Scribd company logo
1 of 36
Download to read offline
1
The transformational shift in educational outcomes in London 2003 to 2013: the
contribution of local authorities
Sean Hayes and Robert Cassen
ABSTRACT
Introduction
The paper explores the transformational shift in educational outcomes in London between
2003 and 2013. London’s schools have improved rapidly over the past decade, with primary
and secondary schools now out-performing the rest of the country, at Key Stages 2 and 4,
respectively. Improvements in many London boroughs have been staggering. England is
now one of only a small number of countries in the developed world to have its capital city,
London, outperforming the rest of the nation.
Focus of the enquiry
Many reasons have been put forward for the transformational shift. National education
policy over the last decade has impacted London as much as anywhere; the investment in
facilities, the growth of academies, changes to the national curriculum and testing,
developments in teacher training, school accountability and the relationship between central
and local government have all played some part in shaping the educational outcomes of
London’s schools and the impact of the Department for Education’s London Challenge,
which ran from 2003-10, cannot be underestimated. There is little doubt that the London
Challenge was an important lever in raising standards; however, this paper argues that there
are many other reasons behind London’s success and it explores the role of local authorities
and their perspective on, and contribution to, London’s educational success.
Research methods and mapping of the literature
The research report will review and critique the current literature on the reasons being put
forward for London’s success. The main research method will be a survey of London Local
Authority education research and statistics officers, supported by feedback from a workshop
organised by the London Education Research Network (LERN), and a series of interviews
with London Local Authority Directors of Education or Children’s Services and their school
improvement officers.
Analytical framework
The research will include a quantitative analysis of the survey responses and a qualitative
narrative based on one of the author’s (Hayes) lived experience from working in two London
Local Authorities (Hammersmith & Fulham and Greenwich), the views expressed by
individual local authority research and statistics staff in the workshop and the interviews with
local authority officers.
Research findings
The research shows that there are many reasons behind the transformational shift in
educational outcomes in London and that local authorities were an important part of that
process of change and that they contributed to the success.
2
Introduction and background
The paper explores the transformational shift in educational outcomes in London between
2003 and 2013. London’s schools have improved rapidly over the past decade, with primary
and secondary schools now out-performing the rest of the country, at Key Stages 2 and 4,
respectively. Improvements in many London boroughs have been staggering, with London
first outperforming national at Key Stage 2 in 2009 and at Key Stage 4 in 2004. However, it
was not always thus and the picture of education and educational outcomes in London was
historically a mixed one. Radfordi
reminds us that state education in inner London was
delivered through the Inner London Education Authority (ILEA) from 1964 to 1990, although
the ILEA was abolished in April 1990 by the Conservative Government, through the
Education Reform Act, 1988. This ended the unitary system of education that had existed in
inner London for over a hundred years. On reflection now and in comparison to educational
outcomes in London in 2013, the results achieved by schools in the ILEA in the 1980s do not
look that good, even though they were robustly defended at the time. Frances Morrell
writing in the book ‘Education and the Cityii
’ in 1983 reported that:
“When we return to a review of educational achievement [in the ILEA’s schools] as
expressed in external examination results, it is clear that in the face of the
disadvantages already outlined, and in the face of considerable institutional fluidity
arising out of many factors, these results are commendable.”
Peter Newsam2
, an ILEA Education Officer around the same time, writing about the 1979
and 1980 examination results in the ILEA remarked that:
“If examination results are to be taken as the test, the suggestion that results in ILEA
secondary schools have fallen, though frequently made, is unsubstantiated.”
The educational outcomes reported by Radford based on data published by the ILEAiii
on
performance by ethnicity and gender show that in 1981, only 13.9% of White British boys
achieved 5+ CSE Grade 1 or O Levels at Grade A – C, while the most recent data available
on Black Caribbean boys in the same report shows that in 1979 only 10.0% of them
achieved the same benchmark. At best the judgement on those outcomes from 2014 is that
they were not very good at all, no matter how much they might have been judged as being
acceptable or even commendable over 30 years ago. A Sheffield University report by Gray
and Jessoniv
on local authority examination performance in the inner cities placed the ILEA
about ‘par for the course’ in relation to other similar local authority areas in England. But it is
clear from the failure of many subsequent Conservative and New Labour initiatives in the
1990s and early 2000s to raise standards in inner city schools that the problem is complex.
The 1988 Education Reform Actv
, which devoted a lot of its legislative content to abolishing
the ILEA, also led to the introduction of the national curriculum and a new assessment
framework in England at Key Stages 1 to 5. The closure of the ILEA in March 1990 meant
that it never benefitted from any of the advantages that the national curriculum brought to
England’s schools thereafter. The demise of the ILEA is much lamented by many who
worked in it and in its schools and Radford concluded in his thesis that:
“The evidence does not sustain the claims made against the ILEA [that it tolerated low
standards in education and failed to give value for money] and that therefore, its
demise can better be explained by the polarisation of politics at the time.”
Educational standards in the majority of the 12 inner London local education authorities,
which were created following the closure of the ILEA in 1990, did not start to rise
exponentially in the early part of that decade. In 1995 one of this paper’s authors (Hayes)
who was working in Hammersmith & Fulham local authority at the time, remembers three
secondary schools in the borough having fewer than 10% of students achieving 5+ A* - C
3
GCSEsvi
; against a national average at the time of 43.5%. By 1998 only nine London local
authorities were performing above the national average for the percentage of students
achieving 5+ A* - C GCSEs (Inc. English & maths) and all of them were outer London local
authorities. Table 1 shows the GCSE results for the percentage of students achieving 5+ A*
- C GCSEs (inc. English & maths) in 1998, 2003, 2008 and 2013 for the 12 London local
authorities that had made up the ILEA.
Table 1: GCSE performance in 1998, 2003, 2008 and 2013 in the 12 London LAs that
made up the ILEA
ILEA LA
London
Designation
% 5+ A* - C (Inc. Eng & maths)
Local Authority 1998 2003 2008 2013
Camden Inner 34.6 40.1 45.3 60.4
Greenwich Inner 23.3 26.4 39.5 65.4
Hackney Inner 16.9 26.5 42.4 61.2
Hammersmith & Fulham Inner 35.0 42.6 55.9 66.5
Islington Inner 15.6 22.5 38.3 63.5
Kensington & Chelsea Inner 29.5 45.1 58.1 80.2
Lambeth Inner 19.7 30.1 46.9 65.9
Lewisham Inner 22.2 30.1 45.8 58.0
Southwark Inner 18.2 26.3 42.7 65.2
Tower Hamlets Inner 17.7 25.5 41.2 64.7
Wandsworth Inner 26.3 37.1 50.0 61.3
Westminster Inner 24.8 37.1 49.6 69.6
England 37.0 41.9 48.4 60.8
Number of LAs above England Average 0 2 4 10
Source: DfE Statistical First Releases (SFRs)
The data in table 1 show that none of the 12 local authorities of the ILEA had reached the
England average by 1998 and only two had done so by 2003 and four by 2008. However,
by 2013, 10 out of the 12 had exceeded the England average, while the two that had not,
Camden and Lewisham, were within 1 and 3 percentage points respectively of the England
average. By 2013 the London landscape, in terms of educational outcomes for young
people at 16 years old, had well and truly changed from the 1980s and the 1990s.
An overview of educational performance in London up to 2013
This section provides an overview of educational performance in London, focusing on Key
Stage 2 and Key Stage 4 and highlighting when performance in London began to outstrip
the national average and where it had reached by 2013.
Key Stage 2
Performance at Key Stage 2 has been one of steady improvement year on year from 2005
to 2012 in London and nationally, with only a slight drop nationally in 2009. Chart 1 shows
the performance in terms of the percentage of children achieving Level 4+ in English &
Maths Combined from 2005 to 20121
, comparing London with the national average. The
graph shows London first outperforming national at Key Stage 2 in 2009 and then moving
further ahead of national year on year up to 2012.
1
This series has been shown up to 2012 because from 2013 it has not been possible to calculate an overall level in English,
i.e. from 2013; the outturns for English are reported separately as the Reading Test Level and the Writing Teacher Assessment
Level. The combined measure from 2013 is the % achieving Level 4+ in Reading, Writing (TA) and Mathematics combined.
4
Chart 1: Key Stage 2 % Level 4+ in English & Maths Combined from 2005 to 2012
London v National
Source: DfE Statistical First Releases (SFRs) 2005 to 2012
Chart 2 shows the Key Stage 2 results for every London local authority in 2013 based on the
percentage of children achieving Level 4+ in Reading, Writing and Mathematics combined.
In 2013 performance in inner, outer and greater London was above national, with
performance in inner London now above that of outer London. Twenty nine local authorities
in London performed above the national average in 2013, while only four were below;
Barking & Dagenham, Croydon, Haringey and Waltham Forest. Appendix 1 shows which
local authorities are in Inner and Outer London, based on the DfE’s designation in 2013.
5
Chart 2: Key Stage 2 Level 4+ in reading, writing & maths by London LA in 2013
Source: DfE Statistical First Release (SFR) 2013
Chart 3 shows Key Stage 2 performance in London and England by Free School Meal
(FSM) Eligibility in 2013 based on the percentage of children achieving Level 4+ in reading,
writing and mathematics combined. The performance of children eligible for free school
meals is much better in London than it is in the whole of England and within London
performance is higher in outer London than inner London.
50 60 70 80 90 100
Croydon
Haringey
Barking & Dagenham
WalthamForest
Enfield
Islington
Southwark
Brent
Ealing
Hounslow
Tower Hamlets
Hillingdon
Merton
Redbridge
Hammersmith& Fulham
Newham
Westminster
Barnet
Harrow
Havering
Bromley
Hackney
Lambeth
Bexley
Greenwich
Sutton
Camden
Wandsworth
Kingstonupon Thames
Lewisham
Kensington & Chelsea
Richmond upon Thames
Cityof London
Inner London
Outer London
London
England
%Level 4+ in Reading, Writing & Maths
6
Chart 3: Key Stage 2 Performance by FSM Eligibility in 2013 % Level 4+ in reading,
writing and mathematics
Source: DfE Statistical First Release (SFR) 2013
Table 2 shows the data that is in Chart 3 with the addition of the percentage point attainment
gap between FSM eligible and not eligible children. The attainment gap in London is six
percentage points lower than the national gap and the gap in inner London is nine points
lower than national. Based on these outturns, primary schools in London and particularly in
inner London appear to be very effective at maximising outcomes for their free school meal
children and at addressing longstanding attainment gaps.
Table 2: Key Stage 2 Performance by FSM Eligibility in 2013 % Level 4+ in reading,
writing and mathematics with the attainment gap
Region FSM Not FSM
All
pupils
% Point
Gap FSM v
Not FSM
Inner London 73.0 83.0 79.0 10.0
Outer London 65.0 82.0 78.0 17.0
London 69.0 82.0 79.0 13.0
England 60.0 79.0 76.0 19.0
Source: DfE Statistical First Release (SFRs) 2013
Chart 4 shows the proportion of primary schools that were below the national Key Stage 2
floor standard2
in 2012 and 2013. The proportion of primary schools below the national floor
standard improved slightly in 2013 to 6.1% from 6.5% in 2012. There was a similar
improvement in London but the point of real significance is that in 2013, only 2.7% of primary
schools were below the floor standard, less than half the national percentage. Primary
schools in London are closing the social class attainment gap and also appear to be tackling
the long tail of underachievement, which for many decades has been associated with
educational performance in England, according to Marshall et alvii
who suggest that: ‘One
2
In the Key Stage 2 tests for 2012/13 a school was below the floor standard if: fewer than 60% of its children did not achieve
Level 4 or above in reading, writing and mathematics; and it was below the England median for progression by two or more
levels in: reading, writing and mathematics.
50
55
60
65
70
75
80
85
FSM NotFSM All pupils
%Le4vel4+
Inner London Outer London London England
7
child in five leaves schools in England without basic skills in literacy and numeracy [and that]
it has become increasingly common to refer to these children as the tail.’ The GCSE
performance data in Charts 7 and 8 of this paper would suggest that secondary schools in
London are also starting to tackle the long tail of underachievement successfully.
Chart 4: % of Schools below the Key Stage 2 Floor Standard in 2012 and 2013
Source: DfE Statistical First Releases (SFRs) 2012 and 2013
Key Stage 4
Performance at Key Stage 4 has been one of steady improvement year on year from 1998
to 2013 in London and nationally. Chart 5 shows the performance in terms of the
percentage of students achieving 5+ GCSE grades at A* - C (Inc. English & maths). The
graph shows London first outperforming national at Key Stage 4 in 2004 and then moving
further ahead of national year on year up to 2013.
Chart 5: Key Stage 4 % achieving 5+ A* - C (Inc. English & maths) from 1998 to 2013
London v National
Source: DFE Key Stage 4 School Performance Tables 1998 to 2013
2.5%
3.6%
3.1%
6.5%
2.2%
3.0%
2.7%
6.1%
0%
1%
2%
3%
4%
5%
6%
7%
Inner London Outer London London England
%ofSchools
2012 2013
30
35
40
45
50
55
60
65
70
1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013
%5+A*-C(Inc.English&maths)
London England
8
Chart 6 shows the performance of London local authorities in terms of the percentage of
students achieving 5+ GCSE grades at A* - C (inc. English & maths) at four points in time,
1998, 2003, 2008 and 2013, against the national performance at the same point in time. In
1998, 28 out of 32 London local authorities were below national on this measure and the
number dropped to 21 in 2003 and 16 in 2008. However, the most dramatic improvement
was in 2013, when the number of London local authorities below the national average
dropped to only six. Between 1998 and 2013, national performance on this measure
improved by 23.8 percentage points. Over the same period 31 out of 32 London local
authorities improved by more than 23.8 percentage points, with nine of them improving by
more than 40 percentage points.
Chart 6: London LAs GCSE Performance v England % 5+ A* - C (Incl. English &
maths) in 1998, 2003, 2008 and 2013
Source: DfE Statistical First Releases (SFRs) 1998, 2003, 2008 and 2013
The six local authorities with the lowest results in 1998 (Islington, Hackney, Tower Hamlets,
Southwark, Lambeth and Haringey) and therefore, those with the greatest distance to travel
to reach the national average, were among those who made the greatest improvements
between 1998 and 2013. By 2013, all six of them were above national average for the
percentage of students achieving 5+ GCSE grades at A* - C (inc. English & maths). Five of
these local authorities, other than Haringey, were also ex-ILEA councils and the other three
councils, which made similarly large improvements between 1998 and 2013, were also ex-
ILEA councils. These were Greenwich, Westminster and Kensington & Chelsea.
Chart 7 shows Key Stage 4 performance in London and England comparing disadvantaged
pupils3
with all others in 2013. The chart shows the performance of both groups and the
attainment gap, in inner and outer London as well as greater London and England. More
disadvantaged pupils in London achieve 5+ GCSE grades at A* - C (inc. English & maths)
than do so nationally and more disadvantaged pupils in inner London do so than in outer
London. In a similar manner to primary schools in London, secondary schools are also
closing the social class attainment gap.
3
Disadvantaged pupils include all those pupils in the Key Stage 4 cohort who are eligible for free school meals and those who
are Looked After Children.
0
20
40
60
80
Islington
Hackney
TowerHamlets
Southwark
Lambeth
Haringey
Lewisham
Barking&Dagenham
Greenwich
Newham
Westminster
Wandsworth
WalthamForest
Kensington&Chelsea
Croydon
Merton
Ealing
Enfield
Hillingdon
Camden
Brent
Hammersmith&Fulham
Hounslow
Bexley
Havering
Richmond
Harrow
Redbridge
Bromley
Barnet
Kingston
Sutton
England
%5+A*-C(Incl.Eng&maths)
1998 2003 2008 2013
9
Chart 7: GCSE Performance and Attainment Gaps in London in 2013
Source: DfE School Performance Tables 2013
Chart 8 shows the proportion of secondary schools that were below the national Key Stage
4 floor standard4
in 2012 and 2013. The proportion of secondary schools below the national
floor standard improved in 2013, down to 5.3% from 6.6% in 2012. There was a similar
improvement in London but the point of real significance is that in 2013, only 1.2% of
secondary schools were below the floor standard, a small fraction of the national
percentage, with 1.2% equating to five schools. Just like primary schools in London,
secondary schools are also closing the social class attainment gap and also appear to be
tackling the long tail of underachievement.
Chart 8: Key Stage 4 % of schools below the Floor Standard in 2013 London v
National
Source: Source: DfE Statistical First Releases (SFRs) 2012 and 2013
4
At Key Stage 4 in 2012/13 a school was below the floor standard if: fewer than 40% of its pupils did not achieve 5+ GCSE
grades at A* - C (Inc. English & maths); and it was below the England median for progression by three or more levels in: GCSE
English and in GCSE mathematics.
0%
10%
20%
30%
40%
50%
60%
70%
80%
Disadvantaged pupils Other pupils Attainment Gap
(% points)
%5+A*-C(Inc.E&M)
GCSE Performance and Attainment Gaps in London in 2013
Inner London Outer London London England
2.8%
2.6% 2.7%
6.6%
1.4%
1.1% 1.2%
5.3%
0%
2%
4%
6%
8%
Inner London Outer London Greater London England
%ofSchools
2012 2013
Only 5 London schools were below the floor
standard in 2013. London appears to be
tackling the long tail of underachievement
10
Chart 9 shows the GCSE performance of the main ethnic groupings in terms of the
percentage of students achieving 5+ GCSE grades at A* - C (inc. English & maths). It
compares performance in inner, outer and greater London with national. The chart shows
that more pupils in each of the five ethnic groups achieves 5+ GCSE grades at A* - C (inc.
English & maths) in London compared to the same groups nationally. The performance of
White pupils more or less mirrors the performance of all pupils for each of the London and
national benchmarks, while the performance of mixed race, Asian and Chinese pupils
exceeds that of all pupils. The performance of Black pupils in London at 60.1% is just below
that of all pupils nationally, that is, within less than one percentage point of the national
average of 60.8%. The comparative performance for Black pupils in London back in
2005/06 was 35.0% compared with 44.0% for all pupils nationally, a gap of nine percentage
points. Secondary schools in London have made significant inroads in terms of closing the
attainment gaps between Black pupils and White pupils and Black pupils and all pupils
nationally. Greater proportions of Asian, Chinese and Mixed Race pupils in London are
achieving GCSE success compared to the same groups nationally and to all pupils
nationally.
Chart 9: Key Stage 4 Performance by Ethnicity in 2013 % 5+ A* - C (Including English
& maths)
Source: DfE Statistical First Release (SFR) 2013
The analyses of performance quoted above have all been taken from the Department for
Education’s publicly available data. There has been other analytical work done that shows
that London is outperforming national at GCSE. One example of this is by Chris Cookviii
, a
journalist at the Financial Times, who produced an analysis using the 2012 GCSE National
Pupil Dataset (NPD). Chart 10 shows one of the representations of Cook’s analysis5
, which
plots pupils’ Average GCSE Points against their deprivation ranking, split by English region.
This analysis shows that pupils in London are outperforming pupils in all other regions
regardless of their deprivation ranking. In fact, the more deprived pupils in London are
outperforming similar pupils in all other regions, to an even greater extent than less deprived
pupils.
5
Cook’s analysis is based on the following concepts: (1) The Average GCSE Points Score (called the "FT score") – is based on
attributing 8 points for an A* down to one for a G and adding up the score for English, maths and the pupils’ three best other
subjects. (2) This is plotted against each pupil’s IDACI (Income Deprivation Affecting Children Index) score, which is an index of
poverty which measures how poor the neighbourhood in which a child lives is. The lower the child’s IDACI score the more
deprived they are likely to be. (3) The analytical method uses regression analysis. (4) The outcomes are split by English region.
20
30
40
50
60
70
80
90
White Mixed Asian Black Chinese All Pupils
%5+A*-C(Inc.Eng&maths)
Inner London Outer London London England
11
Chart 10: GCSE Performance in 2012 based on Average GCSE Points Score plotted
against pupils’ deprivation ranking and split by English Region
Source: Chris Cook at the Financial Times 2013 – http://blogs.ft.com/ftdata/author/christophercook/
Ofsted Inspection Outcomes
Chart 11 compares the Ofsted inspection profile of schools in London with the rest of
England as at 31 August 2013. In 2013, over 80% of London’s primary and secondary
schools were judged by Ofsted to be good or outstanding, while the comparable figures for
the rest of England’s schools were under 80% for primary and under 70% for secondary.
The better Ofsted profile of London’s schools compared to the rest of England reflects the
better educational outcomes at Key Stage 2 and Key Stage 4.
Chart 11: Ofsted Inspection Profile of Schools – London v National (as at 31/08/13)
Source: Ofsted 2013 and Merryn Hutchingsix, Institute for Policy Studies in Education, London
Metropolitan University
0% 20% 40% 60% 80% 100%
Rest of England
London
Rest of England
London
Outstanding Good Satisfactory/Requires Improvement Inadequate
Primary
Secondary
12
What factors contributed to London’s success – A review of the literature?
Many individuals and organisations will want to claim that they played some part in the
transformational change in London’s educational outcomes in the 10 years from 2003 to
2013. These will range from the most obvious; the students themselves, teachers, school
leaders and governors, through to local authority directors, school improvement officers and
elected members, the trade unions, educational researchers and performance data analysts
and the Department for Education and its initiatives like London Challenge and Ofsted with
its rigorous inspection frameworks, to politicians of every hue and in particular the Labour
government from 1997 to 2010 and the Coalition government from 2010 onwards. There is
also the strong notion that certain propitious circumstances particular to London might have
helped enable the transformational change. These include the resilience of the London
economy, the demographic profile of London with its much more ethnically and linguistically
diverse population compared to the rest of England and the ability of London’s teachers to
effectively meet the language, cultural and learning needs of their diverse pupil population.
In reality, it is likely that most of these individuals and organisations and London factors will
have played some part in improving educational outcomes and it will be near impossible to
disentangle which of these made the most impact, and difficult to assess which other factors
might have contributed to the success. This is one of the methodological challenges with
attempting to research such a complex phenomenon as changing outcomes in schools in a
city with over 8,000,000 inhabitants. As the Centre for London report: Lessons from
London’s Schools by Barrs et alx
points out, there have been no randomised controlled trials
to measure the impact: “none of the major London reforms were planned with a concurrent
rigorous evaluative element or any randomised controlled trial (RCT) element.” In practice, it
would have been almost impossible to put randomised controlled trials in place to measure
what made the most impact on educational outcomes in a city like London over a five to 10
year period.
In the period 2003 to 2010, the single biggest educational intervention in London was the
Department for Education’s London Challenge school improvement programme. The
London Challenge was established in 2003 to improve outcomes in low-performing
secondary schools in the capital, with primary schools included from 2008. The London
Challenge was led by two key players: Tim Brighouse, the ex-Birmingham Council Chief
Education Officer and London Schools’ Commissioner, and David Woods, the ex-Principal
National Challenge Adviser for England and Chief Adviser for London Schools. It used
independent, experienced education experts (London Challenge advisers), to identify need
and broker support for underperforming schools. The advisers were supported by a small
administrative team based in the DfE. The cost of the support and brokered services came
directly from the DfE and was spent as the advisers directed. The London Challenge had
four core elements:
 A consistent message of the pressing need to improve educational standards;
 Programmes of support to local authorities, which were managed by experienced
and credible London Challenge advisers;
 The main focus of the work was to improve the quality of teaching and learning in
schools;
 It developed robust systems to track pupils’ progress and used data to evaluate
effectiveness.
The most significant evaluation of the London Challenge was carried out by Hutchings et alxi
in a report for the Department for Education: Evaluation of the City Challenge Programme,
which evaluated the City Challenge programme in London, Greater Manchester and the
Black Country, and included a retrospective review of the London Challenge 2003 to 2008.
Another feature of the London Challenge was that it worked with local authority school
improvement advisers and the evaluation by Hutchings et alxii
acknowledges that: “To be
effective, capacity building with local authorities has to involve working as partners”. Many
13
local authority advisers reported that there was effective partnership and that this benefitted
them and schools. There was some evidence that working with the Challenge advisors had
resulted in some changes to the way that the local authority conducted school reviews, that
is, they became more focused on teaching and learning. Several LA interviewees talked
about the key role that Challenge advisors had played, both in swelling the number of people
working on school improvement in the borough, and in developing the expertise of the local
authority school improvement team. However, in other LAs the relationship was more
limited, partly because the Challenge was focusing its work on those local authorities whose
schools were most in need of improvement and in some cases because there were barriers
to partnership working, for example poor communication on the part of City Challengexiii
.
Some local authority officers also felt that City Challenge did not recognise the work that the
local authorities had been doing in their schools over an extended period, and were claiming
credit for improved results which also related to previous groundwork they had undertaken.
One local authority officer argued, ‘they’re very much airbrushing out the contribution made
by local authorities to the success of the Challengexiv
.’
This is the essence of the tension that this paper wants to explore and it is the view of the
authors that yes, the London Challenge did play a part in the transformational shift in
educational outcomes in London, but it was not the only factor that contributed to the
change. The Challenge provided a significant catalyst for many London local authorities
and their schools to embark on a journey of rapid improvement but it did not achieve the
successful educational outcomes on its own. It probably worked best when it worked in
effective and collaborative partnerships with local authority school improvement teams and
together they played an important part in improving educational standards. The view of Tim
Brighousexv
is that London Challenge played a large part and that: ‘it made more good
things happen and fewer bad things happen’. The main conclusion made by Hutchings et
alxvi
was the following:
“Perhaps the most effective aspect of City Challenge was that it recognised that
individuals and school communities tend to thrive when they feel trusted, supported
and encouraged. The ethos of the programme, in which successes were celebrated
and it was recognised that if teachers are to inspire pupils they themselves need to be
motivated and inspired, was a key factor in its success.”
It is not unreasonable to read into this that the City Challenges, including the London
Challenge, were most effective when all parties worked together, including schools,
Challenge advisers and local authority school improvement staff.
In 2013 Ofsted produced a reportxvii
which summarised the previous evaluation by Hutchings
et al and also reviewed the sustainability of the school improvement that took place. One
can conclude from Hutching’s evaluation, the Ofsted report and the evidence provided in this
research on educational outcomes in London, that the three objectives of the London
Challenge:
 to reduce the number of underperforming schools;
 to increase the number of Good and Outstanding schools;
 to improve educational outcomes for disadvantaged children;
were achieved during the lifetime of the programme and in the years immediately following it
the successes were sustained. However, the question stills persists, was the London
Challenge responsible?
The London Challenge was not active in every London borough and in its first phase, it was
targeting particularly intensive support at five key boroughs (Southwark, Lambeth, Hackney,
Islington and Haringey) and many local authorities improved at different times and from
different starting points, while some, as we have seen, improved much more than others.
There was significant variation in the improvement trajectories across London and some
14
local authorities reached a plateau earlier and have not improved as much as others that
exceeded the national average at GCSE for the first time more recently, such as Greenwich
in 2012 and Islington in 2013.
There is a strong feeling that many London local authorities just got on with it and largely set
about raising standards by themselves, that is, working locally to improve the educational
outcomes in their schools. One local authority who documented this was Tower Hamlets.
In Chart 6 of this paper it was illustrated that six local authorities with the lowest GCSE
results in 1998 had been among those who had made the greatest improvements by 2013.
Tower Hamlets was one of those six and in 2013 the local authority decided to tell its
version of how they improved educational outcomes and effectively achieved all three of the
objectives of the London Challenge. The report: ‘Transforming Education for All: the Tower
Hamlets Story’xviii
was written by Woods et al at the Institute of Education, the same David
Woods who led the London Challenge with Tim Brighouse.
The report on Tower Hamlets provides evidence of a local authority which had a very clear
strategy for securing improvements in educational standards and which achieved them
largely through its own efforts, although Tower Hamlets council leaders also engaged fully
with the London Challenge. Christine Gilbert and Kevan Collins were in post as Director of
Education and Children’s Services and subsequently as Chief Executive for the majority of
the life of the London Challenge and Collins makes the point in a chapter entitled ‘An East
End Tale’ in the book: ‘The Tail’ edited by Paul Marshall (2013)xix
, that: ‘
“Tower Hamlets never saw London Challenge as a threat to its leadership and
embraced the approach with many of the Borough’s Headteachers given key roles and
rightly asked to share their work and support others. The strategy thus played to the
strong local traditions of collaborative partnership working”.
Tower Hamlets council worked with the London Challenge but it also had a very clear idea of
its own role in raising educational standards in the borough and the importance of doing that
through effective collaborative partnership working.
The Tower Hamlets reportxx
identified six major factors which the council believe explained
their experience and successful approach:
 Shared values and beliefs with robust and resilient purpose and professional will.
‘Yes we can…’;
 Highly effective and ambitious leadership at all levels – Local Authority and school
leadership;
 Schools rising to the standards challenge – improved teaching and learning,
enhanced Continuing Professional Development, rigorous pupil tracking and
assessment, a relentless focus on school improvement;
 Partnership working – inward and outward facing, external and integrated
services, shared responsibility and accountability;
 Community development – building collaborative capacity and community
cohesion;
 A professional learning community – building momentum and engagement
through and across school communities, high levels of knowledge, trust and
professional relationships.
Several of these factors align with the London Challenge, but some are more specific to the
strategy adopted locally by Tower Hamlets, including the robust and resilient purpose and
professional will; rigorous pupil tracking and assessment; partnership working and probably
most importantly: community development, that building of collaborative capacity and
community cohesion, so important in such a diverse borough as Tower Hamlets.
15
A recent study by Greaves et al, looking specifically at the results for disadvantaged pupils,
also contests the claims made for the London Challenge. It attributes an important part of
disadvantaged pupils’ improved performance to better primary school results between 1999
and 2003.xxi
They argue this was a major factor in the improvement of their Key Stage 4
performance between 2004 and 2008; after subtracting for this prior attainment, the effects
of other changes in the early 2000’s are greatly reduced. In other words, an important part of
the London improvement, for disadvantaged students at least, is due to factors preceding
the London Challenge, including the roll out of the National Literacy and Numeracy
Strategies nationwide from 1998–99 onwards. Greaves et alxxii
asked what caused the
improvement in Key Stage 2 test scores that led to the ‘London effect’ at Key Stage 4 and
suggested that it is not clear. However, they found that the explanation is likely to be related
to changes in London’s primary schools in the late 1990s and early 2000s. They conclude
that:
“This means that programmes and initiatives such as the London Challenge, the
Academies Programme, Teach First or differences in resources are unlikely to be the
major explanation (as these changes either happened too late, were focused on
secondary schools or were longstanding, and therefore are unlikely to account for the
rapid improvements we see).”
But the authors do further argue that GCSE success has also to do with secondary schools;
in other parts of the country good results for the disadvantaged at Key Stage 2 were not so
impressively continued into Key Stage 4 as they were in London. Tower Hamlets for
example had only a very small social class attainment gap between disadvantaged pupils
and all others at Key Stage 4 in 2013. Table 3 shows that the national gap between
disadvantaged and all other pupils was 26.9 percentage points, whereas the gap in Tower
Hamlets was only 7.5 percentage points. Not only that, results for all pupils in Tower
Hamlets were better than the national average and 62.9% of disadvantaged pupils in Tower
Hamlets achieved 5+ A* - C (inc. English and maths) GCSEs, compared with a national
average for disadvantaged pupils of 40.9%. This is indeed some transformation from the
position in 1997, when as the report saysxxiii
: “in 1997 the Borough had been positioned
149th out of 149 local education authorities in terms of its performance.”
Table 3: Comparison of the performance of disadvantaged and all other pupils based
on the % 5+ A* - C (Incl. Eng and maths) GCSEs in 2013 – Tower Hamlets v National
The attainment gap
at Key Stage 4
% 5+ A* - C (Incl. English and maths) GCSEs in 2013
All pupils
Disadvantaged
pupils
Other
pupils
Gap
(% points)
England (state
funded only) 60.6% 40.9% 67.8% 26.9%
Tower Hamlets 64.7% 62.9% 70.4% 7.5%
Difference v England
(% points) 4.1% 22.0% 2.6% -19.4%
Source: DfE School Performance Tables 2013
The report by Barrs et alxxiv
also referenced the approaches to school improvement being
taken in Haringey and Hackney councils and said that they had: ‘a similar theory of action as
Tower Hamlets’, based on:
 ensuring first-rate leadership of the school improvement service;
 a tough approach to the performance management of headteachers;
 a strong emphasis on the use of data;
 effective professional development both for leaders and class teachers.
16
There were clearly factors in common among local authorities, whose schools achieved
some of the greatest improvements, including: excellent leadership, a focus on school
improvement, improved teaching and learning, pupil tracking and the use of performance
data, continuing professional development, partnership working and community
development.
Another constituency that undoubtedly contributed to London’s educational success story
were the students themselves. In Greenwich, for example, Hayes et alxxv
carried out some
quantitative research into performance by ethnic group, and looked at the possibility that
White UK boys from low income households might be becoming the group at greatest risk of
underperformance. The research found that this was the case and further qualitative
research was carried out by Hayes et alxxvi
which moved it out of the negative paradigm to
investigate why some students from that background succeeded against the odds. What
emerged from the qualitative research was that the successful students from a deprived
White British background had developed a range of approaches and strategies to help them
succeed. These included a degree of ambition for their success and a level of resilience
that they developed for themselves, often leading to a capacity for self regulation, especially
when it came to organising their own learning and even changing their friendship groups.
Research by Siraj-Blatchfordxxvii
found similar reasons why some children from deprived
backgrounds, both White and minority ethnic, were able to succeed against the odds. While
the capacity for children and young people to demonstrate personal resilience and to
succeed against the odds is not unique to London, the fact that the social class attainment
gap is smaller in London at Key Stage 2 and Key Stage 4 than elsewhere in England,
suggests that London’s students have played their part in the capital’s success.
It is sometimes claimed that academies have played a significant part in the story of
London’s improving educational performance. While there is some evidence for rising
outcomes under the academies programme up to 2008/09,xxviii
another study found that, at
least as measured by attainment at the end of primary school, the benefits of academies
were entirely concentrated among students of medium to high prior attainment.xxix
Academies by this measure did little to raise the outcomes for disadvantaged students,
which has been such a feature of the improvements in London. At the time of writing there
were no reliable evaluations of academies’ performance in the period after 2008/09.
Analysis of survey responses
In order to get the local authority perspective on the transformation in London’s educational
outcomes a survey was devised and administered to local authority research and statistics
officers to get their assessment of which factors they felt had the greatest impact. The
survey was administered during a workshop in March 2014, organised by the London
Education Research Network (LERN) and held at the Greater London Authority, City Hall
building.
The survey had 82 factors and respondents were asked to rank each one on a scale of 0 to
4, with 0 being no impact and 4 being major impact. A table at Appendix 2 lists the number
of valid responses and the average score for the perceived impact of each factor.
In total, 21 survey responses were received, 19 from local authority research and statistics
staff, one form a retired local authority school improvement lead and one from a data officer
who works in an educational charity supplying performance data to London schools. The 20
responses from local authorities covered a total of 14 London local authorities, which is
42.4% of the 33 local authorities in London. Table 4 shows the number of responses from
each local authority. There were multiple responses from four local authorities.
17
Table 4: Number of survey responses by local authority
Local Authority
Number of
respondents
Ealing 1
Enfield 1
Greenwich 3
Hammersmith & Fulham 1
Haringey 2
Hounslow 2
Islington 3
Kensington & Chelsea 1
Lambeth 1
Lewisham 1
Newham 1
Southwark 1
Tower Hamlets 1
Waltham Forest 1
Other (Non-LA) 1
Total 21
Source: Survey of local authority staff
Table 5 shows the number of survey responses by gender. There was quite an even
gender split among respondents.
Table 5: Number of survey responses by gender
Gender
Number of
respondents
% of
respondents
Female 11 52.4%
Male 10 47.6%
Total 21 100.0%
Source: Survey of local authority staff
Table 6 shows the length of time respondents have worked in their current or most recent
Local Authority. The majority of respondents, 57.1%, had been in their local authority for
between five and 20 years. Two of the three respondents who have been in their current
local authority for less than one year had substantial earlier experience in other London local
authorities.
Table 6: Length of time respondents have worked in their LA
Length of time in current
or most recent Local
Authority
Number of
respondents
% of
respondents
< 1 year 3 14.3%
1 to 5 years 6 28.6%
5 to 10 years 3 14.3%
10 to 15 years 5 23.8%
15 to 20 years 4 19.0%
Total 21 100.0%
Source: Survey of local authority staff
18
The 10 factors that the survey respondents identified as having the greatest impact on
educational outcomes in their local authority and in London overall are listed in Table 7.
Table 7: Factors identified as having the greatest impact on educational outcomes
How much do you feel these factors contributed to your Local
Authority’s and/or London’s improved educational performance at
Key Stage 2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
between
0 and 4
A relentless focus on standards in your LA 18 3.72
A focus on progress to drive up attainment 19 3.53
Putting effective school improvement support in place 20 3.50
Teaching strategies for EAL children and particularly those at the lower
stages of fluency in English 14 3.50
Your LA not allowing disadvantage to be a barrier to achievement 17 3.47
Schools in your LA committing to the standards’ agenda 15 3.47
Pupil tracking in schools 19 3.42
A relentless focus on pupil groups at risk of underperformance 19 3.42
The role of headteachers in your LA 20 3.40
Your LA’s stance on schools causing concern 17 3.35
Source: Survey of local authority staff. NB This table only includes scores for impact where there
were more than 10 responses.
The factor that respondents rated as having the greatest impact on educational performance
in their local authority was: a relentless focus on standards, scoring 3.72, with a similarly
high score of 3.42 for: schools in your LA committing to the standards agenda. This would
suggest that local authorities and their schools were strongly committed to improving
educational outcomes for children and young people. Other factors that scored highly for
impact related more specifically to the pupils themselves, including: a focus on progress to
drive up attainment, scoring 3.53; teaching strategies for EAL children, 3.50; pupil tracking in
schools, 3.42; and, a relentless focus on pupil groups at risk of underperformance, 3.40.
There is local authority research carried out in Greenwich Council by Hayes and Clayxxx
who
identified the importance of focusing on pupil progress to drive up levels of attainment and
on identifying pupil groups at risk of underperformance, to ensure that every child fulfils their
potential. In Greenwich Council the work that the local authority engaged in with its schools
around pupil level target setting had at its core the concept that a focus on pupils’ progress
was a key lever to raise standards. The impact of this could be seen in the 2011 Key Stage
2 results when the outcomes for 2+ levels of progress moved Greenwich into the top 10 local
authorities in England for progress and attainment in English and mathematics.
Survey respondents were asked four questions about the impact of the London Challenge
and table 8 shows the average scores for impact for each of these. The average scores for
the impact of the London Challenge were in the middle to lower range of scores and
respondents thought that it had slightly more impact generally and across London than it did
in their own local authority, with the impact of its role in improving teacher recruitment being
located somewhere in the middle of the other three scores. To an extent this chimes with
the findings of Greaves et alxxxi
who stated that: ‘programmes and initiatives such as the
London Challenge, the Academies Programme, Teach First or differences in resources are
unlikely to be the major explanation [for London’s success]’, although others have made a
strong case for all of them.
19
Table 8: Average impact scores for factors relating to the London challenge
How much do you feel these factors contributed to your LA’s
and/or London’s improved educational performance at Key Stage
2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
London Challenge across London 12 2.83
London Challenge generally 13 2.69
London Challenge role in improving teacher recruitment in London 11 2.55
London Challenge in your LA 13 2.38
Summary of responses 49 2.61
Source: Survey of local authority staff
Survey respondents were asked 13 questions relating to the impact of performance data and
research in driving up educational standards and table 9 shows the average scores for
impact for each of these.
Table 9: Average impact scores for factors relating to performance data
How much do you feel these factors contributed to your LA’s
and/or London’s improved educational performance at Key Stage
2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
A focus on progress to drive up attainment 19 3.53
Pupil tracking in schools 19 3.42
A relentless focus on pupil groups at risk of underperformance 19 3.42
Measures to close the gaps at KS 2 & 4 between disadvantaged pupils
and others 17 3.18
Performance analysis outputs for schools from your LA’s research &
statistics team 19 3.16
Forensic analysis of performance data by schools 18 3.11
Use of educational research into school effectiveness and school
improvement 15 3.07
The use of local educational research 16 3.00
DfE School Performance Tables 20 2.85
LA traded service providing performance data for schools (if you have
one) 10 2.80
Fischer Family Trust as a tool to support school improvement 14 2.79
Forensic analysis of the performance of pupils by ethnic background 16 2.69
RAISEonline as a tool to support improvement 18 2.67
Summary of responses 220 3.07
Source: Survey of local authority staff
The data in table 7 have already shown that a focus on progress to drive up attainment and
pupil tracking in schools both scored high for impact and both are aspects of school life that
are data driven. The data in table 9 show that the majority of factors relating to data scored
3 or above for impact, with a relentless focus on pupil groups at risk of underperformance
being the next highest score at 3.42. The application and accountability dimension of
measures to close the gaps at Key Stage 2 & 4 between disadvantaged pupils and others,
scored 3.18; the production of performance analysis outputs for schools by the local
authority, 3.16; the forensic analysis of performance data by schools, 3.11; and the use of
educational research into school effectiveness and school improvement, 3.07; that is, they
all scored above 3 for impact. The scores for the impact of data tools produced outside the
school and local authority environment, that is by DfE, Ofsted and others, were all below 3,
with the DfE School Performance Tables scoring 2.85, Fischer Family Trust (FFT) scoring
2.79 and RAISEonline scoring 2.67. One of the aspects that Hutchings et alxxxii
investigated
20
in their evaluation of the London challenge was the impact of the Families of Schools6
data
analysis that the DfE produced for the London and other city challenges. Their findings were
not positive about the use and impact of Families of Schools, in fact the conclusion was:
“Across all City Challenge areas, most schools (and particularly primary schools) made
limited or no use of Families of Schools data. Most who did look at it did so mainly out
of interest; smaller numbers used it with a view to contacting other schools or
informing school improvement planning. It appeared that many were unaware of the
data, or did not understand its purpose.”
In the evaluation by Hutchings et alxxxiii
, those schools who did not use the Families of
Schools materials were asked in the survey to indicate their reasons and the most frequently
cited reason was that: ‘the LA provides data which enables schools to compare themselves
with others in the LA’. This was the response of 79% of respondents in London, the highest
of all the city challenge areas. Although the Families of Schools data were not regarded
highly by schools in London, the London local authorities were evidently providing their
schools with the type of performance data analysis that was useful to them. The other factor
not picked up by Hutchings was that the London local authorities were able to provide their
analyses to schools much sooner than the Families of Schools materials were produced.
One of the authors (Hayes) was involved with DfE in the development of the Families of
Schools materials and knows from personal experience that these were being produced in
April or May in the school year after the assessments and tests had been done, whereas the
best local authorities were producing their analyses six or seven months sooner. It is not
that the Families of Schools materials did not have validity; it is just that they arrived too late.
Somewhat ironically, the research by Barrs et al suggests that: ‘this use of Families of
Schools data was identified both in the previous literature and in our interviews as a major
feature of the programme’s success.’ This does appear contradictory to what Hutchings et
al. found, as referenced above; however, what is clear in the research by Barrs et al.xxxiv
is
that:
“One of the most important developments in London since 2000 has been the growth
in data use and data literacy. In our interviews with stakeholders (both groups) there
was virtual unanimity in the identification of data analysis and data literacy as key both
to powerful accountability and well targeted support. This preoccupation was not the
exclusive property of any particular group and all the major initiatives seemed to have
strong foundations in the use of educational metrics. The different actors in the
London story are therefore linked by a common preoccupation with the effective use of
educational data as an instrument for transformation.”
The production of data analysis allied to the ensuing data literacy might not have been the
exclusive property of any particular group; however, local authorities in London were major
players in the production of performance data analysis for their schools. Some of the
earliest work in this area was a legacy of the ILEA’s renowned Research & Statistics Unit,
particularly in the inner London authorities that made up the ILEA and some of the best work
was developed from the mid 1990s onwards by most local authorities across London. One
example among many of this work was by Hayes & Ruttxxxv
, which they produced in
Hammersmith & Fulham council. There are many examples of London local authorities
producing high quality educational performance data for their schools.
6
The Families of Schools intervention involved the annual provision of data (in books and online) that would enable schools to
benchmark against a group of schools with similar intakes based on prior attainment and socio-economic factors. The rationale
was that benchmarking would potentially challenge school leaders to explore why others were doing better in certain respects
and therefore identify new strategies for raising attainment in their own schools.
21
Survey respondents were asked 13 questions relating to the impact of leadership on
educational outcomes in their local authority and across London, covering school leadership,
governors, local authority officers and local political leadership. The scores are listed in
table 10.
Table 10: Average impact scores for factors relating to leadership
How much do you feel these factors contributed to your LA’s
and/or London’s improved educational performance at Key Stage
2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
Schools in your LA committing to the standards agenda 15 3.47
The role of headteachers in your LA 20 3.40
Local authority interventions in schools in challenging circumstances in
your LA 18 3.28
The role of headteachers across London 18 3.17
Having ambitious LA leadership at all levels 20 3.10
Having a coherent LA plan to raise standards, e.g. the Education
Development Plan (EDP) and/or the Children and Young Peoples Plan
(CYPP) 18 2.94
Your LA taking a resilient approach to external government policies
and pressure 15 2.87
The role of the Director of Children's Services (DCS) in your LA 17 2.71
The role school governors in your LA played in driving up standards 14 2.43
Local political scrutiny as a lever to drive up standards 14 2.14
The role of your lead councillor for Children’s Services and/or
education 16 1.75
The role of local politicians in driving up standards 16 1.69
The role of your leader of the council 14 1.36
Summary of responses 215 2.69
Source: Survey of local authority staff
In summary, the average scores for impact were highest in relation to school leadership and
local authority officer leadership and lowest for local authority political leadership. The
commitment of schools to the standards agenda, scoring 3.47, and the role of headteachers
in your LA, scoring 3.40, were the highest for impact. These were followed by local authority
interventions in schools in challenging circumstances, scoring 3.28, the role of headteachers
across London, 3.17, and having ambitious LA leadership at all levels, 3.10. The impact of
school governors in driving up standards was a middle ranging score at 2.43, while scores
for the impact of local politicians were all below 2, with the impact of local political scrutiny as
a lever to drive up standards not scoring much higher at 2.14. These low scores for the
impact of local political leaders are at odds with the role of national politicians as cited in
Barrs et al.xxxvi
: ‘In the context of London Challenge the Prime Minister [Tony Blair] and
successive secretaries of state for education personally endorsed London school reform as
a priority.’
The main evaluation of the role of school leadership across the whole City Challenge
programme was carried out by Rudd et alxxxvii
at the National Foundation for Educational
Research (NFER) and the evaluation of leadership in the London Challenge was carried out
by Poet and Kettlewellxxxviii
. A key element of the city challenges were the Leadership
Strategies, which were designed to break the cycle of under-achievement among
disadvantaged pupils in schools in urban areas. School leaders were seen as central agents
for change and, therefore, the city-wide Leadership Strategies were major elements of the
wider City Challenge initiative. Based on the concept of school-to-school support (known as
system leadership), these strategies also aimed to promote a more systemic approach to the
sharing of expertise and knowledge among school leaders, local authorities and other
stakeholders through local networks.
22
Poet and Kettlewellxxxix
stated that the school leaders who they interviewed sometimes found
it difficult to disentangle the impact of the London Leadership Strategies from other initiatives
supporting school improvement in the capital. However, the leadership provision was
perceived to have had a positive impact in a number of areas:
 enhanced quality of teaching and learning, particularly in teachers who had
participated in teaching and learning programmes;
 whole-school improvement in both supported and supporting schools;
 provision of high-quality Continuing Professional Development (CPD) for those
delivering support;
 improved quality of leadership in schools and better leadership capacity;
 enhanced collaboration and the development of a network of schools and
experienced individuals across London.
The London Challenge helped bring about improvements in school leaders in the capital and
the improved leadership was good for the performance of London’s schools.
Survey respondents were asked an open ended question which asked them to list the five
factors which they felt had a major impact on raising educational standards in London.
Table 11 shows the top ten most frequently cited reasons and a full list of all the reasons
given is in a table at Appendix 3.
Table 11: Factors listed as having the greatest impact on educational standards in
London
Factors listed by respondents (to an open ended question) as having a
major impact on raising educational standards in London
Frequency of
responses
Provision and use of performance data 15
Demographic changes in London and schools learning how to work with
their intakes 10
LA and school collaboration 10
Leadership in schools 10
Relentless focus on standards 8
Focus on pupil tracking 7
Funding at higher levels 7
LA support for school improvement 6
Focus on pupil progress 5
Leadership in the local authority 5
Source: Survey of local authority staff
The most frequently cited factor (15 times) by local authority research and statistics staff as
having the greatest impact on educational standards in London was the provision and use of
performance data, which in this context means the provision of performance data by the
local authority and its use by schools. This is not entirely surprising as the provision of
performance data is a key element of their core business. This was followed by three factors
cited 10 times each: demographic changes in London and schools learning how to work with
their intakes; local authority and school collaboration; and leadership in schools. The top 10
most frequently cited factors largely mirror the individual factors getting the highest scores
for impact in the main survey, particularly in relation to performance data, the relentless
focus on standards, pupil progress and tracking, and school and local authority leadership.
The one factor that stands apart in the list in table 11 is the demographic changes in London
and schools learning how to work with their intakes. The demographic profile of pupils in
London (and in some other large English cities) is very different from the mix in the rest of
England, with higher levels of deprivation and a greater number of pupils from ethnic
minority backgrounds and pupils who do not speak English as their first language. The
23
evaluation of the City Challenge by Hutchings et al.xl
provides a comparative profile of the
demographics of London’s children and young people compared with the other city
challenge areas and the rest of England; however, it does not explore in detail the
hypothesis that London’s schools got better at teaching and achieving greater success for,
their more diverse pupil intakes, in fact she makes the point that: ‘there is no reason to
assume that changed pupil characteristics [in London] might be responsible for the
improvement in attainment.’xli
In some ways this seems to be missing the point, because
even if it is not about the different profile of London’s school children, it is very possible that
there is something positive happening in the teaching and learning that they are exposed to.
The work of Wynessxlii
is referenced by Hutchings as someone who has analysed the
‘London advantage7
’. She used Income Deprivation Affected Children Index (IDACI) figures
as well as free school meal (FSM) eligibility, and analysed the 2010 data for different Key
Stages. She showed that the positive effect in London is small at Key Stage 1, but
increases with age. Wyness concludes:
“That the gap in attainment between pupils from London and the rest of the country
emerges over time suggests that it is schools, rather than parents that are responsible
for the relative advantage of pupils in London.”
Wyness also suggests that there are many potential explanations for the ‘London advantage’
that centre on teachers, schools and pupils themselves. In relation to the quality of teaching
she suggests that: ‘given the large pool of graduates in London compared to the rest of the
UK, teacher quality may be driven upwards by strong competition for teaching jobs in
London.’ So yes, London’s school children might, on average, be exposed to a better quality
of teaching but the research does not go as far as suggesting that London’s teachers got
better at teaching and achieving greater success for, their more diverse pupil intakes. In line
with the findings of Wyness, Greaves et al. suggest that:
“student demographics can explain some of the higher level of performance of FSM-
eligible students, based on a threshold measure and for a measure of high educational
attainment, but certainly not all of it.”
The report by Barrs et al.xliii
picks up on schemes like Teach First and reported that:
“Teach First’s framing of the work of its teachers as being a ‘mission’ to address
‘educational disadvantage’ also contributed to the moral purpose of teaching in London
and helped counter much of the negative press coverage previously given to London
schools. As one academy chain leader put it, Teach First led to a broader ‘upgrading
of the workforce’ and made it ‘an attractive place to be for bright young teachers’.”
The respondents interviewed by Barrs et al.xliv
had, without exception, a highly positive view
of the overall quality of the leadership in London schools now (in 2014), after more than a
decade of effective reform. They quote a highly experienced educationalist who played a
key role in the London Challenge story and who attributed the improvement in teaching and
learning in London schools to: ‘very much better leadership of schools by headteachers who
had become very clever at enabling teachers to improve their game’. This suggests that
there is a strong link between the quality of leadership and the impact of good leadership on
the quality of teachers and their teaching practice. There may be something about effective
leadership and the quality of teaching allied to being better able to teach the more diverse
pupil intakes in London, which have all coalesced so as to be responsible for at least part of
the improvements in London’s educational performance over the last 10 years. Mongon and
Chapmanxlv
, reference the importance of school leadership in their book: ‘High Leverage
Leadership’, and particularly its impact on raising white working class attainment:
7
The ‘London advantage’ can be best described as that amount by which primary school outcomes at Key Stage 2 and
secondary school outcomes at Key Stage 4 in London exceed the national average.
24
“The best of school leadership, raises the work of adults and the attainment of young
people to levels that exceed expectations and, sometimes, even their own ambitions.
It combines relentless focus and management skill with wide professional knowledge
and profound empathy, wrapped in a bag of energy and tied with robust optimism. It
has its most remarkable expression in circumstances where poverty and culture might
otherwise corrode the potential of young people to fulfil their talent.”
Insights from Interviews
To supplement the above data analysis and findings from research, one of the present
authors (Cassen) held a number of interviews with Local Authority officers from four London
boroughs. It is notable that so many studies of London school improvement fail altogether to
refer to the local authority role. This is in part because researchers commonly look for
Randomised Control Trials (RCTs) or robust statistical modelling with causal attributions,
which are not feasible in the case of local authorities and London improvement. But also
there is a dearth of published information. The few published studies of particular boroughs,
such as those for Lambeth and Tower Hamlets, do nonetheless make it clear that local
authorities often played a significant part; though one of the few studies that refers to local
authorities – Barrs et al.xlvi
– rightly says that: “the contribution of local authorities is probably
the least well researched aspect of the London story” but also that the local authorities’ work
has to be set in a context of several other contributing factors.
The interviews in question took place in the spring and summer of 2014. They showed some
common features, and some different emphases. They also accord well with the survey
findings reported above. For reasons of confidentiality the boroughs and the interviewees
are not identified.
Interviewees stressed that every stage in a young person’s development matters for
educational outcomes: the earliest years and the home learning environment, parenting and
pre-school, as well as primary and secondary education. So it was evident that a range of
local authority services had a role to play, children’s services and health as well as
education.
When it came to schooling, the interviewees from all four boroughs said that the majority of
their schools in the 1990s left much to be desired. ‘In the mid-90’s we had a few good
schools and a lot of terrible ones,’ one respondent said. And they all described the same
approach. First and foremost was the use of data, either from the DfE or from their own
statistical base, or both. ‘Heads didn’t understand spreadsheets in the early 1990s,’ one
interviewee said; ‘we had to work with them and by the end of the 1990s things improved.’
The data were used to set targets for all state schools in the borough: ambitious targets, as
was frequently asserted, which had to include disadvantaged pupils. One published study
from Lambeth Council gives more detail of how this was done.xlvii
Some interviewees said
they had to counter complacency or low expectations, at least in the early days. The targets
were set in discussion with the schools, and then monitored by the LA and its School
Improvement Team, on an annual basis or more often if a school was struggling. If targets
were not being met, follow-up with headteachers and school leaders would intensify.
Interviewees said that leadership was important, and pressure was placed on headteachers
to meet the targets; heads could even be replaced if improvement was not forthcoming. Two
interviewees referred also to the importance of training support in primary schools for
headteachers and governors, and to ‘good recruitment packages’ to attract the right people.
Teaching was emphasised as a key part of the dialogue with school-leaders, including
raising teachers’ expectations where necessary, and emphasising the quality of teaching,
not least through Continuing Professional Development. Getting teachers to share best
practice between schools was given by one interviewee as a further key factor. As another
put it: ‘We could not accept that schools were providing four good lessons out of five; it was
like saying students could stay at home one day a week.’
25
Resources were discussed, with the interviewees all saying they were little troubled by
resource shortages during and even before the London Challenge period. ‘With Excellence
in Cities and the Education Action Zones, we had a lot of money coming in,’ one interviewee
said.
Interviewees were also asked whether the academisation of schools had reduced their
roles; officers from three of the four boroughs said that academies were still ‘applying to
them for help’ and ‘buying into their services’; though those services had been reduced by
budget cuts. Although academies were not under local authority supervision, one
interviewee, a Head of Children’s Services, said ‘I am responsible for my children’s
education, so if an academy is not performing well, I will complain and try to get something
done.’
It was clear from the interviews that context varied greatly from one borough to another,
perhaps most significantly in the character of their pupil populations. One interviewee spoke
of an ‘entrenched white working class’, who were often in poverty and unemployment, but
did not move elsewhere, and presented difficulties of approach. In one of the boroughs
there were several ethnic minorities, each of which required an approach tailored to their
specific character and needs. In another a single large minority was the main focus; while
the fourth had to cope with a ‘very large transient population’. There was clearly not going
to be one size fitting all. But a key part of addressing the issues of minorities was getting
people from those minorities to cooperate, not least by serving as mentors and teaching
assistants in the schools.
In summary, it cannot be suggested that the four boroughs from which interviewees came
were statistically representative; and four is much too small a sample to permit
generalisation, when there are 33 councils in London. Nevertheless one thing stands out:
the use of data and the setting and monitoring of targets were important instruments in
improving school outcomes in the boroughs concerned. Those local authorities which have
had a successful record were very likely to have been using similar approaches in this
respect, as reflected in our survey findings. But they have had to deal with quite different
problems and contexts, and other, varying, factors account for what they have been able to
achieve.
Conclusion
There has been much written about the reasons for the ‘London advantage’ and the
transformational shift in educational outcomes in the capital in that last 10 years. Many
factors have been identified as playing a part in the improvement and it is difficult to
separate out those that had the greatest impact. Although the studies evaluating the
London Challenge acknowledged the role local authorities played in partnership with the
challenge advisers, other research has much less to say about the role and impact of local
authorities. The London Challenge would not have succeeded if the London local
authorities and their schools had not committed to its moral purpose and the relentless focus
on educational standards in those years.
The report by Barrs et al.xlviii
which attempts to tell the London story reminds us that school
reform is not a quick fix: ‘Changing professional culture is not a question of ‘flicking a switch’
or issuing a ministerial directive. It requires, to use the word of [an] expert witness, a
‘relentless’ focus over a long period of time.’ Alongside other key factors, it is this relentless
focus on standards allied to a sense of moral purpose that seems to underlie the educational
improvements in London. Educational outcomes that might have passed as being broadly
acceptable or ‘par for the course’ in London in the 1980s and 1990s were no longer
acceptable in the first decade of the new millennium. It is some transformation when the
majority of London local authorities that made the biggest gains in GCSE performance
between 1998 and 2013 were part of inner London under the ILEA.
26
Much has been made of the importance of performance data and of the benchmarking of
data, which made it possible to challenge underperformance on the compelling grounds that
if other schools were doing much better with a similar intake of students, then significant
improvement was possible. The use of data, therefore, generated both optimism and
urgency about the need for change and this is a valid finding; but it was not the Families of
Schools data produced by the DfE, as part of the London Challenge, which made the
difference. It was not widely used by schools and it came out too late in the school year to
make sufficient impact. Rather, it was the performance data analysis being produced by the
majority of London local authorities for their schools and the schools’ use of that data
analysis which helped drive improvements.
The local authority role in borough, school and pupil level target setting in the 2000s was
referenced in the interviews with school improvement officers as being an important lever in
raising standards and in holding schools to account for the performance of their pupils. As
one of the authors (Hayes) can attest from his work in Greenwich council, this was often a
data rich process laden with aspiration, ambition and challenge, driven by a sense of moral
purpose within which disadvantage should not be seen as a barrier to achievement. To
paraphrase Mongon and Chapman:xlix
‘poverty and culture [were not allowed to] corrode the
potential of young people to fulfil their talent’.
This research and the range of other research that has investigated the reasons for
London’s success, suggests that there is a strong link between the quality of leadership and
the impact of good leadership on the quality of teachers and their teaching practice. It is
possible that these factors, allied to teachers being better able to teach the more diverse
pupil intakes in London and many London children and young people from disadvantaged
backgrounds being resilient enough to succeed against the odds, have all coalesced to drive
up the improvements in London’s educational performance over the last 10 years.
It is the authors’ view that the London Challenge bore many of the hallmarks of a successful
intervention and that it was mostly very effective over a sustained period of eight years but it
was not the only factor that brought about the transformational shift in educational outcomes
in London. It was a significant factor and it certainly acted as a catalyst for change and a
lever to drive improvement, but the green shoots of the transformation in London were
already beginning to appear before the London Challenge began and many London schools
themselves and local authorities played crucial roles in securing the rapid improvement in
outcomes over the life of the London Challenge and beyond. The challenges that remain
now are the ability and capacity to sustain those improvements into the future and for other
regions and cities to learn the lessons from London’s success.
27
Appendix 1: London Local Authorities in Inner and Outer London
Local Authority
Inner / Outer
London
Designation8
Camden Inner
City of London Inner
Hackney Inner
Hammersmith & Fulham Inner
Haringey Inner
Islington Inner
Kensington & Chelsea Inner
Lambeth Inner
Lewisham Inner
Newham Inner
Southwark Inner
Tower Hamlets Inner
Wandsworth Inner
Westminster Inner
Barking & Dagenham Outer
Barnet Outer
Bexley Outer
Brent Outer
Bromley Outer
Croydon Outer
Ealing Outer
Enfield Outer
Greenwich Outer
Harrow Outer
Havering Outer
Hillingdon Outer
Hounslow Outer
Kingston upon Thames Outer
Merton Outer
Redbridge Outer
Richmond upon Thames Outer
Sutton Outer
Waltham Forest Outer
8
This split of local authorities between inner and outer London is based on the Department for Education’s designation as at
2013.
28
Appendix 2: Average Scores for impact of factors in the survey
The following table lists the 82 factors that were included in the survey instrument
administered to local authority officers, showing the number of responses and the average
score for impact. Please note that the range of possible scores for impact was from 0 (no
impact) to 4 (major impact) and that the average scores have been ranked in descending
order from those deemed to have had the most impact to those with the least impact.
How much do you feel these factors contributed to your Local
Authority’s and/or London’s improved educational performance at
Key Stage 2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
between
0 and 4
A relentless focus on standards in your LA 18 3.72
If your LA (e.g. your school improvement service) was outsourced, what
was the impact on standards? 3 3.67
A focus on progress to drive up attainment 19 3.53
Putting effective school improvement support in place 20 3.50
Teaching strategies for EAL children and particularly those at the lower
stages of fluency in English 14 3.50
Your LA not allowing disadvantage to be a barrier to achievement 17 3.47
Schools in your LA committing to the standards agenda 15 3.47
Pupil tracking in schools 19 3.42
A relentless focus on pupil groups at risk of underperformance 19 3.42
The role of headteachers in your LA 20 3.40
Your LA’s stance on schools causing concern 17 3.35
A relentless focus on standards across London 18 3.33
Higher levels of funding in your LA, compared to national 15 3.33
The recruitment of better teachers in your LA and across London 14 3.29
Local authority interventions in schools in challenging circumstances in
your LA 18 3.28
The LA provision of high quality Continuing Professional Development
(CPD) for schools 13 3.23
The role of your LA school improvement service 18 3.22
Your LA’s support for BME pupils 18 3.22
Schools across London committing to the standards agenda 14 3.21
Schools in your LA getting better at teaching the pupils in their schools,
based on their demographic profile 19 3.21
Measures to close the gaps at KS 2 & 4 between disadvantaged pupils
and others 17 3.18
The role of headteachers across London 18 3.17
Target setting at pupil level 18 3.17
Your LA’s support for EAL pupils 18 3.17
Higher levels of funding per pupil than the England average 18 3.17
Performance analysis outputs for schools from your LA’s research &
statistics team 19 3.16
Improving the quality of teachers in your LA 13 3.15
Learning how to teach the pupils you have in your LA 17 3.12
Higher levels of funding across London compared to national 18 3.11
Forensic analysis of performance data by schools 18 3.11
The role of your LA research & statistics team 19 3.11
Having ambitious LA leadership at all levels 20 3.10
29
How much do you feel these factors contributed to your Local
Authority’s and/or London’s improved educational performance at
Key Stage 2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
between
0 and 4
Local authority interventions in schools in challenging circumstances
across London 14 3.07
Use of educational research into school effectiveness and school
improvement 15 3.07
Improving the quality of teachers in London 16 3.06
Target setting at school level 20 3.05
National pressure from the DfE 18 3.00
The use of local educational research 16 3.00
Having a coherent LA plan to raise standards, e.g. the EDP and/or the
CYPP 18 2.94
Inward migration of families from BME backgrounds with high levels of
educational aspiration 16 2.94
Education policy (national) under Labour 1997 to 2010 16 2.88
The role of School Improvement Partners (SIPs) 15 2.87
Your LA taking a resilient approach to external government policies and
pressure 15 2.87
DfE School Performance Tables 20 2.85
Target setting at LA level 19 2.84
London Challenge across London 12 2.83
If your LA was put into intervention by DfE, what was the impact on
standards? 6 2.83
LA traded service providing performance data for schools (if you have
one) 10 2.80
The Ofsted school inspection framework 19 2.79
Fischer Family Trust as a tool to support school improvement 14 2.79
Harnessing support from your LA’s local community 13 2.77
Improving the profile of teaching as a profession 13 2.77
Changing demographics in London, e.g. Increasing young population
leading to more schools being full and fewer casual admissions 17 2.76
The role of the DCS in your LA 17 2.71
London Challenge generally 13 2.69
Forensic analysis of the performance of pupils by ethnic background 16 2.69
RAISEonline as a tool to support improvement 18 2.67
The involvement of parents in schools 13 2.62
Your council topping up the education budget 10 2.60
Partnership working with the local community 17 2.59
London Challenge role in improving teacher recruitment in London 11 2.55
DfE Standards’ Meetings with your LA 11 2.55
A focus on attendance as a lever to drive up attainment 18 2.44
The role school governors in your LA played in driving up standards 14 2.43
London Challenge in your LA 13 2.38
The London economy i.e. being more resilient than the rest of the
country 16 2.38
Local authority attendance teams 18 2.28
Reducing the number of fixed term and permanent exclusions in your
LA’s schools 18 2.28
Deployment of teaching assistants 13 2.15
Local political scrutiny as a lever to drive up standards 14 2.14
30
How much do you feel these factors contributed to your Local
Authority’s and/or London’s improved educational performance at
Key Stage 2 and Key Stage 4, placing London above national?
Number of
responses
Average
score
between
0 and 4
The transition from being an education department to Children’s
Services 14 2.00
The role of your lead councillor for Children’s Services and/or education 16 1.75
Education policy (national) under the coalition 2010 to 2014 15 1.73
The role of local politicians in driving up standards 16 1.69
The DfE replacing Contextual Value Added (CVA) with simple Value
Added (VA) 16 1.56
Your LA’s approach to moving more pupils into Alternative Provision 13 1.54
The role of Academy Chains in your local authority (if relevant) 7 1.43
The role of your leader of the council 14 1.36
The academisation of schools 16 1.13
The discontinuation of statutory target setting in 2010 12 1.08
The increase in the number of schools becoming academies since 2010 17 1.00
The creation of free schools 13 0.15
31
Appendix 3:
The following table lists the summary of the five factors listed by respondents (to an open
ended question) as having a major impact on raising educational standards in London.
Factors listed by respondents (to an open ended question) as
having a major impact on raising educational standards in London
Frequency of
responses
Provision and use of performance data 15
Demographic changes in London and schools learning how to work with
their intakes 10
LA and school collaboration 10
Leadership in schools 10
Relentless focus on standards 8
Focus on pupil tracking 7
Funding at higher levels 7
LA support for school improvement 6
Focus on pupil progress 5
Leadership in the local authority 5
High aspirations 3
School improvement and Research & Statistics teams providing
challenge 3
Target setting 3
Accountability from DCS 2
Building Schools for the Future (BSF) 2
Continuing Professional Development (CPD) for teachers 2
Culture change in schools and the local authority 2
Disadvantage not a barrier to success 2
Focus on Early Years 2
Focus on schools causing concern 2
Inclusivity and moral purpose 2
Outsourcing school improvement 2
Partnership working between schools 2
Quality of teaching 2
Recruiting better headteachers 2
The London economy 2
Accountability from DfE 1
Collaboratives and federations of schools 1
Community and parental involvement 1
Diverse population with high aspirations 1
Focus on the quality of teaching and learning 1
Funding for disadvantaged 2 year olds 1
Funding for Early Years places 1
Identification of underachieving groups 1
LA funded interventions 1
LA put into intervention 1
Other LA services improving 1
Parental engagement 1
Quality of teaching in underperforming subject areas 1
32
Factors listed by respondents (to an open ended question) as
having a major impact on raising educational standards in London
Frequency of
responses
Raising expectations 1
Role of Children's Centres 1
School leaders who were given enough time to raise standards 1
Schools became more ambitious 1
Schools doing it for themselves 1
Support for BME groups 1
Support for different groups of pupils 1
Support for EAL pupils 1
Support for underperforming groups 1
Total Responses 140
33
Notes
This paper is currently in draft format and was presented at the British Educational Research
Association Annual Conference, Institute of Education from 23 - 25 September 2014. This
paper is confidential and should only be used with the express permission of the authors,
(contact details below).
Acknowledgements
Acknowledgement is due to all of the London Children’s Services Research and Statistics
officers who completed a survey questionnaire as part of this research, and to Directors of
Education and/or of Children’s Services and other Local Authority officers who generously
gave their time to be interviewed.
Contacts for correspondence:
Sean Hayes
Freelance Educational Researcher
London
Mobile: 07729 053676
Co-author: Robert Cassen
London School of Economics
Emails: sean.hayes@carmo.myzen.co.uk
R.Cassen@lse.ac.uk
34
Bibliography
i
Radford, A. (2009). An enquiry into the Abolition of the Inner London Education Authority
(1964 to 1988), with Particular Reference to Politics and Policy Making. (PhD Thesis,
University of Bath – UK)
ii
Grace, G (Ed.). (1983). Education and the City. Theory, history and contemporary practice.
(Routledge – UK)
iii
Inner London Education Authority. (1982). Achievement of Leavers by Sex, Ethnic Group
and Year of Final Examination. (ILEA Research and Statistics Branch – UK)
iv
Gray, J & Jesson, D. (1987). Exam Results and LEA League Tables. (Newbury Policy
Journal – UK)
v
Her Majesty’s Government. (1988). The Education Reform Act. (HMSO – UK)
vi
Department for Education. (1995). Secondary School Performance Tables.
http://www.education.gov.uk/cgi-in/schools/performance/archive/shlea1_95?lea=205&type=b
vii
Marshall, P (Ed.). (2103). The Tail: How England's schools fail one child in five - and what
can be done. (Profile Books Ltd – UK)
viii
Cook, C. (2013). How to explain the London success story.
http://blogs.ft.com/ftdata/author/christophercook/ (Financial Times – UK)
ix
Hutchings, M. (2014). The Legacy of the London Challenge. (Keynote presentation to an
NUT Conference at the Institute of Education)
x
Baars, S. Bernardes, E. Elwick, Malortie, A. McAleavy, T. McInerney, L. Menzies, L &
Riggall, A. (2014). Lessons from London schools. (CfBT and Centre for London – UK)
xi
Hutchings, M. Greenwood, C. Hollingworth, S, Mansaray, A. Rose, A. Minty, S & Glass, K.
(2012). Evaluation of the City Challenge Programme. (Department for Education – UK)
xii
Hutchings, M, ibid. (P 95)
xiii
Hutchings, M, ibid. (P 95)
xiv
Hutchings, M, ibid. (P 96)
xv
Brighouse, T. (2014). London Challenge Remembered. Presentation to an NUT
Conference in 2014.
xvi
Hutchings, M, op cit. (P 110)
xvii
Ofsted. (2013). A review of the impact of the London Challenge (2003-8) and the City
Challenge (2008-11). www.ofsted.gov.uk/accessandachievement
xviii
Woods, D. Husbands, C. & Brown, C. (2013). Transforming Education for All: the Tower
Hamlets Story. (Tower Hamlets Council – UK)
xix
Collins, K. & Keating, M. (2013). ‘An East End Tale’, in The Tail. Marshall, P (Ed). (Profile
Books).
xx
Woods, D. op cit. (P 49)
35
xxi
Greaves, E. Macmillan, L. & Sibieta, L. (2014). Lessons from London schools for
attainment gaps and social mobility.(London: Social Mobility and Child Poverty Commission)
xxii
Greaves, E, ibid (P 7)
xxiii
Woods, D, op cit. (P 8)
xxiv
Barrs, S. et al. op cit. (P 85)
xxv
Hayes, S. Shaw, H. & Osborne, K. (2007). White working class boys: is their performance
at school a cause for concern? Paper presented at BERA in 2007. (British Education Index
Reference: 167843)
xxvi
Hayes, S. Shaw, H. McGrath, G. & Bonel, F. (2009) Using RAISEonline as a research
tool to analyse the link between attainment, social class and ethnicity. Paper presented at
BERA in 2009. (British Education Index Reference: 184218)
xxvii
Siraj-Blatchford, I. (2009). Learning in the home and at school: how working class
children ‘succeed against the odds’. British Educational Research Journal Vol. 36, No. 3,
June 2010, pp. 463–482
xxviii
Machin, S. and Vernoit, J. (2011). Changing School Autonomy: Academy Schools and
Their Introduction to England’s Education. CEE Discussion Paper. No. 123. London School
of Economics.
xxix
Machin, S. and Silva, O. (2013). School Structure, School Autonomy and the Tail. CEP
Special Report. Centre for Economic Performance, London School of Economics.
http://cep.lse.ac.uk/pubs/download/special/cepsp29.pdf. This research and the previous
citation are reported in Cassen, R., McNally, S., and Vignoles, A. (forthcoming), Making a
Difference in Education: What the evidence says, (Routledge London)
xxx
Hayes, S. & Clay, J. (2008). Progression from Key Stage 2 to 4: Understanding the
Context and Nature of Performance and Underperformance between the ages of 11-16.
(British Education Index Reference: 167840)
xxxi
Greaves, E. op cit (P 7)
xxxii
Hutchings, M. op cit. (P 93)
xxxiii
Hutchings, M. op cit. (P 87)
xxxiv
Barrs, S. et al. op cit. (P 88)
xxxv
Hayes, S., & Rutt, S. (1999). Primary Analysis for Secondary Schools: A LEA Research
Officer’s Perspective on Helping Schools Interpret Assessment Data for School Improvement
Purposes. Improving Schools (pp. 44 – 52). (Trentham Books – UK)
xxxvi
Barrs, S. et al. op cit. (P 100)
xxxvii
Rudd, P., Poet, H., Featherstone, G., Lamont, E., Durbin, B., Bergeron, C., Bramley, G.,
Kettlewell, K., & Hart, R. (2011). Evaluation of City Challenge Leadership Strategies:
Overview Report. (Slough: NFER)
xxxviii
Poet, H., & Kettlewell, K. (2011). Evaluation of City Challenge Leadership Strategies:
London Area Report. (Slough: NFER)
36
xxxix
Poet, H., & Kettlewell, K. ibid. (P iii)
xl
Hutchings, M. op cit. (P 6)
xli
Hutchings, M. op cit. (P 37)
xlii
Wnyess, G. (2011). London schooling: lessons from the capital. (Centre: Forum – UK)
xliii
Barrs, S. et al. op cit. (P 81)
xliv
Barrs, S. et al. op cit. (P 99)
xlv
Mongon, D., & Chapman, C. (2011). High-Leverage Leadership: Improving Outcomes in
Educational Settings. (Routledge – UK)
xlvi
Barrs, S. et al. op cit. (P 82)
xlvii
Demie, Feyisa. (2013). Using Data to Raise Achievement: good practice in schools,
(Lambeth Council, Research and Statistics Unit London)
xlviii
Barrs, S. et al. op cit. (P 120)
xlix
Mongon, D., & Chapman, C. op cit. (P 16)

More Related Content

Similar to The Transformational Shift in London Outcomes 2003 to 2013 (Sean Hayes and Robert Cassen)

Item6.PanLondonTraveltoSuccessReport (2).pdf
Item6.PanLondonTraveltoSuccessReport (2).pdfItem6.PanLondonTraveltoSuccessReport (2).pdf
Item6.PanLondonTraveltoSuccessReport (2).pdfRoBerTCreaTi
 
Making Sense of Policy in London Secondary Education: What can be Learned fro...
Making Sense of Policy in London Secondary Education: What can be Learned fro...Making Sense of Policy in London Secondary Education: What can be Learned fro...
Making Sense of Policy in London Secondary Education: What can be Learned fro...Challenge Partners
 
Chirp english-final-web
Chirp english-final-webChirp english-final-web
Chirp english-final-webMartin Brown
 
Skills for London: Meeting the Challenge
Skills for London: Meeting the ChallengeSkills for London: Meeting the Challenge
Skills for London: Meeting the ChallengeAssociation of Colleges
 
Implications of PIACC Findings for England
Implications of PIACC Findings for EnglandImplications of PIACC Findings for England
Implications of PIACC Findings for EnglandIpsos UK
 
3kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp013kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp01Nie99
 
3kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp013kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp01Chenjoi23
 
london-without-poverty_0
london-without-poverty_0london-without-poverty_0
london-without-poverty_0Alvin Carpio
 
Resource guide for GCSE Teachers in English and Maths
Resource guide for GCSE Teachers in English and MathsResource guide for GCSE Teachers in English and Maths
Resource guide for GCSE Teachers in English and MathsBob Read
 
119065129-DepEd-K12-Basic-Education-Program.pdf
119065129-DepEd-K12-Basic-Education-Program.pdf119065129-DepEd-K12-Basic-Education-Program.pdf
119065129-DepEd-K12-Basic-Education-Program.pdfriamagdosa18
 
Ofsted annual report 201314 south west
Ofsted annual report 201314 south westOfsted annual report 201314 south west
Ofsted annual report 201314 south westJulia Skinner
 
Nhsc transitions report
Nhsc transitions reportNhsc transitions report
Nhsc transitions reportHank Maine
 
Making it happen: teaching the technology generation
Making it happen: teaching the technology generationMaking it happen: teaching the technology generation
Making it happen: teaching the technology generationwillstewart
 
Why Become an NVQ Assessor? How to become an NVQ Assessor
Why Become an NVQ Assessor?  How to become an NVQ Assessor Why Become an NVQ Assessor?  How to become an NVQ Assessor
Why Become an NVQ Assessor? How to become an NVQ Assessor The Pathway Group
 
ONS Local presents: Adult Education Outcomes in London
ONS Local presents: Adult Education Outcomes in LondonONS Local presents: Adult Education Outcomes in London
ONS Local presents: Adult Education Outcomes in LondonOffice for National Statistics
 
New York City Public School Demographics
New York City Public School DemographicsNew York City Public School Demographics
New York City Public School DemographicsLuis Taveras EMBA, MS
 
Financial Health of the Higher Education Sector 2013
Financial Health of the Higher Education Sector 2013Financial Health of the Higher Education Sector 2013
Financial Health of the Higher Education Sector 2013Tom Davies
 

Similar to The Transformational Shift in London Outcomes 2003 to 2013 (Sean Hayes and Robert Cassen) (20)

Item6.PanLondonTraveltoSuccessReport (2).pdf
Item6.PanLondonTraveltoSuccessReport (2).pdfItem6.PanLondonTraveltoSuccessReport (2).pdf
Item6.PanLondonTraveltoSuccessReport (2).pdf
 
Making Sense of Policy in London Secondary Education: What can be Learned fro...
Making Sense of Policy in London Secondary Education: What can be Learned fro...Making Sense of Policy in London Secondary Education: What can be Learned fro...
Making Sense of Policy in London Secondary Education: What can be Learned fro...
 
Chirp english-final-web
Chirp english-final-webChirp english-final-web
Chirp english-final-web
 
1c
1c1c
1c
 
Skills for London: Meeting the Challenge
Skills for London: Meeting the ChallengeSkills for London: Meeting the Challenge
Skills for London: Meeting the Challenge
 
Implications of PIACC Findings for England
Implications of PIACC Findings for EnglandImplications of PIACC Findings for England
Implications of PIACC Findings for England
 
3kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp013kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp01
 
3kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp013kto12curriculumguide 120819074611-phpapp01
3kto12curriculumguide 120819074611-phpapp01
 
london-without-poverty_0
london-without-poverty_0london-without-poverty_0
london-without-poverty_0
 
Satish Presentation
Satish PresentationSatish Presentation
Satish Presentation
 
Resource guide for GCSE Teachers in English and Maths
Resource guide for GCSE Teachers in English and MathsResource guide for GCSE Teachers in English and Maths
Resource guide for GCSE Teachers in English and Maths
 
119065129-DepEd-K12-Basic-Education-Program.pdf
119065129-DepEd-K12-Basic-Education-Program.pdf119065129-DepEd-K12-Basic-Education-Program.pdf
119065129-DepEd-K12-Basic-Education-Program.pdf
 
Ofsted annual report 201314 south west
Ofsted annual report 201314 south westOfsted annual report 201314 south west
Ofsted annual report 201314 south west
 
Nhsc transitions report
Nhsc transitions reportNhsc transitions report
Nhsc transitions report
 
Making it happen: teaching the technology generation
Making it happen: teaching the technology generationMaking it happen: teaching the technology generation
Making it happen: teaching the technology generation
 
Why Become an NVQ Assessor? How to become an NVQ Assessor
Why Become an NVQ Assessor?  How to become an NVQ Assessor Why Become an NVQ Assessor?  How to become an NVQ Assessor
Why Become an NVQ Assessor? How to become an NVQ Assessor
 
ONS Local presents: Adult Education Outcomes in London
ONS Local presents: Adult Education Outcomes in LondonONS Local presents: Adult Education Outcomes in London
ONS Local presents: Adult Education Outcomes in London
 
Assessment
AssessmentAssessment
Assessment
 
New York City Public School Demographics
New York City Public School DemographicsNew York City Public School Demographics
New York City Public School Demographics
 
Financial Health of the Higher Education Sector 2013
Financial Health of the Higher Education Sector 2013Financial Health of the Higher Education Sector 2013
Financial Health of the Higher Education Sector 2013
 

The Transformational Shift in London Outcomes 2003 to 2013 (Sean Hayes and Robert Cassen)

  • 1. 1 The transformational shift in educational outcomes in London 2003 to 2013: the contribution of local authorities Sean Hayes and Robert Cassen ABSTRACT Introduction The paper explores the transformational shift in educational outcomes in London between 2003 and 2013. London’s schools have improved rapidly over the past decade, with primary and secondary schools now out-performing the rest of the country, at Key Stages 2 and 4, respectively. Improvements in many London boroughs have been staggering. England is now one of only a small number of countries in the developed world to have its capital city, London, outperforming the rest of the nation. Focus of the enquiry Many reasons have been put forward for the transformational shift. National education policy over the last decade has impacted London as much as anywhere; the investment in facilities, the growth of academies, changes to the national curriculum and testing, developments in teacher training, school accountability and the relationship between central and local government have all played some part in shaping the educational outcomes of London’s schools and the impact of the Department for Education’s London Challenge, which ran from 2003-10, cannot be underestimated. There is little doubt that the London Challenge was an important lever in raising standards; however, this paper argues that there are many other reasons behind London’s success and it explores the role of local authorities and their perspective on, and contribution to, London’s educational success. Research methods and mapping of the literature The research report will review and critique the current literature on the reasons being put forward for London’s success. The main research method will be a survey of London Local Authority education research and statistics officers, supported by feedback from a workshop organised by the London Education Research Network (LERN), and a series of interviews with London Local Authority Directors of Education or Children’s Services and their school improvement officers. Analytical framework The research will include a quantitative analysis of the survey responses and a qualitative narrative based on one of the author’s (Hayes) lived experience from working in two London Local Authorities (Hammersmith & Fulham and Greenwich), the views expressed by individual local authority research and statistics staff in the workshop and the interviews with local authority officers. Research findings The research shows that there are many reasons behind the transformational shift in educational outcomes in London and that local authorities were an important part of that process of change and that they contributed to the success.
  • 2. 2 Introduction and background The paper explores the transformational shift in educational outcomes in London between 2003 and 2013. London’s schools have improved rapidly over the past decade, with primary and secondary schools now out-performing the rest of the country, at Key Stages 2 and 4, respectively. Improvements in many London boroughs have been staggering, with London first outperforming national at Key Stage 2 in 2009 and at Key Stage 4 in 2004. However, it was not always thus and the picture of education and educational outcomes in London was historically a mixed one. Radfordi reminds us that state education in inner London was delivered through the Inner London Education Authority (ILEA) from 1964 to 1990, although the ILEA was abolished in April 1990 by the Conservative Government, through the Education Reform Act, 1988. This ended the unitary system of education that had existed in inner London for over a hundred years. On reflection now and in comparison to educational outcomes in London in 2013, the results achieved by schools in the ILEA in the 1980s do not look that good, even though they were robustly defended at the time. Frances Morrell writing in the book ‘Education and the Cityii ’ in 1983 reported that: “When we return to a review of educational achievement [in the ILEA’s schools] as expressed in external examination results, it is clear that in the face of the disadvantages already outlined, and in the face of considerable institutional fluidity arising out of many factors, these results are commendable.” Peter Newsam2 , an ILEA Education Officer around the same time, writing about the 1979 and 1980 examination results in the ILEA remarked that: “If examination results are to be taken as the test, the suggestion that results in ILEA secondary schools have fallen, though frequently made, is unsubstantiated.” The educational outcomes reported by Radford based on data published by the ILEAiii on performance by ethnicity and gender show that in 1981, only 13.9% of White British boys achieved 5+ CSE Grade 1 or O Levels at Grade A – C, while the most recent data available on Black Caribbean boys in the same report shows that in 1979 only 10.0% of them achieved the same benchmark. At best the judgement on those outcomes from 2014 is that they were not very good at all, no matter how much they might have been judged as being acceptable or even commendable over 30 years ago. A Sheffield University report by Gray and Jessoniv on local authority examination performance in the inner cities placed the ILEA about ‘par for the course’ in relation to other similar local authority areas in England. But it is clear from the failure of many subsequent Conservative and New Labour initiatives in the 1990s and early 2000s to raise standards in inner city schools that the problem is complex. The 1988 Education Reform Actv , which devoted a lot of its legislative content to abolishing the ILEA, also led to the introduction of the national curriculum and a new assessment framework in England at Key Stages 1 to 5. The closure of the ILEA in March 1990 meant that it never benefitted from any of the advantages that the national curriculum brought to England’s schools thereafter. The demise of the ILEA is much lamented by many who worked in it and in its schools and Radford concluded in his thesis that: “The evidence does not sustain the claims made against the ILEA [that it tolerated low standards in education and failed to give value for money] and that therefore, its demise can better be explained by the polarisation of politics at the time.” Educational standards in the majority of the 12 inner London local education authorities, which were created following the closure of the ILEA in 1990, did not start to rise exponentially in the early part of that decade. In 1995 one of this paper’s authors (Hayes) who was working in Hammersmith & Fulham local authority at the time, remembers three secondary schools in the borough having fewer than 10% of students achieving 5+ A* - C
  • 3. 3 GCSEsvi ; against a national average at the time of 43.5%. By 1998 only nine London local authorities were performing above the national average for the percentage of students achieving 5+ A* - C GCSEs (Inc. English & maths) and all of them were outer London local authorities. Table 1 shows the GCSE results for the percentage of students achieving 5+ A* - C GCSEs (inc. English & maths) in 1998, 2003, 2008 and 2013 for the 12 London local authorities that had made up the ILEA. Table 1: GCSE performance in 1998, 2003, 2008 and 2013 in the 12 London LAs that made up the ILEA ILEA LA London Designation % 5+ A* - C (Inc. Eng & maths) Local Authority 1998 2003 2008 2013 Camden Inner 34.6 40.1 45.3 60.4 Greenwich Inner 23.3 26.4 39.5 65.4 Hackney Inner 16.9 26.5 42.4 61.2 Hammersmith & Fulham Inner 35.0 42.6 55.9 66.5 Islington Inner 15.6 22.5 38.3 63.5 Kensington & Chelsea Inner 29.5 45.1 58.1 80.2 Lambeth Inner 19.7 30.1 46.9 65.9 Lewisham Inner 22.2 30.1 45.8 58.0 Southwark Inner 18.2 26.3 42.7 65.2 Tower Hamlets Inner 17.7 25.5 41.2 64.7 Wandsworth Inner 26.3 37.1 50.0 61.3 Westminster Inner 24.8 37.1 49.6 69.6 England 37.0 41.9 48.4 60.8 Number of LAs above England Average 0 2 4 10 Source: DfE Statistical First Releases (SFRs) The data in table 1 show that none of the 12 local authorities of the ILEA had reached the England average by 1998 and only two had done so by 2003 and four by 2008. However, by 2013, 10 out of the 12 had exceeded the England average, while the two that had not, Camden and Lewisham, were within 1 and 3 percentage points respectively of the England average. By 2013 the London landscape, in terms of educational outcomes for young people at 16 years old, had well and truly changed from the 1980s and the 1990s. An overview of educational performance in London up to 2013 This section provides an overview of educational performance in London, focusing on Key Stage 2 and Key Stage 4 and highlighting when performance in London began to outstrip the national average and where it had reached by 2013. Key Stage 2 Performance at Key Stage 2 has been one of steady improvement year on year from 2005 to 2012 in London and nationally, with only a slight drop nationally in 2009. Chart 1 shows the performance in terms of the percentage of children achieving Level 4+ in English & Maths Combined from 2005 to 20121 , comparing London with the national average. The graph shows London first outperforming national at Key Stage 2 in 2009 and then moving further ahead of national year on year up to 2012. 1 This series has been shown up to 2012 because from 2013 it has not been possible to calculate an overall level in English, i.e. from 2013; the outturns for English are reported separately as the Reading Test Level and the Writing Teacher Assessment Level. The combined measure from 2013 is the % achieving Level 4+ in Reading, Writing (TA) and Mathematics combined.
  • 4. 4 Chart 1: Key Stage 2 % Level 4+ in English & Maths Combined from 2005 to 2012 London v National Source: DfE Statistical First Releases (SFRs) 2005 to 2012 Chart 2 shows the Key Stage 2 results for every London local authority in 2013 based on the percentage of children achieving Level 4+ in Reading, Writing and Mathematics combined. In 2013 performance in inner, outer and greater London was above national, with performance in inner London now above that of outer London. Twenty nine local authorities in London performed above the national average in 2013, while only four were below; Barking & Dagenham, Croydon, Haringey and Waltham Forest. Appendix 1 shows which local authorities are in Inner and Outer London, based on the DfE’s designation in 2013.
  • 5. 5 Chart 2: Key Stage 2 Level 4+ in reading, writing & maths by London LA in 2013 Source: DfE Statistical First Release (SFR) 2013 Chart 3 shows Key Stage 2 performance in London and England by Free School Meal (FSM) Eligibility in 2013 based on the percentage of children achieving Level 4+ in reading, writing and mathematics combined. The performance of children eligible for free school meals is much better in London than it is in the whole of England and within London performance is higher in outer London than inner London. 50 60 70 80 90 100 Croydon Haringey Barking & Dagenham WalthamForest Enfield Islington Southwark Brent Ealing Hounslow Tower Hamlets Hillingdon Merton Redbridge Hammersmith& Fulham Newham Westminster Barnet Harrow Havering Bromley Hackney Lambeth Bexley Greenwich Sutton Camden Wandsworth Kingstonupon Thames Lewisham Kensington & Chelsea Richmond upon Thames Cityof London Inner London Outer London London England %Level 4+ in Reading, Writing & Maths
  • 6. 6 Chart 3: Key Stage 2 Performance by FSM Eligibility in 2013 % Level 4+ in reading, writing and mathematics Source: DfE Statistical First Release (SFR) 2013 Table 2 shows the data that is in Chart 3 with the addition of the percentage point attainment gap between FSM eligible and not eligible children. The attainment gap in London is six percentage points lower than the national gap and the gap in inner London is nine points lower than national. Based on these outturns, primary schools in London and particularly in inner London appear to be very effective at maximising outcomes for their free school meal children and at addressing longstanding attainment gaps. Table 2: Key Stage 2 Performance by FSM Eligibility in 2013 % Level 4+ in reading, writing and mathematics with the attainment gap Region FSM Not FSM All pupils % Point Gap FSM v Not FSM Inner London 73.0 83.0 79.0 10.0 Outer London 65.0 82.0 78.0 17.0 London 69.0 82.0 79.0 13.0 England 60.0 79.0 76.0 19.0 Source: DfE Statistical First Release (SFRs) 2013 Chart 4 shows the proportion of primary schools that were below the national Key Stage 2 floor standard2 in 2012 and 2013. The proportion of primary schools below the national floor standard improved slightly in 2013 to 6.1% from 6.5% in 2012. There was a similar improvement in London but the point of real significance is that in 2013, only 2.7% of primary schools were below the floor standard, less than half the national percentage. Primary schools in London are closing the social class attainment gap and also appear to be tackling the long tail of underachievement, which for many decades has been associated with educational performance in England, according to Marshall et alvii who suggest that: ‘One 2 In the Key Stage 2 tests for 2012/13 a school was below the floor standard if: fewer than 60% of its children did not achieve Level 4 or above in reading, writing and mathematics; and it was below the England median for progression by two or more levels in: reading, writing and mathematics. 50 55 60 65 70 75 80 85 FSM NotFSM All pupils %Le4vel4+ Inner London Outer London London England
  • 7. 7 child in five leaves schools in England without basic skills in literacy and numeracy [and that] it has become increasingly common to refer to these children as the tail.’ The GCSE performance data in Charts 7 and 8 of this paper would suggest that secondary schools in London are also starting to tackle the long tail of underachievement successfully. Chart 4: % of Schools below the Key Stage 2 Floor Standard in 2012 and 2013 Source: DfE Statistical First Releases (SFRs) 2012 and 2013 Key Stage 4 Performance at Key Stage 4 has been one of steady improvement year on year from 1998 to 2013 in London and nationally. Chart 5 shows the performance in terms of the percentage of students achieving 5+ GCSE grades at A* - C (Inc. English & maths). The graph shows London first outperforming national at Key Stage 4 in 2004 and then moving further ahead of national year on year up to 2013. Chart 5: Key Stage 4 % achieving 5+ A* - C (Inc. English & maths) from 1998 to 2013 London v National Source: DFE Key Stage 4 School Performance Tables 1998 to 2013 2.5% 3.6% 3.1% 6.5% 2.2% 3.0% 2.7% 6.1% 0% 1% 2% 3% 4% 5% 6% 7% Inner London Outer London London England %ofSchools 2012 2013 30 35 40 45 50 55 60 65 70 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 %5+A*-C(Inc.English&maths) London England
  • 8. 8 Chart 6 shows the performance of London local authorities in terms of the percentage of students achieving 5+ GCSE grades at A* - C (inc. English & maths) at four points in time, 1998, 2003, 2008 and 2013, against the national performance at the same point in time. In 1998, 28 out of 32 London local authorities were below national on this measure and the number dropped to 21 in 2003 and 16 in 2008. However, the most dramatic improvement was in 2013, when the number of London local authorities below the national average dropped to only six. Between 1998 and 2013, national performance on this measure improved by 23.8 percentage points. Over the same period 31 out of 32 London local authorities improved by more than 23.8 percentage points, with nine of them improving by more than 40 percentage points. Chart 6: London LAs GCSE Performance v England % 5+ A* - C (Incl. English & maths) in 1998, 2003, 2008 and 2013 Source: DfE Statistical First Releases (SFRs) 1998, 2003, 2008 and 2013 The six local authorities with the lowest results in 1998 (Islington, Hackney, Tower Hamlets, Southwark, Lambeth and Haringey) and therefore, those with the greatest distance to travel to reach the national average, were among those who made the greatest improvements between 1998 and 2013. By 2013, all six of them were above national average for the percentage of students achieving 5+ GCSE grades at A* - C (inc. English & maths). Five of these local authorities, other than Haringey, were also ex-ILEA councils and the other three councils, which made similarly large improvements between 1998 and 2013, were also ex- ILEA councils. These were Greenwich, Westminster and Kensington & Chelsea. Chart 7 shows Key Stage 4 performance in London and England comparing disadvantaged pupils3 with all others in 2013. The chart shows the performance of both groups and the attainment gap, in inner and outer London as well as greater London and England. More disadvantaged pupils in London achieve 5+ GCSE grades at A* - C (inc. English & maths) than do so nationally and more disadvantaged pupils in inner London do so than in outer London. In a similar manner to primary schools in London, secondary schools are also closing the social class attainment gap. 3 Disadvantaged pupils include all those pupils in the Key Stage 4 cohort who are eligible for free school meals and those who are Looked After Children. 0 20 40 60 80 Islington Hackney TowerHamlets Southwark Lambeth Haringey Lewisham Barking&Dagenham Greenwich Newham Westminster Wandsworth WalthamForest Kensington&Chelsea Croydon Merton Ealing Enfield Hillingdon Camden Brent Hammersmith&Fulham Hounslow Bexley Havering Richmond Harrow Redbridge Bromley Barnet Kingston Sutton England %5+A*-C(Incl.Eng&maths) 1998 2003 2008 2013
  • 9. 9 Chart 7: GCSE Performance and Attainment Gaps in London in 2013 Source: DfE School Performance Tables 2013 Chart 8 shows the proportion of secondary schools that were below the national Key Stage 4 floor standard4 in 2012 and 2013. The proportion of secondary schools below the national floor standard improved in 2013, down to 5.3% from 6.6% in 2012. There was a similar improvement in London but the point of real significance is that in 2013, only 1.2% of secondary schools were below the floor standard, a small fraction of the national percentage, with 1.2% equating to five schools. Just like primary schools in London, secondary schools are also closing the social class attainment gap and also appear to be tackling the long tail of underachievement. Chart 8: Key Stage 4 % of schools below the Floor Standard in 2013 London v National Source: Source: DfE Statistical First Releases (SFRs) 2012 and 2013 4 At Key Stage 4 in 2012/13 a school was below the floor standard if: fewer than 40% of its pupils did not achieve 5+ GCSE grades at A* - C (Inc. English & maths); and it was below the England median for progression by three or more levels in: GCSE English and in GCSE mathematics. 0% 10% 20% 30% 40% 50% 60% 70% 80% Disadvantaged pupils Other pupils Attainment Gap (% points) %5+A*-C(Inc.E&M) GCSE Performance and Attainment Gaps in London in 2013 Inner London Outer London London England 2.8% 2.6% 2.7% 6.6% 1.4% 1.1% 1.2% 5.3% 0% 2% 4% 6% 8% Inner London Outer London Greater London England %ofSchools 2012 2013 Only 5 London schools were below the floor standard in 2013. London appears to be tackling the long tail of underachievement
  • 10. 10 Chart 9 shows the GCSE performance of the main ethnic groupings in terms of the percentage of students achieving 5+ GCSE grades at A* - C (inc. English & maths). It compares performance in inner, outer and greater London with national. The chart shows that more pupils in each of the five ethnic groups achieves 5+ GCSE grades at A* - C (inc. English & maths) in London compared to the same groups nationally. The performance of White pupils more or less mirrors the performance of all pupils for each of the London and national benchmarks, while the performance of mixed race, Asian and Chinese pupils exceeds that of all pupils. The performance of Black pupils in London at 60.1% is just below that of all pupils nationally, that is, within less than one percentage point of the national average of 60.8%. The comparative performance for Black pupils in London back in 2005/06 was 35.0% compared with 44.0% for all pupils nationally, a gap of nine percentage points. Secondary schools in London have made significant inroads in terms of closing the attainment gaps between Black pupils and White pupils and Black pupils and all pupils nationally. Greater proportions of Asian, Chinese and Mixed Race pupils in London are achieving GCSE success compared to the same groups nationally and to all pupils nationally. Chart 9: Key Stage 4 Performance by Ethnicity in 2013 % 5+ A* - C (Including English & maths) Source: DfE Statistical First Release (SFR) 2013 The analyses of performance quoted above have all been taken from the Department for Education’s publicly available data. There has been other analytical work done that shows that London is outperforming national at GCSE. One example of this is by Chris Cookviii , a journalist at the Financial Times, who produced an analysis using the 2012 GCSE National Pupil Dataset (NPD). Chart 10 shows one of the representations of Cook’s analysis5 , which plots pupils’ Average GCSE Points against their deprivation ranking, split by English region. This analysis shows that pupils in London are outperforming pupils in all other regions regardless of their deprivation ranking. In fact, the more deprived pupils in London are outperforming similar pupils in all other regions, to an even greater extent than less deprived pupils. 5 Cook’s analysis is based on the following concepts: (1) The Average GCSE Points Score (called the "FT score") – is based on attributing 8 points for an A* down to one for a G and adding up the score for English, maths and the pupils’ three best other subjects. (2) This is plotted against each pupil’s IDACI (Income Deprivation Affecting Children Index) score, which is an index of poverty which measures how poor the neighbourhood in which a child lives is. The lower the child’s IDACI score the more deprived they are likely to be. (3) The analytical method uses regression analysis. (4) The outcomes are split by English region. 20 30 40 50 60 70 80 90 White Mixed Asian Black Chinese All Pupils %5+A*-C(Inc.Eng&maths) Inner London Outer London London England
  • 11. 11 Chart 10: GCSE Performance in 2012 based on Average GCSE Points Score plotted against pupils’ deprivation ranking and split by English Region Source: Chris Cook at the Financial Times 2013 – http://blogs.ft.com/ftdata/author/christophercook/ Ofsted Inspection Outcomes Chart 11 compares the Ofsted inspection profile of schools in London with the rest of England as at 31 August 2013. In 2013, over 80% of London’s primary and secondary schools were judged by Ofsted to be good or outstanding, while the comparable figures for the rest of England’s schools were under 80% for primary and under 70% for secondary. The better Ofsted profile of London’s schools compared to the rest of England reflects the better educational outcomes at Key Stage 2 and Key Stage 4. Chart 11: Ofsted Inspection Profile of Schools – London v National (as at 31/08/13) Source: Ofsted 2013 and Merryn Hutchingsix, Institute for Policy Studies in Education, London Metropolitan University 0% 20% 40% 60% 80% 100% Rest of England London Rest of England London Outstanding Good Satisfactory/Requires Improvement Inadequate Primary Secondary
  • 12. 12 What factors contributed to London’s success – A review of the literature? Many individuals and organisations will want to claim that they played some part in the transformational change in London’s educational outcomes in the 10 years from 2003 to 2013. These will range from the most obvious; the students themselves, teachers, school leaders and governors, through to local authority directors, school improvement officers and elected members, the trade unions, educational researchers and performance data analysts and the Department for Education and its initiatives like London Challenge and Ofsted with its rigorous inspection frameworks, to politicians of every hue and in particular the Labour government from 1997 to 2010 and the Coalition government from 2010 onwards. There is also the strong notion that certain propitious circumstances particular to London might have helped enable the transformational change. These include the resilience of the London economy, the demographic profile of London with its much more ethnically and linguistically diverse population compared to the rest of England and the ability of London’s teachers to effectively meet the language, cultural and learning needs of their diverse pupil population. In reality, it is likely that most of these individuals and organisations and London factors will have played some part in improving educational outcomes and it will be near impossible to disentangle which of these made the most impact, and difficult to assess which other factors might have contributed to the success. This is one of the methodological challenges with attempting to research such a complex phenomenon as changing outcomes in schools in a city with over 8,000,000 inhabitants. As the Centre for London report: Lessons from London’s Schools by Barrs et alx points out, there have been no randomised controlled trials to measure the impact: “none of the major London reforms were planned with a concurrent rigorous evaluative element or any randomised controlled trial (RCT) element.” In practice, it would have been almost impossible to put randomised controlled trials in place to measure what made the most impact on educational outcomes in a city like London over a five to 10 year period. In the period 2003 to 2010, the single biggest educational intervention in London was the Department for Education’s London Challenge school improvement programme. The London Challenge was established in 2003 to improve outcomes in low-performing secondary schools in the capital, with primary schools included from 2008. The London Challenge was led by two key players: Tim Brighouse, the ex-Birmingham Council Chief Education Officer and London Schools’ Commissioner, and David Woods, the ex-Principal National Challenge Adviser for England and Chief Adviser for London Schools. It used independent, experienced education experts (London Challenge advisers), to identify need and broker support for underperforming schools. The advisers were supported by a small administrative team based in the DfE. The cost of the support and brokered services came directly from the DfE and was spent as the advisers directed. The London Challenge had four core elements:  A consistent message of the pressing need to improve educational standards;  Programmes of support to local authorities, which were managed by experienced and credible London Challenge advisers;  The main focus of the work was to improve the quality of teaching and learning in schools;  It developed robust systems to track pupils’ progress and used data to evaluate effectiveness. The most significant evaluation of the London Challenge was carried out by Hutchings et alxi in a report for the Department for Education: Evaluation of the City Challenge Programme, which evaluated the City Challenge programme in London, Greater Manchester and the Black Country, and included a retrospective review of the London Challenge 2003 to 2008. Another feature of the London Challenge was that it worked with local authority school improvement advisers and the evaluation by Hutchings et alxii acknowledges that: “To be effective, capacity building with local authorities has to involve working as partners”. Many
  • 13. 13 local authority advisers reported that there was effective partnership and that this benefitted them and schools. There was some evidence that working with the Challenge advisors had resulted in some changes to the way that the local authority conducted school reviews, that is, they became more focused on teaching and learning. Several LA interviewees talked about the key role that Challenge advisors had played, both in swelling the number of people working on school improvement in the borough, and in developing the expertise of the local authority school improvement team. However, in other LAs the relationship was more limited, partly because the Challenge was focusing its work on those local authorities whose schools were most in need of improvement and in some cases because there were barriers to partnership working, for example poor communication on the part of City Challengexiii . Some local authority officers also felt that City Challenge did not recognise the work that the local authorities had been doing in their schools over an extended period, and were claiming credit for improved results which also related to previous groundwork they had undertaken. One local authority officer argued, ‘they’re very much airbrushing out the contribution made by local authorities to the success of the Challengexiv .’ This is the essence of the tension that this paper wants to explore and it is the view of the authors that yes, the London Challenge did play a part in the transformational shift in educational outcomes in London, but it was not the only factor that contributed to the change. The Challenge provided a significant catalyst for many London local authorities and their schools to embark on a journey of rapid improvement but it did not achieve the successful educational outcomes on its own. It probably worked best when it worked in effective and collaborative partnerships with local authority school improvement teams and together they played an important part in improving educational standards. The view of Tim Brighousexv is that London Challenge played a large part and that: ‘it made more good things happen and fewer bad things happen’. The main conclusion made by Hutchings et alxvi was the following: “Perhaps the most effective aspect of City Challenge was that it recognised that individuals and school communities tend to thrive when they feel trusted, supported and encouraged. The ethos of the programme, in which successes were celebrated and it was recognised that if teachers are to inspire pupils they themselves need to be motivated and inspired, was a key factor in its success.” It is not unreasonable to read into this that the City Challenges, including the London Challenge, were most effective when all parties worked together, including schools, Challenge advisers and local authority school improvement staff. In 2013 Ofsted produced a reportxvii which summarised the previous evaluation by Hutchings et al and also reviewed the sustainability of the school improvement that took place. One can conclude from Hutching’s evaluation, the Ofsted report and the evidence provided in this research on educational outcomes in London, that the three objectives of the London Challenge:  to reduce the number of underperforming schools;  to increase the number of Good and Outstanding schools;  to improve educational outcomes for disadvantaged children; were achieved during the lifetime of the programme and in the years immediately following it the successes were sustained. However, the question stills persists, was the London Challenge responsible? The London Challenge was not active in every London borough and in its first phase, it was targeting particularly intensive support at five key boroughs (Southwark, Lambeth, Hackney, Islington and Haringey) and many local authorities improved at different times and from different starting points, while some, as we have seen, improved much more than others. There was significant variation in the improvement trajectories across London and some
  • 14. 14 local authorities reached a plateau earlier and have not improved as much as others that exceeded the national average at GCSE for the first time more recently, such as Greenwich in 2012 and Islington in 2013. There is a strong feeling that many London local authorities just got on with it and largely set about raising standards by themselves, that is, working locally to improve the educational outcomes in their schools. One local authority who documented this was Tower Hamlets. In Chart 6 of this paper it was illustrated that six local authorities with the lowest GCSE results in 1998 had been among those who had made the greatest improvements by 2013. Tower Hamlets was one of those six and in 2013 the local authority decided to tell its version of how they improved educational outcomes and effectively achieved all three of the objectives of the London Challenge. The report: ‘Transforming Education for All: the Tower Hamlets Story’xviii was written by Woods et al at the Institute of Education, the same David Woods who led the London Challenge with Tim Brighouse. The report on Tower Hamlets provides evidence of a local authority which had a very clear strategy for securing improvements in educational standards and which achieved them largely through its own efforts, although Tower Hamlets council leaders also engaged fully with the London Challenge. Christine Gilbert and Kevan Collins were in post as Director of Education and Children’s Services and subsequently as Chief Executive for the majority of the life of the London Challenge and Collins makes the point in a chapter entitled ‘An East End Tale’ in the book: ‘The Tail’ edited by Paul Marshall (2013)xix , that: ‘ “Tower Hamlets never saw London Challenge as a threat to its leadership and embraced the approach with many of the Borough’s Headteachers given key roles and rightly asked to share their work and support others. The strategy thus played to the strong local traditions of collaborative partnership working”. Tower Hamlets council worked with the London Challenge but it also had a very clear idea of its own role in raising educational standards in the borough and the importance of doing that through effective collaborative partnership working. The Tower Hamlets reportxx identified six major factors which the council believe explained their experience and successful approach:  Shared values and beliefs with robust and resilient purpose and professional will. ‘Yes we can…’;  Highly effective and ambitious leadership at all levels – Local Authority and school leadership;  Schools rising to the standards challenge – improved teaching and learning, enhanced Continuing Professional Development, rigorous pupil tracking and assessment, a relentless focus on school improvement;  Partnership working – inward and outward facing, external and integrated services, shared responsibility and accountability;  Community development – building collaborative capacity and community cohesion;  A professional learning community – building momentum and engagement through and across school communities, high levels of knowledge, trust and professional relationships. Several of these factors align with the London Challenge, but some are more specific to the strategy adopted locally by Tower Hamlets, including the robust and resilient purpose and professional will; rigorous pupil tracking and assessment; partnership working and probably most importantly: community development, that building of collaborative capacity and community cohesion, so important in such a diverse borough as Tower Hamlets.
  • 15. 15 A recent study by Greaves et al, looking specifically at the results for disadvantaged pupils, also contests the claims made for the London Challenge. It attributes an important part of disadvantaged pupils’ improved performance to better primary school results between 1999 and 2003.xxi They argue this was a major factor in the improvement of their Key Stage 4 performance between 2004 and 2008; after subtracting for this prior attainment, the effects of other changes in the early 2000’s are greatly reduced. In other words, an important part of the London improvement, for disadvantaged students at least, is due to factors preceding the London Challenge, including the roll out of the National Literacy and Numeracy Strategies nationwide from 1998–99 onwards. Greaves et alxxii asked what caused the improvement in Key Stage 2 test scores that led to the ‘London effect’ at Key Stage 4 and suggested that it is not clear. However, they found that the explanation is likely to be related to changes in London’s primary schools in the late 1990s and early 2000s. They conclude that: “This means that programmes and initiatives such as the London Challenge, the Academies Programme, Teach First or differences in resources are unlikely to be the major explanation (as these changes either happened too late, were focused on secondary schools or were longstanding, and therefore are unlikely to account for the rapid improvements we see).” But the authors do further argue that GCSE success has also to do with secondary schools; in other parts of the country good results for the disadvantaged at Key Stage 2 were not so impressively continued into Key Stage 4 as they were in London. Tower Hamlets for example had only a very small social class attainment gap between disadvantaged pupils and all others at Key Stage 4 in 2013. Table 3 shows that the national gap between disadvantaged and all other pupils was 26.9 percentage points, whereas the gap in Tower Hamlets was only 7.5 percentage points. Not only that, results for all pupils in Tower Hamlets were better than the national average and 62.9% of disadvantaged pupils in Tower Hamlets achieved 5+ A* - C (inc. English and maths) GCSEs, compared with a national average for disadvantaged pupils of 40.9%. This is indeed some transformation from the position in 1997, when as the report saysxxiii : “in 1997 the Borough had been positioned 149th out of 149 local education authorities in terms of its performance.” Table 3: Comparison of the performance of disadvantaged and all other pupils based on the % 5+ A* - C (Incl. Eng and maths) GCSEs in 2013 – Tower Hamlets v National The attainment gap at Key Stage 4 % 5+ A* - C (Incl. English and maths) GCSEs in 2013 All pupils Disadvantaged pupils Other pupils Gap (% points) England (state funded only) 60.6% 40.9% 67.8% 26.9% Tower Hamlets 64.7% 62.9% 70.4% 7.5% Difference v England (% points) 4.1% 22.0% 2.6% -19.4% Source: DfE School Performance Tables 2013 The report by Barrs et alxxiv also referenced the approaches to school improvement being taken in Haringey and Hackney councils and said that they had: ‘a similar theory of action as Tower Hamlets’, based on:  ensuring first-rate leadership of the school improvement service;  a tough approach to the performance management of headteachers;  a strong emphasis on the use of data;  effective professional development both for leaders and class teachers.
  • 16. 16 There were clearly factors in common among local authorities, whose schools achieved some of the greatest improvements, including: excellent leadership, a focus on school improvement, improved teaching and learning, pupil tracking and the use of performance data, continuing professional development, partnership working and community development. Another constituency that undoubtedly contributed to London’s educational success story were the students themselves. In Greenwich, for example, Hayes et alxxv carried out some quantitative research into performance by ethnic group, and looked at the possibility that White UK boys from low income households might be becoming the group at greatest risk of underperformance. The research found that this was the case and further qualitative research was carried out by Hayes et alxxvi which moved it out of the negative paradigm to investigate why some students from that background succeeded against the odds. What emerged from the qualitative research was that the successful students from a deprived White British background had developed a range of approaches and strategies to help them succeed. These included a degree of ambition for their success and a level of resilience that they developed for themselves, often leading to a capacity for self regulation, especially when it came to organising their own learning and even changing their friendship groups. Research by Siraj-Blatchfordxxvii found similar reasons why some children from deprived backgrounds, both White and minority ethnic, were able to succeed against the odds. While the capacity for children and young people to demonstrate personal resilience and to succeed against the odds is not unique to London, the fact that the social class attainment gap is smaller in London at Key Stage 2 and Key Stage 4 than elsewhere in England, suggests that London’s students have played their part in the capital’s success. It is sometimes claimed that academies have played a significant part in the story of London’s improving educational performance. While there is some evidence for rising outcomes under the academies programme up to 2008/09,xxviii another study found that, at least as measured by attainment at the end of primary school, the benefits of academies were entirely concentrated among students of medium to high prior attainment.xxix Academies by this measure did little to raise the outcomes for disadvantaged students, which has been such a feature of the improvements in London. At the time of writing there were no reliable evaluations of academies’ performance in the period after 2008/09. Analysis of survey responses In order to get the local authority perspective on the transformation in London’s educational outcomes a survey was devised and administered to local authority research and statistics officers to get their assessment of which factors they felt had the greatest impact. The survey was administered during a workshop in March 2014, organised by the London Education Research Network (LERN) and held at the Greater London Authority, City Hall building. The survey had 82 factors and respondents were asked to rank each one on a scale of 0 to 4, with 0 being no impact and 4 being major impact. A table at Appendix 2 lists the number of valid responses and the average score for the perceived impact of each factor. In total, 21 survey responses were received, 19 from local authority research and statistics staff, one form a retired local authority school improvement lead and one from a data officer who works in an educational charity supplying performance data to London schools. The 20 responses from local authorities covered a total of 14 London local authorities, which is 42.4% of the 33 local authorities in London. Table 4 shows the number of responses from each local authority. There were multiple responses from four local authorities.
  • 17. 17 Table 4: Number of survey responses by local authority Local Authority Number of respondents Ealing 1 Enfield 1 Greenwich 3 Hammersmith & Fulham 1 Haringey 2 Hounslow 2 Islington 3 Kensington & Chelsea 1 Lambeth 1 Lewisham 1 Newham 1 Southwark 1 Tower Hamlets 1 Waltham Forest 1 Other (Non-LA) 1 Total 21 Source: Survey of local authority staff Table 5 shows the number of survey responses by gender. There was quite an even gender split among respondents. Table 5: Number of survey responses by gender Gender Number of respondents % of respondents Female 11 52.4% Male 10 47.6% Total 21 100.0% Source: Survey of local authority staff Table 6 shows the length of time respondents have worked in their current or most recent Local Authority. The majority of respondents, 57.1%, had been in their local authority for between five and 20 years. Two of the three respondents who have been in their current local authority for less than one year had substantial earlier experience in other London local authorities. Table 6: Length of time respondents have worked in their LA Length of time in current or most recent Local Authority Number of respondents % of respondents < 1 year 3 14.3% 1 to 5 years 6 28.6% 5 to 10 years 3 14.3% 10 to 15 years 5 23.8% 15 to 20 years 4 19.0% Total 21 100.0% Source: Survey of local authority staff
  • 18. 18 The 10 factors that the survey respondents identified as having the greatest impact on educational outcomes in their local authority and in London overall are listed in Table 7. Table 7: Factors identified as having the greatest impact on educational outcomes How much do you feel these factors contributed to your Local Authority’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score between 0 and 4 A relentless focus on standards in your LA 18 3.72 A focus on progress to drive up attainment 19 3.53 Putting effective school improvement support in place 20 3.50 Teaching strategies for EAL children and particularly those at the lower stages of fluency in English 14 3.50 Your LA not allowing disadvantage to be a barrier to achievement 17 3.47 Schools in your LA committing to the standards’ agenda 15 3.47 Pupil tracking in schools 19 3.42 A relentless focus on pupil groups at risk of underperformance 19 3.42 The role of headteachers in your LA 20 3.40 Your LA’s stance on schools causing concern 17 3.35 Source: Survey of local authority staff. NB This table only includes scores for impact where there were more than 10 responses. The factor that respondents rated as having the greatest impact on educational performance in their local authority was: a relentless focus on standards, scoring 3.72, with a similarly high score of 3.42 for: schools in your LA committing to the standards agenda. This would suggest that local authorities and their schools were strongly committed to improving educational outcomes for children and young people. Other factors that scored highly for impact related more specifically to the pupils themselves, including: a focus on progress to drive up attainment, scoring 3.53; teaching strategies for EAL children, 3.50; pupil tracking in schools, 3.42; and, a relentless focus on pupil groups at risk of underperformance, 3.40. There is local authority research carried out in Greenwich Council by Hayes and Clayxxx who identified the importance of focusing on pupil progress to drive up levels of attainment and on identifying pupil groups at risk of underperformance, to ensure that every child fulfils their potential. In Greenwich Council the work that the local authority engaged in with its schools around pupil level target setting had at its core the concept that a focus on pupils’ progress was a key lever to raise standards. The impact of this could be seen in the 2011 Key Stage 2 results when the outcomes for 2+ levels of progress moved Greenwich into the top 10 local authorities in England for progress and attainment in English and mathematics. Survey respondents were asked four questions about the impact of the London Challenge and table 8 shows the average scores for impact for each of these. The average scores for the impact of the London Challenge were in the middle to lower range of scores and respondents thought that it had slightly more impact generally and across London than it did in their own local authority, with the impact of its role in improving teacher recruitment being located somewhere in the middle of the other three scores. To an extent this chimes with the findings of Greaves et alxxxi who stated that: ‘programmes and initiatives such as the London Challenge, the Academies Programme, Teach First or differences in resources are unlikely to be the major explanation [for London’s success]’, although others have made a strong case for all of them.
  • 19. 19 Table 8: Average impact scores for factors relating to the London challenge How much do you feel these factors contributed to your LA’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score London Challenge across London 12 2.83 London Challenge generally 13 2.69 London Challenge role in improving teacher recruitment in London 11 2.55 London Challenge in your LA 13 2.38 Summary of responses 49 2.61 Source: Survey of local authority staff Survey respondents were asked 13 questions relating to the impact of performance data and research in driving up educational standards and table 9 shows the average scores for impact for each of these. Table 9: Average impact scores for factors relating to performance data How much do you feel these factors contributed to your LA’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score A focus on progress to drive up attainment 19 3.53 Pupil tracking in schools 19 3.42 A relentless focus on pupil groups at risk of underperformance 19 3.42 Measures to close the gaps at KS 2 & 4 between disadvantaged pupils and others 17 3.18 Performance analysis outputs for schools from your LA’s research & statistics team 19 3.16 Forensic analysis of performance data by schools 18 3.11 Use of educational research into school effectiveness and school improvement 15 3.07 The use of local educational research 16 3.00 DfE School Performance Tables 20 2.85 LA traded service providing performance data for schools (if you have one) 10 2.80 Fischer Family Trust as a tool to support school improvement 14 2.79 Forensic analysis of the performance of pupils by ethnic background 16 2.69 RAISEonline as a tool to support improvement 18 2.67 Summary of responses 220 3.07 Source: Survey of local authority staff The data in table 7 have already shown that a focus on progress to drive up attainment and pupil tracking in schools both scored high for impact and both are aspects of school life that are data driven. The data in table 9 show that the majority of factors relating to data scored 3 or above for impact, with a relentless focus on pupil groups at risk of underperformance being the next highest score at 3.42. The application and accountability dimension of measures to close the gaps at Key Stage 2 & 4 between disadvantaged pupils and others, scored 3.18; the production of performance analysis outputs for schools by the local authority, 3.16; the forensic analysis of performance data by schools, 3.11; and the use of educational research into school effectiveness and school improvement, 3.07; that is, they all scored above 3 for impact. The scores for the impact of data tools produced outside the school and local authority environment, that is by DfE, Ofsted and others, were all below 3, with the DfE School Performance Tables scoring 2.85, Fischer Family Trust (FFT) scoring 2.79 and RAISEonline scoring 2.67. One of the aspects that Hutchings et alxxxii investigated
  • 20. 20 in their evaluation of the London challenge was the impact of the Families of Schools6 data analysis that the DfE produced for the London and other city challenges. Their findings were not positive about the use and impact of Families of Schools, in fact the conclusion was: “Across all City Challenge areas, most schools (and particularly primary schools) made limited or no use of Families of Schools data. Most who did look at it did so mainly out of interest; smaller numbers used it with a view to contacting other schools or informing school improvement planning. It appeared that many were unaware of the data, or did not understand its purpose.” In the evaluation by Hutchings et alxxxiii , those schools who did not use the Families of Schools materials were asked in the survey to indicate their reasons and the most frequently cited reason was that: ‘the LA provides data which enables schools to compare themselves with others in the LA’. This was the response of 79% of respondents in London, the highest of all the city challenge areas. Although the Families of Schools data were not regarded highly by schools in London, the London local authorities were evidently providing their schools with the type of performance data analysis that was useful to them. The other factor not picked up by Hutchings was that the London local authorities were able to provide their analyses to schools much sooner than the Families of Schools materials were produced. One of the authors (Hayes) was involved with DfE in the development of the Families of Schools materials and knows from personal experience that these were being produced in April or May in the school year after the assessments and tests had been done, whereas the best local authorities were producing their analyses six or seven months sooner. It is not that the Families of Schools materials did not have validity; it is just that they arrived too late. Somewhat ironically, the research by Barrs et al suggests that: ‘this use of Families of Schools data was identified both in the previous literature and in our interviews as a major feature of the programme’s success.’ This does appear contradictory to what Hutchings et al. found, as referenced above; however, what is clear in the research by Barrs et al.xxxiv is that: “One of the most important developments in London since 2000 has been the growth in data use and data literacy. In our interviews with stakeholders (both groups) there was virtual unanimity in the identification of data analysis and data literacy as key both to powerful accountability and well targeted support. This preoccupation was not the exclusive property of any particular group and all the major initiatives seemed to have strong foundations in the use of educational metrics. The different actors in the London story are therefore linked by a common preoccupation with the effective use of educational data as an instrument for transformation.” The production of data analysis allied to the ensuing data literacy might not have been the exclusive property of any particular group; however, local authorities in London were major players in the production of performance data analysis for their schools. Some of the earliest work in this area was a legacy of the ILEA’s renowned Research & Statistics Unit, particularly in the inner London authorities that made up the ILEA and some of the best work was developed from the mid 1990s onwards by most local authorities across London. One example among many of this work was by Hayes & Ruttxxxv , which they produced in Hammersmith & Fulham council. There are many examples of London local authorities producing high quality educational performance data for their schools. 6 The Families of Schools intervention involved the annual provision of data (in books and online) that would enable schools to benchmark against a group of schools with similar intakes based on prior attainment and socio-economic factors. The rationale was that benchmarking would potentially challenge school leaders to explore why others were doing better in certain respects and therefore identify new strategies for raising attainment in their own schools.
  • 21. 21 Survey respondents were asked 13 questions relating to the impact of leadership on educational outcomes in their local authority and across London, covering school leadership, governors, local authority officers and local political leadership. The scores are listed in table 10. Table 10: Average impact scores for factors relating to leadership How much do you feel these factors contributed to your LA’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score Schools in your LA committing to the standards agenda 15 3.47 The role of headteachers in your LA 20 3.40 Local authority interventions in schools in challenging circumstances in your LA 18 3.28 The role of headteachers across London 18 3.17 Having ambitious LA leadership at all levels 20 3.10 Having a coherent LA plan to raise standards, e.g. the Education Development Plan (EDP) and/or the Children and Young Peoples Plan (CYPP) 18 2.94 Your LA taking a resilient approach to external government policies and pressure 15 2.87 The role of the Director of Children's Services (DCS) in your LA 17 2.71 The role school governors in your LA played in driving up standards 14 2.43 Local political scrutiny as a lever to drive up standards 14 2.14 The role of your lead councillor for Children’s Services and/or education 16 1.75 The role of local politicians in driving up standards 16 1.69 The role of your leader of the council 14 1.36 Summary of responses 215 2.69 Source: Survey of local authority staff In summary, the average scores for impact were highest in relation to school leadership and local authority officer leadership and lowest for local authority political leadership. The commitment of schools to the standards agenda, scoring 3.47, and the role of headteachers in your LA, scoring 3.40, were the highest for impact. These were followed by local authority interventions in schools in challenging circumstances, scoring 3.28, the role of headteachers across London, 3.17, and having ambitious LA leadership at all levels, 3.10. The impact of school governors in driving up standards was a middle ranging score at 2.43, while scores for the impact of local politicians were all below 2, with the impact of local political scrutiny as a lever to drive up standards not scoring much higher at 2.14. These low scores for the impact of local political leaders are at odds with the role of national politicians as cited in Barrs et al.xxxvi : ‘In the context of London Challenge the Prime Minister [Tony Blair] and successive secretaries of state for education personally endorsed London school reform as a priority.’ The main evaluation of the role of school leadership across the whole City Challenge programme was carried out by Rudd et alxxxvii at the National Foundation for Educational Research (NFER) and the evaluation of leadership in the London Challenge was carried out by Poet and Kettlewellxxxviii . A key element of the city challenges were the Leadership Strategies, which were designed to break the cycle of under-achievement among disadvantaged pupils in schools in urban areas. School leaders were seen as central agents for change and, therefore, the city-wide Leadership Strategies were major elements of the wider City Challenge initiative. Based on the concept of school-to-school support (known as system leadership), these strategies also aimed to promote a more systemic approach to the sharing of expertise and knowledge among school leaders, local authorities and other stakeholders through local networks.
  • 22. 22 Poet and Kettlewellxxxix stated that the school leaders who they interviewed sometimes found it difficult to disentangle the impact of the London Leadership Strategies from other initiatives supporting school improvement in the capital. However, the leadership provision was perceived to have had a positive impact in a number of areas:  enhanced quality of teaching and learning, particularly in teachers who had participated in teaching and learning programmes;  whole-school improvement in both supported and supporting schools;  provision of high-quality Continuing Professional Development (CPD) for those delivering support;  improved quality of leadership in schools and better leadership capacity;  enhanced collaboration and the development of a network of schools and experienced individuals across London. The London Challenge helped bring about improvements in school leaders in the capital and the improved leadership was good for the performance of London’s schools. Survey respondents were asked an open ended question which asked them to list the five factors which they felt had a major impact on raising educational standards in London. Table 11 shows the top ten most frequently cited reasons and a full list of all the reasons given is in a table at Appendix 3. Table 11: Factors listed as having the greatest impact on educational standards in London Factors listed by respondents (to an open ended question) as having a major impact on raising educational standards in London Frequency of responses Provision and use of performance data 15 Demographic changes in London and schools learning how to work with their intakes 10 LA and school collaboration 10 Leadership in schools 10 Relentless focus on standards 8 Focus on pupil tracking 7 Funding at higher levels 7 LA support for school improvement 6 Focus on pupil progress 5 Leadership in the local authority 5 Source: Survey of local authority staff The most frequently cited factor (15 times) by local authority research and statistics staff as having the greatest impact on educational standards in London was the provision and use of performance data, which in this context means the provision of performance data by the local authority and its use by schools. This is not entirely surprising as the provision of performance data is a key element of their core business. This was followed by three factors cited 10 times each: demographic changes in London and schools learning how to work with their intakes; local authority and school collaboration; and leadership in schools. The top 10 most frequently cited factors largely mirror the individual factors getting the highest scores for impact in the main survey, particularly in relation to performance data, the relentless focus on standards, pupil progress and tracking, and school and local authority leadership. The one factor that stands apart in the list in table 11 is the demographic changes in London and schools learning how to work with their intakes. The demographic profile of pupils in London (and in some other large English cities) is very different from the mix in the rest of England, with higher levels of deprivation and a greater number of pupils from ethnic minority backgrounds and pupils who do not speak English as their first language. The
  • 23. 23 evaluation of the City Challenge by Hutchings et al.xl provides a comparative profile of the demographics of London’s children and young people compared with the other city challenge areas and the rest of England; however, it does not explore in detail the hypothesis that London’s schools got better at teaching and achieving greater success for, their more diverse pupil intakes, in fact she makes the point that: ‘there is no reason to assume that changed pupil characteristics [in London] might be responsible for the improvement in attainment.’xli In some ways this seems to be missing the point, because even if it is not about the different profile of London’s school children, it is very possible that there is something positive happening in the teaching and learning that they are exposed to. The work of Wynessxlii is referenced by Hutchings as someone who has analysed the ‘London advantage7 ’. She used Income Deprivation Affected Children Index (IDACI) figures as well as free school meal (FSM) eligibility, and analysed the 2010 data for different Key Stages. She showed that the positive effect in London is small at Key Stage 1, but increases with age. Wyness concludes: “That the gap in attainment between pupils from London and the rest of the country emerges over time suggests that it is schools, rather than parents that are responsible for the relative advantage of pupils in London.” Wyness also suggests that there are many potential explanations for the ‘London advantage’ that centre on teachers, schools and pupils themselves. In relation to the quality of teaching she suggests that: ‘given the large pool of graduates in London compared to the rest of the UK, teacher quality may be driven upwards by strong competition for teaching jobs in London.’ So yes, London’s school children might, on average, be exposed to a better quality of teaching but the research does not go as far as suggesting that London’s teachers got better at teaching and achieving greater success for, their more diverse pupil intakes. In line with the findings of Wyness, Greaves et al. suggest that: “student demographics can explain some of the higher level of performance of FSM- eligible students, based on a threshold measure and for a measure of high educational attainment, but certainly not all of it.” The report by Barrs et al.xliii picks up on schemes like Teach First and reported that: “Teach First’s framing of the work of its teachers as being a ‘mission’ to address ‘educational disadvantage’ also contributed to the moral purpose of teaching in London and helped counter much of the negative press coverage previously given to London schools. As one academy chain leader put it, Teach First led to a broader ‘upgrading of the workforce’ and made it ‘an attractive place to be for bright young teachers’.” The respondents interviewed by Barrs et al.xliv had, without exception, a highly positive view of the overall quality of the leadership in London schools now (in 2014), after more than a decade of effective reform. They quote a highly experienced educationalist who played a key role in the London Challenge story and who attributed the improvement in teaching and learning in London schools to: ‘very much better leadership of schools by headteachers who had become very clever at enabling teachers to improve their game’. This suggests that there is a strong link between the quality of leadership and the impact of good leadership on the quality of teachers and their teaching practice. There may be something about effective leadership and the quality of teaching allied to being better able to teach the more diverse pupil intakes in London, which have all coalesced so as to be responsible for at least part of the improvements in London’s educational performance over the last 10 years. Mongon and Chapmanxlv , reference the importance of school leadership in their book: ‘High Leverage Leadership’, and particularly its impact on raising white working class attainment: 7 The ‘London advantage’ can be best described as that amount by which primary school outcomes at Key Stage 2 and secondary school outcomes at Key Stage 4 in London exceed the national average.
  • 24. 24 “The best of school leadership, raises the work of adults and the attainment of young people to levels that exceed expectations and, sometimes, even their own ambitions. It combines relentless focus and management skill with wide professional knowledge and profound empathy, wrapped in a bag of energy and tied with robust optimism. It has its most remarkable expression in circumstances where poverty and culture might otherwise corrode the potential of young people to fulfil their talent.” Insights from Interviews To supplement the above data analysis and findings from research, one of the present authors (Cassen) held a number of interviews with Local Authority officers from four London boroughs. It is notable that so many studies of London school improvement fail altogether to refer to the local authority role. This is in part because researchers commonly look for Randomised Control Trials (RCTs) or robust statistical modelling with causal attributions, which are not feasible in the case of local authorities and London improvement. But also there is a dearth of published information. The few published studies of particular boroughs, such as those for Lambeth and Tower Hamlets, do nonetheless make it clear that local authorities often played a significant part; though one of the few studies that refers to local authorities – Barrs et al.xlvi – rightly says that: “the contribution of local authorities is probably the least well researched aspect of the London story” but also that the local authorities’ work has to be set in a context of several other contributing factors. The interviews in question took place in the spring and summer of 2014. They showed some common features, and some different emphases. They also accord well with the survey findings reported above. For reasons of confidentiality the boroughs and the interviewees are not identified. Interviewees stressed that every stage in a young person’s development matters for educational outcomes: the earliest years and the home learning environment, parenting and pre-school, as well as primary and secondary education. So it was evident that a range of local authority services had a role to play, children’s services and health as well as education. When it came to schooling, the interviewees from all four boroughs said that the majority of their schools in the 1990s left much to be desired. ‘In the mid-90’s we had a few good schools and a lot of terrible ones,’ one respondent said. And they all described the same approach. First and foremost was the use of data, either from the DfE or from their own statistical base, or both. ‘Heads didn’t understand spreadsheets in the early 1990s,’ one interviewee said; ‘we had to work with them and by the end of the 1990s things improved.’ The data were used to set targets for all state schools in the borough: ambitious targets, as was frequently asserted, which had to include disadvantaged pupils. One published study from Lambeth Council gives more detail of how this was done.xlvii Some interviewees said they had to counter complacency or low expectations, at least in the early days. The targets were set in discussion with the schools, and then monitored by the LA and its School Improvement Team, on an annual basis or more often if a school was struggling. If targets were not being met, follow-up with headteachers and school leaders would intensify. Interviewees said that leadership was important, and pressure was placed on headteachers to meet the targets; heads could even be replaced if improvement was not forthcoming. Two interviewees referred also to the importance of training support in primary schools for headteachers and governors, and to ‘good recruitment packages’ to attract the right people. Teaching was emphasised as a key part of the dialogue with school-leaders, including raising teachers’ expectations where necessary, and emphasising the quality of teaching, not least through Continuing Professional Development. Getting teachers to share best practice between schools was given by one interviewee as a further key factor. As another put it: ‘We could not accept that schools were providing four good lessons out of five; it was like saying students could stay at home one day a week.’
  • 25. 25 Resources were discussed, with the interviewees all saying they were little troubled by resource shortages during and even before the London Challenge period. ‘With Excellence in Cities and the Education Action Zones, we had a lot of money coming in,’ one interviewee said. Interviewees were also asked whether the academisation of schools had reduced their roles; officers from three of the four boroughs said that academies were still ‘applying to them for help’ and ‘buying into their services’; though those services had been reduced by budget cuts. Although academies were not under local authority supervision, one interviewee, a Head of Children’s Services, said ‘I am responsible for my children’s education, so if an academy is not performing well, I will complain and try to get something done.’ It was clear from the interviews that context varied greatly from one borough to another, perhaps most significantly in the character of their pupil populations. One interviewee spoke of an ‘entrenched white working class’, who were often in poverty and unemployment, but did not move elsewhere, and presented difficulties of approach. In one of the boroughs there were several ethnic minorities, each of which required an approach tailored to their specific character and needs. In another a single large minority was the main focus; while the fourth had to cope with a ‘very large transient population’. There was clearly not going to be one size fitting all. But a key part of addressing the issues of minorities was getting people from those minorities to cooperate, not least by serving as mentors and teaching assistants in the schools. In summary, it cannot be suggested that the four boroughs from which interviewees came were statistically representative; and four is much too small a sample to permit generalisation, when there are 33 councils in London. Nevertheless one thing stands out: the use of data and the setting and monitoring of targets were important instruments in improving school outcomes in the boroughs concerned. Those local authorities which have had a successful record were very likely to have been using similar approaches in this respect, as reflected in our survey findings. But they have had to deal with quite different problems and contexts, and other, varying, factors account for what they have been able to achieve. Conclusion There has been much written about the reasons for the ‘London advantage’ and the transformational shift in educational outcomes in the capital in that last 10 years. Many factors have been identified as playing a part in the improvement and it is difficult to separate out those that had the greatest impact. Although the studies evaluating the London Challenge acknowledged the role local authorities played in partnership with the challenge advisers, other research has much less to say about the role and impact of local authorities. The London Challenge would not have succeeded if the London local authorities and their schools had not committed to its moral purpose and the relentless focus on educational standards in those years. The report by Barrs et al.xlviii which attempts to tell the London story reminds us that school reform is not a quick fix: ‘Changing professional culture is not a question of ‘flicking a switch’ or issuing a ministerial directive. It requires, to use the word of [an] expert witness, a ‘relentless’ focus over a long period of time.’ Alongside other key factors, it is this relentless focus on standards allied to a sense of moral purpose that seems to underlie the educational improvements in London. Educational outcomes that might have passed as being broadly acceptable or ‘par for the course’ in London in the 1980s and 1990s were no longer acceptable in the first decade of the new millennium. It is some transformation when the majority of London local authorities that made the biggest gains in GCSE performance between 1998 and 2013 were part of inner London under the ILEA.
  • 26. 26 Much has been made of the importance of performance data and of the benchmarking of data, which made it possible to challenge underperformance on the compelling grounds that if other schools were doing much better with a similar intake of students, then significant improvement was possible. The use of data, therefore, generated both optimism and urgency about the need for change and this is a valid finding; but it was not the Families of Schools data produced by the DfE, as part of the London Challenge, which made the difference. It was not widely used by schools and it came out too late in the school year to make sufficient impact. Rather, it was the performance data analysis being produced by the majority of London local authorities for their schools and the schools’ use of that data analysis which helped drive improvements. The local authority role in borough, school and pupil level target setting in the 2000s was referenced in the interviews with school improvement officers as being an important lever in raising standards and in holding schools to account for the performance of their pupils. As one of the authors (Hayes) can attest from his work in Greenwich council, this was often a data rich process laden with aspiration, ambition and challenge, driven by a sense of moral purpose within which disadvantage should not be seen as a barrier to achievement. To paraphrase Mongon and Chapman:xlix ‘poverty and culture [were not allowed to] corrode the potential of young people to fulfil their talent’. This research and the range of other research that has investigated the reasons for London’s success, suggests that there is a strong link between the quality of leadership and the impact of good leadership on the quality of teachers and their teaching practice. It is possible that these factors, allied to teachers being better able to teach the more diverse pupil intakes in London and many London children and young people from disadvantaged backgrounds being resilient enough to succeed against the odds, have all coalesced to drive up the improvements in London’s educational performance over the last 10 years. It is the authors’ view that the London Challenge bore many of the hallmarks of a successful intervention and that it was mostly very effective over a sustained period of eight years but it was not the only factor that brought about the transformational shift in educational outcomes in London. It was a significant factor and it certainly acted as a catalyst for change and a lever to drive improvement, but the green shoots of the transformation in London were already beginning to appear before the London Challenge began and many London schools themselves and local authorities played crucial roles in securing the rapid improvement in outcomes over the life of the London Challenge and beyond. The challenges that remain now are the ability and capacity to sustain those improvements into the future and for other regions and cities to learn the lessons from London’s success.
  • 27. 27 Appendix 1: London Local Authorities in Inner and Outer London Local Authority Inner / Outer London Designation8 Camden Inner City of London Inner Hackney Inner Hammersmith & Fulham Inner Haringey Inner Islington Inner Kensington & Chelsea Inner Lambeth Inner Lewisham Inner Newham Inner Southwark Inner Tower Hamlets Inner Wandsworth Inner Westminster Inner Barking & Dagenham Outer Barnet Outer Bexley Outer Brent Outer Bromley Outer Croydon Outer Ealing Outer Enfield Outer Greenwich Outer Harrow Outer Havering Outer Hillingdon Outer Hounslow Outer Kingston upon Thames Outer Merton Outer Redbridge Outer Richmond upon Thames Outer Sutton Outer Waltham Forest Outer 8 This split of local authorities between inner and outer London is based on the Department for Education’s designation as at 2013.
  • 28. 28 Appendix 2: Average Scores for impact of factors in the survey The following table lists the 82 factors that were included in the survey instrument administered to local authority officers, showing the number of responses and the average score for impact. Please note that the range of possible scores for impact was from 0 (no impact) to 4 (major impact) and that the average scores have been ranked in descending order from those deemed to have had the most impact to those with the least impact. How much do you feel these factors contributed to your Local Authority’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score between 0 and 4 A relentless focus on standards in your LA 18 3.72 If your LA (e.g. your school improvement service) was outsourced, what was the impact on standards? 3 3.67 A focus on progress to drive up attainment 19 3.53 Putting effective school improvement support in place 20 3.50 Teaching strategies for EAL children and particularly those at the lower stages of fluency in English 14 3.50 Your LA not allowing disadvantage to be a barrier to achievement 17 3.47 Schools in your LA committing to the standards agenda 15 3.47 Pupil tracking in schools 19 3.42 A relentless focus on pupil groups at risk of underperformance 19 3.42 The role of headteachers in your LA 20 3.40 Your LA’s stance on schools causing concern 17 3.35 A relentless focus on standards across London 18 3.33 Higher levels of funding in your LA, compared to national 15 3.33 The recruitment of better teachers in your LA and across London 14 3.29 Local authority interventions in schools in challenging circumstances in your LA 18 3.28 The LA provision of high quality Continuing Professional Development (CPD) for schools 13 3.23 The role of your LA school improvement service 18 3.22 Your LA’s support for BME pupils 18 3.22 Schools across London committing to the standards agenda 14 3.21 Schools in your LA getting better at teaching the pupils in their schools, based on their demographic profile 19 3.21 Measures to close the gaps at KS 2 & 4 between disadvantaged pupils and others 17 3.18 The role of headteachers across London 18 3.17 Target setting at pupil level 18 3.17 Your LA’s support for EAL pupils 18 3.17 Higher levels of funding per pupil than the England average 18 3.17 Performance analysis outputs for schools from your LA’s research & statistics team 19 3.16 Improving the quality of teachers in your LA 13 3.15 Learning how to teach the pupils you have in your LA 17 3.12 Higher levels of funding across London compared to national 18 3.11 Forensic analysis of performance data by schools 18 3.11 The role of your LA research & statistics team 19 3.11 Having ambitious LA leadership at all levels 20 3.10
  • 29. 29 How much do you feel these factors contributed to your Local Authority’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score between 0 and 4 Local authority interventions in schools in challenging circumstances across London 14 3.07 Use of educational research into school effectiveness and school improvement 15 3.07 Improving the quality of teachers in London 16 3.06 Target setting at school level 20 3.05 National pressure from the DfE 18 3.00 The use of local educational research 16 3.00 Having a coherent LA plan to raise standards, e.g. the EDP and/or the CYPP 18 2.94 Inward migration of families from BME backgrounds with high levels of educational aspiration 16 2.94 Education policy (national) under Labour 1997 to 2010 16 2.88 The role of School Improvement Partners (SIPs) 15 2.87 Your LA taking a resilient approach to external government policies and pressure 15 2.87 DfE School Performance Tables 20 2.85 Target setting at LA level 19 2.84 London Challenge across London 12 2.83 If your LA was put into intervention by DfE, what was the impact on standards? 6 2.83 LA traded service providing performance data for schools (if you have one) 10 2.80 The Ofsted school inspection framework 19 2.79 Fischer Family Trust as a tool to support school improvement 14 2.79 Harnessing support from your LA’s local community 13 2.77 Improving the profile of teaching as a profession 13 2.77 Changing demographics in London, e.g. Increasing young population leading to more schools being full and fewer casual admissions 17 2.76 The role of the DCS in your LA 17 2.71 London Challenge generally 13 2.69 Forensic analysis of the performance of pupils by ethnic background 16 2.69 RAISEonline as a tool to support improvement 18 2.67 The involvement of parents in schools 13 2.62 Your council topping up the education budget 10 2.60 Partnership working with the local community 17 2.59 London Challenge role in improving teacher recruitment in London 11 2.55 DfE Standards’ Meetings with your LA 11 2.55 A focus on attendance as a lever to drive up attainment 18 2.44 The role school governors in your LA played in driving up standards 14 2.43 London Challenge in your LA 13 2.38 The London economy i.e. being more resilient than the rest of the country 16 2.38 Local authority attendance teams 18 2.28 Reducing the number of fixed term and permanent exclusions in your LA’s schools 18 2.28 Deployment of teaching assistants 13 2.15 Local political scrutiny as a lever to drive up standards 14 2.14
  • 30. 30 How much do you feel these factors contributed to your Local Authority’s and/or London’s improved educational performance at Key Stage 2 and Key Stage 4, placing London above national? Number of responses Average score between 0 and 4 The transition from being an education department to Children’s Services 14 2.00 The role of your lead councillor for Children’s Services and/or education 16 1.75 Education policy (national) under the coalition 2010 to 2014 15 1.73 The role of local politicians in driving up standards 16 1.69 The DfE replacing Contextual Value Added (CVA) with simple Value Added (VA) 16 1.56 Your LA’s approach to moving more pupils into Alternative Provision 13 1.54 The role of Academy Chains in your local authority (if relevant) 7 1.43 The role of your leader of the council 14 1.36 The academisation of schools 16 1.13 The discontinuation of statutory target setting in 2010 12 1.08 The increase in the number of schools becoming academies since 2010 17 1.00 The creation of free schools 13 0.15
  • 31. 31 Appendix 3: The following table lists the summary of the five factors listed by respondents (to an open ended question) as having a major impact on raising educational standards in London. Factors listed by respondents (to an open ended question) as having a major impact on raising educational standards in London Frequency of responses Provision and use of performance data 15 Demographic changes in London and schools learning how to work with their intakes 10 LA and school collaboration 10 Leadership in schools 10 Relentless focus on standards 8 Focus on pupil tracking 7 Funding at higher levels 7 LA support for school improvement 6 Focus on pupil progress 5 Leadership in the local authority 5 High aspirations 3 School improvement and Research & Statistics teams providing challenge 3 Target setting 3 Accountability from DCS 2 Building Schools for the Future (BSF) 2 Continuing Professional Development (CPD) for teachers 2 Culture change in schools and the local authority 2 Disadvantage not a barrier to success 2 Focus on Early Years 2 Focus on schools causing concern 2 Inclusivity and moral purpose 2 Outsourcing school improvement 2 Partnership working between schools 2 Quality of teaching 2 Recruiting better headteachers 2 The London economy 2 Accountability from DfE 1 Collaboratives and federations of schools 1 Community and parental involvement 1 Diverse population with high aspirations 1 Focus on the quality of teaching and learning 1 Funding for disadvantaged 2 year olds 1 Funding for Early Years places 1 Identification of underachieving groups 1 LA funded interventions 1 LA put into intervention 1 Other LA services improving 1 Parental engagement 1 Quality of teaching in underperforming subject areas 1
  • 32. 32 Factors listed by respondents (to an open ended question) as having a major impact on raising educational standards in London Frequency of responses Raising expectations 1 Role of Children's Centres 1 School leaders who were given enough time to raise standards 1 Schools became more ambitious 1 Schools doing it for themselves 1 Support for BME groups 1 Support for different groups of pupils 1 Support for EAL pupils 1 Support for underperforming groups 1 Total Responses 140
  • 33. 33 Notes This paper is currently in draft format and was presented at the British Educational Research Association Annual Conference, Institute of Education from 23 - 25 September 2014. This paper is confidential and should only be used with the express permission of the authors, (contact details below). Acknowledgements Acknowledgement is due to all of the London Children’s Services Research and Statistics officers who completed a survey questionnaire as part of this research, and to Directors of Education and/or of Children’s Services and other Local Authority officers who generously gave their time to be interviewed. Contacts for correspondence: Sean Hayes Freelance Educational Researcher London Mobile: 07729 053676 Co-author: Robert Cassen London School of Economics Emails: sean.hayes@carmo.myzen.co.uk R.Cassen@lse.ac.uk
  • 34. 34 Bibliography i Radford, A. (2009). An enquiry into the Abolition of the Inner London Education Authority (1964 to 1988), with Particular Reference to Politics and Policy Making. (PhD Thesis, University of Bath – UK) ii Grace, G (Ed.). (1983). Education and the City. Theory, history and contemporary practice. (Routledge – UK) iii Inner London Education Authority. (1982). Achievement of Leavers by Sex, Ethnic Group and Year of Final Examination. (ILEA Research and Statistics Branch – UK) iv Gray, J & Jesson, D. (1987). Exam Results and LEA League Tables. (Newbury Policy Journal – UK) v Her Majesty’s Government. (1988). The Education Reform Act. (HMSO – UK) vi Department for Education. (1995). Secondary School Performance Tables. http://www.education.gov.uk/cgi-in/schools/performance/archive/shlea1_95?lea=205&type=b vii Marshall, P (Ed.). (2103). The Tail: How England's schools fail one child in five - and what can be done. (Profile Books Ltd – UK) viii Cook, C. (2013). How to explain the London success story. http://blogs.ft.com/ftdata/author/christophercook/ (Financial Times – UK) ix Hutchings, M. (2014). The Legacy of the London Challenge. (Keynote presentation to an NUT Conference at the Institute of Education) x Baars, S. Bernardes, E. Elwick, Malortie, A. McAleavy, T. McInerney, L. Menzies, L & Riggall, A. (2014). Lessons from London schools. (CfBT and Centre for London – UK) xi Hutchings, M. Greenwood, C. Hollingworth, S, Mansaray, A. Rose, A. Minty, S & Glass, K. (2012). Evaluation of the City Challenge Programme. (Department for Education – UK) xii Hutchings, M, ibid. (P 95) xiii Hutchings, M, ibid. (P 95) xiv Hutchings, M, ibid. (P 96) xv Brighouse, T. (2014). London Challenge Remembered. Presentation to an NUT Conference in 2014. xvi Hutchings, M, op cit. (P 110) xvii Ofsted. (2013). A review of the impact of the London Challenge (2003-8) and the City Challenge (2008-11). www.ofsted.gov.uk/accessandachievement xviii Woods, D. Husbands, C. & Brown, C. (2013). Transforming Education for All: the Tower Hamlets Story. (Tower Hamlets Council – UK) xix Collins, K. & Keating, M. (2013). ‘An East End Tale’, in The Tail. Marshall, P (Ed). (Profile Books). xx Woods, D. op cit. (P 49)
  • 35. 35 xxi Greaves, E. Macmillan, L. & Sibieta, L. (2014). Lessons from London schools for attainment gaps and social mobility.(London: Social Mobility and Child Poverty Commission) xxii Greaves, E, ibid (P 7) xxiii Woods, D, op cit. (P 8) xxiv Barrs, S. et al. op cit. (P 85) xxv Hayes, S. Shaw, H. & Osborne, K. (2007). White working class boys: is their performance at school a cause for concern? Paper presented at BERA in 2007. (British Education Index Reference: 167843) xxvi Hayes, S. Shaw, H. McGrath, G. & Bonel, F. (2009) Using RAISEonline as a research tool to analyse the link between attainment, social class and ethnicity. Paper presented at BERA in 2009. (British Education Index Reference: 184218) xxvii Siraj-Blatchford, I. (2009). Learning in the home and at school: how working class children ‘succeed against the odds’. British Educational Research Journal Vol. 36, No. 3, June 2010, pp. 463–482 xxviii Machin, S. and Vernoit, J. (2011). Changing School Autonomy: Academy Schools and Their Introduction to England’s Education. CEE Discussion Paper. No. 123. London School of Economics. xxix Machin, S. and Silva, O. (2013). School Structure, School Autonomy and the Tail. CEP Special Report. Centre for Economic Performance, London School of Economics. http://cep.lse.ac.uk/pubs/download/special/cepsp29.pdf. This research and the previous citation are reported in Cassen, R., McNally, S., and Vignoles, A. (forthcoming), Making a Difference in Education: What the evidence says, (Routledge London) xxx Hayes, S. & Clay, J. (2008). Progression from Key Stage 2 to 4: Understanding the Context and Nature of Performance and Underperformance between the ages of 11-16. (British Education Index Reference: 167840) xxxi Greaves, E. op cit (P 7) xxxii Hutchings, M. op cit. (P 93) xxxiii Hutchings, M. op cit. (P 87) xxxiv Barrs, S. et al. op cit. (P 88) xxxv Hayes, S., & Rutt, S. (1999). Primary Analysis for Secondary Schools: A LEA Research Officer’s Perspective on Helping Schools Interpret Assessment Data for School Improvement Purposes. Improving Schools (pp. 44 – 52). (Trentham Books – UK) xxxvi Barrs, S. et al. op cit. (P 100) xxxvii Rudd, P., Poet, H., Featherstone, G., Lamont, E., Durbin, B., Bergeron, C., Bramley, G., Kettlewell, K., & Hart, R. (2011). Evaluation of City Challenge Leadership Strategies: Overview Report. (Slough: NFER) xxxviii Poet, H., & Kettlewell, K. (2011). Evaluation of City Challenge Leadership Strategies: London Area Report. (Slough: NFER)
  • 36. 36 xxxix Poet, H., & Kettlewell, K. ibid. (P iii) xl Hutchings, M. op cit. (P 6) xli Hutchings, M. op cit. (P 37) xlii Wnyess, G. (2011). London schooling: lessons from the capital. (Centre: Forum – UK) xliii Barrs, S. et al. op cit. (P 81) xliv Barrs, S. et al. op cit. (P 99) xlv Mongon, D., & Chapman, C. (2011). High-Leverage Leadership: Improving Outcomes in Educational Settings. (Routledge – UK) xlvi Barrs, S. et al. op cit. (P 82) xlvii Demie, Feyisa. (2013). Using Data to Raise Achievement: good practice in schools, (Lambeth Council, Research and Statistics Unit London) xlviii Barrs, S. et al. op cit. (P 120) xlix Mongon, D., & Chapman, C. op cit. (P 16)