Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?


Published on

Authors: Andy Phippen, David Wright, Tanya Ovenden-Hope.
This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with data from the previous year, we consider the current state of practice among UK schools and analyse progress over a 12-month period.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?

  1. 1. In-depth Are schools equipped to address online safety in the curriculum and beyond?Authors This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with data from the previous year, weAndy Phippen consider the current state of practice among UK schools and analyse progress over aSchool of Management,Plymouth University, UK 12-month What is clear from this analysis is that the aspects that either use technological inter- vention (i.e. filtering) and policy development are generally performing better thanDavid Wright those that require long-term resource investment (such as training) or whole school in-South West Grid for volvement (such as parental education or community understanding). Monitoring andLearning Trust, UK reporting also perform badly. It is interesting to note that even with an almost the number of participating establishments, the strongest and weakest performing as-Dr Tanya Ovenden-Hope pects remain almost constant across 2010 and 2011, with only slight improvement.School of Social Science The analytical tool used to gather this data is now being used in pilot projects in the USand Social Word, PlymouthUniversity, UK and Australia. Once it is in full use in these regions, detailed analysis of internationalTanya.ovenden-hope@ performance will be available, for the first time. This presents some exciting ties to understand at an international level, how schools engage with online safety and ensure protection of their pupils, staff and wider community.Tagsmedia education, online 1. Introductionsafety, school self- The issue of online safety never out of the media and constant concerns for schools, whoassessment have duty of care to both staff and pupils, as well as ensure policy is in place to show due dili- gence related to different aspects of online safety. However, while the focus of much media is on the sensational aspects of the issues (for example, predatory behaviour, cyberbullying), the reality of online safety in schools is far more broad, ranging from technical countermeas- ures such as effective password strategy and content filtering, to developing policy is in place to deal with incidents if they arise. In the UK, a lack of national strategy on online safety has meant that many schools have adopted their own approaches which the institutions themselves have identified as, in many cases, incomplete and inconsistent. A review of online issues by Tanya Byron (http://media. the%202008%20byron%20review.pdf) proposed a holistic review to online safety, compris- ing a broad range of issues from technical issues to wider parental and community educa- tion. It called for a “whole school” approach where all staff were involved and engaged in all aspects of online as provided with regular training to ensure their knowledge and practice is up to date with the every changing field. OFSTED’s Safe Use of New Technologies report ( use-of-new-technologies), built on the recommendations of the Byron review concluding ing earn eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eueL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012Pap 1
  2. 2. In-depththat outstanding online safety had to have a whole school ap- adopted by many more organisations since this publication, andproach, including pupils, staff, governors, parents and the com- the data presented here is based upon returns from 1055 edu-munity in policy and practice, and did not use technology in a cational establishments.locked down manner. In this paper we present analysis “a year on” from this first re-However, while these important policy documents were wel- port, comparing development over the 12 month period andcomed, it also presented schools with a challenge in how to for the first time allow a comparison of progress to understandtransform this strategic vision of what online safety should be how institutions, and online safety policy and practice, has de-into operational terms. veloped in the UK up to September 2011.360 degree safe [] was launched by SouthWest Grid for Learning Trust [], in November 2. Methodology2009 as a means to allow schools to self-evaluate their own on- An overview of the 360 structure, detailing aspects covered,line safety provision; benchmark that provision against others; can be found at and prioritise areas for improvement and find advice degree-safe-Structure-Map. In total 28 aspects are detailed byand support to move forward. It provided a tool for schools to the tool, from technical measures such as filtering and technicalfirstly understand the breadth of issues associated with online security, through policy development to training and communi-safety and then review their own performance and identify how ty education. For each aspect a school can give themselves a rat-to improve. It provided summary reports of progression, which ing from 5 to 1 (5 being worst, 1 being best). For each rating inhelped all staff (not just those charged with the job of imple- each aspect, clear definitions are provided for each level to helpmenting an online safety policy) to understand the scope of on- the self review process. Establishments carry out the self reviewline safety and what the school is doing about the issue. via a web interface and submitted data is stored in a relational database structure which holds the information in a collectionIn operationalising an online safety “vision”, the tool provided on related “tables”, each table related to a specific data elementa prioritised action plan, suggesting not just what needs to be within the system. The three data tables which provide the coredone, but also in what order it needs to be done. This is a vital for analysis relate to establishments, 360 degree safe aspects,bonus for teachers and managers who approach the issue of and individual ratings, which detail an entry that an establish-online safety for the first time, in a school which has no (or only ment has made against a specific aspect.a very rudimentary) policy. Each establishment’s “profile” comprises a number of entries inUnderstanding Online Safety Policy and Practice the rating table, each related to a specific aspect. It is possiblewith 360 Degree Safe Data for an establishment to have more than one entry in the rating table associated with a specific aspect which would reflect thatAs well as providing a tool for schools to understand and de- they are using the tool for school improvement around onlinepractice, the tool also collects all submis-  velop their own online safety policy andsions into a central database. In buildinga picture of practice across the UK, this Establishments   Aspects  resource is unique to hold data on everyschool who have engaged with the tool.In September 2010, the first analysis ofthe 360 degree safe database was pub-lished by the South West Grid for Learning Rating  ( basedupon data returned from 547 establish- Figure 1: 360 data structurements across England. The tool has been ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 2
  3. 3. In-depthsafety practice. An establishment’s profile will also reflect their Education, an agency of the UK’s Ministry of Defence1, whichcurrent stage provides education for MoD employee’s children overseas.Given the relational structure of the 360 degree safe data, theprimary approach to analysis is through the querying of thisdata structure using SQL. In addition, summary data was loadedinto Microsoft Excel for further statistical analysis and graphing.Analysis of the data focuses on establishment’s self review oftheir online safety policy and practice, exploring their ratingsagainst the 28 aspects of 360 degree safe. Aspect exploration al-lows the measurement of degrees of progression and improve-ment in the self review and those where, in general, policy andpractice among UK educational establishment requires supportto deliver further progress.It should be noted that the international data (US and Austral-ia) has a slightly different, extended, structure, for the review Figure 2: Establishment geographyaspects, and this will be discussed in more detail later in thisreport. The “phase” of the establishment responses shows the break-It is acknowledged that the data being explored is self reviewed down between primary, secondary and post-16 and nursery. As– the establishments give themselves ratings against the as- can be seen from figure 3, the majority of those registered arepects and level definitions. However, self review is well estab- from primary schools. It is encouraging, given last year’s analy-lished practice within the UK school system and level descrip- sis showing that primary schools consistently underperformtors are very clearly defined. In addition, accreditation visits to against secondary schools in online safety policy and practice,date have demonstrated that in the instances of inspection that that the largest area of growth is registrations from that phasehave occurred, self review ratings have been generally accurate. of school. While the number of secondary schools has moreIndeed, many schools are generally conservative with their as- than doubled, the number of primaries has more than trebledsessments. We also now have a sufficiently large database that in 12 months.“anomalous” returns are very apparent and can be followed upwith the school or its local authority.3. Details of the Establishments AnalysedThe vast majority of the data is drawn from English schoolsalthough there are a few from Wales. There are almost threetimes as many schools now registered to use the tool than therewere last year. However, we should acknowledge that not allschools who have registered have embarked on their self re-view.Based upon the local authority specified by each establishment,figure 2 details the proportion of establishments from differ-ent regions. As we can see, while there is a large proportionfrom the south west, over half are from other regions. The Mid- Figure 3: Establishment “phase”lands also has a strong representation, and there are also goodspreads across other regions. SCE refers to Service Children’s 1 ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 3
  4. 4. In-depth4. Analysis of Aspect Performance tool. Therefore, different aspects have been rated by different numbers of establishments. In total, 559 establishments fromTop level analysis of practice and policy performance explores our population have carried out the full self review, and 496responses to different aspects given by each establishment. The additional schools have reviewed at least one aspect. Of thoseinitial analysis explores the “best” rating any establishment has establishments that have not completed a full review, figure 4provided, given this reflects where establishments currently illustrates the variety of levels of completion to date. It details the num- ber of establishments that have achieved each given number of aspects to show the range of completion This breakdown shows a spread of responses from those still in the early stages of self review to those nearing completion of the full set of aspects. It is interesting to noteFigure 4: The number of aspects completed by any establishment that has not completed the full review that, as with last year, there is a large concen-stand in their self re-view. However, giventhat 360 degree safe isintended for use to im-prove as well as evalu-ate practice, a featureof the 360 degree safedatabase is that it re-cords any evaluationon a particular aspectmade by an establish-ment at the time anddate of entry. This datacan be used to explorewhich areas are show-ing improvement inschools.It should also be notedthat it is not necessaryfor an establishmentto have completed thefull self review to have Figure 5: Aspect frequencyits data logged in the ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 4
  5. 5. In-depthtration of establishments who have completed 15 aspects. We plore areas of strength and weakness across our establishments.would observe that, if the tool was being used in a linear man- We present this data as an approximate “state of the nation”ner, the 16th aspect is Password Security, arguably the first of report related to online safety policy and practice in the UK.the aspects being reviewed that might require specialist techni- However, we acknowledge that it is likely that the respondentscal input to make a judgement on the levels. We might hypoth- who have embarked on an online safety self review are likely toesis (but cannot test at the present time) this may be a reason be more engaged as “early adopters” than those who have not.why some reviews seem to stall at this stage. Therefore, we might make the assumption that the data pre- sented may be better than average if it were possible to analyseIn further exploring which aspects are more “popular” with es- performance in all educational establishments in the country.tablishments, we can examine each aspect and the number of However, in comparing the results from this year’s analysis withestablishments who have completed a self review of that ele- those from the previous year, we will highlight a fairly consist-ment. This is detailed in figure 5 and again supports our hypoth- ent pattern, even with the addition of a significant number ofesis that aspects requiring technical input (or those following new establishments. Therefore, this year we can say with higheraspects requiring technical input) are less “popular” than other confidence than last that this does represent a national picture.aspects. We can see the two largest drops in aspect completionsare around Password Security and Technical Security: Each aspect can be rated by the self reviewing establishments on a progressive maturity scale from 5 (lowest rating) and 1The aspects are ordered as they appear in the self review tool (highest). In all cases analysis of the aspect ratings shows anand the pattern presented shows that most establishments will across establishment maximum rating of 1 and minimum of 5.undertake a linear approach to completing the self review. It Therefore, in order to determine cross-establishment perfor-should be noted that the tool can be used in a non-linear man- mance, average scores for each rating are used to measure are-ner, but figure 5 suggests that this is not the case in the majority as of strength and weakness in online safety policy and practice.of establishments. Figure 6 illustrates overall averages across aspects:We will now move from the top level quantity based returnsto look in more detail at each aspect presented in order to ex-Figure 6: Average ratings per aspect ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 5
  6. 6. In-depthThe top 5 aspects across establishments are exactly the same as • Monitoring the impact of policy and practice (3.96)last year. In 2010 the strongest aspects were: • E-Safety Committee (3.94) • Filtering (2.57) • Staff training (3.84) • Acceptable Use Policies (2.78) And this is the same in 2011: • Policy Scope (2.8) • Community understanding (4) • Digital and video images (2.93) • Governor training (3.93) • Policy development (3.02) • Monitoring the impact of the e-safety policy and practiceIn 2011 they are: (3.9) • Filtering (2.5) • E-Safety Committee (3.82) • Policy Scope (2.65) • Staff training (3.76) • Acceptable Use Policies (2.71) All of these aspects require long term development and commit- • Digital and video images (2.83) ment of resources (for example, regular and up to date training • Policy development (2.88) or monitoring). As with the strongest aspects, all have improved to some degree, which is encouraging to see. It is interesting toThere are two points to note in comparing the two sets of as- note that even with more than double the population size, thepects. Firstly, there has been a slight change in that Policy Scope strongest and weakest aspects have remained very similar. This,is now ahead of Acceptable Usage Policies. More significantly, again, gives us confidence in the representative nature of ourall of the aspects have improved on last year’s scores. While population data and the consistency of the self review process.increases are not huge, all aspects have improved by some de-gree. And while that is encouraging as we remarked upon last Standard deviation is also used to explore the “spread” of rat-year, the strongest aspects all have either a documentary (i.e. ings in the self review process. This is a useful measure to con-putting a policy in place, possibly derived from a local authority sider whether an aspect is consistently strong or weak across allor regional broadband consortia) or technical in nature (which schools, or whether there is variance in the evaluation. A highagain is generallyprovided by an out-side agency or offthe shelf solution).We see a similarestablished trendwith the five low-est rated aspects.As we identified lastyear these all focuson education andlong term resourcecommitment andthe 2011 weakestaspects are exact-ly the same as in2010: • Community understanding (4.03) • Governor Figure 7: Standard deviation of aspects training (4.03) ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 6
  7. 7. In-depthstandard deviation would mean that different establishments establishments than the previous year. Figure 8 shows the com-were using a broad range of scores for self review. Figure 7 parison between the two sets of averages, and shows a veryshows the standard deviations across the aspects. similar pattern but an improvement across all aspects.As with last year, “Filtering” is a high average and low standard We can compare this and last year’s scores and see there is vari-deviation, which shows that filtering is consistently highly rated ation in the level of change. The “most improved” aspects areacross establishments. Also similarly to last year, other “strong” as follows:aspects have a broader standard deviation. For example, Digital • Governors (0.16)images and video and policy scope show that these practices • E-Safety Committee (0.14)have a greater variance around schools. • Policy development (0.13)In considering the weakest aspects, we can see that both Staff • Policy Scope (0.13)Training and Monitoring and Reporting Incidents have com- • The contribution of young people (0.12)paratively narrow standard deviations, which would suggest This is positive to see improvement in some areas that are out-that these aspects have consistently poor performance across side of the “policy or technical” areas. In particular the role ofschools. Another one of the weaker areas of practice – Informa- Governors in the online safety context is particularly encourag-tion Literacy – also has a low deviation again reflecting consist- ing, given the stewardship of the school strategy and the poten-ently poor performance. tial for more aware governors to ask questions of senior man- agement around these issues.5. 2010/2011 comparison However, of least improved areas:While we have used some comparison to last year’s data to • Information literacy (0.01)show that there is a consistency and robustness to our data set, • Parental education (0.01)we now consider the comparison in more detail. As has been • Community understanding (0.03)discussed above, the 2011 data set included considerably moreFigure 8: 2010/2011 average rating comparison ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 7
  8. 8. In-depth • M&R Incidents (0.04) For example, we can see slight increase in spread for filtering, • Personal data (0.04) personal data and information literacy, while observing a reduc- tion for staff training and parental education, both areas of con-The majority, again, are those that require long term invest- cern from the broad exploration of online safety. Communityment. Recent research around the abuse of professionals by understanding, again highlighted as an area of concern, also hasstudents and other members of the school community (http:// experienced a narrowing of standard deviation (therefore other area of consistently poor practice).Online-Abuse-of-Professionals) highlights how important strongcommunity and parental engagement are in matters of onlinesafety. However, our data would suggest these are still weak 6. Primary Improvement, Secondaryareas showing little sign of improvement. We would also ob- Stationaryserve that personal data is a key area of concern from those As previously, the comparison of performance for primary andworking with schools, where establishments might be opening secondary establishments presents us with some very interest-themselves up to potential data protection prosecution. Our ing comparisons. Figure 10 shows the difference between aver-data would show that this is an area of weakness that is not age ratings in primary and secondary populations in 2011.improving. And we can see that, in general, primary establishments stillFigure 9 shows a comparison between 2010 and 2011 stand- report their performance as consistently weaker than their sec-ard deviations. Again we see a consistent share to the spread of ondary counterparts. This is not surprising given the differencedata and this time greater variance in increases and decreases in resource available in a lot of primary scoring. A change in standard deviation does not mean some-thing has become “better” or “worse”, but is can show whether However, one of the most interesting things to draw from thissomething has become more dispersed in terms of practice. comparison is that primary schools are “catching up” in terms of their policy and practice.Figure 9: 2010/2011 standard deviation comparison ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 8
  9. 9. In-depthFigure 10: Primary/secondary comparison 2011Figure 11: Primary/secondary comparison 2010 ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 9
  10. 10. In-depthIf we consider the 2010 data, in some aspects the average rating In some of the strongest areas of improvement, almost a quar-was more than a whole level difference: ter of a level has improved over the last year: • Whole School (1.5 difference) • Whole School (0.28) • Community understanding (1.23) • Technical Security (0.26) • Mobile phones and personal hand held devices (0.96) • Professional standards (0.26) • Password security (0.93) • Governors (0.23) • Technical Security (0.81) • Password security (0.22)However, with the 2011 data these differences have greatly re- In contrast, secondary schools, when the data is isolated, showduced: little, in any improvement. In some cases, there has been a re- • Mobile phones and personal hand held devices (0.78) duction in performance: • Password security (0.64) • Technical Security (-0.13) • Email, chat, social networking, instant messaging (0.54) • Professional standards (-0.11) • E-safety education (0.46) • Governor training (-0.09) • Technical Security (0.42) • Password security (-0.07) • Information literacy (-0.07)If we break the 2010/2011 comparison down between primary • Community understanding (-0.07)and secondary schools, as detailed in figures 12 and 13, we cansee clearly that there is a far more dramatic increase in perfor-mance in primary schools:Figure 12: Comparison of primary school averages 2010 - 2011 ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 10
  11. 11. In-depth in the number of partici- pating establishments, the strongest and weak- est performing aspects remain almost constant across 2010 and 2011 with only slight im- provement. However, more in depth analysis of the data shows a more interest- ing picture, which pre- sents evidence that pri- mary schools are clearly improving in their per- formance, while sec- ondaries are remaining stationary or, in some cases having a slight de-Figure 13: Comparison of secondary school averages 2010-2011 grading in performance. This analysis does, however, only present the very high level7. Conclusions overview of what is possible with this unique resource. FurtherThis paper has explored a number of aspects around the data analysis is possible at any level of comparison, from a nationalprovided by over 1000 schools in the UK related to their online picture to regional analysis and even consideration of differ-safety policy and practice. By comparing with similar analysis ent institutions in the same area. Since this analysis has beenfrom the previous year, we were able to both consider the cur- performed, significantly more schools have now engaged withrent state of practice among UK schools but also analysing pro- the tool, almost 1,000 having now carried out a full profile. Ingress over a 12 month period. addition, the tool is now being piloted in the US and Australia, through the Generation Safe project [http://generationsafe.What is clear from this analysis is those aspects that either use]. Once this tool is in full use in these regions, de-technological intervention (i.e. filtering) and policy develop- tailed analysis of international performance will be possible,ment are generally better performing than those aspects that for the first time. This presents some exciting opportunities forrequire long term resource investment (such as training) or understanding how schools internationally engage with onlinewhole school involvement (such as parental education or com- safety and ensure protection of their pupils, staff and widermunity understanding). Monitoring and reporting also perform community.badly. It is interesting to note that even with an almost doubling Edition and production Name of the publication: eLearning Papers Copyrights ISSN: 1887-1542 The texts published in this journal, unless otherwise indicated, are subject Publisher: to a Creative Commons Attribution-Noncommercial-NoDerivativeWorks Edited by: P.A.U. Education, S.L. 3.0 Unported licence. They may be copied, distributed and broadcast pro- Postal address: c/Muntaner 262, 3r, 08021 Barcelona (Spain) vided that the author and the e-journal that publishes them, eLearning Phone: +34 933 670 400 Papers, are cited. Commercial use and derivative works are not permitted. Email: The full licence can be consulted on Internet: es/by-nc-nd/3.0/ ing earn eLearning Papers • ISSN: 1887-1542 • eL ers 28 u ers.e gpap www .elea rnin n.º 28 • April 2012 Pap 11