SlideShare a Scribd company logo
1 of 110
Understanding online
     audiences
   Planning and implementing
  research into online audiences

 Creating Capacity 19 June 2012


         Martin Bazley
 Online experience consultant
  Martin Bazley & Associates
Martin Bazley
Previously
• Teaching (7 yrs)
• Science Museum, London,
  Internet Projects (7yrs)
• E-Learning Officer, MLA South East (3yrs)
Martin Bazley
• Current
• Vice Chair of Digital Learning Network
  DLNET
• Developing online resources, websites,
  user testing, evaluation, training,
  consultancy…
  Martin Bazley & Associates
  www.martinbazley.com

 Slides and notes available afterwards
Note to self:
                check stats
                tomorrow to see
                if anyone looked
                up the website




www.martinbazley.com
How can we get a sense of who our online
visitors are and what they do with our online
content?
How do we gather data to help us improve
what we do?

How do we measure success from the
user's point of view, and against our own
objectives and constraints?

For example, how justify investment (or lack
of it) in social networks etc?
Reasons for doing audience research:
               Evaluation
• Did your project/product/service do what
  you wanted it to do?
• Provide information for stakeholders
• Gauge audience satisfaction
Reasons for doing audience research:
               Promotion
• Improve your offer for your target
  audiences
• Increase usage
• Widen access
Reasons for doing audience research:
                 Planning
• Inform development of a new
  product/service
• Inform business planning
• Prove interest in a related activity
Tools available

• Qualitative – focus groups, “free text”
  questions in surveys, interviews
• Quantitative – web statistics, “multiple
  choice” questions in surveys, visitor
  tracking
• Observational – user testing,
  ethnographic
Define audience
                                                         Plan methodology
                          research goal




Use results to guide                                                  Collect data
       changes




                                          Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                                 Plan methodology
                                   goal




Use results to guide                                                          Collect data
       changes




                                                  Analyse data
Define audience research
                                                             Plan methodology
                               goal




Use results to guide                                                      Collect data
       changes




                                              Analyse data
Define audience
                                                         Plan methodology
                          research goal




Use results to guide                                                  Collect data
       changes




                                          Analyse data
When to evaluate or test and why

• Before funding approval – project planning

• Post-funding - project development

• Post-project – summative evaluation
Testing is an iterative process

Testing isn’t something you do once

Make something
 => test it
    => refine it
        => test it again
Before funding – project planning
• *Evaluation of other websites
  – Who for? What for? How use it? etc
  – awareness raising: issues, opportunities
  – contributes to market research              Research
  – possible elements, graphic feel etc

• *Concept testing
  – check idea makes sense with audience
  – reshape project based on user feedback
                                               Focus group
Post-funding - project development
• *Concept testing
  – refine project outcomes based on
    feedback from intended users
                                              Focus group

• Refine website structure
  – does it work for users?
                               One-to-one tasks

• *Evaluate initial look and feel
  – graphics,navigation etc

                                              Focus group
Post-funding - project development 2
• *Full evaluation of a draft working
  version
    – usability AND content: do activities work, how
      engaging is it, what else could be offered, etc




Observation of actual use of website
by intended users,
using it for intended purpose,
in intended context – workplace, classroom, library, home, etc
Post-funding - project development 3

• Acceptance testing of ‘finished’
  website
  – last minute check, minor corrections only
  – often offered by web developers



• Summative evaluation
  – report for funders, etc
  – learn lessons at project level for next time
Website evaluation and testing
Need to think ahead a bit:
  – what are you trying to find out?

  – how do you intend to test it?

  – why? what will do you do as a result?



  The Why? should drive this process
Evaluating online learning
resources in the classroom
M a r t in B a z le y
O n lin e e x p e r ie n c e
c o n s u lt a n t
Key point:


 f o r a s it e d e s ig n e d f o r s c h o o ls ,


 t h e m o s t e f f e c t iv e u s e r t e s t in g

 o b s e r v a t io n s


 w ill b e m a d e in a r e a l c la s s r o o m

 s it u a t io n
National Archives Moving Here project


 For teachers of 8 – 14 yr olds

 History Geography and Citizenship



 Features: Interactives, activity sheets, audio and video clips
M o v in g H e r e
For 8 – 14 yr olds studying: l s :
               S c hoo
History Geography and Citizenship
Features:
Interactives, activity sheets, audio and
video clips
1. p r e l i m i n a r y
 t e s t in g s e s s io n s
 –
conventional user-testing
 with teachers (at TNA)
2 . in -c la s s
 t e s t in g –
teachers used the Moving
 Here Schools site with
 pupils in their own
 classrooms
 This meant sitting at the back of the classroom
Evaluation: 2-phase approach
Site ready in parts – but not too ready:
The environment had a significant
 impact on how the site was
 used.

The class dynamic within the
 different groups contributed to
 how much the students learned.
The environment and social dynamics




The environment had a significant
 impact on how the site was
 used.

The class dynamic within the
 different groups contributed to
 how much the students learned.
in-class testing picked up
  elements not there in
  conventional user testing.

teachers in preliminary user
  testing did not spot some
  problems until actually in the
  classroom. For example…
in t e r a c t iv e
 a c t iv it ie s :
looked big enough when
 viewed on a screen
 nearby…
… but text/images too
small for some children
to see from the back of
the class…
…so interactives needed
to be viewable full-
screen
Only spotted during in-class testing:



…so interactives needed
to be viewable full-
screen
c o nte nt:
when students tried to read
 text out loud, teachers
 realised some text was too
 difficult or complex
a c t iv it y s h e e t s :
some sheets did not have
 spaces for students to put
 their names - caused
 confusion when printing 30
 at same time…
Manchester Art Gallery art interactive


 For teachers of 8 – 11 yr olds, and for pupils

 History Art and Citizenship



 Features: interactive with built in video, quiz, etc,

 plus activity sheets and background info
Martin Bazley
Martin Bazley
Martin Bazley
Martin Bazley
'This classroom user testing is all very well, but...'



Ho w c a n y o u s e e e v e r y t h in g
  in a c l a s s o f 3 0 c h il d r e n –
  d o n 't y o u mis s t h in g s ?
    Y o u s e e t h in g s in a
      c l a s s r o o m t h a t d o n 't
      a r is e in o n e - t o - o n e
      t e s t in g
    Th e y a r e t h e r e a l
'This classroom user testing is all very well, but...'



Ho w c a n y o u s e e e v e r y t h in g
  in a c l a s s o f 3 0 c h il d r e n –
  d o n 't y o u mis s t h in g s ?
    Y o u s e e t h in g s in a
      c l a s s r o o m t h a t d o n 't
      a r is e in o n e - t o - o n e
      t e s t in g
    Th e y a r e t h e r e a l
'This classroom user testing is all very well, but...' but...'
        'This classroom user testing is all very well,




Do e s n 't u s in g a s p e c if ic
  c l a s s wit h p a r t ic u l a r
  needs s kew t he r es ul t s ?
    »   F o r e x a mp l e , l o w a b il it y , p o o r E n g l is h ,
        e q u ip me n t n o t wo r k in g , b e h a v io u r is s u e s ,
        e t c - a r e r e s u l t s a s r e l ia b l e a s t h o s e
        in a 'n e u t r a l ' e n v ir o n me n t ?
    »   ‘n e u t r a l e n v ir o n me n t ’ ? – n o s u c h t h in g
        - a n y t e s t wil l b e s u b j e c t iv e , a n d in a n y
        c as e:
    »   T e s t in g is t o ma k e we b s it e wo r k we l l in
        c l a s s r o o m, - n e e d t o s e e e f f e c t s o f
        f a c t o r s l ik e t h o s e .
'This classroom user user testing isall very well, but...'
          'This classroom testing is all very well, but...'




 Do e s n 't u s in g a s p e c if ic
   c l a s s wit h p a r t ic u l a r
   needs s kew t he r es ul t s ?
       »   F o r e x a mp l e , l o w a b il it y , p o o r E n g l is h ,
           e q u ip me n t n o t wo r k in g , b e h a v io u r is s u e s ,
           e t c - a r e r e s u l t s a s r e l ia b l e a s t h o s e
           in a 'n e u t r a l ' e n v ir o n me n t ?
       »   ‘n e u t r a l e n v ir o n me n t ’ ? – n o s u c h t h in g
           - a n y t e s t wil l b e s u b j e c t iv e , a n d in a n y
           c as e:
       »   T e s t in g is t o ma k e we b s it e wo r k we l l in
           c l a s s r o o m, - n e e d t o s e e e f f e c t s o f
           f a c t o r s l ik e t h o s e .
'This classroom user user testing isall very well, but...'
          'This classroom testing is all very well, but...'




 C a n 't my W e b d e v e l o p e r d o
    t h e t e s t in g f o r u s ?
       » bes t not t o us e ext er nal
         de v e l ope r t o do us e r
         t e s t in g - c o n f l ic t o f
         in t e r e s t
       » a l s o l ik e l y t o f o c u s mo r e
         o n t h e t e c h n ic a l a s p e c t s o f
         t h e s it e t h a n o n e f f e c t o n
         t h e t e a c h e r a n d p u p il s .
       » obs er v e c l as s es y our s el f
'This classroom user user testing isall very well, but...'
          'This classroom testing is all very well, but...'




 C a n 't my W e b d e v e l o p e r d o
    t h e t e s t in g f o r u s ?
       » bes t not t o us e ext er nal
         de v e l ope r t o do us e r
         t e s t in g - c o n f l ic t o f
         in t e r e s t
       » a l s o l ik e l y t o f o c u s mo r e
         o n t h e t e c h n ic a l a s p e c t s o f
         t h e s it e t h a n o n e f f e c t o n
         t h e t e a c h e r a n d p u p il s .
       » v is it a c l a s s r o o m y o u r s e l f
'This classroom user testing is all very well, but...' but...'
        'This classroom user testing is all very well,




I d o n 't h a v e t h e t ime o r
  b u d g e t t o d o t h is !
    »   n e e d c o s t n o mo r e t h a n c o n v e n t io n a l
        u s e r t e s t in g . o n e p e r s o n c o u l d a t t e n d
        a o n e - h o u r c l a s s s e s s io n in a s c h o o l ,
        g iv in g t h e t e a c h e r t h e s a me s ma l l
        t o k e n p a y me n t
    »   T h is p r o g r a mme h a d e v a l u a t io n b u il t
        in t o p r o j e c t : 6 .7 % o f t o t a l S c h o o l s
        s it e b u d g e t .
    »   Al l o w 5 - 1 0 % o f t o t a l pr o j e c t b u d g e t
        f o r u s e r t e s t in g            => videos
Video clips
• Moving Here key ideas not lesson plans
  etc http://www.vimeo.com/18888798
• http://www.vimeo.com/18892401 Lesson
  starter
• Time saver
  http://www.vimeo.com/18867252 S
User test early

Testing one user early on in the project…

…is better than testing 50 near the end
Two usability testing techniques

“Get it” testing
- do they understand the purpose, how it
  works, etc

Key task testing
- ask the user to do something, watch how
  well they do

Ideally, do a bit of each, in that order
User testing – who should do it?
• The worst person to conduct (or interpret)
  user testing of your own site is…
  – you!
• Beware of hearing what you want to
  hear…
• Useful to have an external viewpoint
• First 5mins in a genuine setting tells you
  80% of what’s wrong with the site
Strengths and weaknesses of different data
           gathering techniques
Data gathering techniques
User testing
 - early in development and again near end
Online questionnaires
 – emailed to people or linked from website
Focus groups
 - best near beginning of project, or at
 redevelopment stage
Visitor surveys
 - link online and real visits
Web stats
 - useful for long term trends /events etc
Need to distinguish between:

Diagnostics
  – making a project or service better

Reporting
– to funders, or for advocacy
Online questionnaires
(+) once set up they gather numerical and
  qualitative data with no further effort –
   given time can build up large datasets
(+) the datasets can be easily exported and
  manipulated, can be sampled at various times,
  and structured queries can yield useful results
(–) respondents are self-selected and this will skew
  results – best to compare with similar data from
  other sources, like visitor surveys
(–) the number and nature of responses may
  depend on how the online questionnaire is
  displayed and promoted on the website
Focus groups
(+) can explore specific issues in more
  depth, yielding rich feedback
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) yield qualitative data only - small
  numbers mean numerical comparisons are
  unreliable
Visitor surveys
(+) possible to control participant
  composition to ensure representative
(–) comparatively time-consuming
  (expensive) to organise and analyse
(–) responses can be affected by various
  factors including interviewer, weather on
  the day, day of the week, etc, reducing
  validity of numerical comparisons between
  museums
Web stats
(+) Easy to gather data – can decide what
  to do with it later
(+) Person-independent data generated - it
  is the interpretation, rather than the data
  themselves, which is subjective. This
  means others can review the same data
  and verify or amend initial conclusions
  reached
Web stats
(–) Different systems generate different
  data for the same web activity – for
  example no of unique visits measured via
  Google Analytics is generally lower than
  that derived via server log files
(–) Metrics are complicated and require
  specialist knowledge to appreciate them
  fully
Web stats
(–) As the amount of off-website web
  activity increases (e.g. Web 2.0 style
  interactions) the validity of website stats
  decreases, especially for reporting
  purposes, but also for diagnostics
 (–) Agreeing a common format for
  presentation of data and analysis requires
  collaborative working to be meaningful
More information / advice /
          ideas


     Martin Bazley
     0780 3580 737
  www.martinbazley.com
SCA guidance
http://sca.jiscinvolve.org/wp/audience-publications/

                      Good overview
                   Step by step approach



                 Culture 24 Let’s Get Real
            http://weareculture24.org.uk/projects/action-research/
Crit room
‘simulated user testing’
Crit room protocol
Simulating user testing – usually one-to-one
  in quiet room
No one (especially site stakeholders) other
  than tester say anything for first part of
  session
In this simulation we will focus on
- Look and feel of site
- Usability
- Content
Crit room websites
Web stats
Google Analytics
Short intro videos:
http://services.google.com/analytics/breeze/en/ga_intro/index.html
Focus on   trends
rather than absolute   values
The ‘long tail’




An example of a power law graph showing popularity
ranking. To the right is the long tail; to the left are the
few that dominate. Notice that the areas of both
regions match. [Wikipedia: Long Tail]
The ‘long tail’




The tail becomes bigger and longer in new markets (depicted in
red). In other words, whereas traditional retailers have focused
on the area to the left of the chart, online bookstores derive
more sales from the area to the right.[Wikipedia: Long Tail]
The dashboard
• An overview of key metrics          • To the left is the main navigation
• Can be customised for quick views   • Detail can be per day, week or month
Setting time scale
• Click on the date range top right   • Select by calendar or timeline
• A panel opens with options          • Two periods can be compared
Visitors section
• Visits equate to sessions on site       • Non-human traffic is exluded
• Unique visitors are individual people   • You can segment the visitors by type
Map overlay
• See where visitors came from   • You can’t zoom in much
• Google stats are reliable      • Compare clusters to population of UK
Traffic Sources
• 3 types - direct, search engine and link   • Referrers are sites that link to you
• Can tell you a lot about audiences         • Approx 50% search is common
Keywords
• Find out what people searched for     • Always check the bounce rate
• Keyword clusters indicate audiences   • But take it with a pinch of salt!
Content section
• See how each page performs            • Find out where people enter and exit
• Follow navigation from page to page   • Look for unusual patterns
Content Detail
• Get key metrics for just this page   • Check keywords that got people here
Site overlay
• See clicks on links over a page            • Values are small when lots of links
• %s are proportion of clicks for this page • Correlate visibility with popularity
Navigation summary
• In the middle is the page              • Exits and next pages on right
• Entrances and previous pages on left   • See which pathways are most used
Site search
• This represents internal search   • Track search terms people use in site
• Needs a bit of configuration      • See how long spent after search
Goals section
• Typically used to track sales          • eg: track sign ups to newsletter
• But can be used for non-profit goals   • or user contributions to site

More Related Content

Similar to Understanding online audiences creating capacity 19 june 2012

Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin Bazley
 
Digital learning martin bazley gem conference swansea
Digital learning martin bazley gem conference swanseaDigital learning martin bazley gem conference swansea
Digital learning martin bazley gem conference swanseaMartin Bazley
 
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinExperience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinMad*Pow
 
Experience Research Best Practices
Experience Research Best PracticesExperience Research Best Practices
Experience Research Best PracticesDan Berlin
 
Bazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesBazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesMartin Bazley
 
090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview PresentationMartin Bazley
 
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Martin Bazley
 
Introduction to Marketing Research
Introduction to Marketing ResearchIntroduction to Marketing Research
Introduction to Marketing ResearchSoumitra Roy
 
Digital learning: an overview
Digital learning: an overviewDigital learning: an overview
Digital learning: an overviewMartin Bazley
 
Developing Audience Insight: Arts and Entertainment Experience (Un)marketing
Developing Audience Insight: Arts and Entertainment Experience (Un)marketingDeveloping Audience Insight: Arts and Entertainment Experience (Un)marketing
Developing Audience Insight: Arts and Entertainment Experience (Un)marketingKelly Page
 
Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie    Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie Sanika Deshpande
 
Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
 
Learning-Based Evaluation of Visual Analytic Systems.
Learning-Based Evaluation of Visual Analytic Systems.Learning-Based Evaluation of Visual Analytic Systems.
Learning-Based Evaluation of Visual Analytic Systems.BELIV Workshop
 
Qualitative and quantitative analysis
Qualitative and quantitative analysisQualitative and quantitative analysis
Qualitative and quantitative analysisNellie Deutsch (Ed.D)
 
Aligning Learning Analytics with Classroom Practices & Needs
Aligning Learning Analytics with Classroom Practices & NeedsAligning Learning Analytics with Classroom Practices & Needs
Aligning Learning Analytics with Classroom Practices & NeedsSimon Knight
 
Formulation of the research probleme Dr. SC sharma
Formulation of the research probleme Dr. SC sharmaFormulation of the research probleme Dr. SC sharma
Formulation of the research probleme Dr. SC sharmaDr Kirpa Ram Jangra
 

Similar to Understanding online audiences creating capacity 19 june 2012 (20)

Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...Martin bazley evaluating digital learning resources leicester reduced for upl...
Martin bazley evaluating digital learning resources leicester reduced for upl...
 
Digital learning martin bazley gem conference swansea
Digital learning martin bazley gem conference swanseaDigital learning martin bazley gem conference swansea
Digital learning martin bazley gem conference swansea
 
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan BerlinExperience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
Experience Research Best Practices - UX Meet Up Boston 2013 - Dan Berlin
 
Experience Research Best Practices
Experience Research Best PracticesExperience Research Best Practices
Experience Research Best Practices
 
Bazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online ResourcesBazley Developing And Evaluating Online Resources
Bazley Developing And Evaluating Online Resources
 
090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation090511 Appleby Magna Overview Presentation
090511 Appleby Magna Overview Presentation
 
Main
MainMain
Main
 
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
Understanding Online Audiences Bazley Ma Wonder Web 10 Jun09
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
Chapter 2.pptx
Chapter 2.pptxChapter 2.pptx
Chapter 2.pptx
 
Dlf 2012
Dlf 2012Dlf 2012
Dlf 2012
 
Introduction to Marketing Research
Introduction to Marketing ResearchIntroduction to Marketing Research
Introduction to Marketing Research
 
Digital learning: an overview
Digital learning: an overviewDigital learning: an overview
Digital learning: an overview
 
Developing Audience Insight: Arts and Entertainment Experience (Un)marketing
Developing Audience Insight: Arts and Entertainment Experience (Un)marketingDeveloping Audience Insight: Arts and Entertainment Experience (Un)marketing
Developing Audience Insight: Arts and Entertainment Experience (Un)marketing
 
Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie    Pre-Launch Content Evaluation of an Animated Movie
Pre-Launch Content Evaluation of an Animated Movie
 
Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey Research
 
Learning-Based Evaluation of Visual Analytic Systems.
Learning-Based Evaluation of Visual Analytic Systems.Learning-Based Evaluation of Visual Analytic Systems.
Learning-Based Evaluation of Visual Analytic Systems.
 
Qualitative and quantitative analysis
Qualitative and quantitative analysisQualitative and quantitative analysis
Qualitative and quantitative analysis
 
Aligning Learning Analytics with Classroom Practices & Needs
Aligning Learning Analytics with Classroom Practices & NeedsAligning Learning Analytics with Classroom Practices & Needs
Aligning Learning Analytics with Classroom Practices & Needs
 
Formulation of the research probleme Dr. SC sharma
Formulation of the research probleme Dr. SC sharmaFormulation of the research probleme Dr. SC sharma
Formulation of the research probleme Dr. SC sharma
 

More from Martin Bazley

Digital learning resources
Digital learning resourcesDigital learning resources
Digital learning resourcesMartin Bazley
 
Digital supporting pre post visit and classroom martin bazley upload version
Digital supporting pre post visit and classroom martin bazley upload versionDigital supporting pre post visit and classroom martin bazley upload version
Digital supporting pre post visit and classroom martin bazley upload versionMartin Bazley
 
Understanding online audiences ara conf 28 aug 15 martin bazley upload version
Understanding online audiences ara conf 28 aug 15 martin bazley upload versionUnderstanding online audiences ara conf 28 aug 15 martin bazley upload version
Understanding online audiences ara conf 28 aug 15 martin bazley upload versionMartin Bazley
 
E learning getting started with online learning reduced for uploading
E learning getting started with online learning reduced for uploadingE learning getting started with online learning reduced for uploading
E learning getting started with online learning reduced for uploadingMartin Bazley
 
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingDigital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingMartin Bazley
 
Developing online learning resources for schools on a budget
Developing online learning resources for schools on a budgetDeveloping online learning resources for schools on a budget
Developing online learning resources for schools on a budgetMartin Bazley
 
Creating online learning resources for schools for uploading
Creating online learning resources for schools   for uploadingCreating online learning resources for schools   for uploading
Creating online learning resources for schools for uploadingMartin Bazley
 
Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley
 
Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin Bazley
 
Creating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesCreating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesMartin Bazley
 
10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley bodenMartin Bazley
 
MyLearning and funding ukmw10
MyLearning and funding ukmw10MyLearning and funding ukmw10
MyLearning and funding ukmw10Martin Bazley
 
Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Martin Bazley
 
Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Martin Bazley
 
Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Martin Bazley
 
Grace Kimble NHM Intro
Grace Kimble NHM IntroGrace Kimble NHM Intro
Grace Kimble NHM IntroMartin Bazley
 
Linda Spurdle Pre Raphaelite Online Resource
Linda Spurdle Pre Raphaelite Online ResourceLinda Spurdle Pre Raphaelite Online Resource
Linda Spurdle Pre Raphaelite Online ResourceMartin Bazley
 
Fitzwilliam Museum Online Exhibitions
Fitzwilliam Museum Online ExhibitionsFitzwilliam Museum Online Exhibitions
Fitzwilliam Museum Online ExhibitionsMartin Bazley
 

More from Martin Bazley (20)

Digital learning resources
Digital learning resourcesDigital learning resources
Digital learning resources
 
Digital supporting pre post visit and classroom martin bazley upload version
Digital supporting pre post visit and classroom martin bazley upload versionDigital supporting pre post visit and classroom martin bazley upload version
Digital supporting pre post visit and classroom martin bazley upload version
 
Understanding online audiences ara conf 28 aug 15 martin bazley upload version
Understanding online audiences ara conf 28 aug 15 martin bazley upload versionUnderstanding online audiences ara conf 28 aug 15 martin bazley upload version
Understanding online audiences ara conf 28 aug 15 martin bazley upload version
 
E learning getting started with online learning reduced for uploading
E learning getting started with online learning reduced for uploadingE learning getting started with online learning reduced for uploading
E learning getting started with online learning reduced for uploading
 
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploadingDigital technology for museum learning oxford 2 mar 12 reduced for uploading
Digital technology for museum learning oxford 2 mar 12 reduced for uploading
 
Developing online learning resources for schools on a budget
Developing online learning resources for schools on a budgetDeveloping online learning resources for schools on a budget
Developing online learning resources for schools on a budget
 
Creating online learning resources for schools for uploading
Creating online learning resources for schools   for uploadingCreating online learning resources for schools   for uploading
Creating online learning resources for schools for uploading
 
Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...Martin Bazley - using simple technologies with different audiences (reduced f...
Martin Bazley - using simple technologies with different audiences (reduced f...
 
Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11Martin bazley Creating effective content 15 Mar 11
Martin bazley Creating effective content 15 Mar 11
 
Creating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced imagesCreating online learning resources royal collection 18 jan 2011 reduced images
Creating online learning resources royal collection 18 jan 2011 reduced images
 
10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden10 11 25 univ of brighton usability and evaluation module shelley boden
10 11 25 univ of brighton usability and evaluation module shelley boden
 
MyLearning and funding ukmw10
MyLearning and funding ukmw10MyLearning and funding ukmw10
MyLearning and funding ukmw10
 
Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010Developing online resources fleet air arm museum 18 oct 2010
Developing online resources fleet air arm museum 18 oct 2010
 
Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010Online exhibitions southampton 22 may 2010
Online exhibitions southampton 22 may 2010
 
Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010Writing for the web highpoint leicester may 2010
Writing for the web highpoint leicester may 2010
 
Peter Pavement
Peter PavementPeter Pavement
Peter Pavement
 
Grace Kimble NHM Intro
Grace Kimble NHM IntroGrace Kimble NHM Intro
Grace Kimble NHM Intro
 
Jane Devine Mejia
Jane Devine MejiaJane Devine Mejia
Jane Devine Mejia
 
Linda Spurdle Pre Raphaelite Online Resource
Linda Spurdle Pre Raphaelite Online ResourceLinda Spurdle Pre Raphaelite Online Resource
Linda Spurdle Pre Raphaelite Online Resource
 
Fitzwilliam Museum Online Exhibitions
Fitzwilliam Museum Online ExhibitionsFitzwilliam Museum Online Exhibitions
Fitzwilliam Museum Online Exhibitions
 

Recently uploaded

Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 

Recently uploaded (20)

Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 

Understanding online audiences creating capacity 19 june 2012

  • 1. Understanding online audiences Planning and implementing research into online audiences Creating Capacity 19 June 2012 Martin Bazley Online experience consultant Martin Bazley & Associates
  • 2. Martin Bazley Previously • Teaching (7 yrs) • Science Museum, London, Internet Projects (7yrs) • E-Learning Officer, MLA South East (3yrs)
  • 3. Martin Bazley • Current • Vice Chair of Digital Learning Network DLNET • Developing online resources, websites, user testing, evaluation, training, consultancy… Martin Bazley & Associates www.martinbazley.com Slides and notes available afterwards
  • 4. Note to self: check stats tomorrow to see if anyone looked up the website www.martinbazley.com
  • 5. How can we get a sense of who our online visitors are and what they do with our online content? How do we gather data to help us improve what we do? How do we measure success from the user's point of view, and against our own objectives and constraints? For example, how justify investment (or lack of it) in social networks etc?
  • 6. Reasons for doing audience research: Evaluation • Did your project/product/service do what you wanted it to do? • Provide information for stakeholders • Gauge audience satisfaction
  • 7. Reasons for doing audience research: Promotion • Improve your offer for your target audiences • Increase usage • Widen access
  • 8. Reasons for doing audience research: Planning • Inform development of a new product/service • Inform business planning • Prove interest in a related activity
  • 9. Tools available • Qualitative – focus groups, “free text” questions in surveys, interviews • Quantitative – web statistics, “multiple choice” questions in surveys, visitor tracking • Observational – user testing, ethnographic
  • 10. Define audience Plan methodology research goal Use results to guide Collect data changes Analyse data
  • 11. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 12. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 13. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 14. Define audience research Plan methodology goal Use results to guide Collect data changes Analyse data
  • 15. Define audience Plan methodology research goal Use results to guide Collect data changes Analyse data
  • 16. When to evaluate or test and why • Before funding approval – project planning • Post-funding - project development • Post-project – summative evaluation
  • 17. Testing is an iterative process Testing isn’t something you do once Make something => test it => refine it => test it again
  • 18. Before funding – project planning • *Evaluation of other websites – Who for? What for? How use it? etc – awareness raising: issues, opportunities – contributes to market research Research – possible elements, graphic feel etc • *Concept testing – check idea makes sense with audience – reshape project based on user feedback Focus group
  • 19.
  • 20. Post-funding - project development • *Concept testing – refine project outcomes based on feedback from intended users Focus group • Refine website structure – does it work for users? One-to-one tasks • *Evaluate initial look and feel – graphics,navigation etc Focus group
  • 21.
  • 22.
  • 23.
  • 24.
  • 25. Post-funding - project development 2 • *Full evaluation of a draft working version – usability AND content: do activities work, how engaging is it, what else could be offered, etc Observation of actual use of website by intended users, using it for intended purpose, in intended context – workplace, classroom, library, home, etc
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37. Post-funding - project development 3 • Acceptance testing of ‘finished’ website – last minute check, minor corrections only – often offered by web developers • Summative evaluation – report for funders, etc – learn lessons at project level for next time
  • 38. Website evaluation and testing Need to think ahead a bit: – what are you trying to find out? – how do you intend to test it? – why? what will do you do as a result? The Why? should drive this process
  • 39. Evaluating online learning resources in the classroom M a r t in B a z le y O n lin e e x p e r ie n c e c o n s u lt a n t
  • 40. Key point: f o r a s it e d e s ig n e d f o r s c h o o ls , t h e m o s t e f f e c t iv e u s e r t e s t in g o b s e r v a t io n s w ill b e m a d e in a r e a l c la s s r o o m s it u a t io n
  • 41. National Archives Moving Here project For teachers of 8 – 14 yr olds History Geography and Citizenship Features: Interactives, activity sheets, audio and video clips
  • 42. M o v in g H e r e For 8 – 14 yr olds studying: l s : S c hoo History Geography and Citizenship Features: Interactives, activity sheets, audio and video clips
  • 43. 1. p r e l i m i n a r y t e s t in g s e s s io n s – conventional user-testing with teachers (at TNA)
  • 44. 2 . in -c la s s t e s t in g – teachers used the Moving Here Schools site with pupils in their own classrooms This meant sitting at the back of the classroom
  • 46. Site ready in parts – but not too ready:
  • 47. The environment had a significant impact on how the site was used. The class dynamic within the different groups contributed to how much the students learned.
  • 48. The environment and social dynamics The environment had a significant impact on how the site was used. The class dynamic within the different groups contributed to how much the students learned.
  • 49. in-class testing picked up elements not there in conventional user testing. teachers in preliminary user testing did not spot some problems until actually in the classroom. For example…
  • 50. in t e r a c t iv e a c t iv it ie s : looked big enough when viewed on a screen nearby…
  • 51.
  • 52. … but text/images too small for some children to see from the back of the class…
  • 53.
  • 54. …so interactives needed to be viewable full- screen
  • 55. Only spotted during in-class testing: …so interactives needed to be viewable full- screen
  • 56. c o nte nt: when students tried to read text out loud, teachers realised some text was too difficult or complex
  • 57. a c t iv it y s h e e t s : some sheets did not have spaces for students to put their names - caused confusion when printing 30 at same time…
  • 58. Manchester Art Gallery art interactive For teachers of 8 – 11 yr olds, and for pupils History Art and Citizenship Features: interactive with built in video, quiz, etc, plus activity sheets and background info
  • 59.
  • 64. 'This classroom user testing is all very well, but...' Ho w c a n y o u s e e e v e r y t h in g in a c l a s s o f 3 0 c h il d r e n – d o n 't y o u mis s t h in g s ? Y o u s e e t h in g s in a c l a s s r o o m t h a t d o n 't a r is e in o n e - t o - o n e t e s t in g Th e y a r e t h e r e a l
  • 65. 'This classroom user testing is all very well, but...' Ho w c a n y o u s e e e v e r y t h in g in a c l a s s o f 3 0 c h il d r e n – d o n 't y o u mis s t h in g s ? Y o u s e e t h in g s in a c l a s s r o o m t h a t d o n 't a r is e in o n e - t o - o n e t e s t in g Th e y a r e t h e r e a l
  • 66. 'This classroom user testing is all very well, but...' but...' 'This classroom user testing is all very well, Do e s n 't u s in g a s p e c if ic c l a s s wit h p a r t ic u l a r needs s kew t he r es ul t s ? » F o r e x a mp l e , l o w a b il it y , p o o r E n g l is h , e q u ip me n t n o t wo r k in g , b e h a v io u r is s u e s , e t c - a r e r e s u l t s a s r e l ia b l e a s t h o s e in a 'n e u t r a l ' e n v ir o n me n t ? » ‘n e u t r a l e n v ir o n me n t ’ ? – n o s u c h t h in g - a n y t e s t wil l b e s u b j e c t iv e , a n d in a n y c as e: » T e s t in g is t o ma k e we b s it e wo r k we l l in c l a s s r o o m, - n e e d t o s e e e f f e c t s o f f a c t o r s l ik e t h o s e .
  • 67. 'This classroom user user testing isall very well, but...' 'This classroom testing is all very well, but...' Do e s n 't u s in g a s p e c if ic c l a s s wit h p a r t ic u l a r needs s kew t he r es ul t s ? » F o r e x a mp l e , l o w a b il it y , p o o r E n g l is h , e q u ip me n t n o t wo r k in g , b e h a v io u r is s u e s , e t c - a r e r e s u l t s a s r e l ia b l e a s t h o s e in a 'n e u t r a l ' e n v ir o n me n t ? » ‘n e u t r a l e n v ir o n me n t ’ ? – n o s u c h t h in g - a n y t e s t wil l b e s u b j e c t iv e , a n d in a n y c as e: » T e s t in g is t o ma k e we b s it e wo r k we l l in c l a s s r o o m, - n e e d t o s e e e f f e c t s o f f a c t o r s l ik e t h o s e .
  • 68. 'This classroom user user testing isall very well, but...' 'This classroom testing is all very well, but...' C a n 't my W e b d e v e l o p e r d o t h e t e s t in g f o r u s ? » bes t not t o us e ext er nal de v e l ope r t o do us e r t e s t in g - c o n f l ic t o f in t e r e s t » a l s o l ik e l y t o f o c u s mo r e o n t h e t e c h n ic a l a s p e c t s o f t h e s it e t h a n o n e f f e c t o n t h e t e a c h e r a n d p u p il s . » obs er v e c l as s es y our s el f
  • 69. 'This classroom user user testing isall very well, but...' 'This classroom testing is all very well, but...' C a n 't my W e b d e v e l o p e r d o t h e t e s t in g f o r u s ? » bes t not t o us e ext er nal de v e l ope r t o do us e r t e s t in g - c o n f l ic t o f in t e r e s t » a l s o l ik e l y t o f o c u s mo r e o n t h e t e c h n ic a l a s p e c t s o f t h e s it e t h a n o n e f f e c t o n t h e t e a c h e r a n d p u p il s . » v is it a c l a s s r o o m y o u r s e l f
  • 70. 'This classroom user testing is all very well, but...' but...' 'This classroom user testing is all very well, I d o n 't h a v e t h e t ime o r b u d g e t t o d o t h is ! » n e e d c o s t n o mo r e t h a n c o n v e n t io n a l u s e r t e s t in g . o n e p e r s o n c o u l d a t t e n d a o n e - h o u r c l a s s s e s s io n in a s c h o o l , g iv in g t h e t e a c h e r t h e s a me s ma l l t o k e n p a y me n t » T h is p r o g r a mme h a d e v a l u a t io n b u il t in t o p r o j e c t : 6 .7 % o f t o t a l S c h o o l s s it e b u d g e t . » Al l o w 5 - 1 0 % o f t o t a l pr o j e c t b u d g e t f o r u s e r t e s t in g => videos
  • 71. Video clips • Moving Here key ideas not lesson plans etc http://www.vimeo.com/18888798 • http://www.vimeo.com/18892401 Lesson starter • Time saver http://www.vimeo.com/18867252 S
  • 72. User test early Testing one user early on in the project… …is better than testing 50 near the end
  • 73. Two usability testing techniques “Get it” testing - do they understand the purpose, how it works, etc Key task testing - ask the user to do something, watch how well they do Ideally, do a bit of each, in that order
  • 74.
  • 75. User testing – who should do it? • The worst person to conduct (or interpret) user testing of your own site is… – you! • Beware of hearing what you want to hear… • Useful to have an external viewpoint • First 5mins in a genuine setting tells you 80% of what’s wrong with the site
  • 76. Strengths and weaknesses of different data gathering techniques
  • 77.
  • 78.
  • 79.
  • 80.
  • 81. Data gathering techniques User testing - early in development and again near end Online questionnaires – emailed to people or linked from website Focus groups - best near beginning of project, or at redevelopment stage Visitor surveys - link online and real visits Web stats - useful for long term trends /events etc
  • 82. Need to distinguish between: Diagnostics – making a project or service better Reporting – to funders, or for advocacy
  • 83. Online questionnaires (+) once set up they gather numerical and qualitative data with no further effort – given time can build up large datasets (+) the datasets can be easily exported and manipulated, can be sampled at various times, and structured queries can yield useful results (–) respondents are self-selected and this will skew results – best to compare with similar data from other sources, like visitor surveys (–) the number and nature of responses may depend on how the online questionnaire is displayed and promoted on the website
  • 84. Focus groups (+) can explore specific issues in more depth, yielding rich feedback (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) yield qualitative data only - small numbers mean numerical comparisons are unreliable
  • 85. Visitor surveys (+) possible to control participant composition to ensure representative (–) comparatively time-consuming (expensive) to organise and analyse (–) responses can be affected by various factors including interviewer, weather on the day, day of the week, etc, reducing validity of numerical comparisons between museums
  • 86. Web stats (+) Easy to gather data – can decide what to do with it later (+) Person-independent data generated - it is the interpretation, rather than the data themselves, which is subjective. This means others can review the same data and verify or amend initial conclusions reached
  • 87. Web stats (–) Different systems generate different data for the same web activity – for example no of unique visits measured via Google Analytics is generally lower than that derived via server log files (–) Metrics are complicated and require specialist knowledge to appreciate them fully
  • 88. Web stats (–) As the amount of off-website web activity increases (e.g. Web 2.0 style interactions) the validity of website stats decreases, especially for reporting purposes, but also for diagnostics (–) Agreeing a common format for presentation of data and analysis requires collaborative working to be meaningful
  • 89. More information / advice / ideas Martin Bazley 0780 3580 737 www.martinbazley.com
  • 90. SCA guidance http://sca.jiscinvolve.org/wp/audience-publications/ Good overview Step by step approach Culture 24 Let’s Get Real http://weareculture24.org.uk/projects/action-research/
  • 92. Crit room protocol Simulating user testing – usually one-to-one in quiet room No one (especially site stakeholders) other than tester say anything for first part of session In this simulation we will focus on - Look and feel of site - Usability - Content
  • 96. Focus on trends rather than absolute values
  • 97. The ‘long tail’ An example of a power law graph showing popularity ranking. To the right is the long tail; to the left are the few that dominate. Notice that the areas of both regions match. [Wikipedia: Long Tail]
  • 98. The ‘long tail’ The tail becomes bigger and longer in new markets (depicted in red). In other words, whereas traditional retailers have focused on the area to the left of the chart, online bookstores derive more sales from the area to the right.[Wikipedia: Long Tail]
  • 99. The dashboard • An overview of key metrics • To the left is the main navigation • Can be customised for quick views • Detail can be per day, week or month
  • 100. Setting time scale • Click on the date range top right • Select by calendar or timeline • A panel opens with options • Two periods can be compared
  • 101. Visitors section • Visits equate to sessions on site • Non-human traffic is exluded • Unique visitors are individual people • You can segment the visitors by type
  • 102. Map overlay • See where visitors came from • You can’t zoom in much • Google stats are reliable • Compare clusters to population of UK
  • 103. Traffic Sources • 3 types - direct, search engine and link • Referrers are sites that link to you • Can tell you a lot about audiences • Approx 50% search is common
  • 104. Keywords • Find out what people searched for • Always check the bounce rate • Keyword clusters indicate audiences • But take it with a pinch of salt!
  • 105. Content section • See how each page performs • Find out where people enter and exit • Follow navigation from page to page • Look for unusual patterns
  • 106. Content Detail • Get key metrics for just this page • Check keywords that got people here
  • 107. Site overlay • See clicks on links over a page • Values are small when lots of links • %s are proportion of clicks for this page • Correlate visibility with popularity
  • 108. Navigation summary • In the middle is the page • Exits and next pages on right • Entrances and previous pages on left • See which pathways are most used
  • 109. Site search • This represents internal search • Track search terms people use in site • Needs a bit of configuration • See how long spent after search
  • 110. Goals section • Typically used to track sales • eg: track sign ups to newsletter • But can be used for non-profit goals • or user contributions to site

Editor's Notes

  1. The Moving Here site, launched in 2003, is the product of collaboration between 30 local, regional and national archives, museums and libraries across the UK, headed by the National Archives. The site explores, records and illustrates why people came to England over the last 200 years, and what their experiences were and continue to be. It holds a database of on-line versions of 200,000 original documents and images recording migration history, all free to access for personal and educational use. The documents include photographs, personal papers, government documents, maps, and images of art objects, as well as a collection of sound recordings and video clips, all accessible through a search facility. The site also includes a Migration Histories section focusing on four communities – Caribbean, Irish, Jewish and South Asian – as well as a gallery of selected images from the collection, a section about tracing family roots, and a Stories section allowing users to submit stories and photographs about their own experiences of migration to England. The site was funded by the BLF (Big Lottery Fund). The Moving Here Schools site is a subsection of the greater Moving Here site, and was designed during a second phase (2005-07) of the Moving Here project. One aim of this phase is to ensure that stories of migration history are passed down to younger generations through schools. The Schools section therefore focuses on History, Citizenship and Geography for Key Stages 2 and 3 of the National Curriculum (ages 8 to 14), and includes four modules: The Victorians, Britain Since 1948, The Holocaust, and People and Places. Designed for use with an interactive whiteboard, the resources on Moving Here Schools include images and documents, audio and video clips, downloadable activity sheets, on-line interactive activities, a gallery of images, and links to stories of immigration experiences that have been collected by the Community Partnerships strand of the project. Funding for the Schools section is provided by HLF (Heritage Lottery Fund). The Schools site launches in March 2007 as a new section of the Moving Here site. Most on-line resource testing involves a potential user being observed in an environment chosen and closely controlled by an evaluator who guides the user through pre-set questions and assigned actions. Although this method of assessment can be useful in addressing top level issues and can give insight into some of the design, navigation and content changes that may need to be made, it does not replicate the conditions under which the Web site would normally be used. In this presentation we contend that the primary aim of testing Web sites for use in schools should be to capture feedback, not only on the usability, overall design, content and other aspects of the Web site itself, but also on the ways in which the Web site supports, or hinders, enjoyment and learning on the part of teachers and students in a real classroom environment. A range of issues, including group dynamics in the classroom, teachers' prior experience of using an electronic whiteboard, their background knowledge of the subject, and other issues, can all have an impact on the overall experience. The feedback generated through 'real life testing' – or 'habitat testing' – is rich and highly informative for refining the site to improve the end user experience – and it need not be expensive or overly time consuming. In this presentation we will discuss the expectations, challenges and opportunities that arose during 25 in-class testing sessions undertaken as part of our evaluation programme. We also make suggestions about how this type of testing can enhance Web site evaluation programmes, not only with regard to gaining feedback on specific resources, but also as an awareness-raising experience for museum practitioners.
  2. A comprehensive evaluation for Moving Here Schools was built into the project from the beginning, and was allocated £20,000 of the entire project's budget, which came to approximately 6.7 % of the £300,000 budget that was allotted to the Schools portion of the project (or 1.7 % of the total project budget of £1.2 million). The evaluation programme included two distinct phases: a period of preliminary testing sessions, during which teachers participated in conventional user-testing, and a period of in-class testing, during which teachers used the Moving Here Schools site directly with their pupils in their own classrooms. In planning the evaluation process, the team felt that a combination of methodologies might produce more fruitful results than just one methodology alone. The same concept is suggested in Haley Goldman and Bendoly's study of heuristic evaluation used in tandem with other types of evaluation to test museum Web sites (2003). Six teachers from the London area formed the evaluation team, selected and partially supported by the LGFL (London Grid for Learning). Four of them participated in both the preliminary testing sessions and the in-class sessions. Two preliminary testing sessions were held at the National Archives in February and June 2006 to review the first and second drafts of the Schools site. The Education Resources Manager led the sessions and two other members of the Moving Here team participated, mainly recording information. The user environment (the ICT suite at the National Archives), as well as the methodology, was based on conventional user testing, bringing the testers into a closed environment and observing them as they interacted with the draft site. For each of the two observation sessions, the teachers spent a full day looking at the draft versions of the module they had been assigned and commenting on navigation, design, subject coverage, style, tone and other elements of the site, observed by three Moving Here team members. They also participated in a group discussion at the end of the day. Their feedback was written into reports and used to make improvements to the site. These preliminary testing sessions proved invaluable to improving the quality and usability of the site. Between the February session and the June session, major changes were made as a direct result of teacher feedback, substantially improving the site. The main changes made were to reorganize the materials into shorter, blockier lessons rather than longer, linear lessons; to merge two related lessons into one; to shorten most lessons and lesson pages; and to redevelop some of the interactive activity specifications. In the June session teachers noted that their input had been followed up and commented favourably upon the fact that their feedback had been used to improve the site (one teacher said that even though she had been asked for her opinion about Web sites before, she had never seen her suggestions put into practice before this particular project). After the June session, changes were again made to the site, including design modifications, changes to interactive activities that had already been programmed, and a few more changes (including additions) to content. Although this amount of testing could be considered enough – especially since it yielded such fruitful results – Moving Here also included in-class testing as part of its programme. This approach incorporates the advantages of 'ordinary' user testing, but builds on it by taking account of the social dynamics and practical problems that influence the use of the site, so as to ensure the site is usable in the classroom – as opposed to ensuring that it is usable in a controlled user testing environment. This unique addition to the evaluation programme proved even more useful than the conventional user testing. We were hired as the evaluation team to carry out the in-class testing programme. In classroom testing sessions, teachers were observed, at their schools, using the Moving Here site with their students. The schools included two primary schools in the borough of Newham, London, a secondary school in Bethnal Green, London, and a secondary school in Peckham, London – all neighbourhoods with culturally diverse communities, a high proportion of immigrants and a large number of people whose first language is not English. Four of the original six teachers were involved, and the evaluation team went into their classrooms 25 times between October and December 2006 – 5 times per teacher, with one teacher doing a double load and testing two modules instead of a single one, for a total of 10 sessions. The teachers were paid £300 each for five sessions of in-class testing, working out at £60 per session. This closely follows the standard amount of between $50 and $100 US for one session, as suggested by Steve Krug in his seminal work on user testing, 'Don't Make Me Think!'(2000). The teacher who tested two modules received double payment. The total spent on teachers was £1500. The evaluation team was paid approximately £18,500 for the 5 sets of observations and session reports, plus a final report. The evaluation team met with each of the participating teachers in advance to agree which lessons to use for the in-classroom evaluation, to brief the teachers on what was required during the session and to agree on the procedure for arriving during the school day. In consultation with the Education Resources Manager, the evaluators produced an evaluation plan with a set of questions to ask each teacher to make sure they had covered all areas and issues, plus an in-class observation checklist. The questions covered learning outcomes, tone, length of lesson, order of pages, images, design, navigation, accessibility, activity sheet issues, issues with interactives, children's engagement, and other improvements that teachers thought might be useful. The teachers submitted lesson plans with intended learning outcomes for each session according to the National Curriculum. Immediately after each session a written session report was sent to the Education Resources Manager, who used the findings in each report to implement changes to the site while the series of observations were still going on. In some cases, changes requested by a teacher were in place in time for the next observation.
  3. A comprehensive evaluation for Moving Here Schools was built into the project from the beginning, and was allocated £20,000 of the entire project's budget, which came to approximately 6.7 % of the £300,000 budget that was allotted to the Schools portion of the project (or 1.7 % of the total project budget of £1.2 million). The evaluation programme included two distinct phases: a period of preliminary testing sessions, during which teachers participated in conventional user-testing, and a period of in-class testing, during which teachers used the Moving Here Schools site directly with their pupils in their own classrooms. In planning the evaluation process, the team felt that a combination of methodologies might produce more fruitful results than just one methodology alone. The same concept is suggested in Haley Goldman and Bendoly's study of heuristic evaluation used in tandem with other types of evaluation to test museum Web sites (2003). Six teachers from the London area formed the evaluation team, selected and partially supported by the LGFL (London Grid for Learning). Four of them participated in both the preliminary testing sessions and the in-class sessions. Two preliminary testing sessions were held at the National Archives in February and June 2006 to review the first and second drafts of the Schools site. The Education Resources Manager led the sessions and two other members of the Moving Here team participated, mainly recording information. The user environment (the ICT suite at the National Archives), as well as the methodology, was based on conventional user testing, bringing the testers into a closed environment and observing them as they interacted with the draft site. For each of the two observation sessions, the teachers spent a full day looking at the draft versions of the module they had been assigned and commenting on navigation, design, subject coverage, style, tone and other elements of the site, observed by three Moving Here team members. They also participated in a group discussion at the end of the day. Their feedback was written into reports and used to make improvements to the site. These preliminary testing sessions proved invaluable to improving the quality and usability of the site. Between the February session and the June session, major changes were made as a direct result of teacher feedback, substantially improving the site. The main changes made were to reorganize the materials into shorter, blockier lessons rather than longer, linear lessons; to merge two related lessons into one; to shorten most lessons and lesson pages; and to redevelop some of the interactive activity specifications. In the June session teachers noted that their input had been followed up and commented favourably upon the fact that their feedback had been used to improve the site (one teacher said that even though she had been asked for her opinion about Web sites before, she had never seen her suggestions put into practice before this particular project). After the June session, changes were again made to the site, including design modifications, changes to interactive activities that had already been programmed, and a few more changes (including additions) to content. Although this amount of testing could be considered enough – especially since it yielded such fruitful results – Moving Here also included in-class testing as part of its programme. This approach incorporates the advantages of 'ordinary' user testing, but builds on it by taking account of the social dynamics and practical problems that influence the use of the site, so as to ensure the site is usable in the classroom – as opposed to ensuring that it is usable in a controlled user testing environment. This unique addition to the evaluation programme proved even more useful than the conventional user testing. We were hired as the evaluation team to carry out the in-class testing programme. In classroom testing sessions, teachers were observed, at their schools, using the Moving Here site with their students. The schools included two primary schools in the borough of Newham, London, a secondary school in Bethnal Green, London, and a secondary school in Peckham, London – all neighbourhoods with culturally diverse communities, a high proportion of immigrants and a large number of people whose first language is not English. Four of the original six teachers were involved, and the evaluation team went into their classrooms 25 times between October and December 2006 – 5 times per teacher, with one teacher doing a double load and testing two modules instead of a single one, for a total of 10 sessions. The teachers were paid £300 each for five sessions of in-class testing, working out at £60 per session. This closely follows the standard amount of between $50 and $100 US for one session, as suggested by Steve Krug in his seminal work on user testing, 'Don't Make Me Think!'(2000). The teacher who tested two modules received double payment. The total spent on teachers was £1500. The evaluation team was paid approximately £18,500 for the 5 sets of observations and session reports, plus a final report. The evaluation team met with each of the participating teachers in advance to agree which lessons to use for the in-classroom evaluation, to brief the teachers on what was required during the session and to agree on the procedure for arriving during the school day. In consultation with the Education Resources Manager, the evaluators produced an evaluation plan with a set of questions to ask each teacher to make sure they had covered all areas and issues, plus an in-class observation checklist. The questions covered learning outcomes, tone, length of lesson, order of pages, images, design, navigation, accessibility, activity sheet issues, issues with interactives, children's engagement, and other improvements that teachers thought might be useful. The teachers submitted lesson plans with intended learning outcomes for each session according to the National Curriculum. Immediately after each session a written session report was sent to the Education Resources Manager, who used the findings in each report to implement changes to the site while the series of observations were still going on. In some cases, changes requested by a teacher were in place in time for the next observation.
  4. Besides the elements that originated from the site and its contents, the environment had a significant impact on how the site was used. For example, various disturbances in the classroom (excess noise, students coming in late, interruptions at the door, etc), as well as logistical issues (time taken to turn on laptops and log in, time taken to log in at the ICT suite, time taken to find the correct Web site and the page within the Web site, difficulties with saving documents to students' folders and printing them out) all affected how well students worked with the Web site. As an example, a class of year 7 students became overly excited when visiting the ICT suite for the first time with the teacher, could not concentrate on the activity because there were other classes in the ICT suite, and was unable to access a video because the school's firewall blocked it. This resulted in a relatively uneven learning experience, but also was instrumental in indicating to the evaluators which of the activities that had been attempted during that class were the most engaging, and would therefore be the most likely to hold students' attention during periods of high disturbance Another important element that the evaluators were able to capture only in a classroom setting was how students worked with each other. The class dynamic within the different groups contributed to how much the students learned, and while this issue will affect not only the group's learning from a specific Web site but also how well they will learn in all of their classroom endeavours, it is important to note how it affected the testing. For example, some groups worked extremely well together on an activity sheet, but this may have been due not only to the intrinsic interest they took in the activity but also to external issues (threats of detention if they talked too much, possibility of bad marks if they did not complete the activity, incentive of a free lunch to the group that handed in the best activity sheet). Interestingly, the evaluators' presence did not seem to distract students. Initially, the evaluators thought that their presence, even sitting discreetly at the back of the room, might cause students to react differently than they might normally have done. However, during the in-class testing sessions, evaluators found that their presence was either ignored or considered normal by the children. Reasons for this might be that students are accustomed to having more than one adult in the classroom at a time – teaching assistants might be a constant presence, other teachers might interrupt the class, and OFSTED inspectors (the Office for Standards in Education, the UK's official school inspection system) might visit classes to conduct inspections.
  5. Besides the elements that originated from the site and its contents, the environment had a significant impact on how the site was used. For example, various disturbances in the classroom (excess noise, students coming in late, interruptions at the door, etc), as well as logistical issues (time taken to turn on laptops and log in, time taken to log in at the ICT suite, time taken to find the correct Web site and the page within the Web site, difficulties with saving documents to students' folders and printing them out) all affected how well students worked with the Web site. As an example, a class of year 7 students became overly excited when visiting the ICT suite for the first time with the teacher, could not concentrate on the activity because there were other classes in the ICT suite, and was unable to access a video because the school's firewall blocked it. This resulted in a relatively uneven learning experience, but also was instrumental in indicating to the evaluators which of the activities that had been attempted during that class were the most engaging, and would therefore be the most likely to hold students' attention during periods of high disturbance Another important element that the evaluators were able to capture only in a classroom setting was how students worked with each other. The class dynamic within the different groups contributed to how much the students learned, and while this issue will affect not only the group's learning from a specific Web site but also how well they will learn in all of their classroom endeavours, it is important to note how it affected the testing. For example, some groups worked extremely well together on an activity sheet, but this may have been due not only to the intrinsic interest they took in the activity but also to external issues (threats of detention if they talked too much, possibility of bad marks if they did not complete the activity, incentive of a free lunch to the group that handed in the best activity sheet). Interestingly, the evaluators' presence did not seem to distract students. Initially, the evaluators thought that their presence, even sitting discreetly at the back of the room, might cause students to react differently than they might normally have done. However, during the in-class testing sessions, evaluators found that their presence was either ignored or considered normal by the children. Reasons for this might be that students are accustomed to having more than one adult in the classroom at a time – teaching assistants might be a constant presence, other teachers might interrupt the class, and OFSTED inspectors (the Office for Standards in Education, the UK's official school inspection system) might visit classes to conduct inspections.
  6. The in-class testing was useful in picking up elements that were not, and would not have been, flagged in the conventional user testing scenario. Even though the testers were the same teachers who had participated in the two preliminary user testing sessions, they did not pick up on some of the elements that needed to be fixed or changed until they were actually using the site in the classroom. It was only when practical implications became apparent that they noticed these items needed to be changed. All of the following issues had been present during the conventional user testing sessions but had not been singled out as needing modification until the in-class testing sessions: content : when they read the text out loud or asked students to read the text on the pages, teachers realized that the tone of some of the text was too difficult or complex, even though it had seemed fine when it was read on the screen images : teachers realized that some of the images they had seen on lesson pages were not actually useful or pertinent to their teaching, and so should be removed or moved to different pages activity sheets : activity sheets did not have spaces for students to put their names, which caused confusion when they were printing out their work – something teachers hadn't noticed when looking at the content of the sheets interactive activities : although they took up a fairly large amount of screen space when they were being viewed by a single user on a single screen, interactive activities were too small for some children to see from the back of the class and needed to be expandable to full-screen size navigation : the breadcrumb trail needed to go down one more level in order for teachers and students to immediately recognize where they were within the site navigation : the previous/next buttons and the page numbers only appeared at the bottom of the screen, sometimes after the 'fold line', which made it difficult for users to know how to get from one page to the others In fact, sometimes the issues that came up during classroom testing directly opposed what teachers had told us in initial user testing sessions. For example, during user testing teachers had said that the breadcrumb trails were easy to use and helped with navigation, but when they began using the site in the classroom, they found that this was not always the case. Teachers needed to try out the site with their pupils to see what really worked.
  7. In other words, if you are observing in a specific class with, say, low ability or poor levels of English, equipment not working, behaviour issues, and so on, how can you be sure your results are as reliable as those obtained in a 'neutral' environment? First, there is no such thing as a neutral environment, and any test will be subjective, not least because of the particular interests and abilities of the subjects themselves. Secondly, and more importantly, the overall aim of this testing is to ensure the Web site works well in classrooms, and this means seeing the effect that factors like those mentioned above have on the way the Web site is used. Although ideally one would test in more than one classroom (as in this project), just one session in one classroom, however unique the setting might be, reveals more about the required changes than one session in a neutral environment, because the social dynamics and educational imperatives are simply not there to be observed in neutral surroundings.
  8. In other words, if you are observing in a specific class with, say, low ability or poor levels of English, equipment not working, behaviour issues, and so on, how can you be sure your results are as reliable as those obtained in a 'neutral' environment? First, there is no such thing as a neutral environment, and any test will be subjective, not least because of the particular interests and abilities of the subjects themselves. Secondly, and more importantly, the overall aim of this testing is to ensure the Web site works well in classrooms, and this means seeing the effect that factors like those mentioned above have on the way the Web site is used. Although ideally one would test in more than one classroom (as in this project), just one session in one classroom, however unique the setting might be, reveals more about the required changes than one session in a neutral environment, because the social dynamics and educational imperatives are simply not there to be observed in neutral surroundings.
  9. f you are using an external Web developer, it is probably best to avoid getting them to do the user testing, as there is an inherent conflict of interest leading to a likelihood of minimising the changes required, and also because they are likely to focus more on the technical aspects of the site than on its effect on the teacher and pupils themselves. For the same reason, it would be preferable to get an external or unrelated evaluator for the project rather than visit a classroom yourself if you are the producer or writer of the content. It is always difficult to take criticism and a neutral party will not have any issues surrounding ownership of the material.
  10. f you are using an external Web developer, it is probably best to avoid getting them to do the user testing, as there is an inherent conflict of interest leading to a likelihood of minimising the changes required, and also because they are likely to focus more on the technical aspects of the site than on its effect on the teacher and pupils themselves. For the same reason, it would be preferable to get an external or unrelated evaluator for the project rather than visit a classroom yourself if you are the producer or writer of the content. It is always difficult to take criticism and a neutral party will not have any issues surrounding ownership of the material.
  11. It's true that it can cost money to conduct user testing in a classroom – but then again, it need cost no more than conventional user testing. In conventional user testing, an acceptably budget-conscious way of conducting the testing is to have (preferably neutral) staff administer it, hand-writing the notes or using in-house recording equipment to record the user's experience, and to give the tester a small token of appreciation such as a gift voucher. In an in-class testing scenario, one person could attend a one-hour class session in a school, giving the teacher the same small token payment and taking notes or using recording equipment (taking care not to breach rules about the recording of children) to make notes of the issues uncovered. The Moving Here Schools evaluation programme was built into the project plan, but still only used 6.7% of the total spending on the Schools site. The team would recommend spending between 5 and 10% of your total project budget on user testing – especially a combination of conventional and 'real habitat' testing – and planning it into your project from the start. When taking into account the cost of not conducting effective user testing, then the cost of user testing is usually worth every penny. If you can only afford one test, do one. Krug makes the point best when he says 'Testing one user is 100 percent better than testing none. Testing always works. Even the wrong test with the wrong user will show you things you can do to improve your site' (2000).