SlideShare a Scribd company logo
Critical	
  Analysis:	
  
Survey	
  Design	
  
	
  
	
  
Prepared	
  by	
  
Nicole	
  Brown	
  
September	
  2013	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  1	
  of	
  17	
  
Executive	
  Summary	
  
	
  
This	
  report	
  provides	
  guidance	
  to	
  a	
  student	
  at	
  Central	
  Queensland	
  University	
  (CQU)	
  to	
  
support	
  his	
  research	
  of	
  the	
  uses	
  and	
  gratifications	
  of	
  Facebook	
  among	
  Australian	
  university	
  
students.	
  To	
  improve	
  the	
  quality	
  of	
  the	
  survey	
  data,	
  the	
  wording	
  and	
  layout	
  of	
  the	
  survey	
  
has	
  been	
  logically	
  deconstructed,	
  and	
  key	
  suggestions	
  include:	
  
• Creating	
  a	
  more	
  meaningful	
  title.	
  
• Adding	
  an	
  introductory	
  paragraph.	
  
• Correcting	
  an	
  overlapping	
  scale.	
  
• Re-­‐wording	
  double-­‐barrelled	
  questions	
  and	
  questions	
  that	
  contain	
  extreme	
  absolutes,	
  
ambiguity	
  or	
  grammatical	
  errors.	
  	
  
	
  
The	
  target	
  population	
  is	
  defined	
  as	
  students	
  attending	
  Australian	
  universities,	
  with	
  a	
  
sampling	
  unit	
  comprising	
  one	
  Australian	
  university	
  student.	
  The	
  sample	
  frame	
  is	
  based	
  on	
  a	
  
de-­‐identified	
  listing	
  of	
  students	
  obtained	
  from	
  the	
  administrative	
  records	
  from	
  participating	
  
Australian	
  universities.	
  
	
  
The	
  sample	
  size	
  is	
  calculated	
  as	
  9,250	
  respondents	
  based	
  on	
  a	
  ±3%	
  allowable	
  sample	
  error	
  
at	
  the	
  99%	
  confidence	
  level,	
  maximum	
  variability,	
  and	
  a	
  target	
  response	
  rate	
  of	
  20%.	
  To	
  
validate	
  the	
  sample	
  as	
  representative	
  of	
  the	
  Australian	
  university	
  student	
  population,	
  a	
  
method	
  was	
  suggested	
  that	
  compares	
  the	
  sample’s	
  demographic	
  profile	
  with	
  the	
  profile	
  of	
  
students	
  in	
  the	
  sample	
  frame.	
  
	
  
Given	
  the	
  significant	
  differences	
  between	
  paper	
  and	
  online	
  surveys,	
  important	
  design	
  and	
  
implementation	
  considerations	
  are	
  explored.	
  Recommendations	
  are	
  provided	
  to	
  overcome	
  
key	
  technological	
  barriers,	
  including:	
  
• Designing	
  the	
  survey	
  with	
  specialised	
  online	
  survey	
  software.	
  
• Sending	
  survey	
  invitations	
  to	
  students	
  directly	
  from	
  the	
  university.	
  
• Enabling	
  prospective	
  participants	
  to	
  opt-­‐out	
  of	
  receiving	
  further	
  email	
  communication.	
  
	
  
Other	
  design	
  principles	
  are	
  suggested	
  to	
  reduce	
  drop-­‐out	
  rates,	
  including	
  a	
  white	
  
background,	
  using	
  a	
  maximum	
  of	
  ten	
  questions	
  per	
  screen,	
  and	
  providing	
  a	
  realistic	
  
estimate	
  of	
  the	
  time	
  required	
  to	
  complete	
  the	
  survey.	
  Prior	
  to	
  implementation,	
  a	
  validated	
  
four	
  stage	
  pilot	
  testing	
  process	
  is	
  suggested	
  to	
  reduce	
  drop-­‐out	
  rates	
  and	
  improve	
  response	
  
rates.	
  
	
  
Since	
  a	
  low	
  response	
  rate	
  can	
  compromises	
  survey	
  quality,	
  key	
  strategies	
  are	
  suggested	
  to	
  
increase	
  online	
  survey	
  responses.	
  These	
  suggestions	
  include	
  a	
  photo	
  of	
  the	
  researcher	
  in	
  the	
  
email	
  invitation,	
  not	
  using	
  ‘survey’	
  in	
  the	
  email	
  subject	
  line,	
  and	
  sending	
  email	
  reminders	
  to	
  
non-­‐respondents	
  after	
  2	
  days.	
  
	
  
In	
  recognition	
  of	
  the	
  significant	
  advantages	
  of	
  pluralistic	
  research	
  over	
  quantitative	
  methods	
  
alone,	
  a	
  qualitative	
  online	
  focus	
  group	
  is	
  suggested	
  to	
  accompany	
  the	
  online	
  survey.	
  This	
  
research	
  method	
  suits	
  both	
  the	
  geographical	
  dispersion	
  and	
  the	
  technical	
  abilities	
  of	
  
Australian	
  university	
  students.	
  A	
  validated	
  methodology	
  based	
  on	
  online	
  focus	
  groups	
  for	
  
university	
  students	
  in	
  the	
  United	
  States	
  is	
  provided	
  to	
  inform	
  selection	
  of	
  participants,	
  
instruction	
  development,	
  monitoring	
  of	
  focus	
  group	
  dialogue	
  and	
  analysis	
  of	
  the	
  results.
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  2	
  of	
  17	
  
Table	
  of	
  Contents	
  
	
  
	
  	
  	
  
Executive	
  Summary	
  ..................................................................................................................	
  1	
  
Introduction	
  .............................................................................................................................	
  3	
  
Part	
  1:	
  Questionnaire	
  Critical	
  Analysis	
  .....................................................................................	
  4	
  
Part	
  2:	
  Sampling	
  Plan	
  ...............................................................................................................	
  7	
  
Part	
  3:	
  Online	
  Survey	
  Design	
  and	
  Implementation	
  Considerations	
  .......................................	
  10	
  
Part	
  4:	
  Strategies	
  to	
  Improve	
  Response	
  Rate	
  .........................................................................	
  12	
  
Part	
  5:	
  Complementary	
  Qualitative	
  Research	
  Design	
  ............................................................	
  13	
  
Conclusion	
  ..............................................................................................................................	
  15	
  
Reference	
  List	
  .........................................................................................................................	
  16	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  3	
  of	
  17	
  
Introduction	
  
	
  
Darren	
  is	
  a	
  student	
  at	
  Central	
  Queensland	
  University	
  (CQU)	
  that	
  has	
  designed	
  a	
  survey	
  to	
  
support	
  his	
  research	
  of	
  the	
  uses	
  and	
  gratifications	
  of	
  Facebook	
  among	
  Australian	
  university	
  
students.	
  This	
  report	
  will	
  provide	
  Darren	
  with	
  guidance	
  to	
  implement	
  a	
  high-­‐quality	
  online	
  
survey	
  with	
  a	
  representative	
  sample	
  of	
  students	
  from	
  most	
  Australian	
  universities.	
  
	
  
The	
  first	
  section	
  of	
  the	
  report	
  will	
  critically	
  analyse	
  Darren’s	
  survey,	
  a	
  process	
  that	
  will	
  
logically	
  deconstruct	
  the	
  survey	
  and	
  suggest	
  ways	
  the	
  survey	
  can	
  be	
  improved.	
  The	
  
suggestions	
  will	
  mainly	
  focus	
  on	
  the	
  wording	
  or	
  layout	
  of	
  questions,	
  and	
  ultimately	
  improve	
  
the	
  quality	
  of	
  Darren’s	
  survey	
  data.	
  
	
  
The	
  second	
  section	
  of	
  the	
  report	
  will	
  suggest	
  a	
  detailed	
  sampling	
  plan	
  that	
  precisely	
  defines	
  
the	
  population,	
  describes	
  the	
  sample	
  frame	
  and	
  calculates	
  the	
  sample	
  size.	
  The	
  sampling	
  
plan	
  will	
  also	
  suggest	
  an	
  appropriate	
  sampling	
  method	
  and	
  explain	
  how	
  to	
  validate	
  the	
  
sample,	
  thereby	
  ensuring	
  Darren’s	
  online	
  survey	
  respondents	
  are	
  representative	
  of	
  
Australian	
  university	
  student	
  population.	
  	
  
	
  
Important	
  considerations	
  for	
  transforming	
  Darren’s	
  paper	
  survey	
  into	
  an	
  online	
  format	
  will	
  
be	
  discussed	
  in	
  the	
  third	
  section	
  of	
  the	
  report.	
  Key	
  recommendations	
  will	
  be	
  provided	
  about	
  
technological,	
  demographic	
  and	
  response	
  rate	
  characteristics	
  that	
  influence	
  how	
  Darren’s	
  
survey	
  should	
  be	
  designed	
  and	
  how	
  the	
  survey	
  can	
  be	
  implemented.	
  
	
  
Since	
  a	
  low	
  response	
  rate	
  can	
  compromises	
  survey	
  quality,	
  the	
  fourth	
  section	
  of	
  this	
  report	
  
will	
  discuss	
  strategies	
  to	
  increase	
  the	
  response	
  rate	
  for	
  Darren’s	
  online	
  survey.	
  In	
  
recognition	
  of	
  the	
  benefits	
  of	
  pluralistic	
  research,	
  the	
  final	
  section	
  of	
  this	
  report	
  will	
  propose	
  
a	
  qualitative	
  research	
  design	
  to	
  accompany	
  Darren’s	
  quantitative	
  online	
  survey.	
  
	
  
	
  
	
  
	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  4	
  of	
  17	
  
Part	
  1:	
  Questionnaire	
  Critical	
  Analysis	
  
	
  
The	
  first	
  part	
  of	
  this	
  report	
  will	
  critically	
  analyse	
  Darren’s	
  survey	
  and	
  provide	
  guidance	
  to	
  
overcome	
  design	
  flaws.	
  Survey	
  design	
  directly	
  affects	
  the	
  quality	
  of	
  data	
  collected	
  because	
  
the	
  wording	
  or	
  format	
  of	
  questions	
  can	
  create	
  a	
  bias	
  that	
  influences	
  respondents	
  answers	
  
(Burns	
  &	
  Bush	
  2010).	
  
	
  
Darren’s	
  survey	
  has	
  incorporated	
  several	
  good	
  questionnaire	
  design	
  principles.	
  For	
  instance:	
  
• Questions	
  at	
  the	
  beginning	
  of	
  the	
  survey	
  explicitly	
  address	
  the	
  survey	
  topic,	
  questions	
  
about	
  similar	
  topics	
  have	
  been	
  grouped	
  together,	
  and	
  sensitive	
  questions	
  about	
  
respondent	
  demographics	
  appear	
  at	
  the	
  end	
  of	
  the	
  survey	
  (Marsden	
  &	
  Wright	
  2010).	
  
• Alternate	
  shadings	
  of	
  questions	
  and	
  simple	
  headings	
  make	
  it	
  easier	
  for	
  respondents	
  to	
  
navigate	
  the	
  survey	
  (Wiggins	
  &	
  Bowers	
  n.d.).	
  	
  
• The	
  statements	
  ‘strongly	
  agree’	
  and	
  ‘strongly	
  disagree’	
  on	
  opposite	
  ends	
  of	
  the	
  semantic	
  
differential	
  scale	
  are	
  short	
  and	
  precise	
  (Survey	
  Monkey	
  2008).	
  
	
  
Despite	
  the	
  positive	
  features	
  discussed	
  above,	
  several	
  flaws	
  are	
  also	
  evident	
  in	
  Darren’s	
  
survey	
  design.	
  For	
  instance,	
  the	
  survey	
  title	
  “Survey	
  Measures”	
  is	
  not	
  meaningful	
  to	
  
respondents,	
  and	
  the	
  survey	
  instructions	
  provided	
  are	
  inadequate	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  
2010).	
  It	
  is	
  therefore	
  recommended	
  that	
  Darren	
  creates	
  a	
  meaningful	
  title	
  for	
  the	
  survey,	
  
and	
  includes	
  an	
  introductory	
  paragraph,	
  as	
  it	
  is	
  good	
  practice	
  to	
  explain	
  the	
  survey’s	
  
purpose,	
  identify	
  the	
  organisation	
  conducting	
  the	
  survey,	
  assure	
  respondents	
  of	
  
confidentiality	
  and	
  describe	
  how	
  the	
  collected	
  data	
  will	
  be	
  used	
  (Survey	
  Monkey	
  2008).	
  
	
  
Another	
  common	
  error	
  that	
  appears	
  in	
  Darren’s	
  survey	
  is	
  the	
  overlapping	
  scale	
  in	
  the	
  
response	
  options	
  for	
  question	
  twenty	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  For	
  example,	
  the	
  time	
  
thirty	
  minutes	
  appears	
  in	
  two	
  separate	
  options.	
  It	
  is	
  recommended	
  that	
  Darren	
  re-­‐designs	
  
the	
  response	
  options	
  to	
  question	
  twenty	
  so	
  that	
  the	
  scales	
  do	
  not	
  overlap,	
  and	
  considers	
  
using	
  consistent	
  time	
  increments.	
  For	
  example	
  the	
  first	
  option	
  could	
  be	
  0-­‐9	
  minutes;	
  the	
  
second	
  option	
  could	
  be	
  10-­‐19	
  minutes,	
  and	
  so	
  on.	
  
	
  
	
  The	
  word	
  ‘most’	
  is	
  an	
  extreme	
  absolute	
  that	
  puts	
  survey	
  respondents	
  in	
  an	
  uncomfortable	
  
situation	
  (Burns	
  &	
  Bush	
  2010).	
  On	
  this	
  basis,	
  it	
  is	
  recommended	
  that	
  Darren	
  removes	
  the	
  
word	
  ‘most’	
  from	
  his	
  survey	
  instructions	
  and	
  from	
  question	
  twenty-­‐one.	
  For	
  example,	
  the	
  
instructions	
  can	
  be	
  changed	
  to	
  “Please	
  select	
  the	
  responses	
  that	
  apply	
  to	
  you.”	
  Instead	
  of	
  
using	
  the	
  word	
  ‘most’	
  in	
  question	
  twenty-­‐one,	
  Darren	
  could	
  use	
  a	
  scale	
  to	
  objectively	
  
measure	
  the	
  frequency	
  that	
  respondents	
  use	
  a	
  tablet,	
  computer	
  or	
  smartphone	
  to	
  access	
  
Facebook.	
  	
  
	
  
Contrary	
  to	
  the	
  survey	
  design	
  principles	
  advocated	
  by	
  Marsden	
  and	
  Wright	
  (2010),	
  Darren’s	
  
survey	
  contains	
  double-­‐barrelled	
  survey	
  questions	
  that	
  simultaneously	
  address	
  two	
  
separate	
  issues.	
  This	
  was	
  evident	
  in	
  question	
  one,	
  eleven	
  and	
  sixteen.	
  It	
  is	
  recommended	
  
that	
  Darren	
  resolves	
  this	
  design	
  flaw	
  by	
  separating	
  the	
  double-­‐barrelled	
  questions	
  into	
  two	
  
separate	
  questions	
  that	
  address	
  each	
  issue	
  individually.	
  For	
  example,	
  question	
  one	
  could	
  be	
  
replaced	
  with	
  the	
  statements	
  “I	
  like	
  to	
  share	
  my	
  status	
  with	
  friends”	
  and	
  “I	
  like	
  to	
  share	
  my	
  
photos	
  with	
  friends”.	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  5	
  of	
  17	
  
	
  
Marsden	
  and	
  Wright	
  (2010)	
  advise	
  that	
  questionnaires	
  should	
  avoid	
  words	
  with	
  ambiguous	
  
meanings.	
  However,	
  several	
  questions	
  in	
  Darren’s	
  survey	
  contain	
  ambiguous	
  wording.	
  
Examples	
  of	
  this	
  problem	
  are	
  listed	
  in	
  Table	
  3,	
  along	
  with	
  suggested	
  changes	
  to	
  the	
  question	
  
wording	
  to	
  improve	
  clarity.	
  
	
  
Table	
  3:	
  Ambiguous	
  Survey	
  Questions	
  
Question	
   Specific	
  Design	
  Problem	
   Example	
  of	
  Proposed	
  Solution	
  
4	
   This	
  question	
  does	
  not	
  specify	
  Facebook	
  
at	
  the	
  source	
  of	
  the	
  Friend	
  requests,	
  and	
  
could	
  be	
  mistaken	
  for	
  other	
  social	
  media.	
  
‘I	
  like	
  to	
  receive	
  friend	
  requests	
  on	
  
Facebook.’	
  
5	
   This	
  question	
  does	
  not	
  specify	
  that	
  
Facebook	
  is	
  the	
  method	
  of	
  finding	
  
people.	
  
‘I	
  like	
  to	
  find	
  people	
  on	
  Facebook	
  
that	
  I	
  have	
  not	
  seen	
  recently.’	
  
6	
   The	
  word	
  ‘old’	
  could	
  be	
  interpreted	
  
either	
  as	
  the	
  friend’s	
  age	
  or	
  the	
  duration	
  
of	
  the	
  friendship.	
  
‘I	
  like	
  to	
  find	
  out	
  on	
  Facebook	
  what	
  
long-­‐standing	
  friends	
  are	
  doing	
  
now.’	
  
7	
   This	
  question	
  does	
  not	
  specify	
  that	
  
Facebook	
  is	
  the	
  method	
  of	
  making	
  
contact.	
  	
  
‘I	
  like	
  the	
  ability	
  to	
  contact	
  friends	
  
on	
  Facebook	
  that	
  live	
  far	
  away.’	
  
19	
   A	
  respondent	
  could	
  write	
  ‘it	
  varies’	
  
instead	
  of	
  providing	
  a	
  numerical	
  
response.	
  
Change	
  the	
  response	
  to	
  this	
  
question	
  to	
  a	
  list	
  of	
  tick	
  box	
  options	
  
with	
  consistent	
  increments	
  e.g.	
  ‘less	
  
than	
  once’,	
  ‘once’,	
  ‘twice’,	
  ‘three	
  
times’,	
  ‘more	
  than	
  four	
  times’.	
  
25	
   This	
  question	
  could	
  be	
  interpreted	
  as	
  
asking	
  for	
  the	
  respondent’s	
  occupation.	
  	
  
‘What	
  is	
  your	
  employment	
  status?’	
  
	
  
The	
  following	
  three	
  grammatical	
  errors	
  compromise	
  the	
  clarity	
  of	
  Darren’s	
  survey	
  and	
  could	
  
potentially	
  create	
  a	
  bias	
  (Burns	
  &	
  Bush	
  2010):	
  
	
  
1. The	
  word	
  ‘friend’	
  in	
  question	
  should	
  be	
  a	
  plural.	
  	
  
	
  
2. Question	
  three	
  ends	
  with	
  a	
  proposition,	
  and	
  could	
  be	
  replaced	
  with	
  the	
  wording	
  ‘I	
  use	
  
Facebook	
  to	
  reconnect	
  when	
  I’ve	
  lost	
  contact	
  with	
  people.’	
  
	
  
3. Both	
  questions	
  seventeen	
  and	
  eighteen	
  mix	
  the	
  present	
  and	
  future	
  tenses.	
  It	
  is	
  
recommended	
  that	
  Darren	
  changes	
  the	
  wording	
  in	
  question	
  seventeen	
  to	
  ‘My	
  friends	
  
think	
  I	
  am	
  very	
  active	
  in	
  the	
  group	
  when	
  I	
  am	
  being	
  active	
  on	
  Facebook’.	
  Similarly,	
  the	
  
following	
  wording	
  is	
  suggested	
  for	
  question	
  eighteen	
  ‘I	
  become	
  more	
  famous	
  among	
  my	
  
friends	
  when	
  I	
  am	
  being	
  active	
  on	
  Facebook’.	
  
	
  
Dennis	
  (2010)	
  contends	
  survey	
  layout	
  and	
  wording	
  are	
  equally	
  important.	
  Although	
  the	
  
following	
  three	
  examples	
  describe	
  layout	
  problems,	
  these	
  issues	
  will	
  be	
  resolved	
  when	
  
Darren’s	
  survey	
  is	
  redesigned	
  for	
  the	
  online	
  setting:	
  
	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  6	
  of	
  17	
  
1. Question	
  twelve	
  in	
  Darren’s	
  survey	
  appears	
  on	
  a	
  separate	
  page	
  to	
  other	
  questions	
  
relating	
  to	
  social	
  investigation,	
  and	
  Dennis	
  (2010)	
  advises	
  against	
  splitting	
  questions	
  or	
  
answers	
  across	
  pages.	
  
2. The	
  tick	
  boxes	
  for	
  the	
  responses	
  to	
  question	
  twenty-­‐one	
  appear	
  on	
  the	
  right	
  side	
  of	
  the	
  
response	
  options,	
  which	
  is	
  inconsistent	
  with	
  the	
  layout	
  for	
  the	
  tick	
  box	
  responses	
  for	
  
questions	
  twenty,	
  twenty-­‐five	
  and	
  twenty-­‐six.	
  
	
  
3. The	
  line	
  to	
  write	
  responses	
  to	
  question	
  twenty-­‐two	
  is	
  not	
  aligned	
  with	
  the	
  lines	
  to	
  
respond	
  to	
  questions	
  nineteen	
  and	
  twenty-­‐four.	
  
	
  
4. The	
  numbers	
  one	
  and	
  two	
  in	
  the	
  semantic	
  differential	
  scale	
  are	
  not	
  equidistant	
  and	
  
could	
  be	
  a	
  potential	
  source	
  of	
  bias	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  7	
  of	
  17	
  
Part	
  2:	
  Sampling	
  Plan	
  
	
  
Sampling	
  is	
  a	
  process	
  used	
  by	
  researchers	
  to	
  select	
  a	
  representative	
  segment	
  of	
  a	
  specific	
  
population	
  they	
  are	
  investigating	
  (Burns	
  &	
  Bush	
  2010).	
  The	
  Australian	
  Council	
  for	
  
Educational	
  Research	
  (ACER)	
  contends	
  “a	
  well-­‐designed	
  sample	
  can	
  more	
  efficiently	
  yield	
  
results	
  which	
  are	
  as	
  good	
  as	
  those	
  provided	
  by	
  a	
  census”	
  (2009,	
  p.	
  8).	
  	
  
	
  
The	
  second	
  section	
  of	
  this	
  report	
  provides	
  Darren	
  with	
  a	
  sampling	
  plan	
  that	
  will	
  ensure	
  
online	
  survey	
  respondents	
  are	
  representative	
  of	
  Australian	
  university	
  student	
  population.	
  
The	
  sampling	
  plan	
  will	
  precisely	
  define	
  the	
  population	
  and	
  sampling	
  unit,	
  describe	
  the	
  
sample	
  frame,	
  calculate	
  the	
  sample	
  size,	
  suggest	
  an	
  appropriate	
  sampling	
  method	
  and	
  
explain	
  how	
  to	
  validate	
  the	
  sample.	
  
	
  
Population	
  and	
  Sampling	
  Unit	
  Definitions	
  
Defining	
  the	
  target	
  population	
  involves	
  specifying	
  the	
  whole	
  group	
  being	
  investigated	
  
(McMurray,	
  Pace	
  &	
  Scott	
  2004).	
  Based	
  on	
  the	
  objective	
  of	
  Darren’s	
  research,	
  the	
  specific	
  
target	
  population	
  are	
  students	
  attending	
  Australian	
  universities.	
  Sampling	
  units	
  are	
  the	
  
most	
  basic	
  elements	
  that	
  can	
  be	
  selected	
  in	
  the	
  sample	
  (Zikmund	
  &	
  Babin	
  2013)	
  The	
  
sampling	
  unit	
  for	
  Darren’s	
  survey	
  comprises	
  one	
  university	
  student.	
  
	
  
Sample	
  Frame	
  
A	
  sample	
  frame	
  is	
  the	
  source	
  material	
  that	
  lists	
  all	
  members	
  of	
  a	
  target	
  population	
  from	
  
which	
  the	
  sample	
  is	
  drawn	
  (Ritchie	
  &	
  Lewis	
  2005).	
  Administrative	
  records	
  from	
  Australian	
  
universities	
  will	
  be	
  the	
  most	
  convenient	
  type	
  of	
  sample	
  frame	
  for	
  Darren’s	
  survey.	
  Access	
  to	
  
these	
  records	
  will	
  need	
  to	
  be	
  negotiated	
  with	
  each	
  university	
  (Ritchie	
  &	
  Lewis	
  2005	
  p.	
  89)	
  
	
  
Sample	
  size	
  calculation	
  
The	
  sample	
  size	
  for	
  Darren’s	
  survey	
  has	
  been	
  initially	
  calculated	
  with	
  the	
  standard	
  sample	
  
size	
  formula	
  shown	
  in	
  Figure	
  1	
  below.	
  Darren’s	
  sample	
  size	
  of	
  1,850	
  respondents	
  assumes	
  a	
  
±3%	
  allowable	
  sample	
  error	
  (e	
  =	
  0.03)	
  at	
  the	
  99%	
  confidence	
  level	
  (z	
  =	
  2.58)	
  and	
  maximum	
  
variability	
  (p	
  =	
  0.5	
  and	
  q	
  =	
  0.5).	
  	
  
	
  	
  
Figure	
  1:	
  Standard	
  sample	
  size	
  formula	
  (adapted	
  from	
  Burns	
  &	
  Bush	
  2010,	
  p.	
  409)	
  
	
  
However,	
  Darren	
  is	
  strongly	
  encouraged	
  to	
  intentionally	
  over-­‐sample	
  to	
  avoid	
  complex	
  
follow-­‐up	
  of	
  replacement	
  samples	
  (ACER	
  2009).	
  The	
  amount	
  of	
  over-­‐sampling	
  required	
  to	
  
attain	
  a	
  valid	
  sample	
  size	
  at	
  the	
  99%	
  confidence	
  level	
  can	
  be	
  estimated	
  by	
  pilot	
  testing	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  8	
  of	
  17	
  
Darren’s	
  online	
  surveys	
  to	
  reveal	
  undeliverable	
  email,	
  declined	
  and	
  completed	
  survey	
  rates	
  
(Deggs,	
  Grover	
  &	
  Kacirek	
  2010)	
  .	
  
Over-­‐sampling	
  estimates	
  can	
  also	
  be	
  “based	
  on	
  the	
  researcher’s	
  knowledge	
  of	
  incidence	
  
rates,	
  nonresponse	
  rates,	
  and	
  unusable	
  responses”	
  (Burns	
  &	
  Bush	
  2010,	
  p.	
  391).	
  It	
  is	
  
recommended	
  that	
  Darren’s	
  over-­‐sampling	
  estimate	
  is	
  initially	
  guided	
  by	
  the	
  20%	
  target	
  
response	
  rate	
  used	
  in	
  an	
  online	
  survey	
  of	
  students	
  attending	
  23	
  Australian	
  universities	
  
(ACER	
  2009).	
  On	
  the	
  basis	
  of	
  this	
  target	
  response	
  rate	
  Darren	
  will	
  require	
  9,250	
  
respondents,	
  which	
  is	
  five	
  times	
  the	
  initial	
  sample	
  size	
  estimate	
  of	
  1,850	
  respondents.	
  
	
  
Darren	
  is	
  also	
  encouraged	
  to	
  specify	
  different	
  sample	
  sizes	
  for	
  each	
  Australian	
  university	
  
(ACER	
  2009),	
  given	
  the	
  size	
  of	
  Australian	
  universities	
  varies	
  from	
  6,554	
  to	
  53,612	
  students	
  
(Australian	
  Education	
  Network	
  2013).	
  
	
  
	
  
Sampling	
  Method	
  
To	
  achieve	
  a	
  representative	
  sample,	
  Darren	
  is	
  encouraged	
  to	
  include	
  in	
  the	
  sample	
  at	
  least	
  
half	
  of	
  the	
  universities	
  in	
  each	
  Australian	
  state	
  and	
  territory.	
  This	
  sampling	
  strategy	
  was	
  
adopted	
  by	
  ACER	
  (2009)	
  in	
  an	
  online	
  survey	
  of	
  students	
  attending	
  23	
  Australian	
  universities,	
  
with	
  the	
  exception	
  of	
  the	
  Northern	
  Territory	
  that	
  did	
  not	
  commence	
  teaching	
  until	
  2011	
  
(Charles	
  Darwin	
  University	
  2011).	
  
	
  
In	
  accordance	
  with	
  the	
  methodology	
  described	
  by	
  ACER	
  (2009),	
  Darren	
  is	
  encouraged	
  to	
  
obtain	
  de-­‐identified	
  lists	
  of	
  students	
  from	
  Australian	
  universities,	
  validate	
  each	
  list,	
  draw	
  a	
  
sample	
  and	
  then	
  return	
  the	
  sampled	
  list	
  to	
  each	
  university.	
  At	
  this	
  point,	
  each	
  university	
  will	
  
be	
  required	
  to	
  re-­‐attach	
  student	
  contact	
  details	
  so	
  students	
  can	
  be	
  contacted	
  by	
  email	
  to	
  
participate	
  in	
  Darren’s	
  online	
  survey.	
  
	
  
Probability	
  sampling	
  is	
  regarded	
  as	
  the	
  most	
  rigorous	
  approach	
  to	
  producing	
  a	
  statistically	
  
representative	
  sample	
  of	
  the	
  population	
  from	
  which	
  it	
  is	
  drawn	
  (Ritchie	
  &	
  Lewis	
  2005).	
  
Probabilistic	
  sampling	
  will	
  also	
  allow	
  Darren	
  to	
  measure	
  the	
  response	
  rate	
  to	
  his	
  online	
  
survey	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  
	
  
Since	
  the	
  size	
  of	
  Australian	
  universities	
  varies	
  from	
  6,554	
  to	
  53,612	
  students	
  and	
  the	
  
proportion	
  of	
  international	
  students	
  varies	
  from	
  6.7%	
  to	
  40.5%	
  (Australian	
  Education	
  
Network	
  2013),	
  it	
  can	
  be	
  concluded	
  the	
  Australian	
  university	
  population	
  has	
  a	
  skewed	
  
distribution	
  best	
  suited	
  to	
  stratified	
  sampling	
  (Burns	
  &	
  Bush	
  2010).	
  In	
  populations	
  with	
  a	
  
skewed	
  distribution	
  systematic	
  stratified	
  sampling	
  produces	
  “powerful,	
  generalisable	
  and	
  
representative	
  estimates”	
  (ACER	
  2009,	
  p.	
  8).	
  
	
  
Stratified	
  sampling	
  is	
  a	
  two-­‐step	
  process	
  that	
  partitions	
  the	
  population	
  into	
  homogenous	
  
subgroups	
  and	
  then	
  draws	
  random	
  samples	
  from	
  each	
  subgroup	
  (McMurray,	
  Pace	
  &	
  Scott	
  
2004).	
  Stratified	
  sampling	
  also	
  reduces	
  the	
  sample	
  size	
  needed	
  to	
  provide	
  a	
  representative	
  
sample	
  (McMurray,	
  Pace	
  &	
  Scott	
  2004).	
  
	
  
It	
  is	
  suggested	
  that	
  Darren	
  adopts	
  a	
  technique	
  that	
  was	
  used	
  by	
  Thomson,	
  Rosenthal	
  and	
  
Russell	
  (2006)	
  to	
  randomise	
  a	
  sample	
  for	
  an	
  online	
  survey	
  of	
  students	
  attending	
  Australian	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  9	
  of	
  17	
  
universities.	
  These	
  researchers	
  used	
  an	
  alphabetical	
  listing	
  to	
  create	
  a	
  random	
  starting	
  
point,	
  and	
  defined	
  the	
  skip	
  interval	
  as	
  every	
  third	
  student.	
  
	
  
Sample	
  Validation	
  
The	
  final	
  part	
  of	
  this	
  sample	
  plan	
  will	
  explain	
  how	
  Darren	
  can	
  demonstrate	
  the	
  sample	
  is	
  
representative	
  of	
  the	
  Australian	
  university	
  student	
  population.	
  Darren	
  can	
  validate	
  his	
  
sample	
  by	
  comparing	
  the	
  sample’s	
  demographic	
  profile	
  with	
  a	
  known	
  profile	
  (Burns	
  &	
  Bush	
  
2010).	
  In	
  Darren’s	
  case,	
  the	
  known	
  profile	
  would	
  be	
  obtained	
  by	
  compiling	
  de-­‐identified	
  
student	
  lists	
  supplied	
  by	
  each	
  Australian	
  university.	
  Post-­‐stratification	
  weighting	
  of	
  survey	
  
data	
  (for	
  example	
  by	
  academic	
  year	
  level,	
  attendance	
  type,	
  and	
  respondent	
  gender)	
  will	
  also	
  
affirm	
  the	
  survey	
  responses	
  represent	
  the	
  target	
  population	
  (ACER	
  2009).	
  
	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  10	
  of	
  17	
  
Part	
  3:	
  Online	
  Survey	
  Design	
  and	
  Implementation	
  Considerations	
  
	
  
Compared	
  to	
  traditional	
  paper	
  surveys,	
  online	
  surveys	
  have	
  “distinctive	
  technological,	
  
demographic	
  and	
  response	
  rate	
  characteristics	
  that	
  affect	
  how	
  they	
  should	
  be	
  designed,	
  
when	
  they	
  can	
  be	
  used	
  and	
  how	
  they	
  can	
  be	
  implemented”	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010,	
  
p.	
  186).	
  Given	
  these	
  significant	
  differences,	
  the	
  third	
  section	
  of	
  this	
  report	
  will	
  discuss	
  how	
  
Darren’s	
  survey	
  design	
  can	
  be	
  optimised	
  for	
  online	
  implementation.	
  	
  
	
  
Online	
  survey	
  invitations	
  develop	
  a	
  trusting	
  relationship	
  with	
  respondents	
  from	
  the	
  
beginning	
  of	
  the	
  survey	
  experience	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  Since	
  sending	
  
unsolicited	
  email	
  infringes	
  on	
  student	
  privacy,	
  	
  it	
  is	
  suggested	
  that	
  emails	
  sent	
  from	
  the	
  
universities	
  to	
  prospective	
  participants	
  include	
  a	
  link	
  that	
  enables	
  students	
  to	
  opt-­‐out	
  of	
  
receiving	
  any	
  further	
  communication	
  about	
  Darren’s	
  survey	
  (Survey	
  Monkey	
  2008).	
  This	
  
method	
  of	
  the	
  university	
  sending	
  students	
  the	
  link	
  to	
  Darren’s	
  online	
  survey	
  will	
  also	
  
increase	
  the	
  delivery	
  receipt	
  and	
  avoids	
  emails	
  being	
  rejected	
  by	
  spam	
  filters	
  (Survey	
  
Monkey	
  2008).	
  
	
  
Another	
  technical	
  consideration	
  is	
  the	
  software	
  that	
  Darren	
  uses	
  to	
  design	
  the	
  online	
  
survey,	
  which	
  ideally	
  will:	
  
	
  
1. Support	
  multiple	
  platforms	
  and	
  browsers,	
  as	
  variation	
  in	
  internet	
  devices	
  and	
  browsing	
  
settings	
  could	
  create	
  respondent	
  errors	
  (Andrews,	
  Nonnecke	
  &	
  Preece	
  2003).	
  	
  
	
  
2. Prevent	
  invited	
  students	
  from	
  responding	
  to	
  Darren’s	
  survey	
  more	
  than	
  once	
  (Survey	
  
Monkey	
  2008).	
  
	
  
Unlike	
  paper	
  surveys,	
  designers	
  have	
  less	
  control	
  of	
  online	
  survey	
  presentation	
  due	
  to	
  
variations	
  in	
  devices	
  used	
  to	
  view	
  the	
  survey	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  However,	
  
online	
  survey	
  design	
  has	
  a	
  strong	
  influence	
  on	
  attrition	
  rates	
  (Toepoel,	
  Das	
  &	
  Van	
  Soest	
  
2009).	
  It	
  is	
  therefore	
  recommended	
  that	
  Darren	
  adopts	
  the	
  following	
  evidence-­‐based	
  online	
  
survey	
  design	
  principles:	
  
	
  
• A	
  white	
  background,	
  as	
  this	
  increased	
  response	
  rates	
  by	
  31%	
  compared	
  to	
  black	
  
backgrounds	
  (Edwards	
  et	
  al.	
  2009).	
  
	
  
• A	
  limit	
  of	
  ten	
  questions	
  per	
  screen,	
  as	
  this	
  has	
  been	
  shown	
  to	
  reduce	
  the	
  burden	
  of	
  
scrolling	
  through	
  online	
  surveys	
  which	
  creates	
  the	
  impression	
  that	
  the	
  survey	
  is	
  too	
  long	
  
(Toepoel,	
  Das	
  &	
  Van	
  Soest	
  2009).	
  
	
  
• A	
  description	
  of	
  the	
  survey	
  structure,	
  a	
  realistic	
  time	
  estimate	
  to	
  complete	
  the	
  survey	
  
and	
  a	
  calibrated	
  indicator	
  visually	
  tracking	
  the	
  respondent’s	
  progress	
  in	
  the	
  survey,	
  as	
  
these	
  features	
  motivate	
  respondents	
  and	
  reduce	
  the	
  drop-­‐out	
  rate	
  (Deggs,	
  Grover	
  &	
  
Kacirek	
  2010).	
  
	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  11	
  of	
  17	
  
• Simple	
  headings	
  to	
  help	
  respondents	
  to	
  navigate	
  through	
  the	
  online	
  survey;	
  questions	
  in	
  
a	
  bold	
  typeface;	
  slightly	
  indented	
  response	
  options	
  in	
  regular	
  typeface;	
  and	
  underlining	
  
to	
  emphasize	
  words	
  in	
  the	
  question	
  Wiggins	
  and	
  Bowers	
  (n.d.).	
  
	
  
Prior	
  to	
  implementing	
  online	
  surveys,	
  Marsden	
  and	
  Wright	
  (2010)	
  advocate	
  a	
  formal	
  pre-­‐
test	
  evaluation,	
  as	
  this	
  has	
  been	
  shown	
  to	
  improve	
  both	
  attrition	
  and	
  response	
  rates	
  (Fan	
  &	
  
Yan	
  2010).	
  According	
  to	
  Deggs,	
  Grover	
  and	
  Kacirek	
  (2010,	
  p.	
  199)	
  “survey	
  piloting	
  is	
  the	
  
process	
  of	
  conceptualizing	
  and	
  reconceptualising	
  the	
  key	
  aims	
  of	
  the	
  study	
  and	
  making	
  
preparations	
  for	
  the	
  fieldwork	
  and	
  analysis	
  so	
  that	
  not	
  too	
  much	
  will	
  go	
  wrong	
  and	
  nothing	
  
will	
  have	
  been	
  left	
  out”.	
  
	
  
It	
  is	
  proposed	
  that	
  Darren	
  conducts	
  a	
  four	
  stage	
  pilot	
  testing	
  process,	
  as	
  described	
  by	
  Deggs,	
  
Grover	
  and	
  Kacirek	
  (2010,	
  p.	
  200):	
  
	
  
1. Initial	
  review	
  of	
  the	
  survey	
  by	
  peer	
  experts	
  to	
  ensure	
  Darren’s	
  survey	
  questions	
  are	
  
complete,	
  efficient,	
  relevant	
  and	
  formatted	
  appropriately.	
  
	
  
2. Cognitive	
  “think	
  aloud”	
  pre-­‐testing	
  to	
  ensure	
  Darren’s	
  survey	
  question	
  wording	
  is	
  
understandable	
  and	
  interpreted	
  consistently,	
  and	
  the	
  questions	
  are	
  sequenced	
  logically.	
  
	
  
3. Testing	
  by	
  a	
  sample	
  of	
  university	
  students	
  to	
  see	
  if	
  any	
  of	
  the	
  procedures	
  planned	
  for	
  
the	
  online	
  survey	
  can	
  be	
  optimised.	
  
	
  
4. A	
  final	
  check	
  by	
  people	
  that	
  are	
  not	
  connected	
  to	
  Darren’s	
  survey	
  to	
  identify	
  errors	
  that	
  
might	
  have	
  been	
  introduced	
  during	
  the	
  revision	
  process.	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  12	
  of	
  17	
  
Part	
  4:	
  Strategies	
  to	
  Improve	
  Response	
  Rate	
  
	
  
The	
  fourth	
  part	
  of	
  this	
  report	
  will	
  discuss	
  strategies	
  that	
  Darren	
  can	
  employ	
  to	
  increase	
  the	
  
response	
  rate	
  for	
  his	
  online	
  survey.	
  Response	
  rates	
  are	
  defined	
  as	
  the	
  number	
  of	
  completed	
  
surveys	
  divided	
  by	
  the	
  number	
  of	
  eligible	
  subjects	
  in	
  the	
  sample	
  (Fan	
  &	
  Yan	
  2010).	
  A	
  low	
  response	
  
rate	
  is	
  undesirable	
  because	
  it	
  compromises	
  survey	
  quality	
  by	
  introducing	
  non-­‐response	
  bias	
  (Shih	
  
&	
  Fan	
  2013).	
  
	
  
Given	
  the	
  response	
  rate	
  for	
  online	
  surveys	
  is	
  estimated	
  to	
  be	
  11%	
  lower	
  than	
  other	
  survey	
  modes	
  
(Fan	
  &	
  Yan	
  2010),	
  it	
  is	
  recommended	
  that	
  Darren	
  employs	
  evidence-­‐based	
  strategies	
  to	
  improve	
  
the	
  response	
  rate	
  to	
  his	
  online	
  survey.	
  For	
  instance,	
  offering	
  to	
  send	
  respondents	
  the	
  survey	
  
results	
  increased	
  the	
  response	
  rate	
  by	
  36%,	
  and	
  including	
  a	
  photograph	
  of	
  the	
  researcher	
  in	
  the	
  
email	
  invitation	
  tripled	
  the	
  response	
  rate	
  (Edwards	
  et	
  al.	
  2009).	
  
	
  
The	
  response	
  rate	
  for	
  online	
  surveys	
  is	
  influenced	
  by	
  the	
  respondent’s	
  knowledge	
  of	
  computers	
  
(Fan	
  &	
  Yan	
  2010).	
  However,	
  the	
  university	
  students	
  targeted	
  in	
  Darren’s	
  online	
  survey	
  are	
  
accustomed	
  to	
  technology	
  (Crews	
  &	
  Curtis	
  2013).	
  Furthermore,	
  a	
  meta-­‐analysis	
  has	
  shown	
  
university	
  students	
  are	
  highly	
  responsive	
  to	
  online	
  surveys	
  (Shih	
  &	
  Fan	
  2013).	
  
	
  
The	
  short	
  length	
  of	
  Darren’s	
  online	
  survey	
  is	
  likely	
  to	
  improve	
  the	
  response	
  rate,	
  given	
  a	
  
systematic	
  review	
  indicated	
  response	
  rates	
  were	
  73%	
  higher	
  in	
  shorter	
  online	
  surveys	
  (Edwards	
  et	
  
al.	
  2009).	
  Furthermore,	
  university	
  students	
  are	
  more	
  likely	
  to	
  participate	
  in	
  online	
  surveys	
  when	
  
they	
  can	
  be	
  completed	
  quickly	
  (Nulty	
  2008).	
  Assurances	
  of	
  confidentiality	
  have	
  also	
  been	
  shown	
  to	
  
increase	
  response	
  rates	
  to	
  online	
  surveys	
  by	
  33%	
  (Edwards	
  et	
  al.	
  2009).	
  
	
  
The	
  email	
  inviting	
  students	
  to	
  participate	
  in	
  Darren’s	
  online	
  survey	
  can	
  profoundly	
  influence	
  the	
  
response	
  rates.	
  The	
  invitation	
  email	
  should	
  also	
  include	
  an	
  electronic	
  link	
  to	
  the	
  online	
  survey,	
  as	
  
this	
  will	
  improve	
  response	
  rates	
  by	
  making	
  it	
  easier	
  for	
  students	
  to	
  respond	
  (Nulty	
  2008).	
  	
  It	
  is	
  
recommended	
  the	
  email	
  states	
  the	
  purpose	
  of	
  the	
  survey	
  and	
  provides	
  students	
  with	
  reasons	
  to	
  
participate	
  (Survey	
  Monkey	
  2008).	
  Although	
  sending	
  online	
  survey	
  invitations	
  from	
  reputable	
  
institutions	
  can	
  increase	
  response	
  rates	
  by	
  35%,	
  Darren	
  should	
  discourage	
  universities	
  from	
  using	
  
the	
  word	
  ‘survey’	
  in	
  the	
  email	
  subject	
  line,	
  as	
  this	
  has	
  been	
  shown	
  to	
  reduce	
  response	
  rates	
  by	
  
20%	
  (Edwards	
  et	
  al.	
  2009).	
  
	
  
Darren	
  should	
  develop	
  a	
  participant	
  reminder	
  plan	
  to	
  motivate	
  students	
  to	
  participate	
  (Deggs,	
  
Grover	
  &	
  Kacirek	
  2010),	
  since	
  Nulty	
  (2008)	
  demonstrated	
  higher	
  response	
  rates	
  when	
  non-­‐
respondent	
  university	
  students	
  received	
  repeated	
  email	
  reminders.	
  Meta	
  analyses	
  have	
  shown	
  
that	
  reminder	
  email	
  messages	
  are	
  a	
  low	
  cost,	
  fast	
  and	
  effective	
  way	
  of	
  improving	
  online	
  survey	
  
response	
  rates,	
  particularly	
  when	
  the	
  first	
  reminder	
  is	
  sent	
  two	
  days	
  after	
  the	
  initial	
  invitation	
  (Fan	
  
&	
  Yan	
  2010).	
  	
  
	
  
Non-­‐monetary	
  incentives	
  in	
  online	
  surveys	
  have	
  been	
  shown	
  to	
  increase	
  response	
  rates	
  by	
  72%	
  
compared	
  to	
  not	
  offering	
  incentives	
  (Edwards	
  et	
  al.	
  2009).	
  Among	
  university	
  students	
  incentive	
  
prizes	
  awarded	
  through	
  a	
  lottery	
  have	
  also	
  been	
  shown	
  to	
  increase	
  response	
  rates	
  to	
  online	
  
surveys	
  (Nulty	
  2008).	
  However,	
  incentives	
  could	
  introduce	
  systematic	
  bias	
  into	
  Darren’s	
  survey	
  
(Deggs,	
  Grover	
  &	
  Kacirek	
  2010),	
  and	
  Darren	
  will	
  need	
  to	
  assess	
  the	
  financial	
  feasibility	
  of	
  offering	
  
respondents	
  incentives.
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  13	
  of	
  17	
  
Part	
  5:	
  Complementary	
  Qualitative	
  Research	
  Design	
  
	
  
The	
  final	
  section	
  of	
  this	
  report	
  will	
  propose	
  a	
  qualitative	
  research	
  design	
  to	
  accompany	
  
Darren’s	
  quantitative	
  online	
  survey.	
  It	
  has	
  been	
  argued	
  that	
  “quantitative	
  surveys	
  alone	
  
provide	
  limited	
  information,	
  and	
  the	
  findings	
  therefore	
  may	
  not	
  include	
  all	
  perspectives”	
  
(Weaver,	
  Spratt	
  &	
  Nair	
  2008,	
  p.	
  32).	
  	
  
	
  
In	
  contrast,	
  pluralistic	
  research	
  that	
  combines	
  both	
  quantitative	
  and	
  qualitative	
  research	
  
methods	
  offers	
  significant	
  advantages	
  over	
  using	
  either	
  of	
  these	
  research	
  methods	
  
individually	
  (Burns	
  &	
  Bush	
  2010).	
  It	
  is	
  therefore	
  proposed	
  that	
  Darren	
  uses	
  a	
  qualitative	
  
research	
  method	
  after	
  his	
  quantitative	
  survey	
  to	
  understand	
  the	
  survey	
  findings.	
  
	
  
Focus	
  groups	
  are	
  a	
  qualitative	
  technique	
  that	
  gathers	
  data	
  through	
  systematic	
  questioning	
  
of	
  several	
  individuals	
  simultaneously	
  in	
  a	
  group	
  setting	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010).	
  
However,	
  traditional	
  focus	
  groups	
  may	
  not	
  be	
  financially	
  feasible	
  for	
  Darren	
  given	
  the	
  
geographical	
  dispersion	
  of	
  Australian	
  university	
  students	
  (Burns	
  &	
  Bush	
  2010).	
  It	
  is	
  therefore	
  
proposed	
  that	
  Darren	
  conducts	
  synchronous	
  online	
  focus	
  groups,	
  which	
  involve	
  groups	
  of	
  
individuals	
  typing	
  unstructured	
  comments	
  in	
  private,	
  electronic	
  chat	
  rooms	
  (Zikmund	
  &	
  
Babin	
  2013).	
  
	
  
The	
  following	
  information	
  has	
  been	
  adapted	
  from	
  an	
  online	
  focus	
  group	
  of	
  university	
  
students	
  in	
  the	
  United	
  States	
  (Deggs,	
  Grover	
  &	
  Kacirek	
  2010)	
  to	
  assist	
  Darren	
  to	
  select	
  
participants,	
  develop	
  instructions,	
  send	
  invitations,	
  monitor	
  focus	
  group	
  dialogue	
  and	
  
analyse	
  the	
  results:	
  
	
  
Online	
  Focus	
  Group	
  Participant	
  Selection	
  
Darren	
  will	
  need	
  to	
  set	
  initial	
  criteria	
  for	
  selecting	
  participants	
  that	
  are	
  attending	
  
universities	
  in	
  Australia	
  and	
  have	
  experience	
  using	
  Facebook.	
  It	
  can	
  be	
  implied	
  that	
  
Facebook	
  users	
  are	
  proficient	
  using	
  internet	
  technologies	
  and	
  therefore	
  able	
  to	
  
communicate	
  in	
  a	
  textual-­‐based	
  online	
  environment.	
  
	
  
Online	
  Focus	
  Group	
  Planning	
  
Darren	
  will	
  need	
  to	
  make	
  decisions	
  about	
  the	
  number	
  of	
  online	
  focus	
  groups,	
  the	
  number	
  of	
  
participants	
  per	
  group,	
  and	
  the	
  duration	
  of	
  each	
  group.	
  He	
  will	
  also	
  need	
  to	
  select	
  
appropriate	
  software	
  for	
  the	
  online	
  focus	
  groups	
  that	
  will	
  ensure	
  access	
  is	
  restricted	
  only	
  to	
  
invited	
  participants,	
  and	
  uphold	
  anonymity	
  and	
  confidentiality	
  among	
  participants.	
  When	
  
the	
  methodology	
  is	
  finalised	
  Darern	
  must	
  also	
  obtain	
  approval	
  from	
  the	
  relevant	
  ethics	
  
committees	
  at	
  participating	
  universities,	
  as	
  his	
  study	
  design	
  will	
  require	
  administrative	
  
support	
  from	
  the	
  university	
  to	
  send	
  email	
  invitations	
  to	
  students	
  in	
  the	
  sample.	
  
	
  
Online	
  Focus	
  Group	
  Participant	
  Invitations	
  
Email	
  invitations	
  that	
  universities	
  send	
  to	
  students	
  in	
  Darren’s	
  sample	
  should	
  contain	
  the	
  
following	
  four	
  features:	
  
	
  
1. Advice	
  that	
  participation	
  is	
  voluntary	
  and	
  students	
  will	
  not	
  be	
  penalised	
  for	
  withdrawing	
  
at	
  any	
  time.	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  14	
  of	
  17	
  
	
  
2. Assurances	
  that	
  the	
  participant’s	
  responses	
  will	
  remain	
  confidential.	
  
	
  
3. Clearly	
  articulated	
  instructions	
  about	
  the	
  steps	
  students	
  will	
  need	
  to	
  follow	
  to	
  
participate	
  in	
  the	
  online	
  focus	
  group.	
  
	
  
4. A	
  link	
  to	
  an	
  informed	
  consent	
  document	
  which	
  adequately	
  explains	
  the	
  participant’s	
  
rights	
  and	
  outlines	
  procedures	
  that	
  will	
  be	
  utilised,	
  the	
  duration	
  of	
  the	
  online	
  focus	
  
group	
  and	
  the	
  methods	
  of	
  communication.	
  
	
  
Online	
  Focus	
  Group	
  Implementation	
  
Darren	
  will	
  need	
  to	
  have	
  a	
  consistent	
  presence	
  in	
  the	
  semi-­‐structured,	
  synchronous	
  online	
  
focus	
  groups	
  to	
  gauge	
  the	
  level	
  of	
  interest	
  among	
  participants.	
  For	
  instance,	
  Darren	
  will	
  
need	
  to	
  carefully	
  monitor	
  the	
  dialogue	
  in	
  the	
  online	
  focus	
  groups	
  but	
  avoid	
  redirecting	
  the	
  
conversation	
  unless	
  the	
  discussion	
  gets	
  off	
  the	
  topic	
  of	
  Facebook.	
  	
  
	
  
Given	
  that	
  interaction	
  between	
  participants	
  in	
  the	
  critical	
  characteristic	
  of	
  focus	
  groups,	
  
Darren	
  will	
  need	
  to	
  maximise	
  any	
  synergy	
  and	
  enthusiasm	
  that	
  forms	
  among	
  geographically	
  
dispersed	
  participants.	
  However,	
  Darren	
  should	
  expect	
  some	
  participants	
  to	
  drop-­‐out	
  of	
  the	
  
online	
  focus	
  groups.	
  	
  	
  
	
  
Online	
  Focus	
  Group	
  Questions	
  
Darren’s	
  questions	
  should	
  ask	
  the	
  online	
  focus	
  group	
  participants	
  about	
  their	
  Facebook	
  
experiences,	
  without	
  steering	
  or	
  controlling	
  the	
  content	
  of	
  the	
  discussion.	
  It	
  is	
  suggested	
  
that	
  Darren	
  begins	
  with	
  a	
  very	
  broad	
  and	
  unfocussed	
  ‘grand	
  tour’	
  question,.	
  For	
  example,	
  
Darren	
  could	
  initially	
  ask	
  the	
  online	
  focus	
  group	
  participants	
  “Please	
  tell	
  us	
  about	
  how	
  you	
  
use	
  Facebook”.	
  It	
  is	
  then	
  recommended	
  that	
  Darren	
  follows	
  up	
  with	
  questions	
  based	
  on	
  the	
  
participant’s	
  comments	
  to	
  elicit	
  more	
  insight	
  and	
  greater	
  feedback	
  from	
  participants.	
  
	
  
Online	
  Focus	
  Group	
  Data	
  Analysis	
  
It	
  is	
  recommended	
  that	
  Darren	
  analyses	
  the	
  verbatim	
  transcripts	
  created	
  in	
  the	
  online	
  focus	
  
groups	
  with	
  computerised	
  interpretive	
  software	
  for	
  qualitative	
  research.	
  This	
  will	
  enable	
  
Darren	
  to	
  identify	
  themes	
  and	
  connections	
  in	
  the	
  transcripts	
  (Zikmund	
  &	
  Babin	
  2013).	
  	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  15	
  of	
  17	
  
Conclusion	
  
	
  
This	
  report	
  has	
  provided	
  Darren	
  with	
  guidance	
  to	
  conduct	
  a	
  high-­‐quality	
  online	
  survey	
  and	
  
online	
  focus	
  groups	
  to	
  support	
  his	
  research	
  of	
  the	
  uses	
  and	
  gratifications	
  of	
  Facebook	
  
among	
  a	
  representative	
  sample	
  of	
  Australian	
  university	
  students.	
  The	
  first	
  section	
  of	
  the	
  
report	
  logically	
  deconstructed	
  the	
  wording	
  and	
  layout	
  of	
  questions	
  in	
  Darren’s	
  survey.	
  Key	
  
suggestions	
  included	
  correcting	
  an	
  overlapping	
  scale	
  and	
  re-­‐wording	
  double-­‐barrelled	
  
questions	
  and	
  questions	
  that	
  contained	
  extreme	
  absolutes,	
  ambiguity	
  or	
  grammatical	
  
errors.	
  	
  
	
  
The	
  second	
  section	
  of	
  the	
  report	
  provided	
  Darren	
  with	
  a	
  detailed	
  sampling	
  plan.	
  The	
  target	
  
population	
  was	
  defined	
  as	
  students	
  attending	
  Australian	
  universities.	
  The	
  sample	
  frame	
  was	
  
based	
  on	
  a	
  de-­‐identified	
  listing	
  of	
  students	
  obtained	
  from	
  the	
  administrative	
  records	
  from	
  
Australian	
  universities.	
  The	
  sample	
  size	
  of	
  9,250	
  respondents	
  was	
  calculated	
  based	
  on	
  a	
  ±3%	
  
allowable	
  sample	
  error	
  at	
  the	
  99%	
  confidence	
  level,	
  maximum	
  variability,	
  and	
  a	
  target	
  
response	
  rate	
  of	
  20%.	
  A	
  comparison	
  between	
  the	
  sample’s	
  demographic	
  profile	
  and	
  the	
  
profile	
  of	
  students	
  in	
  the	
  sample	
  frame	
  was	
  suggested	
  to	
  validate	
  the	
  sample	
  as	
  
representative	
  of	
  the	
  Australian	
  university	
  student	
  population.	
  
	
  
The	
  third	
  section	
  of	
  this	
  report	
  discussed	
  important	
  design	
  and	
  implementation	
  
considerations	
  when	
  transforming	
  Darren’s	
  paper	
  survey	
  into	
  an	
  online	
  format.	
  To	
  
overcome	
  the	
  technological	
  issues	
  identified,	
  it	
  was	
  recommended	
  that	
  Darren	
  re-­‐design	
  
the	
  survey	
  with	
  specialised	
  online	
  survey	
  software,	
  send	
  survey	
  invitations	
  to	
  students	
  
directly	
  from	
  the	
  university	
  and	
  enable	
  prospective	
  students	
  to	
  opt-­‐out.	
  Evidence-­‐based	
  
suggestions	
  to	
  lower	
  drop-­‐out	
  rates	
  were	
  suggested,	
  including	
  a	
  white	
  background,	
  using	
  a	
  
maximum	
  of	
  ten	
  questions	
  per	
  screen,	
  and	
  providing	
  a	
  realistic	
  estimate	
  of	
  the	
  time	
  
required	
  to	
  complete	
  the	
  survey.	
  Prior	
  to	
  implementation,	
  a	
  validated	
  four	
  stage	
  pilot	
  
testing	
  process	
  was	
  suggested	
  to	
  reduce	
  drop-­‐out	
  rates	
  and	
  improve	
  response	
  rates.	
  
	
  
In	
  recognition	
  of	
  the	
  effect	
  of	
  response	
  rates	
  on	
  survey	
  quality,	
  the	
  fourth	
  section	
  of	
  this	
  
report	
  discussed	
  strategies	
  to	
  increase	
  the	
  response	
  rate	
  for	
  Darren’s	
  online	
  survey.	
  Key	
  
strategies	
  to	
  increase	
  online	
  survey	
  response	
  included	
  a	
  photo	
  of	
  Darren	
  in	
  the	
  email	
  
invitation,	
  not	
  using	
  ‘survey’	
  in	
  the	
  email	
  subject	
  line,	
  and	
  sending	
  email	
  reminders	
  to	
  non-­‐
respondents	
  after	
  2	
  days.	
  
	
  
Given	
  the	
  benefits	
  of	
  pluralistic	
  research,	
  the	
  final	
  section	
  of	
  this	
  report	
  proposed	
  a	
  
qualitative	
  online	
  focus	
  group	
  to	
  accompany	
  Darren’s	
  online	
  survey.	
  This	
  research	
  method	
  
was	
  shown	
  to	
  suit	
  both	
  the	
  geographical	
  dispersion	
  and	
  the	
  technical	
  abilities	
  of	
  Australian	
  
university	
  students.	
  A	
  validated	
  methodology	
  based	
  on	
  online	
  focus	
  groups	
  for	
  university	
  
students	
  in	
  the	
  United	
  States	
  was	
  also	
  provided	
  to	
  guide	
  Darren’s	
  selection	
  of	
  participants,	
  
survey	
  instructions	
  and	
  invitations,	
  and	
  to	
  analyse	
  the	
  results	
  of	
  the	
  focus	
  group	
  dialogue.
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  16	
  of	
  17	
  
Reference	
  List	
  
	
  
Andrews,	
  D,	
  Nonnecke,	
  B	
  &	
  Preece,	
  J	
  2003,	
  ‘Conducting	
  Research	
  on	
  the	
  Internet:	
  
Online	
  Survey	
  Design,	
  Development	
  and	
  Implementation	
  Guidelines’,	
  International	
  Journal	
  
of	
  Human-­‐Computer	
  Interaction,	
  vol.	
  16,	
  no.	
  2,	
  pp.	
  185-­‐210.	
  
	
  
Australian	
  Council	
  for	
  Educational	
  Research	
  (ACER)	
  2009,	
  Engaging	
  Students	
  for	
  Success,	
  viewed	
  
2	
  August	
  2013,	
  
http://www.acer.edu.au/documents/aussereports/AUSSE_Australasian-­‐Student-­‐Engagement-­‐
Report-­‐ASER-­‐2008.pdf.	
  
	
  
Australian	
  Education	
  Network	
  2013,	
  Student	
  Numbers	
  at	
  Australian	
  Universities,	
  viewed	
  2	
  
August	
  2013,	
  http://www.australianuniversities.com.au/directory/student-­‐numbers.	
  
	
  
Burns,	
  AC	
  &	
  Bush,	
  RF	
  2010,	
  Marketing	
  Research	
  (Global),	
  Pearson	
  Education,	
  Upper	
  Saddle	
  
River.	
  
	
  
Charles	
  Darwin	
  University	
  2011,	
  University	
  profile,	
  viewed	
  2	
  August	
  2013,	
  
http://www.cdu.edu.au/about/university-­‐profile.	
  
	
  
Crews,	
  TB	
  &	
  Curtis,	
  DF	
  2011,	
  ‘Online	
  Course	
  Evaluations:	
  Faculty	
  Perspective	
  and	
  Strategies	
  
for	
  Improved	
  Response	
  Rates’,	
  Assessment	
  &	
  Evaluation	
  in	
  Higher	
  Education,	
  vol.	
  36,	
  no.7,	
  
pp.	
  865-­‐878,	
  
	
  
Deggs,	
  D,	
  Grover,	
  K	
  &	
  Kacirek,	
  K	
  2010,	
  ‘Using	
  Message	
  Boards	
  to	
  Conduct	
  Online	
  Focus	
  
Groups’,	
  The	
  Qualitative	
  Report,	
  vol.	
  15,	
  no.	
  4,	
  pp.	
  1027-­‐1036.	
  
	
  
Dennis,	
  S	
  2010,	
  How	
  to	
  design	
  a	
  questionnaire/survey,	
  viewed	
  2	
  August	
  2013,	
  
http://www.slidefinder.net/h/how_design_questionnaire_survey_sarah/8questionnairedesi
gn-­‐ppt-­‐sarahdennishbirden/5248140.	
  
	
  
Edwards	
  PJ,	
  Roberts	
  I,	
  Clarke	
  MJ,	
  DiGuiseppi	
  C,	
  Wentz	
  R,	
  Kwan	
  I,	
  Cooper	
  R,	
  Felix	
  LM,	
  Pratap	
  
S	
  2009,	
  ‘Methods	
  to	
  increase	
  response	
  to	
  postal	
  and	
  electronic	
  questionnaires’.	
  Cochrane	
  
Database	
  of	
  Systematic	
  Reviews,	
  no.	
  3,	
  viewed	
  7	
  August	
  2013,	
  The	
  Cochrane	
  Library,	
  
http://www.ncbi.nlm.nih.gov/pubmed/19588449.	
  
	
  
Fan,	
  W	
  &	
  Yan,	
  Z	
  2010,	
  ‘Factors	
  affecting	
  response	
  rates	
  of	
  the	
  web	
  survey:	
  A	
  systematic	
  
review’,	
  Computers	
  in	
  Human	
  Behavior,	
  vol.	
  26,	
  no.	
  2,	
  pp.	
  132-­‐139.	
  
	
  
Malhotra,	
  MK	
  &	
  Grover,	
  V	
  1998,	
  ‘An	
  assessment	
  of	
  survey	
  research	
  in	
  POM:	
  from	
  constructs	
  
to	
  theory’,	
  Journal	
  of	
  Operations	
  Management,	
  vol.	
  16,	
  no.	
  4,	
  pp.	
  407-­‐425.	
  
	
  
Marsden,	
  PV	
  &	
  Wright,	
  JD	
  (eds)	
  2010,	
  Handbook	
  of	
  Survey	
  Research,	
  2nd	
  edn,	
  Emerald	
  
Group	
  Publishing,	
  Bingley.	
  
	
  
McMurray,	
  AJ,	
  Pace,	
  RW	
  &	
  Scott,	
  D	
  2004,	
  Research:	
  a	
  common	
  sense	
  approach,	
  Thomson	
  
Social	
  Sciences	
  Press,	
  Southbank.	
  
Critical	
  Analysis:	
  Survey	
  Design	
   Prepared	
  by	
  Nicole	
  Brown	
  (2013)	
   Page	
  17	
  of	
  17	
  
	
  
Nulty,	
  DD	
  2008,	
  ‘The	
  adequacy	
  of	
  response	
  rates	
  to	
  online	
  and	
  paper	
  surveys:	
  what	
  can	
  be	
  
done?’,	
  Assessment	
  &	
  Evaluation	
  in	
  Higher	
  Education,	
  vol.	
  33,	
  no.	
  3,	
  pp.	
  301-­‐314.	
  
	
  
Ritchie,	
  J	
  &	
  Lewis,	
  J	
  (eds)	
  2005,	
  Qualitative	
  Research	
  Practice,	
  Sage	
  Publications,	
  London.	
  
	
  
Shih,	
  T-­‐H	
  &	
  Fan	
  X	
  2013,	
  ‘Comparing	
  response	
  rates	
  from	
  web	
  and	
  mail	
  surveys:	
  a	
  meta-­‐
anlysis’,	
  Field	
  Methods,	
  vol.	
  20,	
  no.	
  3,	
  pp.	
  249-­‐271.	
  
	
  
Survey	
  Monkey	
  2008,	
  Smart	
  Survey	
  Design,	
  viewed	
  2	
  August	
  2013,	
  
http://s3.amazonaws.com/SurveyMonkeyFiles/SmartSurvey.pdf.	
  
	
  
Thomson,	
  G,	
  Rosenthal,	
  D	
  &	
  Russell,	
  J	
  2006,	
  ‘Cultural	
  Stress	
  among	
  International	
  Students	
  at	
  
an	
  Australian	
  University’,	
  Paper	
  presented	
  at	
  the	
  Australian	
  International	
  Education	
  
Conference,	
  viewed	
  2	
  August	
  2013,	
  
http://www.aiec.idp.com/pdf/thomson%20(paper)%20fri%201050%20mr5.pdf.	
  
	
  
Toepoel,	
  V,	
  Das,	
  M	
  &	
  Van	
  Soest,	
  A	
  2009,	
  ‘Design	
  of	
  Web	
  Questionnaires:	
  The	
  Effects	
  of	
  the	
  
Number	
  of	
  Items	
  per	
  Screen’,	
  Field	
  Methods,	
  vol.	
  21,	
  no.	
  2,	
  pp.	
  200-­‐213.	
  
	
  
Weaver,	
  D,	
  Spratt,	
  C	
  &	
  Nair,	
  CS	
  2008,	
  ‘Academic	
  and	
  student	
  use	
  of	
  a	
  learning	
  management	
  
system:	
  Implications	
  for	
  quality’,	
  Australasian	
  Journal	
  of	
  Educational	
  Technology,	
  vol.	
  24,	
  no.	
  
1,	
  pp.	
  30-­‐41.	
  
	
  
Wiggins,	
  BB	
  &	
  Bowers,	
  A	
  n.d.,	
  Designing	
  Survey	
  Questions,	
  viewed	
  2	
  August	
  2013,	
  
http://faculty.washington.edu/janegf/DsgnSrvyQues.pdf.	
  
	
  
Zikmund,	
  WG	
  &	
  Babin,	
  BJ	
  2013,	
  Essentials	
  of	
  Marketing	
  Research,	
  5th	
  edn,	
  Cengage	
  
learning,	
  Ohio.	
  

More Related Content

Similar to Critical Analysis Survey Design

Research Writing Methodology
Research Writing MethodologyResearch Writing Methodology
Research Writing Methodology
Aiden Yeh
 
Model design to develop online web based questionnaire
Model design to develop online web based questionnaireModel design to develop online web based questionnaire
Model design to develop online web based questionnaire
TELKOMNIKA JOURNAL
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
ssuser9878d0
 
Survey Resaerch
Survey ResaerchSurvey Resaerch
Survey Resaerch
Tehreem Ansari
 
Rsi 26 5-solooooooo
Rsi 26 5-soloooooooRsi 26 5-solooooooo
Rsi 26 5-soloooooooDilshad Shah
 
Individual Assignment.docx
Individual Assignment.docxIndividual Assignment.docx
Individual Assignment.docx
Jemal Musa
 
PGCM PROJECT WORK GM 100.pptx
PGCM PROJECT WORK GM 100.pptxPGCM PROJECT WORK GM 100.pptx
PGCM PROJECT WORK GM 100.pptx
ShubhamGoel152302
 
Mixed research methodology.pptx
Mixed research methodology.pptxMixed research methodology.pptx
Mixed research methodology.pptx
PuneethKumarGB
 
Survey and Sample Size Calculation in Epidemiological Studies.pptx
Survey and Sample Size Calculation in Epidemiological Studies.pptxSurvey and Sample Size Calculation in Epidemiological Studies.pptx
Survey and Sample Size Calculation in Epidemiological Studies.pptx
Muhammad Sayyam Akram
 
Interview Process Instructional Plan
Interview Process Instructional PlanInterview Process Instructional Plan
Interview Process Instructional Plan
Gayle Morris Donahue
 
Research Presentation keynote (not yet result)
Research Presentation keynote (not yet result)Research Presentation keynote (not yet result)
Research Presentation keynote (not yet result)
Riniort Huang
 
DESIGN AND DEVELOPMENT OF AN ONLINE EXAM MAKER AND CHECKER
DESIGN AND DEVELOPMENT OF AN  ONLINE EXAM MAKER AND CHECKERDESIGN AND DEVELOPMENT OF AN  ONLINE EXAM MAKER AND CHECKER
DESIGN AND DEVELOPMENT OF AN ONLINE EXAM MAKER AND CHECKER
Lyceum of the Philippines University Batangas
 
Evaluating the Usability of GrantFinder
Evaluating the Usability of GrantFinderEvaluating the Usability of GrantFinder
Evaluating the Usability of GrantFindersoftwaresatish
 
Marketing research
Marketing researchMarketing research
Marketing research
Rajamani5373
 
Police and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyPolice and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyInteract Business Group
 
Question 1
Question 1Question 1
Attitudes of students towards online courses
Attitudes of students towards online coursesAttitudes of students towards online courses
Attitudes of students towards online coursesDinna Dsouza
 
When evaluating a program, there are numerous ways to collect da.docx
When evaluating a program, there are numerous ways to collect da.docxWhen evaluating a program, there are numerous ways to collect da.docx
When evaluating a program, there are numerous ways to collect da.docx
washingtonrosy
 
Scanned by CamScanner1-archival data might be acquired.docx
Scanned by CamScanner1-archival data might be acquired.docxScanned by CamScanner1-archival data might be acquired.docx
Scanned by CamScanner1-archival data might be acquired.docx
kenjordan97598
 

Similar to Critical Analysis Survey Design (20)

Research Writing Methodology
Research Writing MethodologyResearch Writing Methodology
Research Writing Methodology
 
Model design to develop online web based questionnaire
Model design to develop online web based questionnaireModel design to develop online web based questionnaire
Model design to develop online web based questionnaire
 
MethodsofDataCollection.pdf
MethodsofDataCollection.pdfMethodsofDataCollection.pdf
MethodsofDataCollection.pdf
 
Survey Resaerch
Survey ResaerchSurvey Resaerch
Survey Resaerch
 
Rsi 26 5-solooooooo
Rsi 26 5-soloooooooRsi 26 5-solooooooo
Rsi 26 5-solooooooo
 
Individual Assignment.docx
Individual Assignment.docxIndividual Assignment.docx
Individual Assignment.docx
 
Descriptive Method
Descriptive MethodDescriptive Method
Descriptive Method
 
PGCM PROJECT WORK GM 100.pptx
PGCM PROJECT WORK GM 100.pptxPGCM PROJECT WORK GM 100.pptx
PGCM PROJECT WORK GM 100.pptx
 
Mixed research methodology.pptx
Mixed research methodology.pptxMixed research methodology.pptx
Mixed research methodology.pptx
 
Survey and Sample Size Calculation in Epidemiological Studies.pptx
Survey and Sample Size Calculation in Epidemiological Studies.pptxSurvey and Sample Size Calculation in Epidemiological Studies.pptx
Survey and Sample Size Calculation in Epidemiological Studies.pptx
 
Interview Process Instructional Plan
Interview Process Instructional PlanInterview Process Instructional Plan
Interview Process Instructional Plan
 
Research Presentation keynote (not yet result)
Research Presentation keynote (not yet result)Research Presentation keynote (not yet result)
Research Presentation keynote (not yet result)
 
DESIGN AND DEVELOPMENT OF AN ONLINE EXAM MAKER AND CHECKER
DESIGN AND DEVELOPMENT OF AN  ONLINE EXAM MAKER AND CHECKERDESIGN AND DEVELOPMENT OF AN  ONLINE EXAM MAKER AND CHECKER
DESIGN AND DEVELOPMENT OF AN ONLINE EXAM MAKER AND CHECKER
 
Evaluating the Usability of GrantFinder
Evaluating the Usability of GrantFinderEvaluating the Usability of GrantFinder
Evaluating the Usability of GrantFinder
 
Marketing research
Marketing researchMarketing research
Marketing research
 
Police and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation StudyPolice and Fire On-Line Courseware Training Trends and Evaluation Study
Police and Fire On-Line Courseware Training Trends and Evaluation Study
 
Question 1
Question 1Question 1
Question 1
 
Attitudes of students towards online courses
Attitudes of students towards online coursesAttitudes of students towards online courses
Attitudes of students towards online courses
 
When evaluating a program, there are numerous ways to collect da.docx
When evaluating a program, there are numerous ways to collect da.docxWhen evaluating a program, there are numerous ways to collect da.docx
When evaluating a program, there are numerous ways to collect da.docx
 
Scanned by CamScanner1-archival data might be acquired.docx
Scanned by CamScanner1-archival data might be acquired.docxScanned by CamScanner1-archival data might be acquired.docx
Scanned by CamScanner1-archival data might be acquired.docx
 

Critical Analysis Survey Design

  • 1. Critical  Analysis:   Survey  Design       Prepared  by   Nicole  Brown   September  2013  
  • 2. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  1  of  17   Executive  Summary     This  report  provides  guidance  to  a  student  at  Central  Queensland  University  (CQU)  to   support  his  research  of  the  uses  and  gratifications  of  Facebook  among  Australian  university   students.  To  improve  the  quality  of  the  survey  data,  the  wording  and  layout  of  the  survey   has  been  logically  deconstructed,  and  key  suggestions  include:   • Creating  a  more  meaningful  title.   • Adding  an  introductory  paragraph.   • Correcting  an  overlapping  scale.   • Re-­‐wording  double-­‐barrelled  questions  and  questions  that  contain  extreme  absolutes,   ambiguity  or  grammatical  errors.       The  target  population  is  defined  as  students  attending  Australian  universities,  with  a   sampling  unit  comprising  one  Australian  university  student.  The  sample  frame  is  based  on  a   de-­‐identified  listing  of  students  obtained  from  the  administrative  records  from  participating   Australian  universities.     The  sample  size  is  calculated  as  9,250  respondents  based  on  a  ±3%  allowable  sample  error   at  the  99%  confidence  level,  maximum  variability,  and  a  target  response  rate  of  20%.  To   validate  the  sample  as  representative  of  the  Australian  university  student  population,  a   method  was  suggested  that  compares  the  sample’s  demographic  profile  with  the  profile  of   students  in  the  sample  frame.     Given  the  significant  differences  between  paper  and  online  surveys,  important  design  and   implementation  considerations  are  explored.  Recommendations  are  provided  to  overcome   key  technological  barriers,  including:   • Designing  the  survey  with  specialised  online  survey  software.   • Sending  survey  invitations  to  students  directly  from  the  university.   • Enabling  prospective  participants  to  opt-­‐out  of  receiving  further  email  communication.     Other  design  principles  are  suggested  to  reduce  drop-­‐out  rates,  including  a  white   background,  using  a  maximum  of  ten  questions  per  screen,  and  providing  a  realistic   estimate  of  the  time  required  to  complete  the  survey.  Prior  to  implementation,  a  validated   four  stage  pilot  testing  process  is  suggested  to  reduce  drop-­‐out  rates  and  improve  response   rates.     Since  a  low  response  rate  can  compromises  survey  quality,  key  strategies  are  suggested  to   increase  online  survey  responses.  These  suggestions  include  a  photo  of  the  researcher  in  the   email  invitation,  not  using  ‘survey’  in  the  email  subject  line,  and  sending  email  reminders  to   non-­‐respondents  after  2  days.     In  recognition  of  the  significant  advantages  of  pluralistic  research  over  quantitative  methods   alone,  a  qualitative  online  focus  group  is  suggested  to  accompany  the  online  survey.  This   research  method  suits  both  the  geographical  dispersion  and  the  technical  abilities  of   Australian  university  students.  A  validated  methodology  based  on  online  focus  groups  for   university  students  in  the  United  States  is  provided  to  inform  selection  of  participants,   instruction  development,  monitoring  of  focus  group  dialogue  and  analysis  of  the  results.
  • 3. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  2  of  17   Table  of  Contents           Executive  Summary  ..................................................................................................................  1   Introduction  .............................................................................................................................  3   Part  1:  Questionnaire  Critical  Analysis  .....................................................................................  4   Part  2:  Sampling  Plan  ...............................................................................................................  7   Part  3:  Online  Survey  Design  and  Implementation  Considerations  .......................................  10   Part  4:  Strategies  to  Improve  Response  Rate  .........................................................................  12   Part  5:  Complementary  Qualitative  Research  Design  ............................................................  13   Conclusion  ..............................................................................................................................  15   Reference  List  .........................................................................................................................  16  
  • 4. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  3  of  17   Introduction     Darren  is  a  student  at  Central  Queensland  University  (CQU)  that  has  designed  a  survey  to   support  his  research  of  the  uses  and  gratifications  of  Facebook  among  Australian  university   students.  This  report  will  provide  Darren  with  guidance  to  implement  a  high-­‐quality  online   survey  with  a  representative  sample  of  students  from  most  Australian  universities.     The  first  section  of  the  report  will  critically  analyse  Darren’s  survey,  a  process  that  will   logically  deconstruct  the  survey  and  suggest  ways  the  survey  can  be  improved.  The   suggestions  will  mainly  focus  on  the  wording  or  layout  of  questions,  and  ultimately  improve   the  quality  of  Darren’s  survey  data.     The  second  section  of  the  report  will  suggest  a  detailed  sampling  plan  that  precisely  defines   the  population,  describes  the  sample  frame  and  calculates  the  sample  size.  The  sampling   plan  will  also  suggest  an  appropriate  sampling  method  and  explain  how  to  validate  the   sample,  thereby  ensuring  Darren’s  online  survey  respondents  are  representative  of   Australian  university  student  population.       Important  considerations  for  transforming  Darren’s  paper  survey  into  an  online  format  will   be  discussed  in  the  third  section  of  the  report.  Key  recommendations  will  be  provided  about   technological,  demographic  and  response  rate  characteristics  that  influence  how  Darren’s   survey  should  be  designed  and  how  the  survey  can  be  implemented.     Since  a  low  response  rate  can  compromises  survey  quality,  the  fourth  section  of  this  report   will  discuss  strategies  to  increase  the  response  rate  for  Darren’s  online  survey.  In   recognition  of  the  benefits  of  pluralistic  research,  the  final  section  of  this  report  will  propose   a  qualitative  research  design  to  accompany  Darren’s  quantitative  online  survey.          
  • 5. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  4  of  17   Part  1:  Questionnaire  Critical  Analysis     The  first  part  of  this  report  will  critically  analyse  Darren’s  survey  and  provide  guidance  to   overcome  design  flaws.  Survey  design  directly  affects  the  quality  of  data  collected  because   the  wording  or  format  of  questions  can  create  a  bias  that  influences  respondents  answers   (Burns  &  Bush  2010).     Darren’s  survey  has  incorporated  several  good  questionnaire  design  principles.  For  instance:   • Questions  at  the  beginning  of  the  survey  explicitly  address  the  survey  topic,  questions   about  similar  topics  have  been  grouped  together,  and  sensitive  questions  about   respondent  demographics  appear  at  the  end  of  the  survey  (Marsden  &  Wright  2010).   • Alternate  shadings  of  questions  and  simple  headings  make  it  easier  for  respondents  to   navigate  the  survey  (Wiggins  &  Bowers  n.d.).     • The  statements  ‘strongly  agree’  and  ‘strongly  disagree’  on  opposite  ends  of  the  semantic   differential  scale  are  short  and  precise  (Survey  Monkey  2008).     Despite  the  positive  features  discussed  above,  several  flaws  are  also  evident  in  Darren’s   survey  design.  For  instance,  the  survey  title  “Survey  Measures”  is  not  meaningful  to   respondents,  and  the  survey  instructions  provided  are  inadequate  (Deggs,  Grover  &  Kacirek   2010).  It  is  therefore  recommended  that  Darren  creates  a  meaningful  title  for  the  survey,   and  includes  an  introductory  paragraph,  as  it  is  good  practice  to  explain  the  survey’s   purpose,  identify  the  organisation  conducting  the  survey,  assure  respondents  of   confidentiality  and  describe  how  the  collected  data  will  be  used  (Survey  Monkey  2008).     Another  common  error  that  appears  in  Darren’s  survey  is  the  overlapping  scale  in  the   response  options  for  question  twenty  (Deggs,  Grover  &  Kacirek  2010).  For  example,  the  time   thirty  minutes  appears  in  two  separate  options.  It  is  recommended  that  Darren  re-­‐designs   the  response  options  to  question  twenty  so  that  the  scales  do  not  overlap,  and  considers   using  consistent  time  increments.  For  example  the  first  option  could  be  0-­‐9  minutes;  the   second  option  could  be  10-­‐19  minutes,  and  so  on.      The  word  ‘most’  is  an  extreme  absolute  that  puts  survey  respondents  in  an  uncomfortable   situation  (Burns  &  Bush  2010).  On  this  basis,  it  is  recommended  that  Darren  removes  the   word  ‘most’  from  his  survey  instructions  and  from  question  twenty-­‐one.  For  example,  the   instructions  can  be  changed  to  “Please  select  the  responses  that  apply  to  you.”  Instead  of   using  the  word  ‘most’  in  question  twenty-­‐one,  Darren  could  use  a  scale  to  objectively   measure  the  frequency  that  respondents  use  a  tablet,  computer  or  smartphone  to  access   Facebook.       Contrary  to  the  survey  design  principles  advocated  by  Marsden  and  Wright  (2010),  Darren’s   survey  contains  double-­‐barrelled  survey  questions  that  simultaneously  address  two   separate  issues.  This  was  evident  in  question  one,  eleven  and  sixteen.  It  is  recommended   that  Darren  resolves  this  design  flaw  by  separating  the  double-­‐barrelled  questions  into  two   separate  questions  that  address  each  issue  individually.  For  example,  question  one  could  be   replaced  with  the  statements  “I  like  to  share  my  status  with  friends”  and  “I  like  to  share  my   photos  with  friends”.  
  • 6. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  5  of  17     Marsden  and  Wright  (2010)  advise  that  questionnaires  should  avoid  words  with  ambiguous   meanings.  However,  several  questions  in  Darren’s  survey  contain  ambiguous  wording.   Examples  of  this  problem  are  listed  in  Table  3,  along  with  suggested  changes  to  the  question   wording  to  improve  clarity.     Table  3:  Ambiguous  Survey  Questions   Question   Specific  Design  Problem   Example  of  Proposed  Solution   4   This  question  does  not  specify  Facebook   at  the  source  of  the  Friend  requests,  and   could  be  mistaken  for  other  social  media.   ‘I  like  to  receive  friend  requests  on   Facebook.’   5   This  question  does  not  specify  that   Facebook  is  the  method  of  finding   people.   ‘I  like  to  find  people  on  Facebook   that  I  have  not  seen  recently.’   6   The  word  ‘old’  could  be  interpreted   either  as  the  friend’s  age  or  the  duration   of  the  friendship.   ‘I  like  to  find  out  on  Facebook  what   long-­‐standing  friends  are  doing   now.’   7   This  question  does  not  specify  that   Facebook  is  the  method  of  making   contact.     ‘I  like  the  ability  to  contact  friends   on  Facebook  that  live  far  away.’   19   A  respondent  could  write  ‘it  varies’   instead  of  providing  a  numerical   response.   Change  the  response  to  this   question  to  a  list  of  tick  box  options   with  consistent  increments  e.g.  ‘less   than  once’,  ‘once’,  ‘twice’,  ‘three   times’,  ‘more  than  four  times’.   25   This  question  could  be  interpreted  as   asking  for  the  respondent’s  occupation.     ‘What  is  your  employment  status?’     The  following  three  grammatical  errors  compromise  the  clarity  of  Darren’s  survey  and  could   potentially  create  a  bias  (Burns  &  Bush  2010):     1. The  word  ‘friend’  in  question  should  be  a  plural.       2. Question  three  ends  with  a  proposition,  and  could  be  replaced  with  the  wording  ‘I  use   Facebook  to  reconnect  when  I’ve  lost  contact  with  people.’     3. Both  questions  seventeen  and  eighteen  mix  the  present  and  future  tenses.  It  is   recommended  that  Darren  changes  the  wording  in  question  seventeen  to  ‘My  friends   think  I  am  very  active  in  the  group  when  I  am  being  active  on  Facebook’.  Similarly,  the   following  wording  is  suggested  for  question  eighteen  ‘I  become  more  famous  among  my   friends  when  I  am  being  active  on  Facebook’.     Dennis  (2010)  contends  survey  layout  and  wording  are  equally  important.  Although  the   following  three  examples  describe  layout  problems,  these  issues  will  be  resolved  when   Darren’s  survey  is  redesigned  for  the  online  setting:    
  • 7. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  6  of  17   1. Question  twelve  in  Darren’s  survey  appears  on  a  separate  page  to  other  questions   relating  to  social  investigation,  and  Dennis  (2010)  advises  against  splitting  questions  or   answers  across  pages.   2. The  tick  boxes  for  the  responses  to  question  twenty-­‐one  appear  on  the  right  side  of  the   response  options,  which  is  inconsistent  with  the  layout  for  the  tick  box  responses  for   questions  twenty,  twenty-­‐five  and  twenty-­‐six.     3. The  line  to  write  responses  to  question  twenty-­‐two  is  not  aligned  with  the  lines  to   respond  to  questions  nineteen  and  twenty-­‐four.     4. The  numbers  one  and  two  in  the  semantic  differential  scale  are  not  equidistant  and   could  be  a  potential  source  of  bias  (Deggs,  Grover  &  Kacirek  2010).  
  • 8. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  7  of  17   Part  2:  Sampling  Plan     Sampling  is  a  process  used  by  researchers  to  select  a  representative  segment  of  a  specific   population  they  are  investigating  (Burns  &  Bush  2010).  The  Australian  Council  for   Educational  Research  (ACER)  contends  “a  well-­‐designed  sample  can  more  efficiently  yield   results  which  are  as  good  as  those  provided  by  a  census”  (2009,  p.  8).       The  second  section  of  this  report  provides  Darren  with  a  sampling  plan  that  will  ensure   online  survey  respondents  are  representative  of  Australian  university  student  population.   The  sampling  plan  will  precisely  define  the  population  and  sampling  unit,  describe  the   sample  frame,  calculate  the  sample  size,  suggest  an  appropriate  sampling  method  and   explain  how  to  validate  the  sample.     Population  and  Sampling  Unit  Definitions   Defining  the  target  population  involves  specifying  the  whole  group  being  investigated   (McMurray,  Pace  &  Scott  2004).  Based  on  the  objective  of  Darren’s  research,  the  specific   target  population  are  students  attending  Australian  universities.  Sampling  units  are  the   most  basic  elements  that  can  be  selected  in  the  sample  (Zikmund  &  Babin  2013)  The   sampling  unit  for  Darren’s  survey  comprises  one  university  student.     Sample  Frame   A  sample  frame  is  the  source  material  that  lists  all  members  of  a  target  population  from   which  the  sample  is  drawn  (Ritchie  &  Lewis  2005).  Administrative  records  from  Australian   universities  will  be  the  most  convenient  type  of  sample  frame  for  Darren’s  survey.  Access  to   these  records  will  need  to  be  negotiated  with  each  university  (Ritchie  &  Lewis  2005  p.  89)     Sample  size  calculation   The  sample  size  for  Darren’s  survey  has  been  initially  calculated  with  the  standard  sample   size  formula  shown  in  Figure  1  below.  Darren’s  sample  size  of  1,850  respondents  assumes  a   ±3%  allowable  sample  error  (e  =  0.03)  at  the  99%  confidence  level  (z  =  2.58)  and  maximum   variability  (p  =  0.5  and  q  =  0.5).         Figure  1:  Standard  sample  size  formula  (adapted  from  Burns  &  Bush  2010,  p.  409)     However,  Darren  is  strongly  encouraged  to  intentionally  over-­‐sample  to  avoid  complex   follow-­‐up  of  replacement  samples  (ACER  2009).  The  amount  of  over-­‐sampling  required  to   attain  a  valid  sample  size  at  the  99%  confidence  level  can  be  estimated  by  pilot  testing  
  • 9. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  8  of  17   Darren’s  online  surveys  to  reveal  undeliverable  email,  declined  and  completed  survey  rates   (Deggs,  Grover  &  Kacirek  2010)  .   Over-­‐sampling  estimates  can  also  be  “based  on  the  researcher’s  knowledge  of  incidence   rates,  nonresponse  rates,  and  unusable  responses”  (Burns  &  Bush  2010,  p.  391).  It  is   recommended  that  Darren’s  over-­‐sampling  estimate  is  initially  guided  by  the  20%  target   response  rate  used  in  an  online  survey  of  students  attending  23  Australian  universities   (ACER  2009).  On  the  basis  of  this  target  response  rate  Darren  will  require  9,250   respondents,  which  is  five  times  the  initial  sample  size  estimate  of  1,850  respondents.     Darren  is  also  encouraged  to  specify  different  sample  sizes  for  each  Australian  university   (ACER  2009),  given  the  size  of  Australian  universities  varies  from  6,554  to  53,612  students   (Australian  Education  Network  2013).       Sampling  Method   To  achieve  a  representative  sample,  Darren  is  encouraged  to  include  in  the  sample  at  least   half  of  the  universities  in  each  Australian  state  and  territory.  This  sampling  strategy  was   adopted  by  ACER  (2009)  in  an  online  survey  of  students  attending  23  Australian  universities,   with  the  exception  of  the  Northern  Territory  that  did  not  commence  teaching  until  2011   (Charles  Darwin  University  2011).     In  accordance  with  the  methodology  described  by  ACER  (2009),  Darren  is  encouraged  to   obtain  de-­‐identified  lists  of  students  from  Australian  universities,  validate  each  list,  draw  a   sample  and  then  return  the  sampled  list  to  each  university.  At  this  point,  each  university  will   be  required  to  re-­‐attach  student  contact  details  so  students  can  be  contacted  by  email  to   participate  in  Darren’s  online  survey.     Probability  sampling  is  regarded  as  the  most  rigorous  approach  to  producing  a  statistically   representative  sample  of  the  population  from  which  it  is  drawn  (Ritchie  &  Lewis  2005).   Probabilistic  sampling  will  also  allow  Darren  to  measure  the  response  rate  to  his  online   survey  (Deggs,  Grover  &  Kacirek  2010).     Since  the  size  of  Australian  universities  varies  from  6,554  to  53,612  students  and  the   proportion  of  international  students  varies  from  6.7%  to  40.5%  (Australian  Education   Network  2013),  it  can  be  concluded  the  Australian  university  population  has  a  skewed   distribution  best  suited  to  stratified  sampling  (Burns  &  Bush  2010).  In  populations  with  a   skewed  distribution  systematic  stratified  sampling  produces  “powerful,  generalisable  and   representative  estimates”  (ACER  2009,  p.  8).     Stratified  sampling  is  a  two-­‐step  process  that  partitions  the  population  into  homogenous   subgroups  and  then  draws  random  samples  from  each  subgroup  (McMurray,  Pace  &  Scott   2004).  Stratified  sampling  also  reduces  the  sample  size  needed  to  provide  a  representative   sample  (McMurray,  Pace  &  Scott  2004).     It  is  suggested  that  Darren  adopts  a  technique  that  was  used  by  Thomson,  Rosenthal  and   Russell  (2006)  to  randomise  a  sample  for  an  online  survey  of  students  attending  Australian  
  • 10. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  9  of  17   universities.  These  researchers  used  an  alphabetical  listing  to  create  a  random  starting   point,  and  defined  the  skip  interval  as  every  third  student.     Sample  Validation   The  final  part  of  this  sample  plan  will  explain  how  Darren  can  demonstrate  the  sample  is   representative  of  the  Australian  university  student  population.  Darren  can  validate  his   sample  by  comparing  the  sample’s  demographic  profile  with  a  known  profile  (Burns  &  Bush   2010).  In  Darren’s  case,  the  known  profile  would  be  obtained  by  compiling  de-­‐identified   student  lists  supplied  by  each  Australian  university.  Post-­‐stratification  weighting  of  survey   data  (for  example  by  academic  year  level,  attendance  type,  and  respondent  gender)  will  also   affirm  the  survey  responses  represent  the  target  population  (ACER  2009).    
  • 11. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  10  of  17   Part  3:  Online  Survey  Design  and  Implementation  Considerations     Compared  to  traditional  paper  surveys,  online  surveys  have  “distinctive  technological,   demographic  and  response  rate  characteristics  that  affect  how  they  should  be  designed,   when  they  can  be  used  and  how  they  can  be  implemented”  (Deggs,  Grover  &  Kacirek  2010,   p.  186).  Given  these  significant  differences,  the  third  section  of  this  report  will  discuss  how   Darren’s  survey  design  can  be  optimised  for  online  implementation.       Online  survey  invitations  develop  a  trusting  relationship  with  respondents  from  the   beginning  of  the  survey  experience  (Deggs,  Grover  &  Kacirek  2010).  Since  sending   unsolicited  email  infringes  on  student  privacy,    it  is  suggested  that  emails  sent  from  the   universities  to  prospective  participants  include  a  link  that  enables  students  to  opt-­‐out  of   receiving  any  further  communication  about  Darren’s  survey  (Survey  Monkey  2008).  This   method  of  the  university  sending  students  the  link  to  Darren’s  online  survey  will  also   increase  the  delivery  receipt  and  avoids  emails  being  rejected  by  spam  filters  (Survey   Monkey  2008).     Another  technical  consideration  is  the  software  that  Darren  uses  to  design  the  online   survey,  which  ideally  will:     1. Support  multiple  platforms  and  browsers,  as  variation  in  internet  devices  and  browsing   settings  could  create  respondent  errors  (Andrews,  Nonnecke  &  Preece  2003).       2. Prevent  invited  students  from  responding  to  Darren’s  survey  more  than  once  (Survey   Monkey  2008).     Unlike  paper  surveys,  designers  have  less  control  of  online  survey  presentation  due  to   variations  in  devices  used  to  view  the  survey  (Deggs,  Grover  &  Kacirek  2010).  However,   online  survey  design  has  a  strong  influence  on  attrition  rates  (Toepoel,  Das  &  Van  Soest   2009).  It  is  therefore  recommended  that  Darren  adopts  the  following  evidence-­‐based  online   survey  design  principles:     • A  white  background,  as  this  increased  response  rates  by  31%  compared  to  black   backgrounds  (Edwards  et  al.  2009).     • A  limit  of  ten  questions  per  screen,  as  this  has  been  shown  to  reduce  the  burden  of   scrolling  through  online  surveys  which  creates  the  impression  that  the  survey  is  too  long   (Toepoel,  Das  &  Van  Soest  2009).     • A  description  of  the  survey  structure,  a  realistic  time  estimate  to  complete  the  survey   and  a  calibrated  indicator  visually  tracking  the  respondent’s  progress  in  the  survey,  as   these  features  motivate  respondents  and  reduce  the  drop-­‐out  rate  (Deggs,  Grover  &   Kacirek  2010).    
  • 12. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  11  of  17   • Simple  headings  to  help  respondents  to  navigate  through  the  online  survey;  questions  in   a  bold  typeface;  slightly  indented  response  options  in  regular  typeface;  and  underlining   to  emphasize  words  in  the  question  Wiggins  and  Bowers  (n.d.).     Prior  to  implementing  online  surveys,  Marsden  and  Wright  (2010)  advocate  a  formal  pre-­‐ test  evaluation,  as  this  has  been  shown  to  improve  both  attrition  and  response  rates  (Fan  &   Yan  2010).  According  to  Deggs,  Grover  and  Kacirek  (2010,  p.  199)  “survey  piloting  is  the   process  of  conceptualizing  and  reconceptualising  the  key  aims  of  the  study  and  making   preparations  for  the  fieldwork  and  analysis  so  that  not  too  much  will  go  wrong  and  nothing   will  have  been  left  out”.     It  is  proposed  that  Darren  conducts  a  four  stage  pilot  testing  process,  as  described  by  Deggs,   Grover  and  Kacirek  (2010,  p.  200):     1. Initial  review  of  the  survey  by  peer  experts  to  ensure  Darren’s  survey  questions  are   complete,  efficient,  relevant  and  formatted  appropriately.     2. Cognitive  “think  aloud”  pre-­‐testing  to  ensure  Darren’s  survey  question  wording  is   understandable  and  interpreted  consistently,  and  the  questions  are  sequenced  logically.     3. Testing  by  a  sample  of  university  students  to  see  if  any  of  the  procedures  planned  for   the  online  survey  can  be  optimised.     4. A  final  check  by  people  that  are  not  connected  to  Darren’s  survey  to  identify  errors  that   might  have  been  introduced  during  the  revision  process.  
  • 13. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  12  of  17   Part  4:  Strategies  to  Improve  Response  Rate     The  fourth  part  of  this  report  will  discuss  strategies  that  Darren  can  employ  to  increase  the   response  rate  for  his  online  survey.  Response  rates  are  defined  as  the  number  of  completed   surveys  divided  by  the  number  of  eligible  subjects  in  the  sample  (Fan  &  Yan  2010).  A  low  response   rate  is  undesirable  because  it  compromises  survey  quality  by  introducing  non-­‐response  bias  (Shih   &  Fan  2013).     Given  the  response  rate  for  online  surveys  is  estimated  to  be  11%  lower  than  other  survey  modes   (Fan  &  Yan  2010),  it  is  recommended  that  Darren  employs  evidence-­‐based  strategies  to  improve   the  response  rate  to  his  online  survey.  For  instance,  offering  to  send  respondents  the  survey   results  increased  the  response  rate  by  36%,  and  including  a  photograph  of  the  researcher  in  the   email  invitation  tripled  the  response  rate  (Edwards  et  al.  2009).     The  response  rate  for  online  surveys  is  influenced  by  the  respondent’s  knowledge  of  computers   (Fan  &  Yan  2010).  However,  the  university  students  targeted  in  Darren’s  online  survey  are   accustomed  to  technology  (Crews  &  Curtis  2013).  Furthermore,  a  meta-­‐analysis  has  shown   university  students  are  highly  responsive  to  online  surveys  (Shih  &  Fan  2013).     The  short  length  of  Darren’s  online  survey  is  likely  to  improve  the  response  rate,  given  a   systematic  review  indicated  response  rates  were  73%  higher  in  shorter  online  surveys  (Edwards  et   al.  2009).  Furthermore,  university  students  are  more  likely  to  participate  in  online  surveys  when   they  can  be  completed  quickly  (Nulty  2008).  Assurances  of  confidentiality  have  also  been  shown  to   increase  response  rates  to  online  surveys  by  33%  (Edwards  et  al.  2009).     The  email  inviting  students  to  participate  in  Darren’s  online  survey  can  profoundly  influence  the   response  rates.  The  invitation  email  should  also  include  an  electronic  link  to  the  online  survey,  as   this  will  improve  response  rates  by  making  it  easier  for  students  to  respond  (Nulty  2008).    It  is   recommended  the  email  states  the  purpose  of  the  survey  and  provides  students  with  reasons  to   participate  (Survey  Monkey  2008).  Although  sending  online  survey  invitations  from  reputable   institutions  can  increase  response  rates  by  35%,  Darren  should  discourage  universities  from  using   the  word  ‘survey’  in  the  email  subject  line,  as  this  has  been  shown  to  reduce  response  rates  by   20%  (Edwards  et  al.  2009).     Darren  should  develop  a  participant  reminder  plan  to  motivate  students  to  participate  (Deggs,   Grover  &  Kacirek  2010),  since  Nulty  (2008)  demonstrated  higher  response  rates  when  non-­‐ respondent  university  students  received  repeated  email  reminders.  Meta  analyses  have  shown   that  reminder  email  messages  are  a  low  cost,  fast  and  effective  way  of  improving  online  survey   response  rates,  particularly  when  the  first  reminder  is  sent  two  days  after  the  initial  invitation  (Fan   &  Yan  2010).       Non-­‐monetary  incentives  in  online  surveys  have  been  shown  to  increase  response  rates  by  72%   compared  to  not  offering  incentives  (Edwards  et  al.  2009).  Among  university  students  incentive   prizes  awarded  through  a  lottery  have  also  been  shown  to  increase  response  rates  to  online   surveys  (Nulty  2008).  However,  incentives  could  introduce  systematic  bias  into  Darren’s  survey   (Deggs,  Grover  &  Kacirek  2010),  and  Darren  will  need  to  assess  the  financial  feasibility  of  offering   respondents  incentives.
  • 14. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  13  of  17   Part  5:  Complementary  Qualitative  Research  Design     The  final  section  of  this  report  will  propose  a  qualitative  research  design  to  accompany   Darren’s  quantitative  online  survey.  It  has  been  argued  that  “quantitative  surveys  alone   provide  limited  information,  and  the  findings  therefore  may  not  include  all  perspectives”   (Weaver,  Spratt  &  Nair  2008,  p.  32).       In  contrast,  pluralistic  research  that  combines  both  quantitative  and  qualitative  research   methods  offers  significant  advantages  over  using  either  of  these  research  methods   individually  (Burns  &  Bush  2010).  It  is  therefore  proposed  that  Darren  uses  a  qualitative   research  method  after  his  quantitative  survey  to  understand  the  survey  findings.     Focus  groups  are  a  qualitative  technique  that  gathers  data  through  systematic  questioning   of  several  individuals  simultaneously  in  a  group  setting  (Deggs,  Grover  &  Kacirek  2010).   However,  traditional  focus  groups  may  not  be  financially  feasible  for  Darren  given  the   geographical  dispersion  of  Australian  university  students  (Burns  &  Bush  2010).  It  is  therefore   proposed  that  Darren  conducts  synchronous  online  focus  groups,  which  involve  groups  of   individuals  typing  unstructured  comments  in  private,  electronic  chat  rooms  (Zikmund  &   Babin  2013).     The  following  information  has  been  adapted  from  an  online  focus  group  of  university   students  in  the  United  States  (Deggs,  Grover  &  Kacirek  2010)  to  assist  Darren  to  select   participants,  develop  instructions,  send  invitations,  monitor  focus  group  dialogue  and   analyse  the  results:     Online  Focus  Group  Participant  Selection   Darren  will  need  to  set  initial  criteria  for  selecting  participants  that  are  attending   universities  in  Australia  and  have  experience  using  Facebook.  It  can  be  implied  that   Facebook  users  are  proficient  using  internet  technologies  and  therefore  able  to   communicate  in  a  textual-­‐based  online  environment.     Online  Focus  Group  Planning   Darren  will  need  to  make  decisions  about  the  number  of  online  focus  groups,  the  number  of   participants  per  group,  and  the  duration  of  each  group.  He  will  also  need  to  select   appropriate  software  for  the  online  focus  groups  that  will  ensure  access  is  restricted  only  to   invited  participants,  and  uphold  anonymity  and  confidentiality  among  participants.  When   the  methodology  is  finalised  Darern  must  also  obtain  approval  from  the  relevant  ethics   committees  at  participating  universities,  as  his  study  design  will  require  administrative   support  from  the  university  to  send  email  invitations  to  students  in  the  sample.     Online  Focus  Group  Participant  Invitations   Email  invitations  that  universities  send  to  students  in  Darren’s  sample  should  contain  the   following  four  features:     1. Advice  that  participation  is  voluntary  and  students  will  not  be  penalised  for  withdrawing   at  any  time.  
  • 15. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  14  of  17     2. Assurances  that  the  participant’s  responses  will  remain  confidential.     3. Clearly  articulated  instructions  about  the  steps  students  will  need  to  follow  to   participate  in  the  online  focus  group.     4. A  link  to  an  informed  consent  document  which  adequately  explains  the  participant’s   rights  and  outlines  procedures  that  will  be  utilised,  the  duration  of  the  online  focus   group  and  the  methods  of  communication.     Online  Focus  Group  Implementation   Darren  will  need  to  have  a  consistent  presence  in  the  semi-­‐structured,  synchronous  online   focus  groups  to  gauge  the  level  of  interest  among  participants.  For  instance,  Darren  will   need  to  carefully  monitor  the  dialogue  in  the  online  focus  groups  but  avoid  redirecting  the   conversation  unless  the  discussion  gets  off  the  topic  of  Facebook.       Given  that  interaction  between  participants  in  the  critical  characteristic  of  focus  groups,   Darren  will  need  to  maximise  any  synergy  and  enthusiasm  that  forms  among  geographically   dispersed  participants.  However,  Darren  should  expect  some  participants  to  drop-­‐out  of  the   online  focus  groups.         Online  Focus  Group  Questions   Darren’s  questions  should  ask  the  online  focus  group  participants  about  their  Facebook   experiences,  without  steering  or  controlling  the  content  of  the  discussion.  It  is  suggested   that  Darren  begins  with  a  very  broad  and  unfocussed  ‘grand  tour’  question,.  For  example,   Darren  could  initially  ask  the  online  focus  group  participants  “Please  tell  us  about  how  you   use  Facebook”.  It  is  then  recommended  that  Darren  follows  up  with  questions  based  on  the   participant’s  comments  to  elicit  more  insight  and  greater  feedback  from  participants.     Online  Focus  Group  Data  Analysis   It  is  recommended  that  Darren  analyses  the  verbatim  transcripts  created  in  the  online  focus   groups  with  computerised  interpretive  software  for  qualitative  research.  This  will  enable   Darren  to  identify  themes  and  connections  in  the  transcripts  (Zikmund  &  Babin  2013).    
  • 16. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  15  of  17   Conclusion     This  report  has  provided  Darren  with  guidance  to  conduct  a  high-­‐quality  online  survey  and   online  focus  groups  to  support  his  research  of  the  uses  and  gratifications  of  Facebook   among  a  representative  sample  of  Australian  university  students.  The  first  section  of  the   report  logically  deconstructed  the  wording  and  layout  of  questions  in  Darren’s  survey.  Key   suggestions  included  correcting  an  overlapping  scale  and  re-­‐wording  double-­‐barrelled   questions  and  questions  that  contained  extreme  absolutes,  ambiguity  or  grammatical   errors.       The  second  section  of  the  report  provided  Darren  with  a  detailed  sampling  plan.  The  target   population  was  defined  as  students  attending  Australian  universities.  The  sample  frame  was   based  on  a  de-­‐identified  listing  of  students  obtained  from  the  administrative  records  from   Australian  universities.  The  sample  size  of  9,250  respondents  was  calculated  based  on  a  ±3%   allowable  sample  error  at  the  99%  confidence  level,  maximum  variability,  and  a  target   response  rate  of  20%.  A  comparison  between  the  sample’s  demographic  profile  and  the   profile  of  students  in  the  sample  frame  was  suggested  to  validate  the  sample  as   representative  of  the  Australian  university  student  population.     The  third  section  of  this  report  discussed  important  design  and  implementation   considerations  when  transforming  Darren’s  paper  survey  into  an  online  format.  To   overcome  the  technological  issues  identified,  it  was  recommended  that  Darren  re-­‐design   the  survey  with  specialised  online  survey  software,  send  survey  invitations  to  students   directly  from  the  university  and  enable  prospective  students  to  opt-­‐out.  Evidence-­‐based   suggestions  to  lower  drop-­‐out  rates  were  suggested,  including  a  white  background,  using  a   maximum  of  ten  questions  per  screen,  and  providing  a  realistic  estimate  of  the  time   required  to  complete  the  survey.  Prior  to  implementation,  a  validated  four  stage  pilot   testing  process  was  suggested  to  reduce  drop-­‐out  rates  and  improve  response  rates.     In  recognition  of  the  effect  of  response  rates  on  survey  quality,  the  fourth  section  of  this   report  discussed  strategies  to  increase  the  response  rate  for  Darren’s  online  survey.  Key   strategies  to  increase  online  survey  response  included  a  photo  of  Darren  in  the  email   invitation,  not  using  ‘survey’  in  the  email  subject  line,  and  sending  email  reminders  to  non-­‐ respondents  after  2  days.     Given  the  benefits  of  pluralistic  research,  the  final  section  of  this  report  proposed  a   qualitative  online  focus  group  to  accompany  Darren’s  online  survey.  This  research  method   was  shown  to  suit  both  the  geographical  dispersion  and  the  technical  abilities  of  Australian   university  students.  A  validated  methodology  based  on  online  focus  groups  for  university   students  in  the  United  States  was  also  provided  to  guide  Darren’s  selection  of  participants,   survey  instructions  and  invitations,  and  to  analyse  the  results  of  the  focus  group  dialogue.
  • 17. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  16  of  17   Reference  List     Andrews,  D,  Nonnecke,  B  &  Preece,  J  2003,  ‘Conducting  Research  on  the  Internet:   Online  Survey  Design,  Development  and  Implementation  Guidelines’,  International  Journal   of  Human-­‐Computer  Interaction,  vol.  16,  no.  2,  pp.  185-­‐210.     Australian  Council  for  Educational  Research  (ACER)  2009,  Engaging  Students  for  Success,  viewed   2  August  2013,   http://www.acer.edu.au/documents/aussereports/AUSSE_Australasian-­‐Student-­‐Engagement-­‐ Report-­‐ASER-­‐2008.pdf.     Australian  Education  Network  2013,  Student  Numbers  at  Australian  Universities,  viewed  2   August  2013,  http://www.australianuniversities.com.au/directory/student-­‐numbers.     Burns,  AC  &  Bush,  RF  2010,  Marketing  Research  (Global),  Pearson  Education,  Upper  Saddle   River.     Charles  Darwin  University  2011,  University  profile,  viewed  2  August  2013,   http://www.cdu.edu.au/about/university-­‐profile.     Crews,  TB  &  Curtis,  DF  2011,  ‘Online  Course  Evaluations:  Faculty  Perspective  and  Strategies   for  Improved  Response  Rates’,  Assessment  &  Evaluation  in  Higher  Education,  vol.  36,  no.7,   pp.  865-­‐878,     Deggs,  D,  Grover,  K  &  Kacirek,  K  2010,  ‘Using  Message  Boards  to  Conduct  Online  Focus   Groups’,  The  Qualitative  Report,  vol.  15,  no.  4,  pp.  1027-­‐1036.     Dennis,  S  2010,  How  to  design  a  questionnaire/survey,  viewed  2  August  2013,   http://www.slidefinder.net/h/how_design_questionnaire_survey_sarah/8questionnairedesi gn-­‐ppt-­‐sarahdennishbirden/5248140.     Edwards  PJ,  Roberts  I,  Clarke  MJ,  DiGuiseppi  C,  Wentz  R,  Kwan  I,  Cooper  R,  Felix  LM,  Pratap   S  2009,  ‘Methods  to  increase  response  to  postal  and  electronic  questionnaires’.  Cochrane   Database  of  Systematic  Reviews,  no.  3,  viewed  7  August  2013,  The  Cochrane  Library,   http://www.ncbi.nlm.nih.gov/pubmed/19588449.     Fan,  W  &  Yan,  Z  2010,  ‘Factors  affecting  response  rates  of  the  web  survey:  A  systematic   review’,  Computers  in  Human  Behavior,  vol.  26,  no.  2,  pp.  132-­‐139.     Malhotra,  MK  &  Grover,  V  1998,  ‘An  assessment  of  survey  research  in  POM:  from  constructs   to  theory’,  Journal  of  Operations  Management,  vol.  16,  no.  4,  pp.  407-­‐425.     Marsden,  PV  &  Wright,  JD  (eds)  2010,  Handbook  of  Survey  Research,  2nd  edn,  Emerald   Group  Publishing,  Bingley.     McMurray,  AJ,  Pace,  RW  &  Scott,  D  2004,  Research:  a  common  sense  approach,  Thomson   Social  Sciences  Press,  Southbank.  
  • 18. Critical  Analysis:  Survey  Design   Prepared  by  Nicole  Brown  (2013)   Page  17  of  17     Nulty,  DD  2008,  ‘The  adequacy  of  response  rates  to  online  and  paper  surveys:  what  can  be   done?’,  Assessment  &  Evaluation  in  Higher  Education,  vol.  33,  no.  3,  pp.  301-­‐314.     Ritchie,  J  &  Lewis,  J  (eds)  2005,  Qualitative  Research  Practice,  Sage  Publications,  London.     Shih,  T-­‐H  &  Fan  X  2013,  ‘Comparing  response  rates  from  web  and  mail  surveys:  a  meta-­‐ anlysis’,  Field  Methods,  vol.  20,  no.  3,  pp.  249-­‐271.     Survey  Monkey  2008,  Smart  Survey  Design,  viewed  2  August  2013,   http://s3.amazonaws.com/SurveyMonkeyFiles/SmartSurvey.pdf.     Thomson,  G,  Rosenthal,  D  &  Russell,  J  2006,  ‘Cultural  Stress  among  International  Students  at   an  Australian  University’,  Paper  presented  at  the  Australian  International  Education   Conference,  viewed  2  August  2013,   http://www.aiec.idp.com/pdf/thomson%20(paper)%20fri%201050%20mr5.pdf.     Toepoel,  V,  Das,  M  &  Van  Soest,  A  2009,  ‘Design  of  Web  Questionnaires:  The  Effects  of  the   Number  of  Items  per  Screen’,  Field  Methods,  vol.  21,  no.  2,  pp.  200-­‐213.     Weaver,  D,  Spratt,  C  &  Nair,  CS  2008,  ‘Academic  and  student  use  of  a  learning  management   system:  Implications  for  quality’,  Australasian  Journal  of  Educational  Technology,  vol.  24,  no.   1,  pp.  30-­‐41.     Wiggins,  BB  &  Bowers,  A  n.d.,  Designing  Survey  Questions,  viewed  2  August  2013,   http://faculty.washington.edu/janegf/DsgnSrvyQues.pdf.     Zikmund,  WG  &  Babin,  BJ  2013,  Essentials  of  Marketing  Research,  5th  edn,  Cengage   learning,  Ohio.