The growing pains of a controlled vocabulary
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

The growing pains of a controlled vocabulary

on

  • 2,252 views

 

Statistics

Views

Total Views
2,252
Views on SlideShare
2,250
Embed Views
2

Actions

Likes
1
Downloads
10
Comments
0

2 Embeds 2

http://www.iaplay.com 1
http://www.slideshare.net 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

The growing pains of a controlled vocabulary Presentation Transcript

  • 1. The growing pains of a controlled vocabulary
  • 2. Introduction
    • Karen Loasby
    • Information architect
    • Worked for BBC for 4 years on search, navigation, metadata and content management projects
    • 2 years previously for the Guardian newspaper archiving the paper and arranging content on the website
    • MSc in Information Science from City University, London
  • 3. Agenda
    • Background
    • The problem
    • Formal classification vs. Folk tags
    • Our middle ground
    • What happened
    • Learning points
    • Questions
  • 4. Background
    • Content management project
    • Regional websites
    • Need for metadata
    • Authors around the UK
  • 5.  
  • 6. Problem
    • Faceted classification system
    • Authors to tag
    • Central control
    • But …
    • Journalists are the specialists – know the domain and the vocabulary.
  • 7. Formal classification
    • Pre-determined terms
    • Centralised control
    • Rich relationships
  • 8. Folk tags
    • What it is then?
    • Folksonomy, ethnoclassification, social classification, social categorisation and so on
  • 9. Comparing approaches
    • Formal
    • High maintenance
    • Consistent/predictable
    • Rich relationships
    • Can be artificial
    • Folk
    • Low maintenance
    • Quirky/surprising
    • Less added value
    • Real user language
  • 10. A role for both
    • Where we are using folk tagging
    • And where we won’t
      • Trust & Authority
      • High value to business
      • Missing motivation from users
      • Broad domain/user base
      • To avoid tryanny of minority
  • 11. An experimental middle ground
    • Centralised control of terms
    • But encouraging absorption of user language
    • Higher maintenance than folk tags
    • Cheaper than professional cataloguing
  • 12. BBC Experience Semi-automatic classification Terms suggested from the CVs Terms are OK The suggested terms do not describe the content Search or browse for terms Send suggestion to the CV team Terms are OK Send suggestion to the CV team CV team evaluate suggestion Say no to the term – change the classification on the content object Add to CV as a variant term or preferred term
  • 13. Operational system
    • 8000 requests in 10 months
    • From 160 journalists
      • Average per user of 50 terms
      • However this varied wildly. Our top user has suggested 476 terms
  • 14. Graph showing variation between teams
  • 15. Growth in the CVs
      • Up 15000 terms in 10 months
      • Most growth in person/proper names
        • People, venues and organisations
        • Up by 50% to 35,000
  • 16. Growth of facets
  • 17. Types of terms
    • Mostly good
      • Only 200 terms actually rejected
    • Synonyms vs. entirely new terms
      • New for names (only 2% synonyms)
      • Synonyms for subject (15% synonyms)
      • Location – needed colloquial terms
  • 18. Resourcing
    • Handling the requests from journalists
    • First 3 months – one IA
    • Subsequently 2 to 3 junior IAs
    • Too much – how to reduce?
  • 19. Lessons learned
    • Success with the journalists
      • They suggested terms!
      • Got the faceted classification
      • Began to suggest terms in “our” format
      • Some did engage at a detailed level
  • 20. Lessons Learnt
    • Difficulties for journalists
      • System looks as if totally automatic as part of a content management system
      • “ Journalists are people too”
      • Users struggling with a content object tagging system; rather than page based
  • 21. Example Subject: Pregnancy
  • 22. Lessons Learnt
    • Difficulties for journalists, cont.
      • They find it boring
      • Makes it harder for the aim of “finding and re-use” to apply
      • Needed to do more pre-emptive work for them
  • 23. Lessons learnt
    • Number of terms suggested depends on
      • Type of facet
      • Dynamism of content
      • Scope of the content
      • Enthusiasm of users
  • 24. Next?
    • High value facets still need control
      • Make use of the metadata(!)
      • Sell the message
      • Federated management
      • Earlier in production
    • And for folk tagging?
  • 25.
    • Thanks to the IA team for their analysis work;
      • Jon Carey
      • Adil Hussein
      • Christine Rimmer
  • 26. Thank you Questions or comments? Karen Loasby [email_address]