Your SlideShare is downloading. ×
0
×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Practical web governance with search analytics and more

1,482

Published on

Presentation made on the 4th of november at the Jboye10 conference (Governance track) in Aarhus, Denmark.

Presentation made on the 4th of november at the Jboye10 conference (Governance track) in Aarhus, Denmark.

Published in: Technology, Design
1 Comment
2 Likes
Statistics
Notes
No Downloads
Views
Total Views
1,482
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
13
Comments
1
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • To use search analytics to get a grasp of which good content your website or intranet contains is not new. But if search analytics could be used to find the hidden gems on your intranet, and find information that is duplicated, or outdated wouldn't that be great? "Lies, damned lies, and statistics", says it all. We find it hard to rely on search statistics and analytics alone. What about system automated governance? Indeed, the systems can help us with that, and in Kristian's organisation, they have used it as well but it is not sufficient, either.Give the users the ability to give feedback, is the one simple thing that helps a lot. At Region Västra Götaland, implementations are influenced by and based on what they consider best practice from other organisations. They have some basic functionality that really helps them govern the content of their intranets and websites:
    Basic search statisticsIdentifying duplicate contentFeedback formsPolicies for document archiving and updatingNotifying content editorsAutomatic archivingKristian will share what they have learnt on their way to better governance, how they have done it and also tell the good and bad, the do's and don'ts.
    Kristian will share what they have learnt on their way to better governance, how they have done it and also tell the good and bad, the do's and don'ts.
  • Governance is hard work, you have to get your hands dirty.
    It takes a long time, many years, before we can say that we really have established governance.
    We demand responsibility, so whoever has published something is also responsible for keeping the information up-to-date.
    This helps the process of sorting out what information gets revised/updated, what gets archived and what gets deleted.
    Since a lot of decisions is taken on a daily basis on the information that we have on our intranet, it is very important that the information is valid.
    Because of Governance, information is now getting more reliable.
    If a decision is taken on false/outdated information we will know where it was published and who was the owner/responsible for it.
    This rule also helps keeping information where it belongs. On the intranet or in archives.
  • The general document policy is about 70 pages. Then add to that the local ones...
    Our Metadata policy is about 30 pages. This is for document management only.
    Then there is the document process documentation...
    Document plans are an inventory of the types of documents used.
    We have our master metadata stored on a “server” that provides the master metadata to other systems.
    This enables us to establish what information is related to other information.
    So to make individuals to follow these policies they have to be “built-in” to the systems where we store and from where we distribute information.
  • 0-results are those queries that don’t give any search results.
    If you filter out all the miss-spelled words there are interesting queries left: synonyms, slang, abbreviations etc.
    If these are added to the content as metadata (keywords) then the content will be found.
    The top 200 queries amount to about 90 percent of all searches.
    If we can give good search results to these 200 queries, then we have accomplished a lot.
    In order to give good results for the top 200 queries we can:
    Help content editors to make their info more findable
    Adjust the search engine’s relevancy model
    Add key matches
    Add synonyms
    Add stop-words
    Add boost words
    Do’s and don’ts for key-matches = best bets:
    http://dennisdeacon.wordpress.com/2008/06/25/search-engine-best-bets/
  • Basic search stats, available for all users on the intranet. Covers both intranet and internet web sites.
  • 0-result search keywords.
  • The top 200 search queries.
    Period 2010-01-01 to 2010-10-19
  • Key-match self service.
    Everyone can see the key matches on the intranet.
    All users can write a suggestion for a key match.
    The suggestion then goes to our search editor/administrator who approves or rejects the suggestion.
  • Checksums are a unique calculated identifier for a document.
    More about checksums:
    http://en.wikipedia.org/wiki/Checksum
    Read more about this on my blog: http://sys64738.se
  • Rule no #1 When feedback is sent this way we always answer/return feedback
  • This is not fancy at all. But very useful. New version will really improve the UX.
    The feedback gives valuable information.
    Quite a few users give us feedback when they don’t find what they expect to find.
    Or when the search result doesn’t match their domain knowledge, meaning that they know there is relevant information but that it is not present on the (first) search result page.
    The actions we can take from this information is:
    educate the content editors/owners how to make their content more findable
    adjust the search engines relevancy model
    add new information sources
    find parsing bugs
    get information about what we shouldn’t index = refine the crawlers
  • This governance is based upon the ideas and practices of Mark Morell @ BT and Gale Langseth @ Simcorp.
    When all this metadata is in place, we can automate the process of notifying content editors/owners that their content is due for revision.
    If the information is not revised within the set period it will automatically be archived (main rule)
    or the information will be deleted (secondary rule)
  • This is implemented as a kind of workflow in our Web Content Management System or the Document (Enterprise Content) Management System
    There is a chain of responsibility here.
    If the work of updating etc is not done in a timely fashion then the manager will be notified.
    If there’s still no action after the manager has been notified, the information will be removed.
  • ×