Making Inter-operability Visible
Upcoming SlideShare
Loading in...5
×
 

Making Inter-operability Visible

on

  • 932 views

 

Statistics

Views

Total Views
932
Views on SlideShare
932
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Making Inter-operability Visible Making Inter-operability Visible Presentation Transcript

  • Making Inter-operability Visible Visualising Interoperability: ARH, Aggregation, Rationalisation and Harmonisation Michael Currie, Meigan Geileskey, Liddy Nevile, Richard Woodman A Project in the Victorian Department of Premier & Cabinet
  • Summary
    • Introduction
      • disclaimer, context
      • purpose, process
    • The Case Study
    • Some after-thoughts...
  • Disclaimer
    • Just what it is …
    • Using info m’ment to teach maths
    • Not a scientific paper ...
    • Information scientists, systems architects, administrators, DC worshippers, …..
  • Context
    • Dept PC developing Intranet
    • Dept SD developing VOG (brochure-net)
    • Most depts developing doc. m’ment systems
    • departmental agencies have (sometimes) implemented, more or less, DC-based Australian Government Locator Standard
    • the Government needs discovery for all resources and information interoperability
  • Context (2)
    • a visualisation of interoperability to assist real-world deployment of metadata
    • a few metadata records, but the future of the Whole-of-Government Intranet and all departments of government
    • my role
  • Purpose of Project
    • With minimal interference
    • To accommodate work on extranet
    • Influence new document management systems to prepare for intranet
    • To maintain local specificity and global discovery
    • To increase interest and effort.
  • Previous Process
    • Departmental mandate to manage information
    • Monthly meetings of those interested in metadata from various departments
    • Only collaboration, cooperation, …
    • Weak government “policy”.
  • Case Study Process
    • Literature review -
      • what is inter-operability
    • Metadata review
      • Emergent phenomena
    • Bricolage development
      • Aggregation, rationalisation, harmonisation
    • Registry, repository, WOG search...
  • Interoperating
    • - an action when one tries to mix and match across domains
      • Agencies use many different strategies to make this happen….
  • Interoperability
    • - a state where there is interchange and exchange without difficulty
      • Government agencies need complete or absolute interoperability - every document must be discoverable
  • Mapping for interoperation
  • What is good enough?
    • Moderate inter-operability is good enough for information re-use
      • Departments like to maintain independence, operate in ‘silos’
    • Only perfect is good enough for Parliament
      • public accountability is forcing the issue
  • Aggregation
    • First list
      • who has metadata, for what
    • Second list -
      • who uses what elements
    • List all elements currently used
      • is there a significant difference
  • Record review Live record
  • List review
  • Spreadsheet review Live spreadsheet
  • Aggregation - review
    • Looking at the list, are there problems?
      • It’s way too big
      • Too many things in the list
        • Variations in application profiles, errors, and variation in use of elements, formats, etc...
      • Not all metadata is useful for discovery
        • eg. random use of ‘DPC’ vs ‘Department of Premier and Cabinet’
        • searchers may miss some documents.
  • Emergent goals - 1
      • Illuminate limitations in inter-operability resulting from existing metadata practices
        • Articulate the cause of the problems
        • Develop a shared strategy for improving the inter-operability
  • Emergent goals - 2
    • Encourage data managers to develop a single, comprehensive metadata application profile
      • derived from the current requirements and foci of all users
      • with high local specificity and deep interoperability.
  • Analysis
    • Look at the list of elements and decide which have material differences
    • **Remembering that each agency values all its metadata content
  • Material differences
    • Mis-use of available elements, qualifiers etc
    • Different expression of the same type of information
    • Different granularity
    • Different element name for the same information, …
    • -> need to rationalise
  • Element Name Variants
    • Inconsistent case
      • eg. DC.Title/TITLE/title EDNA.Userlevel/UserLevel
    • Non-standard names eg. DC.Keywords
    • Non-standard qualifiers
      • eg. DC.Description.Abstract
    • Non-standard abbreviations
      • eg. DC.Lang
  • Field Selection
    • Standard and non-standard element names
      • eg. 'description' and DC.Description
    • Locally created element names
      • eg. Custodian
  • Value string Variants
    • Despite DCMES recommendations …
    • DC.Identifier:
      • other id numbers without qualifiers.
    • DC.Date:
      • also used yyyy, yyyy/m/d, yyyy-dd-mm
    • DC.Format:
      • Non-standard terms eg. VHS (PAL)
      • Incorrect case eg. text/HTML
    • DC.Language: also used en, en-au, en-AU
    • Qualifiers embedded in values:
      • DC.Publisher CONTENT="corporateName=State..."
    • Non-standard proper names
      • DPC for Department of Premier and Cabinet
    • ->Generally inconsistent use of capitalisation and punctuation
  • Observation
    • Most element variants due to non-standard use of capitals, punctuation, spelling
    • Users seem to act independently of Application Profiles
    • Little use of collection specific qualifiers to enhance specificity
  • Rationalise!
    • Add qualifiers to each type of element?
        • no less elements but significantly increases semantic interoperability
    • Dumb-down for interoperation?
        • Loss of precision, too many documents from search but not everything is found
    • Cross-map ontologies, one to another
      • Too cumbersome
    • Map everything to a new ontology
        • Blanchi, ‘Harmony’, etc.
  • Rationalise (a process?)
    • Choose strategy
    • Decide what is disposable/worth saving given chosen strategy
    • Delete disposable elements
    • -> The list gets shorter, semantic inter-operability is improved.
  • Harmonisation
    • Work together to choose appropriate formats and definitions for elements and qualifiers
    • **Remembering that, locally, departments will want more and less precision
  • Harmonisation (process?)
    • Eg
    • in DC.date - maybe just a different format so it’s obvious what to do
    • in DC.subject - some use AGIFT, some use SCIS, some use NRE’s geo-spatial thesaurus, etc. - consensus work to be done
  • Results from ARH - HA?
    • DPC Project is working on towards an Intranet, harmonising the application profiles
  • Supporting granularity - technologies
    • Registries - leads to the well-defined full WOG ontology supporting evolving granularity
    • Federated Metadata repositories preserve local control and remote interoperability
  • Key service providers - VOG VOG DoJ DoI DNRE
  • Key service providers - WOG WOG AP VOG AP
  • The grammar of DC metadata http://www.dlib.org/dlib/october00/baker/10baker.html
    • Resource has property (subject):
      • ‘about land’
      • ‘about rural land’
      • ‘about Victorian rural land’
      • ‘about Victorian rural land in section 43’
      • ‘about Victorian rural land in section 43 in January 2002’
    The grammar of DC metadata
  • Does your resource speak Dublin Core (AGLS)?
    • “… Pidgins are inherently limited in what they can express, but they are easy to learn and enormously useful. In real life, we talk one way to our professional colleagues and another way to visitors from other cultures. Our digital library applications need to do this as well. Simplicity and complexity are both appropriate, depending on context. If Dublin Core is too simple or generic to use as the native idiom of a particular application, pidgin statements may be extracted or translated from richer idioms that exist for specialized domains. This output should also be filtered to keep the fifteen buckets clear of encoding debris and semantic silt. One should treat digital tourists with courtesy and hide from them the complexities of a local application vocabulary or grammar. However sophisticated its local idiom may be, an application might also speak a pidgin that general users and generic search engines will understand. Simple, semantically clean, computationally obvious values will help us negotiate our way through a splendidly diverse and heterogeneous Internet.” (Tom Baker, http://www.dlib.org/dlib/october00/baker/10baker.html)
  • Pidgin vs Symbolic Languages
    • Pidgin languages serve for good enough communication
    • Symbolic languages serve for complete communication in abbreviated form
      • structure
      • dictionaries