California Ocean Science Trust " Building a Sustainable Knowledge Base for the Marine Protected Areas Monitoring Enterprise"March 16, 2010

  • 163 views
Uploaded on

"Building a Sustainable Knowledge Base for the Marine Protected Areas Monitoring Enterprise" a presentation to the California Ocean Science Trust, Oakland, California March 16, 2010

"Building a Sustainable Knowledge Base for the Marine Protected Areas Monitoring Enterprise" a presentation to the California Ocean Science Trust, Oakland, California March 16, 2010

More in: Technology , Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
163
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Building a Sustainable Knowledge Base for the Marine Protected Areas Monitoring Enterprise
  • 2. Mission? “The mission of the California Ocean Protection Council is to ensure that California maintains healthy, resilient, and productive ocean and coastal ecosystems for the benefit of current and future generations.”
  • 3. Mission?(2) “Our vision is that timely, useful information about the effects and performance of California’s marine protected areas network is provided to decision-makers, resource managers, stakeholders and the public to inform management decisions, improve stewardship, and build understanding of ocean ecosystems…” --MPA Monitoring Enterprise Draft NCC MPA Monitoring Plan
  • 4. “The ocean has been chronically under-sampled for as long as humans have been trying to characterize its innate complexity.” “…the current suite of computationally intensive numerical/theoretical models of ocean behavior has outstripped the requisite level of actual data necessary to ground those models in reality.” J R Delaney & R S Barga, “A 2020 Vision for Ocean Science” in The Fourth Paradigm: Data-Intensive Scientific Discovery (T. Hey, S. Tansley and K. Tolle, eds) Microsoft Research Redmond, WA., Version 1.1 Oct. 2009. http://research.microsoft.com/en-us/collaboration/fourthparadigm/
  • 5. Vision for Knowledge Management • Basic Values • Basic Concepts
  • 6. Basic Values : • “Data as Evidence” – evidenced-based policy and decision-making • Free open and effective access and use of all knowledge resources • Scientific rigor and integrity • Transparency / accountability / effective access • Full-cycle collaboration / peer review • Sustained Contribution to Well-Informed Public Discourse • “K to Gray” Science Literacy / Civic Literacy • Support for Education • Economic Awareness • Respect for spiritual and emotional values • Includes “working with emotional intelligence” and “working with cultural sensitivity” • Public Domain – free use by all for all purposes • Open source / proprietary applications
  • 7. Basic Concepts • Epistemology: Knowledge Evolves and Develops • Recognition of Developmental Phases: R&D, Prototyping, Full Implementation • High Level Constraints on Development • “Knowledge Resources”: – Data – Experience – Information – Knowledge – (Wisdom / Belief ) • Full-life-cycle data curation • “Legacy” Resources / “Prospective” Resources – reconnaissance/survey/inventory • Standards • Canonical / “Reference” Data and Systemic Data • Ecology of Science – relative “maturity” of domains / data set? • Probability/ confidence levels / risk assessment? • Logical expression of multiple working hypotheses? • “Stakeholders” / Communities of practice / Communities of interest
  • 8. Lawrence Lessig, “The New Chicago School,” The Journal of Legal Studies V.27 2 (Pt.2) June, 1998, p.664 Univ of Chicago law School, Univ of Chicago Press. www.lessig.org/content/articles/works/LessigNewchicschool.pdf Lessig: “Modalities of Constraint”
  • 9. Repatriation of biodiversity information through Clearing House Mechanism of the Convention on Biological Diversity and Global Biodiversity Information Facility; Views and experiences of Peruvian and Bolivian non-governmental organizations. Ulla Helimo Master’s Thesis University of Turku Department of Biology 6.10. 2004 p.11. http://enbi.utu.fi/Documents/Ulla%20Helimo%20PRO%20GRADU.pdf [06-06-05] KNOWLEDGE RESOURCES: Technology
  • 10. Research Information Network and British Library “Patterns of information use and exchange: case studies of researchers in the life sciences” http://www.rin.ac.uk/system/files/attachments/Patterns_information_use-REPORT_Nov09.pdf
  • 11. US NSF “DataNet” Program “the full data preservation and access lifecycle” • “Acquisition” • “Documentation” • “Protection” • “Access” • “Analysis and Dissemination” • “Migration” • “Disposition” “Sustainable Digital Data Preservation and Access Network Partners (DataNet) Program Solicitation” NSF 07- 601 US National Science Foundation Office of Cyberinfrastructure Directorate for Computer & Information Science & Engineering
  • 12. “Data”? (1) “Any information that can be stored in digital form and accessed electronically, including, but not limited to, numeric data, text, publications, sensor streams, video, audio, algorithms, software, models and simulations, images, etc.” Sustainable Digital Data Preservation and Access Network Partners (DataNet) Program Solicitation NSF 07-601, p.5 http://www.nsf.gov/pubs/2007/nsf07601/nsf07601.pdf
  • 13. “Data” (2) • “Precise, well-defined representations of observations, descriptions or measurements of a referent (object, phenomena or event) recorded in some standard, well-specified way”. Report of the AnthroDPA MetaData Working Group May, 2009 sponsored by the Wenner-Gren Foundation and the US NSF. [In press.]
  • 14. Specification KMS • Goals • Features • Functions • Attributes
  • 15. CONTEXT • A bounded and delimited project • But with clear links to projects at differing scales and with different foci • Wherever possible use methodologies and applications already proven • Maintain strong current awareness of ITC and domain developments • Use of common standards and tools with demonstrated or planned capacity for migration • For flexibility of adaptation, modularity is essential
  • 16. http://www.cec.org/Storage/57/4984_B2B_PCAs_en.pdf
  • 17. W. Michener, “Meta-information concepts for ecological data management”, Ecological Informatics 1 (2006), 6.
  • 18. Goals/ Objectives • Primary data types must be clearly specified • Data formats should be registered • Provenance/lineage/transformations of data must be specified • Scientific workflow must be specified • Users of primary data should be registered (as through authentication control application – Shibboleth-like?) • Support for descriptive metadata minimally necessary for discovery/ retrieval • Full descriptive metadata is a process not an event • Support for “qualified / expert tagging” • Support for iterative cycles of comment and evaluation • Public access to knowledge products and support for public comment
  • 19. Process for developing the KMS? • Who should be involved, when, and in what role or for what purpose? • What are the greatest challenges to the process and how would you overcome them? • What are the attributes of a good process? • What would be attributes of a bad process? • What potential pitfalls do you see and how would you avoid them?
  • 20. Who? • CalOST Staff • Designated Advisory Bodies • Regulatory Bodies • Legislative Bodies (?) • Academic and Research Community – Institutions – Organizations – Individuals – Library / i-Schools – Marine Labs – CS Departments / Data Centers • Conservation Community • General Stakeholder Groups • Allied Projects and Jurisdictions (at all scales) • Sponsors and potential sponsors (public/private sector – including corporate?) • Education community (formal/informal) • Journalism/ publishing community • Commons community
  • 21. When to involve…? • A broadly representative technical advisory body… (within 3 months) • Expert reviewers – domain specialists… (ongoing – drawing on existing links) • MPA Managers group (link to IUCN?/ WCPA?) (within 9 months) • Journalist / Communication Committee (soon) • Education Committee (soon)
  • 22. Challenges? Pitfalls? Failure to observe bounds/limits… Perception that the process is not objective or (worse) biased… “Politically” “Jurisdictionally” “Intellectually” In any way…? Overly ambitious goals… Costs… Good Process? Well specified with realistic– well-budgeted -- goals/ objectives… Sustained collaborations… Bad process? Failure to achieve goals hit targets on time (or close!) within budget…
  • 23. Development of the KMS: • Elements or steps of developing the system? • What milestones might be used to judge progress? • What timelines? • How will the success or effectiveness be assessed? • Type, number, and timing of additional resources your approach would require
  • 24. Elements or steps of developing the KMS ? (All have an iterative, ongoing dimension…) • Survey / Research / Reconnaissance (6-9 months? ) • Infrastructure specification/ negotiation / implementation (6-12 months) – Full life cycle curation / data repositories – Data registry • Initial applications development (9-18 months) – User Registration / Authentication – Data registry – Access/ visualization applications • Collaborative models assessment/ implementation (18- 24 months)
  • 25. How will “success” or “effectiveness” be assessed? • Policy formation • Decisions consistent with knowledge and recommendations • Funding / sponsorship • Peer review mechanisms (“360 degree” reviews?) • Sustained engagement of partners and stakeholders. – Metrics of use – Publications / citation