Paul Edwards, a keynote speaker at the EarthCube All-Hands Meeting, shares an interesting viewpoint, sharing what social scientists have learned about governance in cyberinfrastructure and how those lessons may apply to EarthCube.
Wells SAA 2014 Public Data for Public Archaeologydinaa_proj
Joshua J. Wells (Indiana University South Bend) presented “Public Data for Public Archaeology: Developing Linked Open Data, Open-Source GIS, and Sensitive Data Standards for the Digital Index of north American Archaeology” on behalf of his co-authors (Kansa, Kansa, Yerka, Noack Myers, DeMuth and Bissett) at the 79th annual meeting of the Society for American Archaeology in Austin, TX in April 2014. This presentation discusses the relationships between archaeological linked open data and the very same “Big Data” discussed by Anderson. Intersecting with law, research, education, and ethics, the perspectives of anthropology, informatics and cybernetics accommodate a unique look at the broad scope of implications of this type of research and work to prevent disuse, misuse and abuse as we navigate new human vs. technological problems.
Scott Kirkpatrick (Hebrew University): OneLab: Federation and TestbedsServiceWave 2010
The document discusses federation of internet testbeds to enable testing across different geographic locations, technologies, and networks. It proposes extending federation support through middleware across control planes and experimental planes to facilitate resource discovery, monitoring, and data sharing. Several existing federated testbeds and measurement tools are described that have been used for topics like internet topology mapping, capacity measurements in Europe, and testing non-IP autonomous networks.
Network of Excellence in Internet Science (JRA1, Towards a Theory of Internet...i_scienceEU
The Network of Excellence in Internet Science aims to achieve a deeper multidisciplinary understanding of the Internet as a societal and technological artefact.
More information: http://internet-science.eu/
Twitter: @i_scienceEU
This document discusses the intersection of open source software and network control planes. It provides an overview of emerging open source SDN projects like OpenDaylight and RouteFlow, which is a software-defined IP routing project. RouteFlow uses an open source control plane with a Linux-based "glue" to program the data plane. The document argues that open source and SDN are accelerating the standardization of networking and that the frontier of networking is shifting from closed, vendor-led systems to ones based on APIs, open source, customer-led approaches, and network function virtualization.
EarthCube Stakeholder Alignment Survey - End-Users & Professional Societies W...EarthCube
Results of the Stakeholder Alignment Survey conducted by PI Joel Cutcher-Gershenfeld, University of Illinois, Urbana Champaign, presented by Susan Winters, University of Maryland
AHM 2014: Enterprise Architecture for Transformative Research and Collaborati...EarthCube
Ilya Zaslavsky, David Valentine, Amarnath Gupta, Stephen Richard, Tanu Malik
Presentation given in the afternoon Architecture Forum Session on Day 1, June 24 at the EarthCube All-Hands Meeting
Elucentra Quickbooks cloud services eases complete access to your hosted QuickBooks files, form anywhere, anytime from any devices such as Mac or PC computer.
Wells SAA 2014 Public Data for Public Archaeologydinaa_proj
Joshua J. Wells (Indiana University South Bend) presented “Public Data for Public Archaeology: Developing Linked Open Data, Open-Source GIS, and Sensitive Data Standards for the Digital Index of north American Archaeology” on behalf of his co-authors (Kansa, Kansa, Yerka, Noack Myers, DeMuth and Bissett) at the 79th annual meeting of the Society for American Archaeology in Austin, TX in April 2014. This presentation discusses the relationships between archaeological linked open data and the very same “Big Data” discussed by Anderson. Intersecting with law, research, education, and ethics, the perspectives of anthropology, informatics and cybernetics accommodate a unique look at the broad scope of implications of this type of research and work to prevent disuse, misuse and abuse as we navigate new human vs. technological problems.
Scott Kirkpatrick (Hebrew University): OneLab: Federation and TestbedsServiceWave 2010
The document discusses federation of internet testbeds to enable testing across different geographic locations, technologies, and networks. It proposes extending federation support through middleware across control planes and experimental planes to facilitate resource discovery, monitoring, and data sharing. Several existing federated testbeds and measurement tools are described that have been used for topics like internet topology mapping, capacity measurements in Europe, and testing non-IP autonomous networks.
Network of Excellence in Internet Science (JRA1, Towards a Theory of Internet...i_scienceEU
The Network of Excellence in Internet Science aims to achieve a deeper multidisciplinary understanding of the Internet as a societal and technological artefact.
More information: http://internet-science.eu/
Twitter: @i_scienceEU
This document discusses the intersection of open source software and network control planes. It provides an overview of emerging open source SDN projects like OpenDaylight and RouteFlow, which is a software-defined IP routing project. RouteFlow uses an open source control plane with a Linux-based "glue" to program the data plane. The document argues that open source and SDN are accelerating the standardization of networking and that the frontier of networking is shifting from closed, vendor-led systems to ones based on APIs, open source, customer-led approaches, and network function virtualization.
EarthCube Stakeholder Alignment Survey - End-Users & Professional Societies W...EarthCube
Results of the Stakeholder Alignment Survey conducted by PI Joel Cutcher-Gershenfeld, University of Illinois, Urbana Champaign, presented by Susan Winters, University of Maryland
AHM 2014: Enterprise Architecture for Transformative Research and Collaborati...EarthCube
Ilya Zaslavsky, David Valentine, Amarnath Gupta, Stephen Richard, Tanu Malik
Presentation given in the afternoon Architecture Forum Session on Day 1, June 24 at the EarthCube All-Hands Meeting
Elucentra Quickbooks cloud services eases complete access to your hosted QuickBooks files, form anywhere, anytime from any devices such as Mac or PC computer.
This document discusses the potential for developing a knowledge network by leveraging metadata from scientific endeavors. It begins by outlining some of the limitations of traditional metadata approaches. It then proposes that metadata could be structured as a graph using semantic triples to represent relationships between people, institutions, projects, and other elements. This liberalized metadata approach could help reduce complexity while providing a more comprehensive view of scientific activities and outputs. The document advocates for establishing common standards, developing tools to extract and aggregate metadata, and creating services and repositories to enable discovery, analysis, and visualization of the knowledge network. The goal is to facilitate research by providing integrated access to information on scientific data, publications, actors and their relationships.
e-Science and Technology Infrastructure for Biodiversity Research discusses e-science, which involves conducting science using vast computational resources and data over the internet. It involves areas like astronomy, biology, earth science, health, and more. Key aspects of e-infrastructure discussed are that it provides on-demand access to distributed resources like a power grid, and supports scientific discovery through computational tools. Challenges to e-infrastructure include organizational, financial, legal, and technical issues. Lifewatch is highlighted as a European e-science infrastructure for biodiversity research providing advanced capabilities for research on biodiversity systems.
This document summarizes a lecture on network science given by Madhav Marathe at Lawrence Livermore National Laboratory in December 2010. It provides an overview of network science, including definitions of networks and their unique properties. It also discusses mathematical and computational approaches to modeling complex networks and applications to infrastructure planning, energy systems, and national security. The lecture acknowledges prior work that contributed to its material from various researchers and textbooks.
Describing Everything - Open Web standards and classificationDan Brickley
The document discusses the need for a hybrid approach to classification that combines traditional library classification systems with modern web technologies and standards. It proposes putting classification data on the open web so it can be more widely used and built upon. This will help drive innovation by making the data accessible to developers, designers and content creators.
Quo vadis, provenancer? Cui prodest? our own trajectory: provenance of data...Paolo Missier
The document discusses provenance in the context of data science and artificial intelligence. It provides bibliometric data on publications related to data/workflow provenance from 2000 to the present. Recent trends include increased focus on applications in computing and engineering fields. Blockchain is discussed as a method for capturing fine-grained provenance. The document also outlines challenges around explainability, transparency and accountability for high-risk AI systems according to new EU regulations, and argues that provenance techniques may help address these challenges by providing traceability of system functioning and operation monitoring.
Daniel Lopresti, Bill Gropp, Mark D. Hill, Katie Schuman, and I put together a white paper on "Building a National Discovery Cloud" for the Computing Community Consortium (http://cra.org/ccc). I presented these slides at a Computing Research Association "Best Practices on using the Cloud for Computing Research Workshop" (https://cra.org/industry/events/cloudworkshop/).
Abstract from White Paper:
The nature of computation and its role in our lives have been transformed in the past two decades by three remarkable developments: the emergence of public cloud utilities as a new computing platform; the ability to extract information from enormous quantities of data via machine learning; and the emergence of computational simulation as a research method on par with experimental science. Each development has major implications for how societies function and compete; together, they represent a change in technological foundations of society as profound as the telegraph or electrification. Societies that embrace these changes will lead in the 21st Century; those that do not, will decline in prosperity and influence. Nowhere is this stark choice more evident than in research and education, the two sectors that produce the innovations that power the future and prepare a workforce able to exploit those innovations, respectively. In this article, we introduce these developments and suggest steps that the US government might take to prepare the research and education system for its implications.
Building COVID-19 Museum as Open Science Projectvty
This document discusses building a COVID-19 Museum as an open science project. It describes the speaker's background working on various data management projects. It discusses moving towards open science and sharing data according to FAIR principles. It outlines the Time Machine project for digitizing historical documents and its approach to data management. The rest of the document discusses using the Dataverse platform to build repositories, linking metadata to ontologies, using tools like Weblate for translations, and exploring the use of artificial intelligence and machine learning to enhance metadata and facilitate human-in-the-loop review processes.
I presented this keynote talk at the WorldComp conference in Las Vegas, on July 13, 2009. In it, I summarize what grid is about (focusing in particular on the "integration" function, rather than the "outsourcing" function--what people call "cloud" today), using biomedical examples in particular.
Hypertext2007 Wendy Hall - "Whatever Happened to Hypertext?"hypertext2007
The document discusses the history of hypertext conferences from 1987 to 2007 and the transition to web and semantic web conferences. It notes that early hypertext conferences in the late 1980s saw significant participation from technical and non-technical authors. Conference attendance declined in the late 1990s as the web conference series grew rapidly in popularity. The semantic web represents a rebirth of hypertext ideas but does not attract the original hypertext community.
1. Linked data, the Internet of Things, and cloud computing are linking everything together through interconnected networks and universal data sharing.
2. These technologies link data on all layers, from embedded systems and sensors to intelligent systems, web services, and cyber-physical systems.
3. Linked data uses Resource Description Framework (RDF) graphs and shared vocabularies to semantically link diverse data sources into a global data space, creating opportunities for new insights and applications.
Research in Intelligent Systems and Data Science at the Knowledge Media Insti...Enrico Motta
The document discusses research directions in intelligent systems and data science. It describes work on making sense of scholarly data through techniques like data mining, semantic technologies, and machine learning. It also discusses mapping and classifying computer science research areas using an automatically generated ontology with over 14,000 topics. Other topics discussed include predicting emerging research areas, applications in smart cities like the MK:Smart project, and potential roles for robots in smart cities like an autonomous health and safety inspector.
Polar research is inherently interdisciplinary and is becoming more so. Correspondingly, polar data managers have been working to meet very diverse communities and needs, especially after the progress of the International Polar Year 2007-8 (IPY). But is it enough? Despite their best efforts, the polar data and research communities can be rather insular. The unique challenges of polar research and data management may sometimes blind us to relevant developments in other parts of the world. At the same time, global initiatives and research in the lower latitudes often underplay, or even ignore, data needs and solutions in the polar regions. This conference emphasizes the need to extend polar issues more globally, yet the polar voice is still not loud enough in global conversations about data infrastructure.
Infrastructure, by its nature, must work across all scales. It requires a “glocal” perspective that simultaneously embraces both universalizing and particularizing tendencies. In this presentation I will discuss how there needs to be a constant interplay between local implementation and global design of data infrastructure. I will describe where the polar regions have had success in this area and where key challenges remain. I will describe a path forward for the polar data community to be better represented on the global stage through initiatives like the Research Data Alliance while also amplifying their effectiveness at the regional and local level. A goal is to improve the global understanding of polar issues while also improving the practice of polar data practitioners.
Preprint-WCMRI,IFERP,Singapore,28 October 2022.pdfChristo Ananth
Call for Papers- Special Session: World Conference on Multidisciplinary Research and Innovation (WCMRI-22), (Session 1: Information and Communication Technology), Singapore
Christo Ananth
Professor, Samarkand State University, Uzbekistan
1) The documents discuss network morphospace, which explores how network traits can define a theoretical space where each point represents a network with shared connectivity characteristics. This allows analysis of actual versus possible network designs and the rules shaping their evolution.
2) Innovation is discussed as analogous to biological evolution, with incremental changes through local searching and disruptive changes through recombination. Fostering innovation means enabling novelty, clear objectives, and better understanding of the landscape.
3) The section proposes a framework called the "Internet of Space" where sovereign users establish their own networks through movement in real space and time. The user's pin provides access to dynamic, layered knowledge according to their physical location and context.
The document discusses the evolution of e-Research, from early forms like supercomputing and grid computing to current approaches like big data. It argues that e-Research will become so integrated into normal research practices that it will effectively disappear as a separate field. The document also provides examples of how computational approaches are transforming different domains like the sciences, social sciences, humanities, and arts. It analyzes the digitization of cultural artifacts and large-scale text analysis as novel advances enabled by e-Research.
This document provides an introduction to data mining concepts and techniques. It discusses why data mining is needed due to the massive growth of data. It defines data mining as the extraction of interesting patterns from large datasets. The document outlines the key steps in the knowledge discovery process and how data mining fits within business intelligence applications. It also describes different types of data that can be mined and popular data mining algorithms.
Cyberenvironments integrate shared and custom cyberinfrastructure resources into a process-oriented framework to support scientific communities and allow researchers to focus on their work rather than managing infrastructure. They enable more complex multi-disciplinary challenges to be tackled through enhanced knowledge production and application. Key challenges include coordinating distributed resources and users without centralization and evolving systems rapidly to keep pace with advancing science.
A new software tool for large-scale analysis of citation networksNees Jan van Eck
This document describes a new software tool called Citation Network Explorer that allows users to explore and visualize large-scale citation networks over time in a dynamic way. It summarizes the motivation for developing this tool, which is the limited availability of software that can handle the visualization of the evolution of science. The document then provides an overview of the tool's capabilities and demonstrates it on two sample citation network datasets, concluding with a list of references for related research.
EarthCube Community Webinar held Tuesday, Dec. 9th at 11:00 PST/2:00 EST for a virtual kick-off of the new 'Demonstration Phase' of EarthCube, including statements from your Leadership Council members and an update from NSF Program Officer, Eva Zanzerkia.
Engagement Team monthly meeting 10.10.2014EarthCube
The document outlines the agenda and priorities for an EarthCube Demonstration Governance Engagement Team meeting in October 2014. The agenda includes an introduction, announcing a team representative to the Leadership Council, developing internal leadership, reviewing priorities and logistical functions, and discussing future meeting schedules. Key priorities and deliverables for the team are to develop an outreach and communications plan to engage the EarthCube community and stakeholders through compiling science use cases. Housekeeping, meeting leadership, point of contact roles, work management, and collaboration with other groups are listed as important logistical functions for the team.
More Related Content
Similar to AHM 2014: Governance and Cyberinfrastructure in the Earth System Sciences
This document discusses the potential for developing a knowledge network by leveraging metadata from scientific endeavors. It begins by outlining some of the limitations of traditional metadata approaches. It then proposes that metadata could be structured as a graph using semantic triples to represent relationships between people, institutions, projects, and other elements. This liberalized metadata approach could help reduce complexity while providing a more comprehensive view of scientific activities and outputs. The document advocates for establishing common standards, developing tools to extract and aggregate metadata, and creating services and repositories to enable discovery, analysis, and visualization of the knowledge network. The goal is to facilitate research by providing integrated access to information on scientific data, publications, actors and their relationships.
e-Science and Technology Infrastructure for Biodiversity Research discusses e-science, which involves conducting science using vast computational resources and data over the internet. It involves areas like astronomy, biology, earth science, health, and more. Key aspects of e-infrastructure discussed are that it provides on-demand access to distributed resources like a power grid, and supports scientific discovery through computational tools. Challenges to e-infrastructure include organizational, financial, legal, and technical issues. Lifewatch is highlighted as a European e-science infrastructure for biodiversity research providing advanced capabilities for research on biodiversity systems.
This document summarizes a lecture on network science given by Madhav Marathe at Lawrence Livermore National Laboratory in December 2010. It provides an overview of network science, including definitions of networks and their unique properties. It also discusses mathematical and computational approaches to modeling complex networks and applications to infrastructure planning, energy systems, and national security. The lecture acknowledges prior work that contributed to its material from various researchers and textbooks.
Describing Everything - Open Web standards and classificationDan Brickley
The document discusses the need for a hybrid approach to classification that combines traditional library classification systems with modern web technologies and standards. It proposes putting classification data on the open web so it can be more widely used and built upon. This will help drive innovation by making the data accessible to developers, designers and content creators.
Quo vadis, provenancer? Cui prodest? our own trajectory: provenance of data...Paolo Missier
The document discusses provenance in the context of data science and artificial intelligence. It provides bibliometric data on publications related to data/workflow provenance from 2000 to the present. Recent trends include increased focus on applications in computing and engineering fields. Blockchain is discussed as a method for capturing fine-grained provenance. The document also outlines challenges around explainability, transparency and accountability for high-risk AI systems according to new EU regulations, and argues that provenance techniques may help address these challenges by providing traceability of system functioning and operation monitoring.
Daniel Lopresti, Bill Gropp, Mark D. Hill, Katie Schuman, and I put together a white paper on "Building a National Discovery Cloud" for the Computing Community Consortium (http://cra.org/ccc). I presented these slides at a Computing Research Association "Best Practices on using the Cloud for Computing Research Workshop" (https://cra.org/industry/events/cloudworkshop/).
Abstract from White Paper:
The nature of computation and its role in our lives have been transformed in the past two decades by three remarkable developments: the emergence of public cloud utilities as a new computing platform; the ability to extract information from enormous quantities of data via machine learning; and the emergence of computational simulation as a research method on par with experimental science. Each development has major implications for how societies function and compete; together, they represent a change in technological foundations of society as profound as the telegraph or electrification. Societies that embrace these changes will lead in the 21st Century; those that do not, will decline in prosperity and influence. Nowhere is this stark choice more evident than in research and education, the two sectors that produce the innovations that power the future and prepare a workforce able to exploit those innovations, respectively. In this article, we introduce these developments and suggest steps that the US government might take to prepare the research and education system for its implications.
Building COVID-19 Museum as Open Science Projectvty
This document discusses building a COVID-19 Museum as an open science project. It describes the speaker's background working on various data management projects. It discusses moving towards open science and sharing data according to FAIR principles. It outlines the Time Machine project for digitizing historical documents and its approach to data management. The rest of the document discusses using the Dataverse platform to build repositories, linking metadata to ontologies, using tools like Weblate for translations, and exploring the use of artificial intelligence and machine learning to enhance metadata and facilitate human-in-the-loop review processes.
I presented this keynote talk at the WorldComp conference in Las Vegas, on July 13, 2009. In it, I summarize what grid is about (focusing in particular on the "integration" function, rather than the "outsourcing" function--what people call "cloud" today), using biomedical examples in particular.
Hypertext2007 Wendy Hall - "Whatever Happened to Hypertext?"hypertext2007
The document discusses the history of hypertext conferences from 1987 to 2007 and the transition to web and semantic web conferences. It notes that early hypertext conferences in the late 1980s saw significant participation from technical and non-technical authors. Conference attendance declined in the late 1990s as the web conference series grew rapidly in popularity. The semantic web represents a rebirth of hypertext ideas but does not attract the original hypertext community.
1. Linked data, the Internet of Things, and cloud computing are linking everything together through interconnected networks and universal data sharing.
2. These technologies link data on all layers, from embedded systems and sensors to intelligent systems, web services, and cyber-physical systems.
3. Linked data uses Resource Description Framework (RDF) graphs and shared vocabularies to semantically link diverse data sources into a global data space, creating opportunities for new insights and applications.
Research in Intelligent Systems and Data Science at the Knowledge Media Insti...Enrico Motta
The document discusses research directions in intelligent systems and data science. It describes work on making sense of scholarly data through techniques like data mining, semantic technologies, and machine learning. It also discusses mapping and classifying computer science research areas using an automatically generated ontology with over 14,000 topics. Other topics discussed include predicting emerging research areas, applications in smart cities like the MK:Smart project, and potential roles for robots in smart cities like an autonomous health and safety inspector.
Polar research is inherently interdisciplinary and is becoming more so. Correspondingly, polar data managers have been working to meet very diverse communities and needs, especially after the progress of the International Polar Year 2007-8 (IPY). But is it enough? Despite their best efforts, the polar data and research communities can be rather insular. The unique challenges of polar research and data management may sometimes blind us to relevant developments in other parts of the world. At the same time, global initiatives and research in the lower latitudes often underplay, or even ignore, data needs and solutions in the polar regions. This conference emphasizes the need to extend polar issues more globally, yet the polar voice is still not loud enough in global conversations about data infrastructure.
Infrastructure, by its nature, must work across all scales. It requires a “glocal” perspective that simultaneously embraces both universalizing and particularizing tendencies. In this presentation I will discuss how there needs to be a constant interplay between local implementation and global design of data infrastructure. I will describe where the polar regions have had success in this area and where key challenges remain. I will describe a path forward for the polar data community to be better represented on the global stage through initiatives like the Research Data Alliance while also amplifying their effectiveness at the regional and local level. A goal is to improve the global understanding of polar issues while also improving the practice of polar data practitioners.
Preprint-WCMRI,IFERP,Singapore,28 October 2022.pdfChristo Ananth
Call for Papers- Special Session: World Conference on Multidisciplinary Research and Innovation (WCMRI-22), (Session 1: Information and Communication Technology), Singapore
Christo Ananth
Professor, Samarkand State University, Uzbekistan
1) The documents discuss network morphospace, which explores how network traits can define a theoretical space where each point represents a network with shared connectivity characteristics. This allows analysis of actual versus possible network designs and the rules shaping their evolution.
2) Innovation is discussed as analogous to biological evolution, with incremental changes through local searching and disruptive changes through recombination. Fostering innovation means enabling novelty, clear objectives, and better understanding of the landscape.
3) The section proposes a framework called the "Internet of Space" where sovereign users establish their own networks through movement in real space and time. The user's pin provides access to dynamic, layered knowledge according to their physical location and context.
The document discusses the evolution of e-Research, from early forms like supercomputing and grid computing to current approaches like big data. It argues that e-Research will become so integrated into normal research practices that it will effectively disappear as a separate field. The document also provides examples of how computational approaches are transforming different domains like the sciences, social sciences, humanities, and arts. It analyzes the digitization of cultural artifacts and large-scale text analysis as novel advances enabled by e-Research.
This document provides an introduction to data mining concepts and techniques. It discusses why data mining is needed due to the massive growth of data. It defines data mining as the extraction of interesting patterns from large datasets. The document outlines the key steps in the knowledge discovery process and how data mining fits within business intelligence applications. It also describes different types of data that can be mined and popular data mining algorithms.
Cyberenvironments integrate shared and custom cyberinfrastructure resources into a process-oriented framework to support scientific communities and allow researchers to focus on their work rather than managing infrastructure. They enable more complex multi-disciplinary challenges to be tackled through enhanced knowledge production and application. Key challenges include coordinating distributed resources and users without centralization and evolving systems rapidly to keep pace with advancing science.
A new software tool for large-scale analysis of citation networksNees Jan van Eck
This document describes a new software tool called Citation Network Explorer that allows users to explore and visualize large-scale citation networks over time in a dynamic way. It summarizes the motivation for developing this tool, which is the limited availability of software that can handle the visualization of the evolution of science. The document then provides an overview of the tool's capabilities and demonstrates it on two sample citation network datasets, concluding with a list of references for related research.
Similar to AHM 2014: Governance and Cyberinfrastructure in the Earth System Sciences (20)
EarthCube Community Webinar held Tuesday, Dec. 9th at 11:00 PST/2:00 EST for a virtual kick-off of the new 'Demonstration Phase' of EarthCube, including statements from your Leadership Council members and an update from NSF Program Officer, Eva Zanzerkia.
Engagement Team monthly meeting 10.10.2014EarthCube
The document outlines the agenda and priorities for an EarthCube Demonstration Governance Engagement Team meeting in October 2014. The agenda includes an introduction, announcing a team representative to the Leadership Council, developing internal leadership, reviewing priorities and logistical functions, and discussing future meeting schedules. Key priorities and deliverables for the team are to develop an outreach and communications plan to engage the EarthCube community and stakeholders through compiling science use cases. Housekeeping, meeting leadership, point of contact roles, work management, and collaboration with other groups are listed as important logistical functions for the team.
The document summarizes the agenda and priorities for an October meeting of the Science Standing Committee. The agenda includes an introduction, announcing committee representatives, developing internal leadership, and reviewing priorities and logistical functions. The committee's year 1 intended outcome is to support work to complete the year 1 deliverable of developing science use cases. Their priorities are housekeeping tasks like assigning a meeting lead and point of contact for the oversight office.
This document summarizes an EarthCube meeting to discuss funded demonstration projects and governance. It outlines the agenda, including introductions from new project teams and a discussion of the role of funded projects. Key points include that the Test Governance project will coordinate the demonstration governance process and report outcomes to NSF. Both the Technology & Architecture Committee and Science Committee outlined initial steps, including forming subcommittees to analyze use cases and gaps. The meeting concluded with a discussion of how funded projects can best work with standing committees through formal work plans, representatives, and regular communication.
Technology and Architecture Committee meeting slides 10.06.14EarthCube
The October meeting agenda of the EarthCube Technology and Architecture Standing Committee included:
1) Welcome and introductions
2) Announcement of new committee representatives
3) Discussion of the committee's internal leadership structure and responsibilities, including coordinating with other groups, monitoring working groups, and sponsoring new working groups.
4) Review of timelines for upcoming milestones and deliverables and discussion of future meeting schedules.
EarthCube Governance Intro for Solar Terrestrial End-user WorkshopEarthCube
Presentation by the EarthCube Test Enterprise Governance project for the Solar Terrestrial Research End-User Workshop, Newark, New Jersey, August 14, 2014.
AHM 2014: The CSDMS Standard Names, Cross-Domain Naming Conventions for Descr...EarthCube
The document discusses the CSDMS Standard Names, which provide unambiguous naming conventions for describing process models, data sets, and their associated variables. The standard names aim to avoid ambiguity and domain-specific terminology. They support naming quantities, processes, mathematical operations, assumptions, and more. Developing and applying standard names helps different models to automatically match variables and understand each other.
AHM 2014: Addressing Data and Heterogeneity, Semantic Building Blocks & CI Pe...EarthCube
This panel will address data heterogeneity issues in EarthCube from the perspective of semantic building blocks and cyberinfrastructure. The panel, convened by Gary Berg-Cross of SOCoP, will feature co-conveners Pascal Hitzler of Wright State University, Kerstin Lehnert of LDEO, Columbia University, and Peter Wiebe of Woods Hole Oceanographic Institution. Additional panelists will include Scott Peckham of University of Colorado Boulder, Anthony Aufdenkampe of Stroud Water Research Center, Tim Finin of University of Maryland Baltimore County, and Krzysztof Janowicz of University of California Santa Barbara.
AHM 2014: Revisting Governance Model, Preparing for Next StepsEarthCube
The document lists several potential priorities for EarthCube including developing an emergent architecture, identifying and promoting success stories, providing guidelines for shared services, developing common end user training, benchmarking progress against scientific needs, creating a prototype to demonstrate connectivity and functionality, documenting scientific workflows, and coordinating projects. Additional options mentioned are scoping and articulating a vision, identifying collaborations, documenting use cases, engaging academia in education, improving data management plans and data discovery, establishing light governance led by scientists, tying different design efforts together, determining funding mechanisms, adopting standards, enabling participation from diverse fields, and engaging stakeholders.
AHM 2014: Integrated Data Management System for Critical Zone ObservatoriesEarthCube
Presentation by Anthony Aufdenkampe during the Addressing Data Heterogeneity, Semantic Building Bloack & CI Perspective Session on Day 2, June 25 at the EarthCube All-Hands Meeting
The document discusses the CSDMS Standard Names, which are naming conventions developed by the Community Surface Dynamics Modeling System (CSDMS) modeling framework to facilitate automatic coupling of models and data sets from different contributors. The naming conventions follow an object-oriented approach where each standard variable name is composed of an object name and quantity name joined by double underscores. This allows framework software to retrieve numerical values for variables based on their standardized names. The naming conventions were designed according to criteria such as avoiding ambiguity, using widely understood terminology, and supporting mathematical operations and assumptions. They address challenges of automatic semantic mediation when coupling diverse resources that use different naming systems.
The document discusses a watershed modeling system called BCube that aims to decrease the effort of watershed initialization by brokering various global geospatial and environmental data required for watershed modeling. BCube allows researchers to focus on scientific research by providing a single access point to the different data formats and sources for elevation, soils, land use, weather, and other data needed to set up and run watershed models. The document provides an overview of the types of data BCube can broker and the workflow where a scientist requests data for a watershed area and BCube returns the available options to choose from.
AHM 2014: OceanLink, Smart Data versus Smart Applications EarthCube
Presentation given by Krysztof Janowicz and Pascal Hitzler in the afternoon Architecture Forum Session on Day 1, June 24, at the EarthCube All-Hands Meeting.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
2. EarthCube goal
“…to design, build, and maintain an easy-to-use
system based on existing resources that embraces
open-source culture and methods to align
technology development with scientific needs.”
Richard et al. “Community‐developed Geoscience Cyberinfrastructure.” Eos 95, no. 20 (2014):
165-166
4. The Tower of Babel…
Heritage of multiple
disciplines, sensors, data
analysis methods
Cacophony of formats,
metadata, software
Earthcube survey of ~175
scientists (2011): need…
Common data formats
Better metadata and
metadata standards
Better ways to find data
Coupled web-based
services, such as
visualization tools
5. Cyberinfrastructure and
climate change informatics (Rood & Edwards 2014)
R. B. Rood & P. N. Edwards, “Climate Informatics: Human Experts and the End-to-End System,” Earthzine, May 2014
6. The loading dock model of cyberinfrastructure
Data Models Services
Loading Dock Model
7. Access is not the main problem
Beyond the loading dock model
Need for translational information for (many)
particular users and uses
Human communication — often informal — remains
the most basic process for effective data sharing
Metadata as product vs. metadata as process
Always provide for communication with data creators
8. This morning
A little history of infrastructure
… and of governance in meteorology
What is governance?
Governance and software in Earth system science
9. This morning
A little history of infrastructure
… and of governance in meteorology
What is governance?
Governance and software in Earth system science
10. Infrastructure: a historical model
Paul N. Edwards
System building: designed,
coherent, centrally organized
Proliferation of systems;
variation
Networks: dedicated gateways
link heterogeneous systems
Internetworks: generic gateways
link heterogeneous networks
Decentralization, fragmentation
Abandonment, substitution
time
Edwards et al. 2007
11. Dedicated or improvised gateways (Egyedi 2001)
Paul N. Edwards
Whose responsibility?
Who sets standards?
Who pays?
14. Internetworks link networks
Paul N. Edwards
Routers are gateways
connect computers to
each other (network)
… and connect the local
network to other
networks
“The” Internet connects
millions of networks
15. This morning
A little history of infrastructure
… and of governance in meteorology
What is governance?
Governance and software in Earth system science
18. 1870 1900
1930 1960
Surface station coverage: evolution
Source: J. Hansen and S. Lebedeff, “Global Trends of Measured Surface Air Temperature,” Journal of
Geophysical Research 92, no. D11 (1987), 13,346-13,347. Diameter of circles drawn around each station
19.
20. Stages in the history of weather forecasting
20
Systems: national
weather services
Set own standards
Networks: national and
international
The Réseau Mondial
Internetworks
Integrating
heterogeneous data
sources
Surface stations
Air bases and airports
Marine data
Satellites
Governance
International Meteorological
Organization (1873-1949)
World Meteorological
Organization (WMO, founded
1950)
Set standards, assisted
coordination — but
lightweight relative to
national services
22. This morning
A little history of infrastructure
… and of governance in meteorology
What is governance?
Governance and software in Earth system science
23. What is governance?
Aligning an organization’s practices and procedures
with its goals, purposes, and values
Oversight, steering, and articulating organizational
norms and processes
vs. management: detailed planning, supervision of work,
allocation of effort
24. Modes of governance
Hierarchy Network (of
firms)
Market or firm Bazaar
Contractual
framework
Employment
contract
Neoclassical
contract
Property
contract
Open source
license
Incentives
intensity
Low Medium High Low
Control
intensity
High Medium Low Low
Social
relations
Strong ties Strong ties Anonymous Mostly
anonymous or
weak ties
Membership Employees
selected
Members
select each
other
Buyer selected
by seller
Open; many
free riders
Timeframe Long-term
commitment
Long-term
commitment
Transaction or
contract
Variable; no
commitment
Source: adapted from B. Demil and X. Lecocq, “Neither Market Nor Hierarchy Nor Network: The
Emergence of Bazaar Governance,” Organization studies 27, no. 10 (2006): 1447-66
25. Open source culture: bazaar governance
E. Raymond, “The Cathedral and the Bazaar”
Linux is ‘a great babbling bazaar of differing agendas and
approaches’
Characteristic: chaotic market, huge variations in quality
“Low levels of control and weak incentives intensity
are distinctive features of bazaar [governance],
lending a high uncertainty to governed transactions.”
Source: B. Demil and X. Lecocq, “Neither Market Nor Hierarchy Nor Network: The Emergence of Bazaar
Governance,” Organization Studies 27, no. 10 (2006): 1447-1466.
26. …but how does governance really work?
Highly competent groups
can get a lot done without
much management from
above —
but there are limits to
leaderless teams,
especially when work is
time-sensitive and
requires coordinating
complex, interdependent
activity.
27. This morning
A little history of infrastructure
… and of governance in meteorology
What is governance?
Governance and software in Earth system science
28. Organizations in science…
Organizations provide space, equipment, money, and
support
Stable, long-lasting (decades)
Well-defined roles and routines
Have boundaries, hierarchies, and entrenched cultures
Research (NCAR, GFDL, universities) vs. operational (NOAA,
NASA, DOE)
National laboratories and military research
Funding agencies (NSF, NIH) and foundations
They strongly structure work incentives and disincentives
29. … vs. projects
...but most scientific work takes place in projects,
teams, and working groups
Varying sizes
Lifespans vary, but mostly short (1-5 years)
Depend heavily on funding cycles
Often cross organizational boundaries
Many scientists are involved in several projects at
once
Overlapping membership
Funding is an ongoing concern
34. Operational norms and rules
Expectations that govern everyday interaction
among project members
Largely informal and tacit (unarticulated)
May be embedded in organizational routines or tools
Usually surface only during crisis or conflict
Difficult to change without a forcing factor
Tools can embody operational norms — but usually can’t
force changes
35. Cyberinfrastructure pitfalls
35
Software makes it seem easy to build gateways
between systems and networks…
“You just…”
… but social, institutional, and security gateways are
even more important
Multiple institutional cultures
Complex projects with many working groups
Multiple security and legal standards can block
interchange
36. Conclusions: some lessons from history
36
Centralized design and control is not the primary
path to working infrastructure
Instead, build gateways (couplers)
Standards technologies, institutions
Must be lightweight, readily understood, easily transferred
across regions and cultures (including disciplinary
cultures)
International governance of data standardization and
exchange in meteorology was achieved by the
1960s
in the face of enormous technical obstacles
(communication channels) and social obstacles (Cold
War, decolonization)
37. EarthCube goal
“…to design, build, and maintain an easy-to-use
system based on existing resources that embraces
open-source culture and methods to align
technology development with scientific needs.”
Richard et al. “Community‐developed Geoscience Cyberinfrastructure.” Eos 95, no. 20 (2014):
165-166
38. Conclusions: some lessons from history
The tensions between hierarchy, network, and
bazaar modes of governance will be difficult to
resolve
Cyberinfrastructure can help, but it can also hinder
Social and organizational issues must be addressed
along with technology
The EarthCube experiment is enormously important,
and worth doing!
39. 25 July 2014Paul N. Edwards , University of Michigan
School of Information
Edwards et al., Knowledge
Infrastructures: Intellectual
Frameworks and Research
Challenges (2013)
knowledgeinfrastructures.or
g