A presentation given at the workshop "Potential of satellite images and hyper/multi-spectral recording in archaeology"
By Anthony Beck
Poznan – 31st June 2012
OpenStack - What is it and why you should know about it!OpenStack
A presentation I did to the inaugural CompCon at ANU in Canberra 29/09/2013.In a phenomenally short time OpenStack has risen to be the dominant platform for building private and public clouds of any scale. With 1000s of contributors and hundreds of companies backing the project, Tristan will demonstrate why you need to know about OpenStack and get involved now.
- What is OpenStack
- History of the project
- Phenomenal growth of the project
- Relevance in Australia and internationally, presenting opportunities to build green field clouds the world over.
- Massive job demand
Utilising Cloud Computing for Research through Infrastructure, Software and D...David Wallom
This document discusses using cloud computing for research through Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Desktop as a Service (DaaS). For IaaS, it describes the EGI Federated Cloud which provides cloud services from multiple public and private sector providers. For SaaS, it discusses Hub for managing the research lifecycle and data, and Chipster for bioinformatics analysis. For DaaS, it covers EOSCloud which provides virtual desktops for bioinformatics research through the JASMIN cloud. Overall it promotes cloud computing for enabling flexible infrastructure, services, and environments to support diverse research needs.
The document discusses the open source space software community, including hurdles to adoption such as export control issues. It provides two case studies: FlightLinux, a customized Linux distribution for spacecraft that was unable to be released due to export controls, and CosmosCode, a NASA open source project launched in 2004. The document also discusses challenges around open source development at NASA from legal, policy, cultural, and procedural perspectives.
Using multi-temporal benchmarking to determine optimal sensor deployment: adv...DART Project
A presentation given by Anthony Beck at EARSeL Gent on 20/09/12 describing some of the multi-temporal issues associated with archaeological detection. This presentation is primarily based on the research of David Stott.
Unleashing the potential of collaboration – archaeological detection in the 2...DART Project
Speakers – Anthony Beck/David Stott
Computers, the internet and mobile phones have changed how archaeologists work. More importantly it has changed how everybody can access, use and contribute to archaeology.
This has altered public expectations on modes of engagement and resource access. This is resulting in an increased demand for access to this data. This phenomena is not solely about archaeology and heritage but is reflected in many areas of society. Some governments have recognised that taxpayers, as funders of data, should be allowed to access and utilise this data more easily. This has underpinned the Open Data movement.
At the same time companies and institutions, like Google and NASA, started making large datasets available on the internet. Some of these organisations provided Application Programming Interface (API's) and other services so that software applications could be built around their data. Such software services made it easier for people to use this data to make new things (derive content) and in turn share these things with their communities. This produced the crowd-sourcing and citizen-science movements. Crowdsoucing is where products, ideas, or content are created by soliciting contributions from a large group of people online. The community mapping system called Open Street Map is a good example of crowdsourcing.
Other people want to be more active. Projects like Galaxy Zoo, Ancient Lives and Old Weather have helped free data trapped in books or help scientists collect and analyse data. National Geographic have sponsored a project to help detect archaeological sites in Mongolia using high spatial resolution satellite images (exploration.nationalgeographic.com/mongolia/home). With lots of people working together a big problem can turn into a small problem. These people are 'citizen scientists'.
This presentation will describe these movements in more detail and provide examples of their implications for the heritage sector. A vision will then be set out for the future of a collaborative framework for heritage management. This will be framed in the implications it has for practice, engagement, research, curation and policy. Public participation is welcomed!
Using technologies to promote projectsDART Project
A presentation given by Anthony Beck to the Cambridge Archaeologists Forum. The forum mindmap is here: http://dl.dropbox.com/u/393477/MindMaps/InTray/CambridgeArchaeologistsForum290911.html
A presentation by Anthony Beck presented at the workshop "Potential of satellite images and hyper/multi-spectral recording in archaeology"
Poznan – 31st June 2012
The document discusses the vision and goals of creating an open "Method Store" repository. The key points are:
1) The Method Store would be a repository that facilitates collaborative development of methodologies in archaeology to avoid redundant work and allow all sectors to participate.
2) It would allow methods, algorithms, and other research outputs to be deposited, shared, tagged, linked, and developed to draw connections between methodologies.
3) The long term goal is for the Method Store to only contain objects released under an open license like Creative Commons to make the research fully transparent and reusable.
OpenStack - What is it and why you should know about it!OpenStack
A presentation I did to the inaugural CompCon at ANU in Canberra 29/09/2013.In a phenomenally short time OpenStack has risen to be the dominant platform for building private and public clouds of any scale. With 1000s of contributors and hundreds of companies backing the project, Tristan will demonstrate why you need to know about OpenStack and get involved now.
- What is OpenStack
- History of the project
- Phenomenal growth of the project
- Relevance in Australia and internationally, presenting opportunities to build green field clouds the world over.
- Massive job demand
Utilising Cloud Computing for Research through Infrastructure, Software and D...David Wallom
This document discusses using cloud computing for research through Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Desktop as a Service (DaaS). For IaaS, it describes the EGI Federated Cloud which provides cloud services from multiple public and private sector providers. For SaaS, it discusses Hub for managing the research lifecycle and data, and Chipster for bioinformatics analysis. For DaaS, it covers EOSCloud which provides virtual desktops for bioinformatics research through the JASMIN cloud. Overall it promotes cloud computing for enabling flexible infrastructure, services, and environments to support diverse research needs.
The document discusses the open source space software community, including hurdles to adoption such as export control issues. It provides two case studies: FlightLinux, a customized Linux distribution for spacecraft that was unable to be released due to export controls, and CosmosCode, a NASA open source project launched in 2004. The document also discusses challenges around open source development at NASA from legal, policy, cultural, and procedural perspectives.
Using multi-temporal benchmarking to determine optimal sensor deployment: adv...DART Project
A presentation given by Anthony Beck at EARSeL Gent on 20/09/12 describing some of the multi-temporal issues associated with archaeological detection. This presentation is primarily based on the research of David Stott.
Unleashing the potential of collaboration – archaeological detection in the 2...DART Project
Speakers – Anthony Beck/David Stott
Computers, the internet and mobile phones have changed how archaeologists work. More importantly it has changed how everybody can access, use and contribute to archaeology.
This has altered public expectations on modes of engagement and resource access. This is resulting in an increased demand for access to this data. This phenomena is not solely about archaeology and heritage but is reflected in many areas of society. Some governments have recognised that taxpayers, as funders of data, should be allowed to access and utilise this data more easily. This has underpinned the Open Data movement.
At the same time companies and institutions, like Google and NASA, started making large datasets available on the internet. Some of these organisations provided Application Programming Interface (API's) and other services so that software applications could be built around their data. Such software services made it easier for people to use this data to make new things (derive content) and in turn share these things with their communities. This produced the crowd-sourcing and citizen-science movements. Crowdsoucing is where products, ideas, or content are created by soliciting contributions from a large group of people online. The community mapping system called Open Street Map is a good example of crowdsourcing.
Other people want to be more active. Projects like Galaxy Zoo, Ancient Lives and Old Weather have helped free data trapped in books or help scientists collect and analyse data. National Geographic have sponsored a project to help detect archaeological sites in Mongolia using high spatial resolution satellite images (exploration.nationalgeographic.com/mongolia/home). With lots of people working together a big problem can turn into a small problem. These people are 'citizen scientists'.
This presentation will describe these movements in more detail and provide examples of their implications for the heritage sector. A vision will then be set out for the future of a collaborative framework for heritage management. This will be framed in the implications it has for practice, engagement, research, curation and policy. Public participation is welcomed!
Using technologies to promote projectsDART Project
A presentation given by Anthony Beck to the Cambridge Archaeologists Forum. The forum mindmap is here: http://dl.dropbox.com/u/393477/MindMaps/InTray/CambridgeArchaeologistsForum290911.html
A presentation by Anthony Beck presented at the workshop "Potential of satellite images and hyper/multi-spectral recording in archaeology"
Poznan – 31st June 2012
The document discusses the vision and goals of creating an open "Method Store" repository. The key points are:
1) The Method Store would be a repository that facilitates collaborative development of methodologies in archaeology to avoid redundant work and allow all sectors to participate.
2) It would allow methods, algorithms, and other research outputs to be deposited, shared, tagged, linked, and developed to draw connections between methodologies.
3) The long term goal is for the Method Store to only contain objects released under an open license like Creative Commons to make the research fully transparent and reusable.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires services like CloudBank and Kubernetes federation.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, when they may be suitable to use, and how tools like CloudBank and Kubernetes can help facilitate science users' access to cloud resources.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires tools for account management, documentation, and integrating cloud resources through HTCondor and Kubernetes.
Science Services and Science Platforms: Using the Cloud to Accelerate and Dem...Ian Foster
Ever more data- and compute-intensive science makes computing increasingly important for research. But for advanced computing infrastructure to benefit more than the scientific 1%, we need new delivery methods that slash access costs, new sustainability models beyond direct research funding, and new platform capabilities to accelerate the development of new, interoperable tools and services.
The Globus team has been working towards these goals since 2010. We have developed software-as-a-service methods that move complex and time-consuming research IT tasks out of the lab and into the cloud, thus greatly reducing the expertise and resources required to use them. We have demonstrated a subscription-based funding model that engages research institutions in supporting service operations. And we are now also showing how the platform services that underpin Globus applications can accelerate the development and use of an integrated ecosystem of advanced science applications, such as NCAR’s Research Data Archive and OSG Connect, thus enabling access to powerful data and compute resources by many more people than is possible today.
In this talk, I introduce Globus services and the underlying Globus platform. I present representative applications and discuss opportunities that this platform presents for both small science and large facilities.
Introduction to the Open Grid Forum community and the document production process, as well as several primary application arenas for OGF specifications, given at the co-located International Conference on Cloud and Autonomic Computing (CAC 2014), IEEE International Conference on Self-Adaptive and Self-Organizing Systems (SASO 2014) and the IEEE International Conference on Peer-to-Peer Computing (P2P’14) conferences, September 8-12, 2014 at Imperial College in London, UK.
This talk was given at a workshop entitled "Cybersecurity Engagement in a Research Environment" at Rady School of Management at UCSD. The workshop was organized by Michael Corn, the UCSD CISO. It tries to provoke discussion around the cybersecurity features and requirements of international science collaborations, as well as more generally, federated cyberinfrastructure systems.
Frank Würthwein - NRP and the Path forwardLarry Smarr
NRP will replace PRP and aims to democratize access to national research cyberinfrastructure. The long term vision is to create an open national cyberinfrastructure by federating resources across research institutions. Key innovations include an innovative network fabric, application libraries for FPGAs, a "bring your own resource" model, and innovative scheduling and data infrastructure. The NSF has funded the Prototype National Research Platform project to support NRP for the next 5 years. NRP aims to grow resources, introduce new capabilities, and be driven by the research community.
Data-intensive bioinformatics on HPC and CloudOla Spjuth
The document discusses data-intensive bioinformatics and challenges with analyzing large genomic datasets on high performance computing (HPC) resources. It summarizes that storage is the biggest challenge as sequencing projects generate very large amounts of data and users do not clean up data. The strategies discussed to address this include assessing costs of storage and analysis upfront, limiting project lifetimes, moving to tiered storage, and improving efficiency. It also discusses using cloud computing resources through virtual clusters and containers to enable flexible, on-demand access and pay-per-use pricing models. Scientific workflows and microservices approaches are presented as ways to automate and orchestrate large-scale genomic analyses on distributed computing resources.
IESL Talk Series: Apache System Projects in the Real WorldSrinath Perera
- LEAD was a large-scale e-science project funded by the NSF that used Apache technologies like Axis2, ODE, and others to build a dynamic weather analysis system across multiple universities in the US.
- It faced challenges at large scale including distributed resources, long-running jobs, large data, and usage spikes from many parallel users.
- Key subsystems included workflows, data, and messaging which also presented challenges around resource utilization, scalability, fault-tolerance, and security at that scale.
- Over time, LEAD transitioned major components to Apache projects and has now joined the Apache incubator as the Apache Airavata project.
This document summarizes a kick-off meeting for the UR3 project, which aims to implement a cloud computing infrastructure for sharing data, algorithms, and high performance computing resources among different teams and communities. It outlines the objectives, tasks, timeline, and involved partners of the UR3 project. It also discusses concepts for the cloud architecture, including virtualization, horizontal and vertical scalability, and the benefits of a cloud model for optimizing resource usage and reducing costs.
The document discusses Grid computing and the Globus Toolkit. It provides an overview of Grid computing, describing it as the sharing of computer resources and coordinated problem solving across multiple institutions. It then summarizes the Globus Toolkit, describing it as open source software that provides basic components for Grid functionality, including security, execution management, data management, and monitoring. The Globus Toolkit aims to make it easier to build collaborative distributed applications that can exploit shared Grid infrastructure.
Research Data (and Software) Management at Imperial: (Everything you need to ...Sarah Anna Stewart
A presentation on research data management tools, workflows and best practices at Imperial College London with a focus on software management. Presented at the 2017 session of the HPC Summer School (Dept. of Computing).
- The Broad Institute is a non-profit biomedical research institute founded in 2004 with 50 core faculty members from Harvard and MIT and over 1000 research personnel.
- It focuses on specific disease areas through various programs and initiatives and technological innovation through several platforms.
- In order to take advantage of cloud technologies, organizations need to fundamentally change how they engage with technology and technologists to collaborate effectively in this new environment.
Slides from Spectra Logic’s inaugural BlackPearl Developer Summit, a virtual conference for current and potential Spectra Logic developers. You’ll get product updates from our CEO and BlackPearl product manager, and you will learn how these new features will help customers and developers. You will learn how to build a Spectra S3 client for BlackPearl, our private cloud gateway to our tape and disk storage systems. You will see how one of our partners developed a client and watch it in action. And you will get to ask questions to our BlackPearl Engineering team. Watch the Summit recording at https://www.youtube.com/watch?v=GYoSwrvhVM0
Presentation given at the Consorcio Madrono conference on Data Management Plans in Horizon 2020 http://www.consorciomadrono.es/info/web/blogs/formacion/217.php
Adoption of Cloud Computing in Scientific ResearchYehia El-khatib
Some might say the scientific research community is somewhat behind the curve of adopting the cloud. In this talk, I present a few examples of adopting the cloud from the wider research community. I also highlight some of the aspects by which cloud computing could affect scientific research in the near future and the associated challenges.
Data Con LA 2022-Open Source or Open Core in Your Data Layer? What Needs to B...Data Con LA
Open source software has progressed from being developed primarily by individuals and small teams to large projects overseen by foundations or commercial entities. There are debates around fully open source projects versus open core models, where core functionality is open but additional features are proprietary. Key considerations for organizations evaluating options include licensing terms, governance structures, impact on branding, available business models, and the overall ecosystem of users and contributors. While open core can provide traditional software vendor advantages, fully open source alternatives aim to avoid vendor lock-in and keep intellectual property communal.
Cloud Standards in the Real World: Cloud Standards Testing for DevelopersAlan Sill
Learn about standards studied in the US National Science Foundation Cloud and Autonomic Computing Industry/University Cooperative Research Center Cloud Standards Testing Lab and how you can get involved to extend the successes from these results in your own cloud software settings. Presented at the O'Reilly OSCON 2014 Open Cloud Day.
Video available at https://www.youtube.com/watch?v=eD2h0SqC7tY
Is cloud and NDT a good mix? NDT has its own specificity. Clouds can truly simplify the file management, but is any cloud solution adapted for the NDT? For example, Dropbox may not work right out of the box for our market. This presentation highlights different avenues about clouds (IaaS, PaaS, and SaaS); and highlights NDT critical requirements (constraints and needs). A list of different levels of cloud services (component, option, security, ...) will be defined. It is important to remember that private and public servers are 2 possible avenues. NDT was an early user of private servers even before it was called a cloud. Overall the main idea is to optimize the operation process to reduce OPEX and to increase availability and accuracy of data.
See: www.amotus-solutions.com or www.nubitus.com
This presentation discusses using airborne remote sensing to detect archaeological features through vegetation marks. It summarizes that spectro-radiometry shows good contrast in foliar pigmentation over time, while crop structure remains similar. Full waveform LiDAR correlates well with hyperspectral data and detects archaeological features through vegetation height more than other metrics like intensity. Different sensors and analysis techniques are needed depending on each field's variability, context and small archaeological signals within large remote sensing datasets.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires services like CloudBank and Kubernetes federation.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, when they may be suitable to use, and how tools like CloudBank and Kubernetes can help facilitate science users' access to cloud resources.
Panel: Open Infrastructure for an Open Society: OSG, Commercial Clouds, and B...Larry Smarr
The document discusses open infrastructure for an open society and the role of commercial clouds. It describes how the National Research Platform (NRP), Open Science Grid (OSG), and Open Science Data Federation (OSDF) provide open infrastructure through open source components that anyone can contribute to and use. It then discusses how Southwestern Oklahoma State University leveraged NRP resources on their campus and engaged students and local teachers. Finally, it outlines the pros and cons of commercial clouds, noting they provide huge capacity and variety but are very expensive for regular use. Facilitating science users on clouds requires tools for account management, documentation, and integrating cloud resources through HTCondor and Kubernetes.
Science Services and Science Platforms: Using the Cloud to Accelerate and Dem...Ian Foster
Ever more data- and compute-intensive science makes computing increasingly important for research. But for advanced computing infrastructure to benefit more than the scientific 1%, we need new delivery methods that slash access costs, new sustainability models beyond direct research funding, and new platform capabilities to accelerate the development of new, interoperable tools and services.
The Globus team has been working towards these goals since 2010. We have developed software-as-a-service methods that move complex and time-consuming research IT tasks out of the lab and into the cloud, thus greatly reducing the expertise and resources required to use them. We have demonstrated a subscription-based funding model that engages research institutions in supporting service operations. And we are now also showing how the platform services that underpin Globus applications can accelerate the development and use of an integrated ecosystem of advanced science applications, such as NCAR’s Research Data Archive and OSG Connect, thus enabling access to powerful data and compute resources by many more people than is possible today.
In this talk, I introduce Globus services and the underlying Globus platform. I present representative applications and discuss opportunities that this platform presents for both small science and large facilities.
Introduction to the Open Grid Forum community and the document production process, as well as several primary application arenas for OGF specifications, given at the co-located International Conference on Cloud and Autonomic Computing (CAC 2014), IEEE International Conference on Self-Adaptive and Self-Organizing Systems (SASO 2014) and the IEEE International Conference on Peer-to-Peer Computing (P2P’14) conferences, September 8-12, 2014 at Imperial College in London, UK.
This talk was given at a workshop entitled "Cybersecurity Engagement in a Research Environment" at Rady School of Management at UCSD. The workshop was organized by Michael Corn, the UCSD CISO. It tries to provoke discussion around the cybersecurity features and requirements of international science collaborations, as well as more generally, federated cyberinfrastructure systems.
Frank Würthwein - NRP and the Path forwardLarry Smarr
NRP will replace PRP and aims to democratize access to national research cyberinfrastructure. The long term vision is to create an open national cyberinfrastructure by federating resources across research institutions. Key innovations include an innovative network fabric, application libraries for FPGAs, a "bring your own resource" model, and innovative scheduling and data infrastructure. The NSF has funded the Prototype National Research Platform project to support NRP for the next 5 years. NRP aims to grow resources, introduce new capabilities, and be driven by the research community.
Data-intensive bioinformatics on HPC and CloudOla Spjuth
The document discusses data-intensive bioinformatics and challenges with analyzing large genomic datasets on high performance computing (HPC) resources. It summarizes that storage is the biggest challenge as sequencing projects generate very large amounts of data and users do not clean up data. The strategies discussed to address this include assessing costs of storage and analysis upfront, limiting project lifetimes, moving to tiered storage, and improving efficiency. It also discusses using cloud computing resources through virtual clusters and containers to enable flexible, on-demand access and pay-per-use pricing models. Scientific workflows and microservices approaches are presented as ways to automate and orchestrate large-scale genomic analyses on distributed computing resources.
IESL Talk Series: Apache System Projects in the Real WorldSrinath Perera
- LEAD was a large-scale e-science project funded by the NSF that used Apache technologies like Axis2, ODE, and others to build a dynamic weather analysis system across multiple universities in the US.
- It faced challenges at large scale including distributed resources, long-running jobs, large data, and usage spikes from many parallel users.
- Key subsystems included workflows, data, and messaging which also presented challenges around resource utilization, scalability, fault-tolerance, and security at that scale.
- Over time, LEAD transitioned major components to Apache projects and has now joined the Apache incubator as the Apache Airavata project.
This document summarizes a kick-off meeting for the UR3 project, which aims to implement a cloud computing infrastructure for sharing data, algorithms, and high performance computing resources among different teams and communities. It outlines the objectives, tasks, timeline, and involved partners of the UR3 project. It also discusses concepts for the cloud architecture, including virtualization, horizontal and vertical scalability, and the benefits of a cloud model for optimizing resource usage and reducing costs.
The document discusses Grid computing and the Globus Toolkit. It provides an overview of Grid computing, describing it as the sharing of computer resources and coordinated problem solving across multiple institutions. It then summarizes the Globus Toolkit, describing it as open source software that provides basic components for Grid functionality, including security, execution management, data management, and monitoring. The Globus Toolkit aims to make it easier to build collaborative distributed applications that can exploit shared Grid infrastructure.
Research Data (and Software) Management at Imperial: (Everything you need to ...Sarah Anna Stewart
A presentation on research data management tools, workflows and best practices at Imperial College London with a focus on software management. Presented at the 2017 session of the HPC Summer School (Dept. of Computing).
- The Broad Institute is a non-profit biomedical research institute founded in 2004 with 50 core faculty members from Harvard and MIT and over 1000 research personnel.
- It focuses on specific disease areas through various programs and initiatives and technological innovation through several platforms.
- In order to take advantage of cloud technologies, organizations need to fundamentally change how they engage with technology and technologists to collaborate effectively in this new environment.
Slides from Spectra Logic’s inaugural BlackPearl Developer Summit, a virtual conference for current and potential Spectra Logic developers. You’ll get product updates from our CEO and BlackPearl product manager, and you will learn how these new features will help customers and developers. You will learn how to build a Spectra S3 client for BlackPearl, our private cloud gateway to our tape and disk storage systems. You will see how one of our partners developed a client and watch it in action. And you will get to ask questions to our BlackPearl Engineering team. Watch the Summit recording at https://www.youtube.com/watch?v=GYoSwrvhVM0
Presentation given at the Consorcio Madrono conference on Data Management Plans in Horizon 2020 http://www.consorciomadrono.es/info/web/blogs/formacion/217.php
Adoption of Cloud Computing in Scientific ResearchYehia El-khatib
Some might say the scientific research community is somewhat behind the curve of adopting the cloud. In this talk, I present a few examples of adopting the cloud from the wider research community. I also highlight some of the aspects by which cloud computing could affect scientific research in the near future and the associated challenges.
Data Con LA 2022-Open Source or Open Core in Your Data Layer? What Needs to B...Data Con LA
Open source software has progressed from being developed primarily by individuals and small teams to large projects overseen by foundations or commercial entities. There are debates around fully open source projects versus open core models, where core functionality is open but additional features are proprietary. Key considerations for organizations evaluating options include licensing terms, governance structures, impact on branding, available business models, and the overall ecosystem of users and contributors. While open core can provide traditional software vendor advantages, fully open source alternatives aim to avoid vendor lock-in and keep intellectual property communal.
Cloud Standards in the Real World: Cloud Standards Testing for DevelopersAlan Sill
Learn about standards studied in the US National Science Foundation Cloud and Autonomic Computing Industry/University Cooperative Research Center Cloud Standards Testing Lab and how you can get involved to extend the successes from these results in your own cloud software settings. Presented at the O'Reilly OSCON 2014 Open Cloud Day.
Video available at https://www.youtube.com/watch?v=eD2h0SqC7tY
Is cloud and NDT a good mix? NDT has its own specificity. Clouds can truly simplify the file management, but is any cloud solution adapted for the NDT? For example, Dropbox may not work right out of the box for our market. This presentation highlights different avenues about clouds (IaaS, PaaS, and SaaS); and highlights NDT critical requirements (constraints and needs). A list of different levels of cloud services (component, option, security, ...) will be defined. It is important to remember that private and public servers are 2 possible avenues. NDT was an early user of private servers even before it was called a cloud. Overall the main idea is to optimize the operation process to reduce OPEX and to increase availability and accuracy of data.
See: www.amotus-solutions.com or www.nubitus.com
This presentation discusses using airborne remote sensing to detect archaeological features through vegetation marks. It summarizes that spectro-radiometry shows good contrast in foliar pigmentation over time, while crop structure remains similar. Full waveform LiDAR correlates well with hyperspectral data and detects archaeological features through vegetation height more than other metrics like intensity. Different sensors and analysis techniques are needed depending on each field's variability, context and small archaeological signals within large remote sensing datasets.
Time-lapse analysis with earth resistance and electrical resistivity imagingDART Project
- The document discusses using time-lapse earth resistance analysis and electrical resistivity imaging to better understand how archaeological features respond over time and with changing soil moisture levels.
- A new methodology was introduced to quantify contrast factors between features and backgrounds based on detection tests and magnitude comparisons.
- Analysis of different study sites showed their response correlated differently with weather data, with some features most detectable during dry periods and others during wet periods.
- Extracting resistivity data from electrical resistivity profiles helped explain the causes of anomalies at different sites. At one site, the ditch anomaly was caused by resistivity differences between geological layers, while at another it was caused by moisture differences above a field drain.
- Understanding each site's
This document summarizes a workshop on using electromagnetic radiation to detect archaeological sites. It discusses how different soil properties like water content, organic matter, and temperature can affect the permittivity and conductivity measured by ground penetrating radar and other electromagnetic techniques. Case studies from two fields in Diddington show how these measurements vary over time with rainfall, infiltration, and temperature. The document also compares measurements from IMKO probes to a Campbell Scientific TDR100, finding the probes less accurate but easier to install long-term. The overall aim is to better understand how soil characteristics influence electromagnetic readings and how these techniques can be used for long-term monitoring of archaeological sites.
Archaeology, Informatics and Knowledge RepresentationDART Project
This document discusses using logic programming and ontologies to model stratigraphic relationships in archaeology. It presents an example stratigraphic sequence and shows how it can be represented and reasoned about using Prolog rules and predicates. Different states of the stratigraphic model are output as the data and rules are updated, demonstrating how logical reasoning can infer additional relationships and handle inconsistencies in the archaeological record. Ontologies like CIDOC-CRM are discussed as a way to formally represent archaeological concepts and relationships to support modeling landscape stratigraphy.
Archaeological detection using satellite sensorsDART Project
A presentation given by Anthony Beck at the workshop "Potential of satellite images and hyper/multi-spectral recording in archaeology"
Poznan – 31st June 2012
Dr. Anthony Beck proposes creating an open methodology store to facilitate collaborative development of research methods. The store would be a repository where users can deposit, share, tag, link, and develop methods in a transparent and open process. By making methods openly accessible, it aims to prevent duplicate work and allow all sectors to participate while capturing discussions around method development. The vision is for a system that links related methods and allows rich data like workflows to be submitted and reused across scientific communities.
An update on the progress of the DART project. Presented by Anthony Beck at the Consultant meeting on the 16th April 2012. The original prezi is available here: http://prezi.com/o2k18vxhpow7/dart_16042012_wherearewenow/
The effects of seasonal variation on archaeological detection using earth res...DART Project
The document summarizes an ongoing study investigating the effects of seasonal variation on archaeological detection using earth resistance surveys. Preliminary results from monthly surveys at two test sites show characteristic seasonal responses in soil resistivity. Resistivity generally increases in summer and decreases in winter. The large decrease from summer to winter appears related more to changes in temperature than rainfall. Further analysis of weather data and continued monthly surveys are proposed to better understand how seasonal effects influence archaeological detection capabilities using different techniques.
Using Time Domain Reflectometry (TDR) to Monitor the Geophysical Properties o...DART Project
This document discusses using Time Domain Reflectometry (TDR) to monitor the geophysical properties of archaeological residues over time. TDR devices were installed at multiple depths and locations, including within and outside of archaeological features, to collect hourly readings on permittivity, conductivity, and temperature. The data collected can help understand contrasts in electromagnetic properties between residues and surrounding soils. Challenges included equipment issues and animal damage. Future work involves further analyzing the data and linking permittivity to soil characteristics measured in a lab. The long-term monitoring provides insights to help detect archaeological sites using geophysical techniques.
DART - improving the science. Bradford 21022012DART Project
This document provides an overview of the DART project, which aims to improve the scientific understanding of archaeological detection. DART studies archaeological sites to better understand how their constituents generate observable contrasts and how sensors can detect these contrasts. The project conducts intensive ground observations and measurements at sites to analyze periodic changes in the sites. DART shares its data openly to maximize its impact and further innovation in archaeological detection.
A presentation given by Anthony Beck at the Royal Agricultural College, Cirencester on 14th February 2012. This presentation describes the data collected by the DART project and encourages members of the local communities to exploit this data.
It covers data, formats, licences, software, applications. This introductory presentation was followed up with an afternoon hands-on workshop.
An update on the progress of the DART project. Presented by Anthony Beck at the Consultant/Stakeholder meeting on the 11th January 2012. The original prezi is available here: http://prezi.com/wsvu366ftd9k/dart_11012012_wherearewenow/
The document discusses the DART (Detecting and Recording Archaeological Traces) project, which aims to improve archaeological detection techniques by taking a scientific approach. It involves intensive ground observation and data collection at sites to better understand how archaeological remains generate detectable contrasts and how those contrasts are influenced by environmental factors over time. The data collected includes spectro-radiometry, soil moisture and temperature probes, weather data, and aerial imagery. Preliminary analysis of temperature, moisture, and resistance data show changes seasonally that could help predict optimal times for detection. The open science approach seeks to further archaeological prospection methods.
A presentation given by Anthony Beck at the Archpro workshop1 in Vienna. The workshop was instigated by the Ludwig Boltzmann Institute.
This presentation covers the applications of satellite platforms for archaeological prospection and heritage management.
British Science Festival Presentation 12 September 2011DART Project
Archaeologists are using aerial photography and satellite imagery to discover new sites of archaeological and historical significance across Britain. By analyzing high resolution images taken from aircraft and satellites, archaeologists can identify subtle features on the ground surface that indicate the presence of ancient settlements, field systems and other structures that were previously unknown. This non-invasive technique is revealing our heritage in new ways and transforming our understanding of the past.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
Software, Licences etc
1. Software, licences and other usage
issues
Anthony (Ant) Beck
Twitter: AntArch
Potential of satellite images and hyper/multi-spectral
recording in archaeology
Poznan – 31st June 2012
School of Computing
Faculty of Engineering
6. Software - FOSS
Pros
•Well supported by the community
•Allows development on any machine
• You can develop a portfolio of work that stays with you
• Generally cross platform
•Transparent processing
• White box (VERY important)
•Open processes are more in-line with CLOUD processing
• Workflow orchestration
•Robust
•Extendable by individuals
7. Software - Proprietary
Cons
•Some interface issues
•There is a perception that FOSS is not as good as proprietary
•Bleeding edge 3d does not always compete with the
commercial
8. Software – Open Source Geospatial foundation
OSGeo: http://www.osgeo.org/
• To provide resources for foundation projects -
• To promote freely available geodata
• To promote the use of open source software in the geospatial industry
(not just foundation software) - eg. PR, training, outreach.
• To encourage the implementation of open standards and standards-
based interoperability in foundation projects.
• To provide support for the use of OSGeo software in education via
curriculum development, outreach, and support.
• To support use and contribution to foundation projects from the
worldwide community through internationalization of software and
community outreach.
• To operate an annual OSGeo Conference,
9.
10. Software – Open Source Geospatial foundation
OSGeo: http://www.osgeo.org/
• Live DVD: http://live.osgeo.org/
• A pre-configured XUBUNTU O/S with a range of applications installed
and pre-configured (this is excellent )
• OSGeo4W: https://trac.osgeo.org/osgeo4w/
• A pre-compiled binary install of different OSGeo approved packages
for windows.
• Both the above deal with some of the more complex bindings between
applications.
12. Licences - Why are they an issue?
Data is rarely in the „public domain‟
• It is normally available under a licence.
Licences dictate:
• Who owns the data
• How you can use data
• How you reference the data
• How you can share/redistribute the data (and any derivatives)
LICENCES ARE REALLY IMPORTANT
13. Licences – examples: Commercial high resolution
satellite
Under a strict license that dictates:
• Who can use the data (normally a single organisation)
• Sometimes for how long
• What happens with the derivatives
The license protects their data which protects their income
stream
A user does not „own‟ this data
• They use it under licence
14. Licences – examples: GovernmentResearch satellite
data
The licences vary:
• Military
• Research grade
• Archive
Much of this data is released to the academic community
• Community science (?) initiatives
• The collecting body does not have the resources to analyse the data
• The collecting body captures data on behalf of a broad community
• The data is no longer sensitive or relevant
Examples: Landsat, ASTER, Corona
15. Licences – examples Landsat 8
Australia will publish images captured by soon-to-launch
satellite Landsat 8 online, in close to real time, for free under
the Creative Commons Attribution 3.0 Australia licence.
“We want to make as much data freely available as possible,”
says Jeff Kingwell, the Section Leader of GA‟s National Earth
Observation Group. “We will move towards a system where
we are taking Landsat data in, in near real time.” Data will be
corrected to make it usable, then published, all in as close to
real time as is practical.
How COOL is that
17. Licences – Take Home Points
Prior to using data find out the licensing constraints
• Licence holders ARE litigious – they have every right to sue you if you
infringe their licence
If you buy data – ensure you licence this for as broad a re-use
base as possible (NEVER licence to an individual)
If you own data – always provide it to others with a CLEAR re-
use licence
• If you want credit include an attribution clause
DO NOT derive maps etc. from Google Earth data – it is illegal
18. Licences – DART (www.dartproject.info)
DART does the following:
• For data under licences:
• Ensures broad access
• Opens data where possible (NERC ARSF)
• Encourages re-use
• For data it owns: gives it away
• Data: Open Data Commons By Attribution licences
(http://opendefinition.org/licenses/odc-by/)
• Everything else: Creative Commons By Attribution licences (CC-By:
http://opendefinition.org/licenses/cc-by/)
• Why
• Open Science
19. Open Data: Server (in the near future)
The full project archive will be available from the server
Raw Data
Processed Data
Web Services
Will also include
TDR data
Weather data
Subsurface temperature data
Soil analyses
spectro-radiometry transects
Crop analyses
Excavation data
In-situ photos ETC.
21. Why are we doing this – it‟s the right thing to do
DART is a publically funded project
Publically funded data should provide benefit to the public
22. Why are we doing this – IMPACT/unlocking potential
More people use the data then there is improved impact
Better financial and intellectual return for the investors
23. Why are we doing this – innovation
Reducing barriers to data and knowledge can improve
innovation
24. Why are we doing this – education
To provide baseline exemplar data for teaching and learning
25. Why are we doing this – building our network
Find new ways to exploit our data
Develop contacts
Write more grant applications