DEEP-Hybrid-DataCloud is a Horizon 2020 project that aims to promote intensive computing services for analyzing large datasets through a hybrid cloud approach. It received funding from the European Union to develop specialized computing infrastructure and integrate intensive computing services. The project involves 9 academic and 1 industrial partners across 6 European countries. It will define a "DEEP as a Service" solution and evolve existing INDIGO components to better support intensive computing workloads and specialized hardware.
Bob Jones, CERN & HNSciCloud Coordinator gives an update on the HNSciCloud Pre-Commercial Procurement which is now in its Solution Prototyping phase. The presentation includes also an overview of the prototypes under development.
This presentation, given by Bob Jones, CERN & HNSciCloud Coordinator, at the ESA-ESPI Workshop on “Space Data & Cloud Computing Infrastructures: Policies and Regulations”, describes what are the challenges and needs of the cloud users and explains how an hybrid cloud model can support them.
Hajira Jabeen introduces the Big Data Europe Integrator Platform. The deck also includes the slides use to summarise the other presentations in the launch webinar.
Bob Jones, CERN & HNSciCloud Coordinator gives an update on the HNSciCloud Pre-Commercial Procurement which is now in its Solution Prototyping phase. The presentation includes also an overview of the prototypes under development.
This presentation, given by Bob Jones, CERN & HNSciCloud Coordinator, at the ESA-ESPI Workshop on “Space Data & Cloud Computing Infrastructures: Policies and Regulations”, describes what are the challenges and needs of the cloud users and explains how an hybrid cloud model can support them.
Hajira Jabeen introduces the Big Data Europe Integrator Platform. The deck also includes the slides use to summarise the other presentations in the launch webinar.
Presentation by Philippe O.A. Navaux, professor at the Universidade Federal of Rio Grande do Sul and Computer Science Area Director of CAPES at Cloudscape Brazil 2017 & WCN 2017
This a RECAP project overview slide deck prepared by Thang Le Duc (UMU), P-O Östberg (UMU) and Tomas Brännström (Tieto). It starts with an introduction and continues with a section on challenges for a self-orchestrated, self-remediated cloud system. It then presents the RECAP vision and use cases and finishes with a conclusion.
A helper for data scientists
The EDP (Environmental Data Platform), manage by Center for Sensing Solutions of Eurac Research, is a compound of several open-source software and components for data and metadata management. In particular, it provides tools to (i) discover the types of data available, to (ii) process big datasets and to (iii) visualize them. The EDP encompasses openEO, JupyterHub, Maps (based on Geonode) and Geonetwork software, among others.
The EDP portal is a web application which helps the viewer accessing and exploring the environmental data platform, acting as unique access point to the EDP platform.
In more detail, the EDP portal has three main features: an intuitive data discovery interface, a dedicated data processing environment (for python and R scripting) and a comprehensive documentation repository.
The Brenner Base Tunnel (BBT) migrated its whole Geospatial infrastructure to an open source solution: implications and results
One of the largest infrastructure projects in Europe decided after having worked for 15 years on a proprietary architecture, to migrate the whole geospatial infrastructure to open source components. The starting situation, the migration process and the final result will be described, along with the advantages of the new architecture. In particular the new multilingual WebGIS interface based on QGIS server will be presented.
Presentation by Bruno Schulze, Senior Researcher / Professor at Laboratório Nacional de Computação Científica (LNCC) at Cloudscape Brazil 2017 & WCN 2017
HDF5 (with Nexus) is becoming the de facto standard in most X-ray facilities. However, it is not always easy to navigate such files to get quick feedback on the data, due to the peculiar structure of Nexus files. HDF5 file viewers are one way to solve this issue. They allow for the browsing and inspecting of the hierarchical structure of HDF5 files, as well as visualising the datasets they contain as basic plots (1D, 2D, 3D).
This presentation will focus on h5web, the open-source web-based viewer being developed at the European Synchrotron Radiation Facility. The intent is to provide synchrotron users with an easy-to-use application and to make open-source components available for other similar web applications. `h5web` is built with React, a front-end web development library. It supports the exploration of HDF5 files, requested from a separate back-end (e.g. HSDS) for modularity, and the visualisation of datasets using performant WebGL-based visualisations.
RECAP at ETSI Experiential Network Intelligence (ENI) MeetingRECAP Project
This presentation was delivered by Johan Forsman (Tieto), Jörg Domaschka (UULM) and Paolo Casari (IMDEA Networks) at the ETSI Experiential Network Intelligence (ENI) Meeting in Warsaw, Poland, on April 12th, 2019. ETSI Experiential Networked Industry Specification Group (ENI ISG) work on defining a Cognitive Network Management architecture using Artificial Intelligence (AI) techniques and context-aware policies to adjust offered services based on changes in user needs, environmental conditions and business goals. The intention is that the use of Artificial Intelligence techniques in the network management system should solve some of the problems of future network deployment and operations. For more information, see https://www.etsi.org/technologies/experiential-networked-intelligence.
The benefits of Linked Data are well known, but the supporting software ecosystem is still somewhat lacking. During this presentation we will look into the approach taken by Joinup: How we start from a formalized ontology, and map this to the Joinup website. We’ll give an overview of the Open Source components that we created for building linked data based CMS applications.
Optimising Service Deployment and Infrastructure Resource ConfigurationRECAP Project
This is a presentation delivered by Alec Leckey (Intel) at the 2nd Data Centre Symposium held in conjunction with the National Conference on Cloud Computing and Commerce (http://2018.nc4.ie/) on April 10, 2018 in Dublin, Ireland.
Learn more about the RECAP project: https://recap-project.eu/
Install the Intel Landscaper: https://github.com/IntelLabsEurope/landscaper
Towards the Intelligent Internet of EverythingRECAP Project
In this presentation, Prof. Theo Lynn (DCU) was talking about observations on Multi-disciplinary Challenges in Intelligent Systems Research, at the RECAP consortium meeting in Dublin, Ireland on 06 November 2018.
Worried about the learning curve to introduce Deep Learning in your organization? Don’t be. The DEEP-HybridDataCloud project offers a framework for all users, including non-experts, enabling the transparent training, sharing and serving of Deep Learning models both locally or on hybrid cloud system. In this webinar we will be showing a set of use cases, from different research areas, integrated within the DEEP infrastructure.
The DEEP solution is based on Docker containers packaging already all the tools needed to deploy and run the Deep Learning models in the most transparent way. No need to worry about compatibility problems. Everything has already been tested and encapsulated so that the user has a fully working model in just a few minutes. To make things even easier, we have developed an API allowing the user to interact with the model directly from the web browser.
Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
José Luiz Ribeiro Filho, Director of Services and Solutions of the Brazilian National Education and Research Network (RNP), Brazil
Cloud Federation & Open Science Cloud at cross-regional level
Presentation by Philippe O.A. Navaux, professor at the Universidade Federal of Rio Grande do Sul and Computer Science Area Director of CAPES at Cloudscape Brazil 2017 & WCN 2017
This a RECAP project overview slide deck prepared by Thang Le Duc (UMU), P-O Östberg (UMU) and Tomas Brännström (Tieto). It starts with an introduction and continues with a section on challenges for a self-orchestrated, self-remediated cloud system. It then presents the RECAP vision and use cases and finishes with a conclusion.
A helper for data scientists
The EDP (Environmental Data Platform), manage by Center for Sensing Solutions of Eurac Research, is a compound of several open-source software and components for data and metadata management. In particular, it provides tools to (i) discover the types of data available, to (ii) process big datasets and to (iii) visualize them. The EDP encompasses openEO, JupyterHub, Maps (based on Geonode) and Geonetwork software, among others.
The EDP portal is a web application which helps the viewer accessing and exploring the environmental data platform, acting as unique access point to the EDP platform.
In more detail, the EDP portal has three main features: an intuitive data discovery interface, a dedicated data processing environment (for python and R scripting) and a comprehensive documentation repository.
The Brenner Base Tunnel (BBT) migrated its whole Geospatial infrastructure to an open source solution: implications and results
One of the largest infrastructure projects in Europe decided after having worked for 15 years on a proprietary architecture, to migrate the whole geospatial infrastructure to open source components. The starting situation, the migration process and the final result will be described, along with the advantages of the new architecture. In particular the new multilingual WebGIS interface based on QGIS server will be presented.
Presentation by Bruno Schulze, Senior Researcher / Professor at Laboratório Nacional de Computação Científica (LNCC) at Cloudscape Brazil 2017 & WCN 2017
HDF5 (with Nexus) is becoming the de facto standard in most X-ray facilities. However, it is not always easy to navigate such files to get quick feedback on the data, due to the peculiar structure of Nexus files. HDF5 file viewers are one way to solve this issue. They allow for the browsing and inspecting of the hierarchical structure of HDF5 files, as well as visualising the datasets they contain as basic plots (1D, 2D, 3D).
This presentation will focus on h5web, the open-source web-based viewer being developed at the European Synchrotron Radiation Facility. The intent is to provide synchrotron users with an easy-to-use application and to make open-source components available for other similar web applications. `h5web` is built with React, a front-end web development library. It supports the exploration of HDF5 files, requested from a separate back-end (e.g. HSDS) for modularity, and the visualisation of datasets using performant WebGL-based visualisations.
RECAP at ETSI Experiential Network Intelligence (ENI) MeetingRECAP Project
This presentation was delivered by Johan Forsman (Tieto), Jörg Domaschka (UULM) and Paolo Casari (IMDEA Networks) at the ETSI Experiential Network Intelligence (ENI) Meeting in Warsaw, Poland, on April 12th, 2019. ETSI Experiential Networked Industry Specification Group (ENI ISG) work on defining a Cognitive Network Management architecture using Artificial Intelligence (AI) techniques and context-aware policies to adjust offered services based on changes in user needs, environmental conditions and business goals. The intention is that the use of Artificial Intelligence techniques in the network management system should solve some of the problems of future network deployment and operations. For more information, see https://www.etsi.org/technologies/experiential-networked-intelligence.
The benefits of Linked Data are well known, but the supporting software ecosystem is still somewhat lacking. During this presentation we will look into the approach taken by Joinup: How we start from a formalized ontology, and map this to the Joinup website. We’ll give an overview of the Open Source components that we created for building linked data based CMS applications.
Optimising Service Deployment and Infrastructure Resource ConfigurationRECAP Project
This is a presentation delivered by Alec Leckey (Intel) at the 2nd Data Centre Symposium held in conjunction with the National Conference on Cloud Computing and Commerce (http://2018.nc4.ie/) on April 10, 2018 in Dublin, Ireland.
Learn more about the RECAP project: https://recap-project.eu/
Install the Intel Landscaper: https://github.com/IntelLabsEurope/landscaper
Towards the Intelligent Internet of EverythingRECAP Project
In this presentation, Prof. Theo Lynn (DCU) was talking about observations on Multi-disciplinary Challenges in Intelligent Systems Research, at the RECAP consortium meeting in Dublin, Ireland on 06 November 2018.
Worried about the learning curve to introduce Deep Learning in your organization? Don’t be. The DEEP-HybridDataCloud project offers a framework for all users, including non-experts, enabling the transparent training, sharing and serving of Deep Learning models both locally or on hybrid cloud system. In this webinar we will be showing a set of use cases, from different research areas, integrated within the DEEP infrastructure.
The DEEP solution is based on Docker containers packaging already all the tools needed to deploy and run the Deep Learning models in the most transparent way. No need to worry about compatibility problems. Everything has already been tested and encapsulated so that the user has a fully working model in just a few minutes. To make things even easier, we have developed an API allowing the user to interact with the model directly from the web browser.
Congresso Sociedade Brasileira de Computação CSBC2016 Porto Alegre (Brazil)
Workshop on Cloud Networks & Cloudscape Brazil
José Luiz Ribeiro Filho, Director of Services and Solutions of the Brazilian National Education and Research Network (RNP), Brazil
Cloud Federation & Open Science Cloud at cross-regional level
Gergely Sipos (EGI): Exploiting scientific data in the international context ...Gergely Sipos
Keynote presentation given at "The Emerging Technology Forum – Data Creates Universe - Scientific Data Innovation Conference" of the "Pujiang Innovation Forum 2021" event.
The EOSC Compute Platform with the EGI-ACE project EGI Federation
EGI-ACE’s main goal is to implement the compute platform of the European Open Science Cloud and contribute to the EOSC Data Commons by delivering integrated computing platforms, data spaces and tools as an integrated solution that is aligned with major European cloud federation projects and HPC initiatives.
This presentation introduces you to the architecture and composition of the EOSC Compute Platform, which delivers capabilities at the IaaS, PaaS and SaaS level.
OCRE Workshop: Shaping the Earth Observation Services Market for Research. Session 3: Presentations from DIAS and eoMALL.
This workshop aims to bring the EO service providers closer to the research community, capture their needs and develop fit for purpose EO services.
The event will be the 4th OCRE Requirements Gathering Workshop. Researchers and Earth Observation Service Providers will be asked to provide inputs to help us shape OCRE's tender.
The OCRE project aims to provide the first end-to-end instance of organised, large-scale market pull for EO services in Europe. These services will be provided for free to EU researchers through the European Open Science Cloud. To ensure that the services meet the actual needs of the research community we invite both the demand and the supply side, to share their views and engage in a productive dialogue. Our aim is to capture the needs of EU researchers and inform the EO service providers so that they make available services that effectively address them. We will also explain how the OCRE process will work, how the different stakeholders should be involved and how to make the most of the foreseen benefits.
Cloud Computing Needs for Earth Observation Data Analysis: EGI and EOSC-hubBjörn Backeberg
This presentation was given during the Japan Geosciences Union 2019. Session details can be found at http://www.jpgu.org/meeting_e2019/SessionList_en/detail/M-GI31.htm
Phoenix Data Conference - Big Data Analytics for IoT 11/4/17Mark Goldstein
“Big Data for IoT: Analytics from Descriptive to Predictive to Prescriptive” was presented to the Phoenix Data Conference on 11/4/17 at Grand Canyon University.
As the Internet of Things (IoT) floods data lakes and fills data oceans with sensor and real-world data, analytic tools and real-time responsiveness will require improved platforms and applications to deal with the data flow and move from descriptive to predictive to prescriptive analysis and outcomes.
Bridging the gap to facilitate selection and image analysis activities for la...Phidias
PHIDIAS organised it's third and final PHIDIAS Webinar of the series, this time dedicated to Use Case 2: Big Data Earth Observations (EO), took place on 18 February 2021 at 15:00 CET, showcasing how PHIDIAS is taking advantage of HPC architecture to facilitate selection and image analysis activities for land surface monitoring.
EOSC support to scientific computing needs in to Earth Observation with the EGI Federated Cloud
The European Open Science Cloud (EOSC) supports multi-disciplinary science, and Earth Observation is one of the major use cases.
EOSC will provide capacity and capabilities for the fostering the exploitation of EO data, this can be achieved by federating cloud providers of EGI, DIAS, and data analytics tools. In this presentation, we show how EOSC can rely on a public-private cloud federation for delivering its compute platform for EO.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Deep Hybrid DataCloud
1. DEEP-Hybrid-DataCloud has received funding from the European Union’s Horizon 2020
research and innovation programme under grant agreement No 777435.
DEEP general presentation
Brief project overview
Jesús Marco de Lucas, Álvaro López García
{marco,aloga}@ifca.unican.es
Spanish National Research Council
EOSC-HUB Week
Malaga
16-20 April 2018
2. DEEP-HybridDataCloud: context
H2020 project, EINFRA-21 call
Topic: Platform-driven e-Infrastructure towards the European Open Science Cloud
Scope: Computing e-infrastructure with extreme large datasets
DEEP-HybridDataCloud: Designing and Enabling E-Infrastructures for intensive data Processing in
a Hybrid DataCloud
Started as a spin-off project (together with XDC) from INDIGO-DataCloud technologies
Submitted in March 2017
Started November 1st 2017
Grant agreement number 777435
Global objective: Promote the use of intensive computing services by different research
communities and areas, and their support by the corresponding e-Infrastructure providers and
open source projects.
3. DEEP consortium
Balanced set of partners
Strong technological background on development, implementation,
deployment and operation of federated e-Infrastructures
9 academic partners
CSIC, LIP, INFN, PSNC, KIT, UPV, CESNET, IISAS, HMGU
1 industrial partner
Atos
6 countries
Spain, Italy, Poland, Germany, Czech Republic, Slovakia
4. DEEP project objectives
Focus on intensive computing techniques for the analysis of very large datasets
considering demanding use cases
Evolve up to production level intensive computing services exploiting specialized
hardware
Integrate intensive computing services under a hybrid cloud approach
Define a “DEEP as a Service” solution to offer an adequate integration path to
developers of final applications
Analyse the complementarity with other ongoing projects targeting added value
services for the cloud
5. DEEP pilot use cases
Deep learning
Pilot cases: stem cells, biodiversity applications, medical image
Provide a general, distributed architecture and pipeline to train deep learning (and other)
models
Post-processing
Pilot cases: post-processing of HPC simulations
Flexible pipeline for the analysis of simulation data generated at HPC resources
On-line analysis of data streams
Pilot case: intrusion detection systems
Provide an architecture able to analyze massive on-line data streams, also with historical
records
6. INDIGO Components and evolution (I)
INDIGO Orchestrator
Hybrid support on multiple sites
Support for specialized computing hardware
Infrastructure Manager
Hybrid cloud support involving specialized computing hardware
uDocker
Support for GPUs and specialized hardware to be further developed
Cloud Information System
Missing information about accelerators or specialized hardware at a provider
React faster to changes in the infrastructure (faster publication and propagation of
information)
7. INDIGO Components and evolution (II)
OpenStack/OpenNebula: extensions needed to properly support accelerators:
improving scheduling strategies, easier configuration and improved documentation.
PaaS layer: support for specialized computing hardware
Docker: container technology for applications
LXC: alternative hypervisor
Ansible: contextualization and configuration tool, further development of modules
INDIGO Virtual Router: improvements to reach production level
8. DEEP work programme
Plan and requirements (Nov 2017 – Jan 2018)
Initial design (Feb – Apr 2018)
First prototype (May – Oct 2018)
in-situ integration meeting to take place: conclude the integration of the first testbed prototype, supporting at least two
initial pilot applications
Second prototype
Improvement of design and proposed solutions (first quarter of 2019)
Integration towards a “second prototype” (mid 2019)
Full Pilot testbed
Integration of all the Pilot applications and their tuning for high performance
Promotion and exploitation (2020)
Improve the support and final quality of the solutions
Promote the exploitation in the EOSC framework, following the integration activities
10. https://deep-hybrid-datacloud.eu
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 777435.
Thank you
Any Questions?