Science Demonstrator Session: Physics and AstrophysicsEOSCpilot .eu
The main focus of Science Demonstrator sessions is to provide feedback to the EOSC community on the first experience of science demonstrators in the practical use of the emerging EOSC ecosystem.
Each panel will consist of a representative of a Science Demonstrator that will provide an overview of their experiences in the use of emerging EOSC services.
These sessions will help members of the scientific communities understanding the current state of maturity of the EOSC ecosystem and what is obtainable in a field of scientific research. It is also valuable to prospective Service Providers who wish to discover what are the challenges and opportunities that user communities might have to deal with, as a result of the adoption of their services.
This session will focus on Physics and Astrophysics.
Science Demonstrator Session: Social and Earth SciencesEOSCpilot .eu
The main focus of Science Demonstrator sessions is to provide feedback to the EOSC community on the first experience of science demonstrators in the practical use of the emerging EOSC ecosystem.
Each panel will consist of a representative of a Science Demonstrator that will provide an overview of their experiences in the use of emerging EOSC services.
These sessions will help members of the scientific communities understanding the current state of maturity of the EOSC ecosystem and what is obtainable in a field of scientific research. It is also valuable to prospective Service Providers who wish to discover what are the challenges and opportunities that user communities might have to deal with, as a result of the adoption of their services.
This session will focus on Social and Earth Sciences.
The health crisis due to COVID-19 is shaping a new reality in which the exchange and access to health data in a secure way will be more and more necessary. In this complex challenge converge both the respect for the individual rights as well as the interests of the patients and the need to promote the research in pursuit of the public interest. To face this challenge, we can find different approaches across Europe. In this webinar, we will present the experiences of three EU-funded projects (BigMedilytics, BodyPass, and DeepHealth), besides an overview of the legal framework and recommendations to enforce both national regulations and GDPR by an expert in data privacy and security.
Science Demonstrator Session: Physics and AstrophysicsEOSCpilot .eu
The main focus of Science Demonstrator sessions is to provide feedback to the EOSC community on the first experience of science demonstrators in the practical use of the emerging EOSC ecosystem.
Each panel will consist of a representative of a Science Demonstrator that will provide an overview of their experiences in the use of emerging EOSC services.
These sessions will help members of the scientific communities understanding the current state of maturity of the EOSC ecosystem and what is obtainable in a field of scientific research. It is also valuable to prospective Service Providers who wish to discover what are the challenges and opportunities that user communities might have to deal with, as a result of the adoption of their services.
This session will focus on Physics and Astrophysics.
Science Demonstrator Session: Social and Earth SciencesEOSCpilot .eu
The main focus of Science Demonstrator sessions is to provide feedback to the EOSC community on the first experience of science demonstrators in the practical use of the emerging EOSC ecosystem.
Each panel will consist of a representative of a Science Demonstrator that will provide an overview of their experiences in the use of emerging EOSC services.
These sessions will help members of the scientific communities understanding the current state of maturity of the EOSC ecosystem and what is obtainable in a field of scientific research. It is also valuable to prospective Service Providers who wish to discover what are the challenges and opportunities that user communities might have to deal with, as a result of the adoption of their services.
This session will focus on Social and Earth Sciences.
The health crisis due to COVID-19 is shaping a new reality in which the exchange and access to health data in a secure way will be more and more necessary. In this complex challenge converge both the respect for the individual rights as well as the interests of the patients and the need to promote the research in pursuit of the public interest. To face this challenge, we can find different approaches across Europe. In this webinar, we will present the experiences of three EU-funded projects (BigMedilytics, BodyPass, and DeepHealth), besides an overview of the legal framework and recommendations to enforce both national regulations and GDPR by an expert in data privacy and security.
At the heart of this DataBench webinar is the goal to share a benchmarking process helping European organisations developing Big Data Technologies to reach for excellence and constantly improve their performance, by measuring their technology development activity against parameters of high business relevance.
The webinar aims to provide the audience with a framework and tools to assess the performance and impact of Big Data and AI technologies, by providing real insights coming from DataBench. In addition, representatives from other projects part of the BDV PPP such as DeepHealth and They-Buy-for-You will participate to share the challenges and opportunities they have identified on the use of Big Data, Analytics, AI. The perspective of other projects that also have looked into benchmarking, such as Track&Now and I-BiDaaS will be introduced.
Optalysis: Disruptive Optical Processing Technology for HPCinside-BigData.com
In this video from the Disruptive Technologies Session at the 2015 HPC User Forum, Nick New from Optalysis describes the company's optical processing technology.
"Optalysys technology uses light, rather than electricity, to perform processor intensive mathematical functions (such as Fourier Transforms) in parallel at incredibly high-speeds and resolutions. It has the potential to provide multi-exascale levels of processing, powered from a standard mains supply. The mission is to deliver a solution that requires several orders of magnitude less power than traditional High Performance Computing (HPC) architectures."
Watch the video presentation: http://wp.me/p3RLHQ-ewz
Gergely Sipos (EGI): Exploiting scientific data in the international context ...Gergely Sipos
Keynote presentation given at "The Emerging Technology Forum – Data Creates Universe - Scientific Data Innovation Conference" of the "Pujiang Innovation Forum 2021" event.
An Experimental Workflow Development Platform for Historical Document Digitis...cneudecker
An Experimental Workflow Development Platform for Historical Document Digitisation and Analysis
International Workshop on Historical Document Imaging and Processing (HIP).
ICDAR 2011, 16-17 September 2011, Beijing, China.
Designing RISC-V-based Accelerators for next generation Computers (DRAC) is a 3-year project (2019-2022) funded by the ERDF Operational Program of Catalonia 2014-2020. DRAC will design, verify, implement and fabricate a high performance general purpose processor that will incorporate different accelerators based on the RISC-V technology, with specific applications in the field of post-quantum security, genomics and autonomous navigation. In this talk, we will provide an overview of the main achievements in the DRAC project, including the fabrication of Lagarto, the first RISC-V processor developed in Spain.
On 29 January 2020 ARCHIVER launched its Request for Tender with the purpose to award several Framework Agreements and work orders for the provision of R&D for hybrid end-to-end archival and preservation services that meet the innovation challenges of European Research communities, in the context of the European Open Science Cloud.
The tender was closed on 28 April 2020 and 15 R&D bids were submitted, with consortia that included 43 companies and organisations. The best bids have been selected and will start the first phase of the ARCHIVER R&D (Solution Design) in June 2020.
On Monday 8 June the selected consortia for the ARCHIVER design phase have been announced during a Public Award Ceremony starting at 14.00 CEST.
In light of the COVID-19 outbreak and the and consequent movement restrictions imposed in several countries, the event has been organised as a webinar, virtually hosted by Port d’Informació Científica (PIC), a member of the Buyers Group of the ARCHIVER consortium.
The Kick-off marks the beginning of the Solution Design Phase.
Reducing Infrastructure and Service Fragmentation EOSCpilot .eu
This presentation was held at the 1st EOSC Stakeholder Forum 28-29/11/2017 in Brussels.
The presentation gives a detailed overview of 5 Science Demonstrators, namely FusionHPC, Photon-Neutron, TextCrowd, PanCancer.
For more information on the EOSCpilot Science Demonstrators visit: https://eoscpilot.eu/science-demonstrators
For more information on the 1st EOSC Stakeholder Forum visit: https://eoscpilot.eu/eosc-stakeholder-forum-shaping-future-eosc
Follow EOSCpilot on Twitter: https://twitter.com/eoscpilot
and LinkedIn: https://uk.linkedin.com/in/eoscpiloteu
EOSC support to scientific computing needs in to Earth Observation with the EGI Federated Cloud
The European Open Science Cloud (EOSC) supports multi-disciplinary science, and Earth Observation is one of the major use cases.
EOSC will provide capacity and capabilities for the fostering the exploitation of EO data, this can be achieved by federating cloud providers of EGI, DIAS, and data analytics tools. In this presentation, we show how EOSC can rely on a public-private cloud federation for delivering its compute platform for EO.
The EOSC Compute Platform with the EGI-ACE project EGI Federation
EGI-ACE’s main goal is to implement the compute platform of the European Open Science Cloud and contribute to the EOSC Data Commons by delivering integrated computing platforms, data spaces and tools as an integrated solution that is aligned with major European cloud federation projects and HPC initiatives.
This presentation introduces you to the architecture and composition of the EOSC Compute Platform, which delivers capabilities at the IaaS, PaaS and SaaS level.
Big Data lay at the core of the strong data economy that is emerging in Europe. Although both large enterprises and SMEs acknowledge the potential of Big Data in disrupting the market and business models, this is not reflected in the growth of the data economy. The lack of trusted, secure, ethical-driven personal data platforms and privacy-aware analytics, hinders the growth of the data economy and creates concerns. The main considerations are related to the secure sharing of personal and proprietary/industrial data, and the definition of a fair remuneration mechanism that will be able to capture, produce, release and cash out the value of data, always for the benefit of all the involved stakeholders.
This webinar will focus on how such concerns that pertain to privacy, ethics and intellectual property rights can be tackled, by allowing individuals to take ownership and control of their data and share them at will, through flexible data sharing and fair compensation schemes with other entities (companies or not), as researched by the DataVaults project.
Big Data lay at the core of the strong data economy that is emerging in Europe. Although both large enterprises and SMEs acknowledge the potential of Big Data in disrupting the market and business models, this is not reflected in the growth of the data economy. The lack of trusted, secure, ethical-driven personal data platforms and privacy-aware analytics, hinders the growth of the data economy and creates concerns. The main considerations are related to the secure sharing of personal and proprietary/industrial data, and the definition of a fair remuneration mechanism that will be able to capture, produce, release and cash out the value of data, always for the benefit of all the involved stakeholders.
This webinar will focus on how such concerns that pertain to privacy, ethics and intellectual property rights can be tackled, by allowing individuals to take ownership and control of their data and share them at will, through flexible data sharing and fair compensation schemes with other entities (companies or not), as researched by the DataVaults project.
More Related Content
Similar to Heterogeneous HPC Computing in the DeepHealth Project
At the heart of this DataBench webinar is the goal to share a benchmarking process helping European organisations developing Big Data Technologies to reach for excellence and constantly improve their performance, by measuring their technology development activity against parameters of high business relevance.
The webinar aims to provide the audience with a framework and tools to assess the performance and impact of Big Data and AI technologies, by providing real insights coming from DataBench. In addition, representatives from other projects part of the BDV PPP such as DeepHealth and They-Buy-for-You will participate to share the challenges and opportunities they have identified on the use of Big Data, Analytics, AI. The perspective of other projects that also have looked into benchmarking, such as Track&Now and I-BiDaaS will be introduced.
Optalysis: Disruptive Optical Processing Technology for HPCinside-BigData.com
In this video from the Disruptive Technologies Session at the 2015 HPC User Forum, Nick New from Optalysis describes the company's optical processing technology.
"Optalysys technology uses light, rather than electricity, to perform processor intensive mathematical functions (such as Fourier Transforms) in parallel at incredibly high-speeds and resolutions. It has the potential to provide multi-exascale levels of processing, powered from a standard mains supply. The mission is to deliver a solution that requires several orders of magnitude less power than traditional High Performance Computing (HPC) architectures."
Watch the video presentation: http://wp.me/p3RLHQ-ewz
Gergely Sipos (EGI): Exploiting scientific data in the international context ...Gergely Sipos
Keynote presentation given at "The Emerging Technology Forum – Data Creates Universe - Scientific Data Innovation Conference" of the "Pujiang Innovation Forum 2021" event.
An Experimental Workflow Development Platform for Historical Document Digitis...cneudecker
An Experimental Workflow Development Platform for Historical Document Digitisation and Analysis
International Workshop on Historical Document Imaging and Processing (HIP).
ICDAR 2011, 16-17 September 2011, Beijing, China.
Designing RISC-V-based Accelerators for next generation Computers (DRAC) is a 3-year project (2019-2022) funded by the ERDF Operational Program of Catalonia 2014-2020. DRAC will design, verify, implement and fabricate a high performance general purpose processor that will incorporate different accelerators based on the RISC-V technology, with specific applications in the field of post-quantum security, genomics and autonomous navigation. In this talk, we will provide an overview of the main achievements in the DRAC project, including the fabrication of Lagarto, the first RISC-V processor developed in Spain.
On 29 January 2020 ARCHIVER launched its Request for Tender with the purpose to award several Framework Agreements and work orders for the provision of R&D for hybrid end-to-end archival and preservation services that meet the innovation challenges of European Research communities, in the context of the European Open Science Cloud.
The tender was closed on 28 April 2020 and 15 R&D bids were submitted, with consortia that included 43 companies and organisations. The best bids have been selected and will start the first phase of the ARCHIVER R&D (Solution Design) in June 2020.
On Monday 8 June the selected consortia for the ARCHIVER design phase have been announced during a Public Award Ceremony starting at 14.00 CEST.
In light of the COVID-19 outbreak and the and consequent movement restrictions imposed in several countries, the event has been organised as a webinar, virtually hosted by Port d’Informació Científica (PIC), a member of the Buyers Group of the ARCHIVER consortium.
The Kick-off marks the beginning of the Solution Design Phase.
Reducing Infrastructure and Service Fragmentation EOSCpilot .eu
This presentation was held at the 1st EOSC Stakeholder Forum 28-29/11/2017 in Brussels.
The presentation gives a detailed overview of 5 Science Demonstrators, namely FusionHPC, Photon-Neutron, TextCrowd, PanCancer.
For more information on the EOSCpilot Science Demonstrators visit: https://eoscpilot.eu/science-demonstrators
For more information on the 1st EOSC Stakeholder Forum visit: https://eoscpilot.eu/eosc-stakeholder-forum-shaping-future-eosc
Follow EOSCpilot on Twitter: https://twitter.com/eoscpilot
and LinkedIn: https://uk.linkedin.com/in/eoscpiloteu
EOSC support to scientific computing needs in to Earth Observation with the EGI Federated Cloud
The European Open Science Cloud (EOSC) supports multi-disciplinary science, and Earth Observation is one of the major use cases.
EOSC will provide capacity and capabilities for the fostering the exploitation of EO data, this can be achieved by federating cloud providers of EGI, DIAS, and data analytics tools. In this presentation, we show how EOSC can rely on a public-private cloud federation for delivering its compute platform for EO.
The EOSC Compute Platform with the EGI-ACE project EGI Federation
EGI-ACE’s main goal is to implement the compute platform of the European Open Science Cloud and contribute to the EOSC Data Commons by delivering integrated computing platforms, data spaces and tools as an integrated solution that is aligned with major European cloud federation projects and HPC initiatives.
This presentation introduces you to the architecture and composition of the EOSC Compute Platform, which delivers capabilities at the IaaS, PaaS and SaaS level.
Similar to Heterogeneous HPC Computing in the DeepHealth Project (20)
Big Data lay at the core of the strong data economy that is emerging in Europe. Although both large enterprises and SMEs acknowledge the potential of Big Data in disrupting the market and business models, this is not reflected in the growth of the data economy. The lack of trusted, secure, ethical-driven personal data platforms and privacy-aware analytics, hinders the growth of the data economy and creates concerns. The main considerations are related to the secure sharing of personal and proprietary/industrial data, and the definition of a fair remuneration mechanism that will be able to capture, produce, release and cash out the value of data, always for the benefit of all the involved stakeholders.
This webinar will focus on how such concerns that pertain to privacy, ethics and intellectual property rights can be tackled, by allowing individuals to take ownership and control of their data and share them at will, through flexible data sharing and fair compensation schemes with other entities (companies or not), as researched by the DataVaults project.
Big Data lay at the core of the strong data economy that is emerging in Europe. Although both large enterprises and SMEs acknowledge the potential of Big Data in disrupting the market and business models, this is not reflected in the growth of the data economy. The lack of trusted, secure, ethical-driven personal data platforms and privacy-aware analytics, hinders the growth of the data economy and creates concerns. The main considerations are related to the secure sharing of personal and proprietary/industrial data, and the definition of a fair remuneration mechanism that will be able to capture, produce, release and cash out the value of data, always for the benefit of all the involved stakeholders.
This webinar will focus on how such concerns that pertain to privacy, ethics and intellectual property rights can be tackled, by allowing individuals to take ownership and control of their data and share them at will, through flexible data sharing and fair compensation schemes with other entities (companies or not), as researched by the DataVaults project.
Big Data lay at the core of the strong data economy that is emerging in Europe. Although both large enterprises and SMEs acknowledge the potential of Big Data in disrupting the market and business models, this is not reflected in the growth of the data economy. The lack of trusted, secure, ethical-driven personal data platforms and privacy-aware analytics, hinders the growth of the data economy and creates concerns. The main considerations are related to the secure sharing of personal and proprietary/industrial data, and the definition of a fair remuneration mechanism that will be able to capture, produce, release and cash out the value of data, always for the benefit of all the involved stakeholders.
This webinar will focus on how such concerns that pertain to privacy, ethics and intellectual property rights can be tackled, by allowing individuals to take ownership and control of their data and share them at will, through flexible data sharing and fair compensation schemes with other entities (companies or not), as researched by the DataVaults project.
Intro - Three pillars for building a Smart Data Ecosystem: Trust, Security an...Big Data Value Association
Today’s data marketplaces are large, closed ecosystems that are in the hands of few established players or a consortium that decide on the rules, policies, etc.
Yet, the main barrier of the European data economy is the fact that current data spaces and marketplaces are “siloes”, without support for data exchange across their boundaries.
This webinar reveals how these boundaries can be overcome through the i3-MARKET “backplane”, which is an infrastructure able to connect all the stakeholders providing the suitable level of trust (consensus-based self-governing, auditability, reliability, verifiable credentials), security (P2P encryption, cryptographic proofs) and privacy (self-sovereign identity, zero-knowledge proof, explicit user consent).
Three pillars for building a Smart Data Ecosystem: Trust, Security and PrivacyBig Data Value Association
Today’s data marketplaces are large, closed ecosystems that are in the hands of few established players or a consortium that decide on the rules, policies, etc.
Yet, the main barrier of the European data economy is the fact that current data spaces and marketplaces are “siloes”, without support for data exchange across their boundaries.
This webinar reveals how these boundaries can be overcome through the i3-MARKET “backplane”, which is an infrastructure able to connect all the stakeholders providing the suitable level of trust (consensus-based self-governing, auditability, reliability, verifiable credentials), security (P2P encryption, cryptographic proofs) and privacy (self-sovereign identity, zero-knowledge proof, explicit user consent).
Market into context - Three pillars for building a Smart Data Ecosystem: Trus...Big Data Value Association
Today’s data marketplaces are large, closed ecosystems that are in the hands of few established players or a consortium that decide on the rules, policies, etc.
Yet, the main barrier of the European data economy is the fact that current data spaces and marketplaces are “siloes”, without support for data exchange across their boundaries.
This webinar reveals how these boundaries can be overcome through the i3-MARKET “backplane”, which is an infrastructure able to connect all the stakeholders providing the suitable level of trust (consensus-based self-governing, auditability, reliability, verifiable credentials), security (P2P encryption, cryptographic proofs) and privacy (self-sovereign identity, zero-knowledge proof, explicit user consent).
BDV Skills Accreditation - Future of digital skills in Europe reskilling and ...Big Data Value Association
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
BDV Skills Accreditation - Recognizing Data Science Skills with BDV Data Scie...Big Data Value Association
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
EIT label intro by Rroberto Prieto
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
Muluneh Oli (EIT Digital)
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
BDV Skills Accreditation - Definition and ensuring of digital roles and compe...Big Data Value Association
The objective of the workshop is to highlight the need for a pan European level skill recognition for Big Data that stimulates mobility and fulfils the definition of overarching Learning Objectives & Overarching Learning Impacts. It is also meant to get feedback on the formats that are being prepared namely, usage of Badges, Label and EIT Label for professionals.
BigDataPilotDemoDays - I BiDaaS Application to the Manufacturing Sector WebinarBig Data Value Association
The new data-driven industrial revolution highlights the need for big data technologies to unlock the potential in various application domains. To this end, BDV PPP projects I-BiDaaS, BigDataStack, Track & Know and Policy Cloud deliver innovative technologies to address the emerging needs of data operations and applications. To fully exploit the sustainability and take full advantage of the developed technologies, the projects onboarded pilots that exhibit their applicability in a wide variety of sectors. In the Big Data Pilot Demo Days, the projects will showcase the developed and implemented technologies to interested end-users from the industry as well as technology providers, for further adoption.
One of the main goals of the I-BiDaaS project is to provide a Big Data as a self-service solution that will empower the actual employees of European companies in targeted sectors (banking, manufacturing, telecom), i.e., the true decision-makers, with the insights and tools they need in order to make the right decisions in an agile way. In this big data pilot webinar, we will demonstrate in a step by step fashion the I-BiDaaS self-service solution and its application to the banking sector. In more detail, we will present an overview of the I-BiDaaS project focusing on the requirements of the CaixaBank pilot study, the I-BiDaaS architecture with its core technologies, and a step by step demo of the I-BiDaaS solution. Last but not least, we will show through CaixaBank's success story how I-BiDaaS can resolve data availability, data sharing, and breaking silos challenges in the banking domain.
At the heart of this DataBench webinar is the goal to share a benchmarking process helping European organisations developing Big Data Technologies to reach for excellence and constantly improve their performance, by measuring their technology development activity against parameters of high business relevance.
The webinar aims to provide the audience with a framework and tools to assess the performance and impact of Big Data and AI technologies, by providing real insights coming from DataBench. In addition, representatives from other projects part of the BDV PPP such as DeepHealth and They-Buy-for-You will participate to share the challenges and opportunities they have identified on the use of Big Data, Analytics, AI. The perspective of other projects that also have looked into benchmarking, such as Track&Now and I-BiDaaS will be introduced.
Virtual BenchLearning - I-BiDaaS - Industrial-Driven Big Data as a Self-Servi...Big Data Value Association
At the heart of this DataBench webinar is the goal to share a benchmarking process helping European organisations developing Big Data Technologies to reach for excellence and constantly improve their performance, by measuring their technology development activity against parameters of high business relevance.
The webinar aims to provide the audience with a framework and tools to assess the performance and impact of Big Data and AI technologies, by providing real insights coming from DataBench. In addition, representatives from other projects part of the BDV PPP such as DeepHealth and They-Buy-for-You will participate to share the challenges and opportunities they have identified on the use of Big Data, Analytics, AI. The perspective of other projects that also have looked into benchmarking, such as Track&Now and I-BiDaaS will be introduced.
The problem of radicalisation is very high on the European agenda as increasing numbers of young European radicals return from Syria and use the internet to disseminate propaganda. To enable policy makers to design policies to address radicalisation effectively, Policy Cloud consortium will collect data from social media and other sources including the open-source Global Terrorism Database (GTD), the Onion City search engine which accesses data over the TOR dark web sites, and Twitter ( through Firehose). The data will be analysed using sentiment analysis and opinion mining software.
Policy Cloud Data Driven Policies against Radicalisation - Participatory poli...Big Data Value Association
The problem of radicalisation is very high on the European agenda as increasing numbers of young European radicals return from Syria and use the internet to disseminate propaganda. To enable policy makers to design policies to address radicalisation effectively, Policy Cloud consortium will collect data from social media and other sources including the open-source Global Terrorism Database (GTD), the Onion City search engine which accesses data over the TOR dark web sites, and Twitter ( through Firehose). The data will be analysed using sentiment analysis and opinion mining software.
The problem of radicalisation is very high on the European agenda as increasing numbers of young European radicals return from Syria and use the internet to disseminate propaganda. To enable policy makers to design policies to address radicalisation effectively, Policy Cloud consortium will collect data from social media and other sources including the open-source Global Terrorism Database (GTD), the Onion City search engine which accesses data over the TOR dark web sites, and Twitter ( through Firehose). The data will be analysed using sentiment analysis and opinion mining software.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Heterogeneous HPC Computing in the DeepHealth Project
1. 1
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Heterogeneous HPC Computing
in the DeepHealth Project
José Flich (UPV)
Monica Caballero (everis)
European Big Data Value Forum (EBDVF) 2019
15 October 2019, Helsinki
2. 2
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
About DeepHealth
Aim & Goals
§ Facilitate the daily work and increase the productivity of medical personnel and IT professionals in terms of image
processing and the use and training of predictive models without the need of combining numerous tools.
§ Offer a unified framework adapted to exploit underlying heterogeneous HPC and Big Data architectures
supporting state-of-the-art and next-generation Deep Learning (AI) and Computer Vision algorithms to enhance
European-based medical software platforms.
§ Put HPC computing power at the service of biomedical applications with DL needs and, through an
interdisciplinary approach, apply DL techniques on large and complex image biomedical datasets to support new and
more efficient ways of diagnosis, monitoring and treatment of diseases.
Duration: 36 months
Starting date: Jan 2019
Budget 14.642.366 €
EU funding 12.774.824 €
21 partners from 9 countries: Research
centers, Health organizations, large industries
and SMEs
3. 3
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
About DeepHealth
• The DeepHealth toolkit: Free and open-source software with two core technology libraries and a dedicated
front-end.
• EDDLL: The European Distributed Deep Learning Library
• ECVL: the European Computer Vision Library
• Ready to run algorithms on Hybrid HPC + Big Data architectures with heterogeneous hardware
• Seven biomedical and AI software platforms will integrate the DeepHealth libraries to improve their
potential.
Use-cases
• 14 pilot test-beds in 3 areas:
• Neurological diseases
• Tumor detection and early cancer prediction
• Digital pathology and automated image annotation.
• Pilots will allow to train models and evaluate the performance of the proposed solutions in terms of time
and accuracy.
Expected results
4. 4
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
DeepHealth HPC Goals
5. 5
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
DeepHealth Goals
• Develop a European Distributed Deep-Learning Library (EDDL)
• Develop a European Computer Vision Library (ECVL)
• Adapt EDDL/ECVL to HPC infrastructure
• Heterogeneous Architectures
• Apply the EDDL/ECVL to 7 European Platforms for Medical applications
• Apply the DeepHealth solution to 14 use cases (pilots) for medical diagnosis
development adaptation use
6. 6
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
HPC Goals and Related Challenges
• Adapt EDDL and ECVL libraries to HPC infrastructure
• Computation
• CPUs, GPUs, FPGAs
• Communication
• Distribution of training process
• KPI
• 4X performance improvement and 7X better power efficiency for target
DeepHealth infrastructure with advanced HPC technologies
(combining manycores with vectorial units, GPUs, FPGAs, and low-
latency interconnects) compared to standard HPC infrastructure
7. 7
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Platform
Platform
Platform
Challenges
At different levels
EDDL
library
ECVL
library
Use case
Heterog.HPC
CPU CPU CPU GPU GPU GPU FPGA FPGA FPGA FPGA
Interconnect
Use caseUse case
Use caseUse caseUse case
• Develop EDDL/ECVL
• Adapt Platforms
• Adapt Use Cases
• Adapt HPC
• computation, runtime, distribution, interconnect
1
1
1
2 2
3
3 3
4
4 4
4 4 4
4
Implementation Challenge:
Adapting new libraries (for performance)
as they are being implemented and tested
8. 8
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Types of Systems
Heterogeneity support
CPU
GPU
Interconnect
CPU
GPU
CPU GPU CPU
GPU
CPU
GPU
Interconnect
CPU
GPU
CPU
FPGA
CPU
FPGA
CPU
Interconnect
CPU CPU CPU
CPU
GPU
Interconnect
GPU
CPU
GPU
GPU
CPU
GPU
GPU
CPU
GPU
Interconnect
CPU
GPU
CPU
FPGA
FPGA
GPU
9. 9
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
DeepHealth HPC Goals
• Reinvest in FET-HPC projects (MANGO)
• Large FPGA cluster for heterogeneous HPC Exploration
10. 10
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Target HPC Systems
11. 11
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
MareNostrum 4
Total peak performance: 13,7 Pflops
General Purpose Cluster: 11.15 Pflops (1.07.2017)
CTE1-P9+Volta: 1.57 Pflops (1.03.2018)
CTE2-Arm V8: 0.5 Pflops (????)
CTE3-KNH?: 0.5 Pflops (????)
MareNostrum 1
2004 – 42,3 Tflops
1st Europe / 4th World
New technologies
MareNostrum 2
2006 – 94,2 Tflops
1st Europe / 5th World
New technologies
MareNostrum 3
2012 – 1,1 Pflops
12th Europe / 36th World
MareNostrum 4
2017 – 11,1 Pflops
2nd Europe / 13th World
New technologies
12. 12
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
BSC HPC Infrastructures
PUT YOUR SMART SUBTITLE HERE
• General Purpose Cluster (in production)
• 48 racks with 3456 nodes, each with 2 Intel Xeon Platinum proc.
• Total of 11.15 PFLOPs in Double Precision
• System with total of 165888 processors and 390TB of main memory
• 29th fastest supercomputer in top500, 7th fastest supercomputer in Europe
• CTE1-P9+VOLTA (in production)
• 54 nodes, each with 2 POWER9 proc., 4 Volta GPUs, 6.4TB NVMe
• Total of 1.57 PFLOPs in Double Precision
• Same node as Sierra supercomputer at LLNL (2nd fastest supercomputer in
top500)
• Suitable for HPC and Machine Learning workloads
13. 13
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
BSC HPC Infrastructures
PUT YOUR SMART SUBTITLE HERE
• CTE2-Arm v8 (to be deployed in 2020)
• Same processor as in the future post-K supercomputer in Japan
• Targets Exascale workloads: 2.7 TFLOPS double precision compute power,
5.4 TFLOPS in single precision; 10.8 TFLOPS in half-precision (16 bits)
• HPC and AI convergence: up to 21.6 TOPS in 8-bit int precision
• 7nm technology; 48 cores; 4 stacks of 8GB HBM2 (total of 32GB)
• Novel 512-bit SVE ext. with specific instructions for machine learning
• Might be interesting as a cutting edge system by the end of DeepHealth
• Mont-Blanc 3 prototype (in production)
• 48 nodes, 2 processors/node (96 processors in total)
• Cavium Thunder X2 processor: 32-core Arm v8, 4-way SMT, up to 2.5GHz
• Targets HPC workloads in datacenters
• System with up to 3K cores and 12K threads
• Liquid cooling
14. 14
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
MANGO prototype
From FET-HPC MANGO project
• 16 (interconnected) clusters, each with
• One Server node
• 12 FPGAs (lego system)
• Xilinx 7–series, Zynq-7000, Kintex Ultrascale+
• Intel Stratix-10
• DDR3, DDR4 pluggable memory modules
• Connections: PCIe Express Gen 2/3 lanes, 40Gbps QSFP
prototype
onecluster
15. 15
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
PROD: Development of a customized FPGA-
based PCIe Board
• Based on latest Intel or Xilinx FPGA
technology (TBD)
• High bandwidth and low latency PCIe
interface for data exchange with host
• Modular peripherals (memories,
interfaces) - TBD
16. 16
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
The DeepHealth Computing Infrastructure
Overview
COMPSs
Global Resource Manage
(Slurm-based)
Distributed Programming Model
(e.g., M/R, task-based)
Non-functional
requirements description
API provided to ECVL and EDDLL developers (WP2/WP3)
Parallel
Run-time
Netlist Partitioning
Vivado tools
N2D2
framework
Mango
Run-time
Mango
Cluster
MareNostrum 4 (Intel)
Arm ThunderX2
POWER9+Voltas Cluster
Private (NVIDIA)
+ Public Cloud
DeepHealth HPC HW Resources DeepHealth Cloud HW Resources
OpenStack
platform
Parallel Programming Models
(e.g., CUDA, OpenCL, OpenMP)
Cloud
API
DeepHealth SW Architecture
Private Cloud
(x86+NVIDIA T4)Tailored FPGA PCIe card
1200 cores
cluster (x86)
BSC
UNITO
PROD
UPV
UNITOTREE
Programming models and access methods for
EDDLL and ECVL development
The DeepHealth computing infrastructure including
HPC and big-data cloud-based resources
Multiple Workloads Scheduling
Single Workload
Scheduling
Container-based
(Parallel) Programming Models
HW
EDDLL workload
(e.g., training)
EDDL workload
(e.g., inference)
Single Workload
Scheduling
17. 17
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
COMPSs
• Framework (programming model + runtime system) to develop parallel
applications for distributed infrastructures
• Abstract model: exposes parallelism while hides the infrastructure
• Agnostic of computing platform
• Task-based programming model build on top of general purpose sequential
programming languages (Python, C, C++, Java)
def display(c):
…
def add(a, b, c):
c = a + b
for i in range(MSIZE):
add(A[i],B[i],C[i])
display(C)
@task(c=INOUT)
def display(c):
…
@task(a=IN,b=IN,c=OUT)
def add(a, b, c):
c = a + b
for i in range(MSIZE):
add(A[i],B[i],C[i])
display(C)
ad
d
ad
d
ad
d
dis
pla
y
…
MSIZE
18. 18
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
EPFL: Multi-objective RM policies
• Power/performance/accuracy-aware
runtime resource management policies
• Automatic selection of the most efficient
resources
• Adding one new axis: accuracy!
• Heuristics, ML-based and hyper-heuristic
RM policies (algorithms)
• Single-node: selection of accelerators
(allocation), DVFS settings
• Multiple nodes (Global RM of MANGO)
• Integrated with DeepHealth SW stack
• MANGO API + COMPS + Slurm
19. 19
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Data Parallelism
• Training batch distribution
• Gradient collection and weights distribution
• AllReduce, Broadcast support to be exploited
• Different strategies will be implemented and evaluated
• Synchronization primitives (relaxed models)
CPU
GPU
Interconnect
CPU GPU CPU
FPGA
FPGA
GPU
High Pressure
on the
Interconnect
20. 20
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Netlist partitioning (CEA)
• Use a multi-FPGA platform as a single virtual large FPGA
• For very large inference networks that do not fit into a single
FPGA
• Direct IO-to-IO connection between FPGAs
• Optimized partitioning of the netlist into several netlists
• Combinatiorial optimization model, taking into account
critical paths & resource quantities in each FPGA
• Several state-of-the-art optimization methods, from
Kernighan-Lin to simulated annealing
• Execution of the design on the multi-FPGA platform
• Multiplexing of signals to deal with the limited
interconnection
21. 21
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
Heterogeneous Computing
• DeepLearning and Computer Vision kernels to be deployed for
• CPU
• Math processing routines (MKL, Eigen)
• GPU
• CUDA vs OpenCL programming
• FPGA
• OpenCL vs HLS vs RTL programming
• Intel/Altera vs Xilinx platforms
22. 22
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
HPC Things to Explore in DeepHealth
• Communication impact
• Will the network become the bottleneck?
• Use cases sizes
• Accuracy vs performance trade-off
• FPGA suitability for Training (Floating point precision requirement)
• Will be energy efficient for such large challenge?
• Which FPGA devices will perform better (accuracy vs. energy trade-off)
• Scalability of the solution (EDDL/ECVL)
• Will perform well on any end-used HPC-like platform?
• … so, ahead a challenging future for DeepHealth HPC teams!
23. 23
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 825111.
José Flich (jflich@disca.upv.es)
Mónica Caballero (monica.caballero.galeote@everis.com)
Thank you!