The document discusses Dell Technologies' artificial intelligence (AI) and data analytics solutions portfolio. It provides an overview of Dell's solutions for AI/machine learning, IoT/streaming data, augmented analytics/data warehousing, data lakes, and high-performance computing (HPC). The solutions leverage Dell infrastructure along with partner technologies and are designed to address various analytical use cases such as digital manufacturing, life sciences research, and retail loss prevention.
Clarisse Hedglin from IBM presented this as part of 3 days International Summit .. She shared the scenarios AI can solve for today using the IBM AI infrastructure.
Next Platform of Manufacturing (Wizcore_Nexpom eng v3.0)Wizcore
It is one of a platform for manufacturing industry. In the factory, in process line, there are many data that you want to manage.You can see most of quality data in nexpom.
IoT for Manufacturing Supply Chain Logistic, Predictive Maintenance and More putrimeisya
The Industry 4.0 is marked by internet-based digital technology innovations, including the internet of things (IoT), artificial intelligence (AI), robots and big data. In this presentation we present various examples of IoT's implementation in manufacturing industry, from many leading companies in the world. The implementation was also studied starting from supply chain management, predictive maintenance, controlling and more from several sources.This presentation is also to fulfill the tasks of the Capita Selecta of Operations and Supply Chain Management course, Master of Management at Trisakti University.
Thank you.
Presentation from 6th June 2017, covering IBM Systems PoV, WW Executive ownership of IBM Power Systems, challenges in our industry, Cloud, Hybrid Cloud, Intel's Strategic direction, IBM Power Systems strategic direction, OpenPOWER Foundation, OpenCAPI, Nutanix and POWER9
Clarisse Hedglin from IBM presented this as part of 3 days International Summit .. She shared the scenarios AI can solve for today using the IBM AI infrastructure.
Next Platform of Manufacturing (Wizcore_Nexpom eng v3.0)Wizcore
It is one of a platform for manufacturing industry. In the factory, in process line, there are many data that you want to manage.You can see most of quality data in nexpom.
IoT for Manufacturing Supply Chain Logistic, Predictive Maintenance and More putrimeisya
The Industry 4.0 is marked by internet-based digital technology innovations, including the internet of things (IoT), artificial intelligence (AI), robots and big data. In this presentation we present various examples of IoT's implementation in manufacturing industry, from many leading companies in the world. The implementation was also studied starting from supply chain management, predictive maintenance, controlling and more from several sources.This presentation is also to fulfill the tasks of the Capita Selecta of Operations and Supply Chain Management course, Master of Management at Trisakti University.
Thank you.
Presentation from 6th June 2017, covering IBM Systems PoV, WW Executive ownership of IBM Power Systems, challenges in our industry, Cloud, Hybrid Cloud, Intel's Strategic direction, IBM Power Systems strategic direction, OpenPOWER Foundation, OpenCAPI, Nutanix and POWER9
An older presentation I recently reused in a meeting with the Head of IT for a customer, showing how IBM i can be used in conjunction with Open Source solutions, either in the same server or as part of a Cloud solution
The purpose of this webinar is to discuss project management of cloud computing. It demystifies terms and uncovers concepts associated with the cloud-based services. It discusses the evolution of the cloud by tracing its conceptualization back to earlier pioneers.
Building a hybrid, dynamic cloud on an open architectureDaniel Krook
Daniel Krook's version of the IBM open cloud overview, focusing on the business and technological imperatives driving the IBM strategy for customers.
Presented 9/30 and 10/1 at Boston TechFest, Cambridge, MA.
IBM Power Systems at the heart of Cognitive SolutionsDavid Spurway
A presentation I gave on the 16th May 2018 to a new CIO of a customer, showing how IBM i remains a key strategic part of the IBM Power Systems portfolio. As we build in the 30 years of innovation and integrate with AI and Cloud solutions, IBM i deserves to remain a key part of customer's strategies going forward.
Open Source Solutions: Managing, Analyzing and Delivering Business Informationmark madsen
These slides on the usage of open source solutions within the business intelligence and data warehousing market go with a webcast and research report. The webcast is archived at http://ow.ly/KLz0 along with a PDF of the report, This presentation describes what open source software is being deployed and presents the benefits, challenges and practices for organizations adopting open source technologies.
By 2021, 98% of organizations plan to adopt multicloud architectures, but only 41% have a multicloud management strategy and just 38% have procedures and tools to operate a multicloud environment. As an IT leader you don't want to stifle forays into multicloud as it is an engine to efficiently support growth, innovation and transformation, however, it can be one of the most challenging changes that organizations face.
Join IBM for the keynote presentation where we’ll discuss:
• Opportunities and inherent challenges for organizations as applications across categories migrate to multicloud
• Stages of multicloud transformation
• Best practices from organizations that are succeeding with a multicloud environments
Data warehouse modernization programme by TOBY WOOLFE at Big Data Spain 2014Big Data Spain
General Motors (GM) is in the process of constructing a single global information warehouse that will become the foundation for all business analytics and decision support across the enterprise.
Capitalize on the core business applications and realising the agility source of new business models, these are major challenges for the new micro services architectures.
Mr. Poos explains how application technologies (API) can open up business applications. He present how to perform the classification per type of workload and some use cases associated (IoT, Analytics, Dev, … ). He describe how to design a roadmap to agility (lift and shift, optimized, natives) and unveil some integration techniques.
Last but not least he present a hands-on case that included the integration of a frontend with docker and a transactional Z.
An older presentation I recently reused in a meeting with the Head of IT for a customer, showing how IBM i can be used in conjunction with Open Source solutions, either in the same server or as part of a Cloud solution
The purpose of this webinar is to discuss project management of cloud computing. It demystifies terms and uncovers concepts associated with the cloud-based services. It discusses the evolution of the cloud by tracing its conceptualization back to earlier pioneers.
Building a hybrid, dynamic cloud on an open architectureDaniel Krook
Daniel Krook's version of the IBM open cloud overview, focusing on the business and technological imperatives driving the IBM strategy for customers.
Presented 9/30 and 10/1 at Boston TechFest, Cambridge, MA.
IBM Power Systems at the heart of Cognitive SolutionsDavid Spurway
A presentation I gave on the 16th May 2018 to a new CIO of a customer, showing how IBM i remains a key strategic part of the IBM Power Systems portfolio. As we build in the 30 years of innovation and integrate with AI and Cloud solutions, IBM i deserves to remain a key part of customer's strategies going forward.
Open Source Solutions: Managing, Analyzing and Delivering Business Informationmark madsen
These slides on the usage of open source solutions within the business intelligence and data warehousing market go with a webcast and research report. The webcast is archived at http://ow.ly/KLz0 along with a PDF of the report, This presentation describes what open source software is being deployed and presents the benefits, challenges and practices for organizations adopting open source technologies.
By 2021, 98% of organizations plan to adopt multicloud architectures, but only 41% have a multicloud management strategy and just 38% have procedures and tools to operate a multicloud environment. As an IT leader you don't want to stifle forays into multicloud as it is an engine to efficiently support growth, innovation and transformation, however, it can be one of the most challenging changes that organizations face.
Join IBM for the keynote presentation where we’ll discuss:
• Opportunities and inherent challenges for organizations as applications across categories migrate to multicloud
• Stages of multicloud transformation
• Best practices from organizations that are succeeding with a multicloud environments
Data warehouse modernization programme by TOBY WOOLFE at Big Data Spain 2014Big Data Spain
General Motors (GM) is in the process of constructing a single global information warehouse that will become the foundation for all business analytics and decision support across the enterprise.
Capitalize on the core business applications and realising the agility source of new business models, these are major challenges for the new micro services architectures.
Mr. Poos explains how application technologies (API) can open up business applications. He present how to perform the classification per type of workload and some use cases associated (IoT, Analytics, Dev, … ). He describe how to design a roadmap to agility (lift and shift, optimized, natives) and unveil some integration techniques.
Last but not least he present a hands-on case that included the integration of a frontend with docker and a transactional Z.
Vertex perspectives ai optimized chipsets (part i)Yanai Oron
Businesses are increasingly adopting AI to create new applications to transform existing operations, driving big data with the growth of IoT and 5G networks and increasing future process complexities for human operators. In this new environment, AI will be needed to write algorithms dynamically to automate the entire programming process. Fortunately, algorithms associated with deep learning are able to achieve enhanced performance with increasing data, unlike the rest associated with machine learning.
Vertex Perspectives | AI-optimized Chipsets | Part IVertex Holdings
Businesses are increasingly adopting AI to create new applications to transform existing operations, driving big data with the growth of IoT and 5G networks and increasing future process complexities for human operators. In this new environment, AI will be needed to write algorithms dynamically to automate the entire programming process. Fortunately, algorithms associated with deep learning are able to achieve enhanced performance with increasing data, unlike the rest associated with machine learning. To date, deep learning technology has primarily been a software play. Existing processors were not originally designed for these new applications. Hence the need to develop AI-optimized hardware.
Industry pundits are predicting up to 50 billion connected devices by 2020, generating more data than in all of human history to date and connected via ubiquitous, connectivity such as 5G, Sigfox and NBIoT. With this comes the promise of business opportunities to deploy your Internet of Things solution. Ganga will walk you through the trends in computing that you need to be aware of, how you can get started and how working with Intel can accelerate your development and time to market.
Speaker: Ganga Varatharajan, IoT & New Technologies Manager, Intel
Inteligencia artificial - Quebrando el paradigma de la amnesia empresarialMarcos Quezada
Deep Learning (DL) es la sub-categoría dentro del Machine Learning que más crece y evoluciona en el campo de la Inteligencia Artificial (AI).
Deep Learning usa redes neuronales basadas en software para desarrollar patrones de análisis que permiten contar con una capacidad predictiva: En pocas palabras, Deep Learning es una plataforma que le permite aprender a aprender. Es la manera de sacar el mayor provecho de sus datos. Hoy las empresas deben aprovechar sus datos a la misma velocidad con la que los producen. Usando Deep Learning podemos desarrollar nuevas capacidades analíticas incluyendo:
- Visión artificial
- Detección de objetos
- Procesamiento de lenguaje natural
- Detección de anomalías y fraude
En el corazón de estos casos de uso hay capacidades sofisticadas de reconocimiento de patrones y clasificación, que dan lugar a aplicaciones revolucionarias y abren una ventana hacia el futuro.
En esta presentación les cuento cómo estamos llevando Deep Learning más allá de sus raíces basadas en frameworks open source y como la plataforma PowerAI le puede ayudar a su empresa a poner producción estas herramientas poderosas en ahora mismo.
AI for Manufacturing (Machine Vision, Edge AI, Federated Learning)byteLAKE
This is the extended presentation about byteLAKE's and Lenovo's Artificial Intelligence solutions for Manufacturing.
Topics covered: AI strategy for manufacturing, Edge AI, Federated Learning and Machine Vision.
It's the first publication in the upcoming series: AI for Manufacturing. Highlights: AI-assisted quality monitoring automation, AI-assisted production line monitoring and issues detection, AI-assisted measurements, Intelligent Cameras and many more. Reach out to us to learn more: welcome@byteLAKE.com.
Presented during the world's first Federated Learning conference (Jun'20). Recording: https://youtu.be/IMqRIi45dDA
Related articles:
- Revolution in factories: Industry 4.0.
https://medium.com/@marcrojek/revolution-in-factories-industry-4-0-conference-made-in-wroclaw-2020-translation-ae96e5e14d55
- Cognitive Automation helps where RPAs fall short.
https://medium.com/@marcrojek/cognitive-automation-helps-where-rpas-fall-short-a1c5a01a66f8
- Machine Vision, how AI brings value to industries.
https://medium.com/@marcrojek/machine-vision-how-ai-brings-value-to-industries-e6a4f8e56f42
Learn more:
- https://www.bytelake.com/en/cognitive-services/
- https://www.lenovo.com/ai
- https://federatedlearningconference.com/
Ομιλία- Παρουσίαση: Ανδρέας Τσαγκάρης, VP & Chief Technology Officer, Performance Technologies
Τίτλος Παρουσίασης: “Big Data on Linux on Power Systems”
Defining a Practical Path to Artificial Intelligence Roman Chanclor
With the evolution of purpose built AI Infrastructures and the advancement of Graphics Processing Units (GPUs) that enable massively parallel, deep analysis in real-time; cognitive computing may be the norm in data centers in record time. But how?
As the adoption of AI technologies increases and matures, the focus will shift from exploration to time to market, productivity and integration with existing workflows. Governing Enterprise data, scaling AI model development, selecting a complete, collaborative hybrid platform and tools for rapid solution deployments are key focus areas for growing data scientist teams tasked to respond to business challenges. This talk will cover the challenges and innovations for AI at scale for the Enterprise focusing on the modernization of data analytics, the AI ladder and AI life cycle and infrastructure architecture considerations. We will conclude by viewing the benefits and innovation of running your modern AI and Data Analytics applications such as SAS Viya and SAP HANA on IBM Power Systems and IBM Storage in hybrid cloud environments.
Open source Apache Hadoop is a great framework for distributed processing of large data sets. But there’s a difference between “playing” with big data versus solving real problems. The reality is that Hadoop alone is not enough. In fact, almost every organization that plans to use Hadoop for production use quickly discovers that it lacks the required features for enterprise use. And, fewer still have the Hadoop specialists on hand to navigate through the complexity to build reliable, robust applications. As a result, many Hadoop projects never make it to production as executives say, “we just don’t have the skills.” In this session, we will discuss these enterprise capabilities and why they’re important: analytics, visualization, security, enterprise integration, developer/admin tools, and more. Additionally, we will share several real-world client examples who have found it necessary to use an enterprise-grade Hadoop platform to tackle some of the most interesting and challenging business problems.
"Toward Cognitive-IoT Applications -- Integrating AI with Fog Computing" by Dr. Frank C. D. Tsai, Workshop of Mobile IoT with Edge Computing and Artificial Intelligence, sponsored by Ministry of Education, Taiwan
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
1. 1
Industry Trends in Analytics
Business Intelligence, Data Warehousing, Big Data, Artificial Intelligence
AI and HPC University Roadshow
Bill Wong – Artificial Intelligence and Data Analytics Practice Leader
Bill Kiatipis - High Performance Computing Strategist
Dell Technologies
2. 2
Agenda
Key Data Trends and Challenges
Digital Transformation Through Analytics
HPC Platform Computing
Summary
3. 3
Higher Education Advanced Analytics Drivers
Enhance Student
Experience to
Attract and
Retain Students,
Improve Post-
Education
Outcomes
Personalized
Learning to Improve
Student Outcome,
Improve Student
Performance and
Graduation Rates
Drive and support
Research,
Partnerships and
Entrepreneurial
Initiatives in Key
Industries
Smarter Campus to
enhance the student
experience and campus
facilities by
transforming the
economic, social, and
technology foundation
4. Q. What are the technology areas where your organization will be spending the largest amount of new or additional funding in 2019? n = 3,086.
Q. What are the technology areas where your organization will be reducing funding by the highest amount in 2019 compared to 2018? n = 2,819. Multiple
responses allowed, excludes “don’t know.”
Source: Gartner “The 2019 CIO Agenda: Securing a New Foundation for Digital Business,” (G00366991)
Data and Analytics Investments Leads New Digital Transformation
Initiatives (Again) for CIOs
Analytics
investments
continue to
increase
Plans to Increase Investment for Digital Transformation
5. 5
The AI Technology Enablers
1950 1960 1970 1980 1990 2000 2010 2020
Cost of Compute
Amount of Data
Artificial Intelligence
Machine Learning
Deep
Learning
- Artificial Intelligence (AI) is human intelligence mimicked by
machine algorithms, examples: Chess, Go, Facial Recognition
- Machine Learning (ML) is a subset of AI algorithms to parse
data, learn from data, and then make a determination or
prediction, example: Spam Detection, Preventative
Maintenance
- Deep Learning (DL) a subset of machine learning algorithms
that leverage artificial neural networks to develop relationships
among the data, examples: Driverless Cars, Cyber-Security
Accelerators
Algorithms
Big Data
Traditional Programming
Machine Learning
6. 6
Expectations
Plateau of
Productivity
Peak of Slope of EnlightenmentInnovationTrigger Trough of Disillusionment
Inflated Expectations
Hype Cycle for Artificial Intelligence
“Narrow" AI is becoming
better than humans at
defined tasks. "General" AI
is still a long way off.”
Time
Plateau will be reached
less than 2 years
2 to 5 years
5 to 10 years
more than 10 years
Deep Learning
Infrastructure Transformation
Autonomous Vehicles
“AI, one of the most
disruptive classes of
technologies, will become
more widely available due to
cloud computing, open
source and the “maker”
(developers, data scientists
and AI architects) community.
While early adopters will
benefit from continued
evolution of the technology,
the notable change will be its
availability to the masses.
As of July 2019
AI PaaS
Artificial General Intelligence
Machine Learning
NLP
FPGA Accelerators
GPU Accelerators
DNN ASICs
Quantum Computing
Neuromorphic Hardware
Computer Vision
Speech Recognition
7. 7
Top 10 Types of Hardware for AI Delivery*
1. Processors (CPU, GPU, FPGA, ASIC)
2. HPC / Supercomputer Infrastructure
3. Communication Network
4. Personal Devices
5. Connected Home Devices
6. AR / VR Head-Mounted Displays (HMD)
7. Drones
8. Robotics
9. Automotive
10.Sensors and Application Components (audio, camera, LiDAR, etc.)
*The Business Impact and Use Cases for Artificial Intelligence, Gartner, 2017
Accelerate
computational
performance
AI-enabled endpoints
AI-enabled autonomous endpoints
8. 8
AI Accelerators
Flexibility Efficiency
“Starting today (November 13, 2019), Microsoft is providing Azure customers
with access to chips made by the British startup Graphcore.”
Microsoft Sends a New Kind of AI Processor Into the Cloud –
https://www.wired.com/story/microsoft-sends-a-new-kind-of-ai-processor-into-
the-cloud/
9. 9
Dell EMC DSS 8440
A Dynamic Machine Learning and HPC Platform
Differentiated Features Benefits
Up to 10 Accelerator cards
including Industry leading
NVIDIA Telsa V100 GPUs
and Graphcore Colossus C2
IPU cards (Post RTS)
Exceptional horsepower in
a 4U chassis
Supports up to 205W CPUs
with accelerators in 35C
environments
Thermally unconstrained
provides flexibility for a
variety of configurations/
environments
Up to 10 drives of local
storage (NVMe and
SAS/SATA)
Accelerated access to
training data
8 x PCIe Gen3 Extensive I/O options for
network/IO traffic
DSS 8440 with
NVIDIA GPUs
DSS 8440 with
Graphcore IPUs
10. 10
Worldwide Artificial Intelligence 2018 Share Snapshot
Dell Technologies' 2018 AI revenue
grew 72.6% to $1.89 billion, from
$1.10 billion in 2017. Dell revenue in
the AI market is primarily from
infrastructure (server and storage).
*IBM's AI revenue grew 19.0% to $2.58 billion, from $2.17 billion in 2017.
Revenue is divided across software, hardware, and services, with services
and hardware significantly larger than software ($349.6M).
13. 13
Dell EMC HPC
Market Leadership
generations of servers and
storage in HPC clusters10
1st HPC cluster
HPC solutions
program officially
launches
Industry’s 1st
HPC Solution
Bundle
#4
Tungsten
Thunderbird
#6
DCS formed
C-series joins
PowerEdge
Stampede
#7
Zenith
System
launched
at Dell
EMC HPC
Innovation
Lab
Dell EMC merger —
Isilon joins HPC
portfolio
1999
2001
2004
2005
2008
2012
2015
2016
Dell EMC AI solutions
announced
2017
2019
Frontera
2018
First systems with DCLC (U.
Michigan) and HDR (OSC)
14. 14
World-class infrastructure in the Innovation Lab
Zenith
• TOP500-class system based on Intel Scalable Systems
Framework (OPA, KNL, Xeon, OpenHPC)
• 424 nodes dual Intel Xeon Gold processors, Omni-Path
• +160 Intel Xeon Phi (KNL) servers.
• Over 1 PF combined performance!
• #396 on Top500, 1.86 PF theoretical peak
• Lustre, Isilon H600, Isilon F800 and NSS storage
• Liquid cooled and air cooled
Rattler
• Research/development system with Mellanox, NVIDIA and
Bright Computing
• 88 nodes with EDR InfiniBand and Intel Xeon Gold
processors
• 32x PowerEdge C4140 nodes with 4x NVIDIA GPUs
Other systems
• 32 node AMD cluster, storage solutions, etc.
13K ft.2 lab, 1,300+ servers, ~10PB storage dedicated to HPC in collaboration with the community
15. 15
Compute/
Storage/
Networking
Verticals/
Use Cases
Processor/
Accelerator
Software&
Frameworks
Management/
Orchestration
Virtualization
Xeon Phi
R840
In-Memory
Analytics
C4140
R740 C6320p
Training
Machine Learning & Deep Learning eco-system – solving real world problems
Xeon FPGA Adapter
Consumption
Models
Math
Libraries
Crest Family
Big Accelerator
system
BigDL
FPGA Adapter
C6420 R740
Inference
Xeon
T640R940XA
Enterprise ISV
ML Software
Open Source Frameworks
hpcWeb
Recommendation Image Smart Disease Predictive Fraud Smart Traffic Threat Inventory
engines classification chatbots Identification marketing Detection Core to Edge IoT Predictions Management
BrightML
BigData & HPC orchestration Containers orchestration
asaka
Bitfusion Accelerator Virtualization & pooling
Service Providers Systems IntegratorsServices Solution BuildBuy
Hyperconverged
appliance
In-Memory
Analytics Training Inference
16. 16
Dell Value-Add for HPC Customers
Dell EMC
Ready Solution
for HPC
Life Sciences
Dell EMC
Ready Solution
for HPC
Research
HPC
Sales
Specialists
Dell EMC
Ready Solution
for
NFS Storage
Dell
Financial
Services
HPC Specific
Executive
Briefings
HPC
Estimation
Tool
HPC/AI
Innovation
Lab
Dell EMC
Annual HPC
Community
Mtg & DellXL
Reference
Architectures
Ready
Solutions
HPC
Resources
&
Tools
HPC
Services
Dedicated
HPC
organization
HPC
Solution
Architects
HPC
Vertical
SME’s
HPC
TSR’s &
OSE’s
Access to
Future
Roadmaps
ProSupport
Add-on for HPC
Remote
Cluster Mgmt
And support
packs
ProDeploy
for HPC
Cloud
Services
Office of CTO
HPC
Visionaries
Consulting
Services for HPC
Dell EMC
Ready
Solution
for HPC
Digital Mfg.
HPC
Power/Cooling
and Cable
Calculators
Partner with
leading suppliers
and channel
partners
Dell EMC
Ready Solution
for HPC
AI
Dell EMC
Ready Solution
for
Lustre®
17. 17
Business needs
Simon Fraser University
needed increased scale and
capacity to compete and excel
globally using big data and big
compute tools.
Solutions at a glance
• Heterogeneous cluster with Dell EMC PowerEdge
C4130/C6320 servers, Intel® Xeon® E5-2650v4/E5-
2683-v4 processors, Intel Omni-Path, NVIDIA®
Tesla® P100 & V100 GPU accelerators
Business results
• Expanded compute, storage and cloud resources for
over 11,000 researchers across Canada
• Ability to run multiple simultaneous jobs of up to 2,600
CPU cores each
• Allows researchers to much more quickly analyze the
DNA of microbes
• Ability to identify infectious disease outbreaks faster
• More rapid tracking and understanding of origins and
spread of infectious disease outbreaks
With greater computational power than all of
Compute Canada’s legacy systems combined, Cedar
is built for big data. The system can support
researchers collecting, analyzing or sharing immense
volumes of data
https://www.youtube.com/watch?v=3RqF8m65r8g
Simon Fraser University
Reduce the impact of infectious disease outbreaks
19. The Digital Future Demands a New Perspective
Cloud First Data First
Infrastructure-centric Business-centric
Takes into consideration:
• Data gravity
• Data velocity
• Data control
• Data privacy and compliance
Driven by:
• Lower infrastructure CapEx
• Offload infrastructure maintenance
• Improve time to market (deployment
time for infrastructure)
20. 20
Analytics Development Challenges and Best Practices
Data Environment Challenge Best Practices
Data Quality Inconsistently formatted across
the organization, often contains
errors and could create biases.
Minimize duplication. Implement Data
Governance strategy and tools to cleanse,
validate and enforce data quality.
Data Silos Inability to leverage data across
the enterprise for analytics
Develop/engineer a common data repository
to access for advanced analytics
New Sources of Data Machine/Deep Learning often
requires large amounts of data,
from various sources, and
formats (unstructured).
Develop a data integration strategy and use
tools to process unstructured data that
requires additional processing, cleansing,
and/or normalization.
Supporting Near Real-
time Requests
Data repositories are challenged
scaling transactional and query-
based workloads together
Consider translytical databases to address
near real-time data response times for
advanced analytical applications.
Privacy, Security, Ethics,
Auditability
Often not considered at design
time and leads to governance
exposures.
Adoption of design principles to provide data
transparency for key stakeholders and
accountability for governance/regulation
organizations.
21. 21
The Value of Dell for AI Infrastructure
- Comprehensive and Scalable AI/Analytics Platform Portfolio
- Workstations, Servers, Clusters, Storage, Networking
- Infrastructure and Data Science and Analytics Expertise
- HPC and AI Innovation Lab
- IoT / Intelligent Video Analytics Lab
- Solution-based Offerings
- Pre-configured AI Ready Offerings
- IoT / Safety and Security and
Thermal Vision Solutions
- GPU Virtualization
- ML Platforms
Infrastructure
Scalability
Reduce
Complexity
Address
Demand
Partner
Ecosystem
Cost
Effective