zData Inc. Overview
Enterprise Big Data Solutions
– Preferred Channel Partner for Pivotal, EMC and Cisco
– Single source for Software, Hardware and Services Procurement across multiple vendors
Experience
& Capacity
– Proven LeadershipTeam
– Over 100Years of collective Pivotal architecture knowledge
– Retail, Utilities, Banking, Government (DoD)
– Global Reach US, APJ, EMEA
Training
– OnlineTraining Portal
– OnsiteTraining: 1,3, and 5 day tracks
Services
– BI and Advanced Analytics Platform and Pilot Programs
– Commercial Big DataTurn Key Solutions and Migration Path
– Enterprise Data Lake Adoption
Solutions
– Fully Hosted and Managed Environments (Cloud and On-Premise)
– Migration Expertise
– Data Lake Design and Methodology
– Services EnablementTools
About zData
Consulting
– Hadoop: Pivotal HD, Hortonworks, MapR
– Greenplum
– SQL on Hadoop, HAWQ
– Spark and Shark
BI & Advanced Analytics Platform
Data Lake Solutions
CustomToolsets
Solutions
BI & Analytics Platform
Store
Gather, integrate, load and manage your data in the cloud or on premise
Collaborate
Validate and dimensionalize data and share for contextual services
Prediction and Advanced Analytics
Execute machine learning and predictive analysis through an extensive library of
functions and features
Analyze andVisualize
Collaborate and uncover insights through a comprehensive, diverse set of visualizations
The ZD Platform offers all the technical support and resources your company needs to get
started with Big Data.
ZD Pilot Programs
Pilot Overview
- 8 week program
- CollaborationTool - ZD Chorus
- Up to 1TB of Data
- FaultTolerant – Mirroring
- Security – SSL andVPN - Private Clusters
- Platform, Software, and Services all Included
- Free online HAWQ & GreenplumTraining course
with Pilot Program subscription
Storage + BI andVisualization + AdvancedAnalytics
zData’s Big Data Pilot Program is a cloud-based end to end BI and analytics solution that leverages the latest MPP
performance and SQL on Hadoop technologies. This 8 week pilot program provides a testing platform that delivers all
the components you will need for your next generation Big Data solution. zData takes out all of the guess work by
providing a fully integrated stack and including key services to help your team get started quickly.
STORAGE
BI &VISUALIZATION
ADVANCED ANALYTICS
Data Lake
– 2 DaysToolsTraining
– 2 Day Architecture
– 1 Day Analytics Demo
– 2 Days Collaboration
– 2 Days Exploration
– 3 DaysToolsTraining
– 2 Day Architecture
– 3Week Mini-Project
– 2Week Discovery Review
– 5 DaysTraining
– Analytics Roadmap
– Test Model Deployment
– Multiple Iterations
Data Lake Solutions EnterpriseAdoption
ZD Chorus
AgileAnalytics used for a Collaborative Environment
MetaELT
Metadata Driven GPDB ETL
zMonitor
DCA and GPDB Cluster Health Alerting and Monitoring via Nagios
zData MigrationToolkit
Migrate database from Greenplum or HAWQ
zData Backup, Replication and DR.
zData CustomToolsets
Big Data Consulting
Managed Services
Services
General Services
Greenplum
Hadoop
SQL on Hadoop (HAWQ)
Platform as a Service
Greenplum and HAWQ
Hadoop
Big Data Consulting General Services
Pivotal Exclusive Services
Infrastructure
– Pivotal GPDB Installs
– DCA Install and Upgrade
Implementation
– Data Migrations
– Analytics Labs
– Audits/Health Checks
Infrastructure
– Software only hardware support
– Cisco, HP and Dell Cluster Install
– Amazon (AWS) GPDB, Hadoop & BI hosted cluster support and services
Implementation
– Managed Services & Cluster Co-Lo
– VCE Configuration & Certification
– Benchmarking
– China and India Offshore Pivotal DB & Hadoop Services
Training
– Onsite – Fundamentals and Advanced
– OnlineTraining Portal
Platforms
- Greenplum
- Hadoop
- SQL on Hadoop (HAWQ)
- Platform as a Service
Big Data Consulting
Greenplum
zData Inc. Greenplum Database Consulting offers a unique focus on Pivotal-based application and
technology initiatives by combining leading expertise, broad coverage, global scale, and flexible delivery.
Hadoop
zData is versed in all applications of Hadoop, a rapidly evolving open source framework scalable for
processing huge datasets in distributed systems.
SQL on Hadoop (HAWQ)
From Pivotal HAWQ to Cloudera Impala, zData has the expertise to configure, implement, and train in any
environment.
Spark Consulting
zData can guide you on your journey with Apache Spark, a large-scale processing engine that can replace
Hadoop MapReduce providing the benefit of improved performance in real-time streaming.
Platform as a Service
zData recognizes the need for cross platform software, applications and framework offerings. zData can
consult you on all of your Cloud Foundry or PaaS needs.
zData’s Focus is on reducing MTD and MTM
MeanTime to Deployment (MTD)
- Use and Development of PaaS toolsets
- Automated Build Scripts
- Standardized Offerings
- Leverage the Cloud
- Best Practice Guides + over 1000
MeanTime to Migrate (MTM)
- Standardized Methodology
- ETL and DDL Conversion Scripts
- SQL Conversion Automation Scripts
– Oracle, Netezza,TD, DB2 to GPDB and HAWQ
Reduction in Human Capital and Operational Costs
- SupportAutomation and Monitoring
- Auto Detection and Breakfix
- Centralized Center of Excellence
- Remote Engineering and Managed Environments
Services Enablement efficiency & automation
Key Responsibilities
Infrastructure
– Security
–Workload Management
– Schema Creation
– Role/User/Group Creation
Low-Cost
Continuous
Support
Quarterly
Health Checks
Active
Investigation
& Resolution
Maintenance
– Explain Plan Review
– Catalog Bloat
–Vacuum
– Partition Pruning
Development
– SQL
– Datamodel Design
– DDL Creation
– Schema Design
– Stored Procedures
Tuning
– Column vs. Row
– SQL
– PL/PGSQL & SQL Functions
– Distribution Strategies
Resident & Remote DBA
*zData also offers full life cycle migration assessments and services
Quickstart – 6 Week Migrations
– Sample Data Set
–Tool Connectivity
– Workload Management/Tuning
– Schema Creation
– Explain Plan Review
– Catalog Bloat
–Vacuum
– Partitioning Pruning
Quickstart – 8 Week Migrations & ETL
– Sample Data Set
– Pig and Hive Setup
– Users and Groups
– Mapreduce
– Hive SQL/UDF
– Cascading
– Custom ETL and BITools
– Hbase Migration Setup
Quickstart – 8 Week Setup
– Scoop Data Load
– GP Schema Setup
–Table Migration
– User/Group/Roles
– Demo
– HDFS to HAWQ Data Movement
Services StandardOfferings
Kickstart– 4 Week Big Data Infrastructure
– Rack and Stack
– Custom UCS Platform Config
– Pre-build Pivotal Cisco appliance
– UCS Manager Setup
– UCS GPDB Monitoring
– Environment Design
– Custom Configurations
– Reference Configurations
*zData also offers full life cycle migration assessments and services
Kickstart – 1 Week GP to GP Migrations
– Greenplum Software Only installations
– DCA Rack/Stack Install
– AmazonWeb Services Setup/Install
– Greenplum to Greenplum
environment migrations
– Backup infrastructure setup
–Timing and Price dependent on volume
Kickstart – 1 Week Migrations and ETL
– O/S Install setup
– Hadoop Install/Setup/Config
– N=3 Environment Setup
– Hadoop environment replication
setup
Services Software + Infrastructure
– Patch Management
– Capacity
– Memory Usage
– Backups
– Security
– Start/Stop
– Break/Fix
– Segment Recovery
– Backup Restore
– Dedicated Support
– Remote Support
– Onsite Escalation
– Problem Isolation
and Resolution
– Health Checks
– Capacity Planning
– System Integration
Managed Services
Managed Services Greenplum and HAWQ
System Administration
- Scheduled System Jobs
- Disaster Recovery Planning
- Patching and Upgrades
- LDAP
- MoreVRP Integration
Database Maintenance
- Batch Load Monitoring
- Capacity Planning and Monitoring
- Memory Management and Settings
- 24x7x365 Monitoring
Backup / Recovery
- Backup Scripts
- Monitor Backups
- Segment Recovery
Tool Connectivity
- DIA Module 3rd Party Software
Installation and Configuration
- ODBC/JDBC Setup
Greenplum Managed Services
Help Desk
Quarterly
Health Checks
IssueTracking,
Resolution
Backup, ETL
Management
Vendor
Management
Key Support Responsibilities
Managed Services Hadoop
System Administration
- Disaster Recovery Planning
- Patching and Upgrades
- Quarterly Health checks
- LDAP
Cluster Maintenance
- HDFSTuning
- Capacity Planning
- Power Management
- Memory Management and
Settings
- 24x7x365 Monitoring
Monitoring
- Cluster node monitoring
- Named node monitoring
- Map Reduce job completion
Tool Connectivity
- Hive/Hbase
- ODBC/JDBC Setup
Hadoop Managed Services
Key Responsibilities
Help Desk
Quarterly
Health Checks
IssueTracking,
Resolution
Data
Integration
Vendor
Management
Training
Pivotal Fundamentals 4.2
Hadoop Administrator
GPDB AdvancedTraining
OnlineTraining Portal
OnsiteTraining
OnlineTraining Portal zData University
Unique OnlineTraining Portal
- Greenplum DeveloperTraining
- Greenplum Database Administrator Training
- Greenplum & HAWQ Fundamentals Overview
- HAWQ Fundamentals
- Gemfire XD Fundamentals
- Data Lake 101
For Developers, Software Engineers and Power Users
zData’s new Online Training Platform now offers top industry courses at the click of a button. Receive best in class
training for Greenplum, HAWQ, GemfireXD, Hadoop, Data Lake Concepts and other top technologies within the Big
Data ecosystem. With zData University, you are learning from the real world experience of our senior field engineers.
*zData also offers full life cycle migration assessments and services
Fundamentals Advanced
– 5 Days
– Installation and Configuration
– Postres SQL
– MPP Architecture
– Explain Plan Review
– Catalog Bloat
–Vacuum
– Partitioning Pruning
Administration Developer
– 3 Days
– Installation andConfiguration
– Pig and Hive Setup
– Users and Groups
– Mapreduce
– Hive SQL/UDF
– Cascading
– Custom ETL and BITools
– Hbase Migration Setup
Fundamentals
– Scoop Data Load
– GP Schema Setup
– External HDFS Calls
OnsiteTraining
Contact Us | sales@zdatainc.com

zData Inc. Big Data Consulting and Services - Overview and Summary

  • 1.
  • 2.
    – Preferred ChannelPartner for Pivotal, EMC and Cisco – Single source for Software, Hardware and Services Procurement across multiple vendors Experience & Capacity – Proven LeadershipTeam – Over 100Years of collective Pivotal architecture knowledge – Retail, Utilities, Banking, Government (DoD) – Global Reach US, APJ, EMEA Training – OnlineTraining Portal – OnsiteTraining: 1,3, and 5 day tracks Services – BI and Advanced Analytics Platform and Pilot Programs – Commercial Big DataTurn Key Solutions and Migration Path – Enterprise Data Lake Adoption Solutions – Fully Hosted and Managed Environments (Cloud and On-Premise) – Migration Expertise – Data Lake Design and Methodology – Services EnablementTools About zData Consulting – Hadoop: Pivotal HD, Hortonworks, MapR – Greenplum – SQL on Hadoop, HAWQ – Spark and Shark
  • 3.
    BI & AdvancedAnalytics Platform Data Lake Solutions CustomToolsets Solutions
  • 4.
    BI & AnalyticsPlatform Store Gather, integrate, load and manage your data in the cloud or on premise Collaborate Validate and dimensionalize data and share for contextual services Prediction and Advanced Analytics Execute machine learning and predictive analysis through an extensive library of functions and features Analyze andVisualize Collaborate and uncover insights through a comprehensive, diverse set of visualizations The ZD Platform offers all the technical support and resources your company needs to get started with Big Data.
  • 5.
    ZD Pilot Programs PilotOverview - 8 week program - CollaborationTool - ZD Chorus - Up to 1TB of Data - FaultTolerant – Mirroring - Security – SSL andVPN - Private Clusters - Platform, Software, and Services all Included - Free online HAWQ & GreenplumTraining course with Pilot Program subscription Storage + BI andVisualization + AdvancedAnalytics zData’s Big Data Pilot Program is a cloud-based end to end BI and analytics solution that leverages the latest MPP performance and SQL on Hadoop technologies. This 8 week pilot program provides a testing platform that delivers all the components you will need for your next generation Big Data solution. zData takes out all of the guess work by providing a fully integrated stack and including key services to help your team get started quickly. STORAGE BI &VISUALIZATION ADVANCED ANALYTICS
  • 6.
  • 7.
    – 2 DaysToolsTraining –2 Day Architecture – 1 Day Analytics Demo – 2 Days Collaboration – 2 Days Exploration – 3 DaysToolsTraining – 2 Day Architecture – 3Week Mini-Project – 2Week Discovery Review – 5 DaysTraining – Analytics Roadmap – Test Model Deployment – Multiple Iterations Data Lake Solutions EnterpriseAdoption
  • 8.
    ZD Chorus AgileAnalytics usedfor a Collaborative Environment MetaELT Metadata Driven GPDB ETL zMonitor DCA and GPDB Cluster Health Alerting and Monitoring via Nagios zData MigrationToolkit Migrate database from Greenplum or HAWQ zData Backup, Replication and DR. zData CustomToolsets
  • 9.
    Big Data Consulting ManagedServices Services General Services Greenplum Hadoop SQL on Hadoop (HAWQ) Platform as a Service Greenplum and HAWQ Hadoop
  • 10.
    Big Data ConsultingGeneral Services Pivotal Exclusive Services Infrastructure – Pivotal GPDB Installs – DCA Install and Upgrade Implementation – Data Migrations – Analytics Labs – Audits/Health Checks Infrastructure – Software only hardware support – Cisco, HP and Dell Cluster Install – Amazon (AWS) GPDB, Hadoop & BI hosted cluster support and services Implementation – Managed Services & Cluster Co-Lo – VCE Configuration & Certification – Benchmarking – China and India Offshore Pivotal DB & Hadoop Services Training – Onsite – Fundamentals and Advanced – OnlineTraining Portal Platforms - Greenplum - Hadoop - SQL on Hadoop (HAWQ) - Platform as a Service
  • 11.
    Big Data Consulting Greenplum zDataInc. Greenplum Database Consulting offers a unique focus on Pivotal-based application and technology initiatives by combining leading expertise, broad coverage, global scale, and flexible delivery. Hadoop zData is versed in all applications of Hadoop, a rapidly evolving open source framework scalable for processing huge datasets in distributed systems. SQL on Hadoop (HAWQ) From Pivotal HAWQ to Cloudera Impala, zData has the expertise to configure, implement, and train in any environment. Spark Consulting zData can guide you on your journey with Apache Spark, a large-scale processing engine that can replace Hadoop MapReduce providing the benefit of improved performance in real-time streaming. Platform as a Service zData recognizes the need for cross platform software, applications and framework offerings. zData can consult you on all of your Cloud Foundry or PaaS needs.
  • 12.
    zData’s Focus ison reducing MTD and MTM MeanTime to Deployment (MTD) - Use and Development of PaaS toolsets - Automated Build Scripts - Standardized Offerings - Leverage the Cloud - Best Practice Guides + over 1000 MeanTime to Migrate (MTM) - Standardized Methodology - ETL and DDL Conversion Scripts - SQL Conversion Automation Scripts – Oracle, Netezza,TD, DB2 to GPDB and HAWQ Reduction in Human Capital and Operational Costs - SupportAutomation and Monitoring - Auto Detection and Breakfix - Centralized Center of Excellence - Remote Engineering and Managed Environments Services Enablement efficiency & automation
  • 13.
    Key Responsibilities Infrastructure – Security –WorkloadManagement – Schema Creation – Role/User/Group Creation Low-Cost Continuous Support Quarterly Health Checks Active Investigation & Resolution Maintenance – Explain Plan Review – Catalog Bloat –Vacuum – Partition Pruning Development – SQL – Datamodel Design – DDL Creation – Schema Design – Stored Procedures Tuning – Column vs. Row – SQL – PL/PGSQL & SQL Functions – Distribution Strategies Resident & Remote DBA
  • 14.
    *zData also offersfull life cycle migration assessments and services Quickstart – 6 Week Migrations – Sample Data Set –Tool Connectivity – Workload Management/Tuning – Schema Creation – Explain Plan Review – Catalog Bloat –Vacuum – Partitioning Pruning Quickstart – 8 Week Migrations & ETL – Sample Data Set – Pig and Hive Setup – Users and Groups – Mapreduce – Hive SQL/UDF – Cascading – Custom ETL and BITools – Hbase Migration Setup Quickstart – 8 Week Setup – Scoop Data Load – GP Schema Setup –Table Migration – User/Group/Roles – Demo – HDFS to HAWQ Data Movement Services StandardOfferings
  • 15.
    Kickstart– 4 WeekBig Data Infrastructure – Rack and Stack – Custom UCS Platform Config – Pre-build Pivotal Cisco appliance – UCS Manager Setup – UCS GPDB Monitoring – Environment Design – Custom Configurations – Reference Configurations *zData also offers full life cycle migration assessments and services Kickstart – 1 Week GP to GP Migrations – Greenplum Software Only installations – DCA Rack/Stack Install – AmazonWeb Services Setup/Install – Greenplum to Greenplum environment migrations – Backup infrastructure setup –Timing and Price dependent on volume Kickstart – 1 Week Migrations and ETL – O/S Install setup – Hadoop Install/Setup/Config – N=3 Environment Setup – Hadoop environment replication setup Services Software + Infrastructure
  • 16.
    – Patch Management –Capacity – Memory Usage – Backups – Security – Start/Stop – Break/Fix – Segment Recovery – Backup Restore – Dedicated Support – Remote Support – Onsite Escalation – Problem Isolation and Resolution – Health Checks – Capacity Planning – System Integration Managed Services
  • 17.
  • 18.
    System Administration - ScheduledSystem Jobs - Disaster Recovery Planning - Patching and Upgrades - LDAP - MoreVRP Integration Database Maintenance - Batch Load Monitoring - Capacity Planning and Monitoring - Memory Management and Settings - 24x7x365 Monitoring Backup / Recovery - Backup Scripts - Monitor Backups - Segment Recovery Tool Connectivity - DIA Module 3rd Party Software Installation and Configuration - ODBC/JDBC Setup Greenplum Managed Services Help Desk Quarterly Health Checks IssueTracking, Resolution Backup, ETL Management Vendor Management Key Support Responsibilities
  • 19.
  • 20.
    System Administration - DisasterRecovery Planning - Patching and Upgrades - Quarterly Health checks - LDAP Cluster Maintenance - HDFSTuning - Capacity Planning - Power Management - Memory Management and Settings - 24x7x365 Monitoring Monitoring - Cluster node monitoring - Named node monitoring - Map Reduce job completion Tool Connectivity - Hive/Hbase - ODBC/JDBC Setup Hadoop Managed Services Key Responsibilities Help Desk Quarterly Health Checks IssueTracking, Resolution Data Integration Vendor Management
  • 21.
    Training Pivotal Fundamentals 4.2 HadoopAdministrator GPDB AdvancedTraining OnlineTraining Portal OnsiteTraining
  • 22.
    OnlineTraining Portal zDataUniversity Unique OnlineTraining Portal - Greenplum DeveloperTraining - Greenplum Database Administrator Training - Greenplum & HAWQ Fundamentals Overview - HAWQ Fundamentals - Gemfire XD Fundamentals - Data Lake 101 For Developers, Software Engineers and Power Users zData’s new Online Training Platform now offers top industry courses at the click of a button. Receive best in class training for Greenplum, HAWQ, GemfireXD, Hadoop, Data Lake Concepts and other top technologies within the Big Data ecosystem. With zData University, you are learning from the real world experience of our senior field engineers.
  • 23.
    *zData also offersfull life cycle migration assessments and services Fundamentals Advanced – 5 Days – Installation and Configuration – Postres SQL – MPP Architecture – Explain Plan Review – Catalog Bloat –Vacuum – Partitioning Pruning Administration Developer – 3 Days – Installation andConfiguration – Pig and Hive Setup – Users and Groups – Mapreduce – Hive SQL/UDF – Cascading – Custom ETL and BITools – Hbase Migration Setup Fundamentals – Scoop Data Load – GP Schema Setup – External HDFS Calls OnsiteTraining
  • 24.
    Contact Us |sales@zdatainc.com

Editor's Notes

  • #7 The use of open source technology presents a very attractive entry point for new commercial customers. Standardized software with the easiest migration from open source to the enterprise class
  • #13 zData’s differentiator in the services market is we are not working on building the largest consulting practice, we are looking to build the most efficient one.