The data-driven culture
still eludes many
Source: New Vantage Partners, 2018
92% failing to scale analytics
Source: McKinsey Analytics, 2018
What do the 8% know that the rest don’t?
TRUSTED & GOVERNED
Community
Agility Proficiency
+ Define roles, controls
and processes to govern
data and content
+ Establish governance
models – centralized,
delegated and self
governing
+ Evolve over time
Desktop Browser Mobile Embedded
Data Access
Deployment ON-PREMISES | CLOUD | HOSTED WINDOWS | LINUX | MAC MULTI-TENANT
Security
&
Compliance
Extensibility
&
APIs
Data Prep
Governance
Content Discovery
Analytics
Collaboration
The Tableau Platform
Live | In-memory | Hybrid Connectivity
Data Blending| Query Federation | Visual Data Prep | Auto Data Modeling
Centralized Data Sources | Certification | Usage Analysis | Permissions
Projects | Recommendations | Versioning | Search
Visual | Ad-hoc | Advanced | Spatial | Calculations | Statistics
Alerting | Subscriptions | Storytelling | Sharing | Discussions
Tableau Analytics Platform
Modern Analytics Workflow
The Benefits of a Governed Self Service Analytics
• Secured and governed access — Provide the right data to the right
audience in a secure, governed operational model.
• Data quality and reliability — Fact-based decisions are made with good
data.
• Build trust and confidence — Establish confidence when using trusted
data to drive business value.
• Consistent user experience — Document processes in an easily-
referenceable format – transparent and understandable by all.
• Performance and scalability — Reduce the proliferation of data and
content through governance processes.
Tableau’s Governance Framework
Data Content
TRUSTED & GOVERNED
Roles
Tableau Explorer
User Roles
Tableau Creator Tableau Viewer
Tableau Server Structure
Server
Default
Default
Site 1
Default
Site 2...
Default
Sites
Projects
Tableau Server Structure
Server
Default
Default
Site 1
Default
Site 2...
Default
Sites
(Multi-Tenant)
Projects
(Content)
Administrative Roles
Site Admin Project Leader
Server Admin
TRUSTED & GOVERNED
Data Governance
Tableau’s Governance Framework
Data Content
Data Source Management
• Provide business users with access to data
• Data selection driven by business questions
• Operationalized: fit, fresh, and relevant
• Comply with organizational policies and procedures
Data Source Management
Connect to
Data
Yes
Build metadata
Model
Live or
Extract?
Extract Data
Shared?
Publish
metadata model
to Tableau Server
No
Begin Analysis
Metadata Management
Shared
Databases
Apps
Cloud
Big Data
Files
Sources of Data
Data Source
VizQL Model
Connections
Connection Attributes
Tables
Joins
Live/Extract
Calculations
Aliases
Default Formatting
Comments
Filters
Aggregations
Roles
Table Calculations
Data Model
Key Considerations for Data Source Management
• What are the key sources of data for a department or team?
• Who is the Data Steward or owner of the data?
• Will you connect live or extract the data?
• Should the data source be embedded or published?
• Do variants of a dataset exist? If so, can they be consolidated as an
authoritative source?
• If multiple data sources are consolidated, does the single data source
performance or utility suffer by attempting to fulfill too many use cases at
once?
• What business questions need to be answered by the data source?
• What naming conventions are used for Published Data Sources?
Metadata Management
Filtered and sized to the analysis
Use business-friendly naming conventions
Set data types
Create hierarchies
Apply formatting (dates, numbers)
Set FY start date, if applicable
Add new calculations
Remove duplicate/test calculations
Enter comments
Aggregate to highest level
Hide unused fields
Publish to Tableau Server
Key Considerations for Metadata Management
• What is the process for curating data sources?
• Has the data source been sized to the analysis at hand?
• What is your organizational standard for naming conventions and field
formatting?
• Does the metadata model meet all criteria for curation, including user-
friendly naming conventions?
• Has the metadata checklist been defined, published, and integrated into
the validation, promotion, and certification process?
Data Quality
Key Considerations for Data Quality
• What processes exist for ensuring accuracy, completeness, reliability, and
relevance?
• Have you develop a checklist to operationalize the process?
• Who needs to review data prior to it becoming shared and trusted?
• Is this process adaptable to business users and are they able to partner with
data owners to report issues?
Data Enrichment
Tableau Prep Flow
Key Considerations for Data Enrichment
• Will data enrichment and preparation be centralized or self-service?
• What organizational roles perform data enrichment and preparation?
• What ETL tools and processes should be used to automate enrichment and/or
preparation?
• What sources of data provide valuable context when combined with each other?
• How complex are the data sources to be combined?
• Will users be able to use Tableau Prep Builder?
• How will you enable self-service data preparation?
Direct Access Workflow
Data
Steward/Analyst
(Creator)
Site
Admin
or
Project
Leader
Analyst
(Creator
or
Explorer)
Connect to
Certified Data
Identify sources
of data
Connect to
sources of data
Create/Augment
prototype Data
Model
Save/Publish
Workbook to
Sandbox Project
Publish Data
Source to
Production Data
Sources Project
Review prototype
Data Model
Approve?
Certify Published
Data Source
No
Yes
Analyst
(Creator
/Explorer)
Restricted Access Workflow
DBA/Data
Steward
(Creator)
Site
Admin
or
Project
Leader
Connect to
Certified Data
Identify sources
of data
Connect to
sources of data
Create/Augment
prototype Data
Model
Save/Publish
Workbook to
Sandbox Project
Publish Data
Source to
Production Data
Sources Project
Review prototype
Data Model
Approve?
Certify Published
Data Source
No
Yes
Analyst
(Creator
or
Explorer)
Review prototype
Data Source
Has
needed
data?
Augment
prototype Data
Model
Save/Publish
Workbook to
Sandbox Project
No
Yes
Describing Data Sources
Search by Relevance
Recommended Data Sources
Monitoring and Management
• Monitor Data Server & popular data sources
• Monitor data source usage
• Monitor individual usage
• Monitor server health
• Create custom views
Monitor Usage
Monitor Status
Key Considerations for Monitoring & Management
• How long do extracts run on server?
• Did the refresh succeed or fail?
• Are subscriptions delivered on time?
• Are there duplicate sources or data?
• Are data sources being used? By whom?
TRUSTED & GOVERNED
Content Governance
Tableau’s Governance Framework
Data Content
Key Considerations for Content Management
• Define standard naming conventions – projects, workbooks, data sources
• Organize by sites or projects, then by department, team, etc.
• Support for ad-hoc and certified content
• Separate areas for data sources and workbooks
• Establish visual standards for trusted content
• Use descriptions, tags, comments
• Keep content fresh and relevant
Content Management - Project Structure
Marketing
Events
Campaigns
Marketing
Sandbox
Events
Campaigns
Marketing
Data Sources
Events
Campaigns
Parent
Nested
Security & Permissions
Authentication LOCAL | ACTIVE DIRECTORY | SAML/KERBEROS/OPENID | TRUSTED TICKETS
Authorization SITE ROLE | DEFAULT & CUSTOM PERMISSIONS | INHERTITANCE & OVERRIDE
Data Security DATABASE USER & SERVICE ACCOUNT | CONTENT PERMISSIONS
Network Security CLIENT-SERVER SSL | DATABASE DRIVERS | STRINGENT TRUST MODEL
Building a Permissions Model
Goal: Provide continuity of user experience across audiences
• Define audiences (how is content organized)
• Define user types (how are users organized)
• Define workflow (how will users interact with content)
• Identify restrictions
Permissions
Projects Workbooks Data Sources
View
Save
Project
Leader
View
Download Image/PDF
Download Summary Data
View Comments
Add Comments
Filter
Download Full Data
Share Customized
Web Edit
Save
Download/
Save As
Move
Delete
Set Permissions
View
Connect
Save
Download Data Source
Delete
Set Permissions
Tableau Server Permissions
Shared
User Capabilities in Tableau
Key Considerations for Security& Permissions
• What is the minimum site role for Active Directory/LDAP group synchronization?
• Have you set all permissions for the All Users group in the Default project to None?
• Are any explicit restrictions (Deny permissions) needed on the All Users group to
propagate to every user account?
• Have you created groups that correspond to a set of authoring and viewing capabilities
for each project?
• Have you reviewed effective permissions on select users to test your permissions model?
• Have you locked permissions at the parent project to maintain security throughout the
project hierarchy?
• Have service account usernames/passwords been established for Published Data
Sources?
Key Considerations for Content Validation
• Who is involved in the validation process?
• Is the workbook accurate, complete, reliable, relevant, and recent?
• Does the new content replace existing content?
• Are the underlying data and calculations correct?
• Does the workbook reflect corporate branding?
• Does the workbook have a logical layout?
• Are all axes and numbers formatted correctly?
• Do dashboards load within the acceptable performance time?
• Do filters and dashboard actions behave on the targeted views?
• Does the dashboard remain useful in edge case behaviors (filtered to all, none, one value, etc.)?
Content Promotion
Validate Promote Certify
Audience
Location
Limited
Sandbox Project
Many
Certified Project
Trust Questionable High
Key Considerations for Content Promotion
• Who is involved in the promotion process?
• Do content-promoting roles have a checklist of criteria to evaluate?
• Have you clearly delineated between certified content and ad-hoc content
by projects?
• Is the process agile to support iterations and innovation?
• Do you have workflows to address both direct and restricted sources of
data and workbooks?
Content Certification
Certification
Notes
Certification
Badge
The certification badge and
notes indicate a Certified Data
Source so that users will
know the data can be trusted.
Key Considerations for Content Certification
• Who is responsible for designating certified content?
• Have all criteria for achieving certification status been met?
• Are all fields completed: about, certification notes, tags?
Content Utilization
Key Considerations for Content Utilization
• How much traffic goes to each view?
• What is the definition of stale content? How often is stale content purged?
• How much indirect utilization (alerts & subscriptions) occurs?
• Are subscriptions delivered on time?
• Does the actual audience size match with expectations?
• Does content follow a weekly, monthly, quarterly trend?
• What is the frequency of login or days since last login by user cohort?
• What is the distribution of workbook and data source size?
TRUSTED & GOVERNED
Governance Models
Modern Analytics Workflow
Governance Models
Centralized
Data access restricted
to centralized group
Content authoring
restricted to
centralized group
Many view & interact
with content
Delegated
Data access restricted
to trained group
Published data
sources available
Content based on
published data
sources, modifiable by
some
Self-Governing
Open data access, with
certification process
Content created by
anyone, with
certification process
Content modifiable,
with certification
process
Governance Models
Centralized
Analytics
CoE Business Lead
Business
Analyst
Business
Analyst
Business Lead
Organization
Centralized
Organization
Analytics COE Business Lead
Business
Analyst
Business
Analyst
Business Lead
Steering
Comm
COE Resource COE Resource
IT
Hybrid
Hybrid
Decentralized
Organization
Business Lead
Business
Analyst
Business
Analyst
Business Lead
Analytics
CoE
Analytics
CoE
De-centralized
Tableau Data Server’s Impact
Shared
Publishing Data Sources
Monitoring Usage & Status
Curating & Sharing Metadata Models
Establishing Processes for Publishing & Certifying
Managing Permissions for Data Access
Designating Certified Data Sources
Establishing Data Standards
How do you connect to Data Server?
In Tableau Server In Tableau Desktop
Published Data Sources
Shared
Hybrid Data Architecture
In-Memory Extract
• Database is too slow for
interactive analytics
• Take load off a transactional
database
• When you need to be offline
Live Query
• Using a fast database
• Require up-to-the-minute data
• Query-dependent scenarios,
such as Initial SQL
Subscriptions
Data-Driven Alerts
Insights-driven
organizations will
grow 7X faster than
global GDP
7X
Source:
Insights - Driven Businesses Set The Pace For Global Growth –
Forrester Research, 2018
How we support you
Customer Success
+ Orchestrate the
resources needed to
accelerate adoption
and maximize the value
of your investment
throughout your
journey.
Premium Support
+ Technical advisor
working with your
Center of Operations to
ensure optimal
performance of your
environment.
Enterprise Adoption
Program
+ Team of resources
(Tableau and/or
partners) setting up
Governance,
Mentoring,
Architecture,
Automation, and Data
Strategy.
Training
+ Establish a scalable
enablement program
for your users.
Month 1 Month 4 Month 8 Month
12
Scheduled Touchpoints / Document Value
Kick Off Call
Executive Business Review Executive Business Review
Review priorities and deliver Success Plan
Tableau Day
Data Gathering (Capability Assessment / Elite Deployment Review / Usage stats etc.)
Align Roadmaps
Customer Success
Summit
Sample Engagement Schedule
Strategy for Tableau Journey
Lead Architect
25 days over 9 months
Foundation / CoE & Community
Solution Architect
40 days
Foundation - Tableau Platform
Solution Architect
25 days
Enablement - Onsite Training
Tableau Trainer
15 days
Business Projects
Analytics Consultant
120 days
Enterprise Adoption Consulting Program | 225 Days Onsite  Sample Breakdown of activities
 Overall Ownership &
responsibility
 Overall Discovery
 Define Visual Analytics
strategy
 Develop Roadmap to
Success / Adoption at Scale
 Plan and coordinate
Resources
 Ensure Customer Success
 Ensure Knowhow Transfer
 Primary contact for
customer
 Steering committee
meetings
 Escalation instance
 Quality assurance & Best
Practices
 Execute on Visual Analytics
Strategy
 Working on Processes and
Guidelines
 Define Governance,
Compliance processes
 Working on Templates
 Best Practices assessment
and Best Practices platform
 Define and set up
Enablement Plan and
Certification Program
 Sharing and Enablement on
Processes, Guidelines and
Best Practices
 Documentation
 Support on client‘s overall
Data strategy
 Define and work on Tableau
Data Strategy
 Empower the role Data
Steward
 Deployment processes and
Change Management
 Define Support processes
 Plan and Execute on
Community activities
 Operation and support concept
 Define and configure High
Availability Architecture
 Set up Monitoring Processes
 Set up Disaster & Recovery
Processes
 Support / Implement
Automation
 Security concepts /
Implementing Security and
Governance processes
 Enablement and know-how
Transfer of customer‘s internal
staff
 Tableau Web Authoring
 Tableau Desktop I & II:
Accelerated
 Tableau Desktop I:
Fundamentals
 Tableau Desktop III: Advanced
 Visual Analytics
 Tableau Server Administration
 Tableau Server Architecture

 Working on Business requirements
 Data source analysis
 Alignment according to the central
data strategy, processes and
guidelines
 Enablement and knowhow Transfer
to the business users
 Quality assurance and best
practices for Visual Analytics &
Dashboards
 Creation / Implementation of
Dashboards
 Contribution to Community
Activities
 Conduct specific Workshops with
the business
 Agile and iterative Tableau solution
development
 Tableau Doctor activities
Premium Support
Key Benefits:
• 24/7 PI and P2 coverage
• Support coverage for ALL Tableau deployments
regardless of where they are hosted
• Dedicated Technical Account Manager (TAM) assigned
to your account with a documented deployment on file
with Tableau’s support team to more quickly diagnosis
issues should they occur
• Weekly status calls with the Tableau TAM and your
admin team
• Dedicated Senior Support Engineer team that
exclusively handles Premium Support tickets
• On-site escalation management – if an issue cannot be
resolved through normal support channels Tableau will
send an on-site resource to resolve the issue at
Tableau’s Expense
• Twice annual deployment reviews (at a minimum)
identifying potential issues and opportunities to
improve performance
• Upgrade planning and assistance
• Strategic planning for scale/growth
Next Steps
Define your analytics strategy, build your
team and establish a Success Plan.

10.10.7 Tableau Blueprint Deep Dive Governance (Presentation Deck).pptx

  • 2.
    The data-driven culture stilleludes many Source: New Vantage Partners, 2018
  • 3.
    92% failing toscale analytics Source: McKinsey Analytics, 2018
  • 4.
    What do the8% know that the rest don’t?
  • 5.
  • 7.
    + Define roles,controls and processes to govern data and content + Establish governance models – centralized, delegated and self governing + Evolve over time
  • 8.
    Desktop Browser MobileEmbedded Data Access Deployment ON-PREMISES | CLOUD | HOSTED WINDOWS | LINUX | MAC MULTI-TENANT Security & Compliance Extensibility & APIs Data Prep Governance Content Discovery Analytics Collaboration The Tableau Platform Live | In-memory | Hybrid Connectivity Data Blending| Query Federation | Visual Data Prep | Auto Data Modeling Centralized Data Sources | Certification | Usage Analysis | Permissions Projects | Recommendations | Versioning | Search Visual | Ad-hoc | Advanced | Spatial | Calculations | Statistics Alerting | Subscriptions | Storytelling | Sharing | Discussions Tableau Analytics Platform
  • 9.
  • 10.
    The Benefits ofa Governed Self Service Analytics • Secured and governed access — Provide the right data to the right audience in a secure, governed operational model. • Data quality and reliability — Fact-based decisions are made with good data. • Build trust and confidence — Establish confidence when using trusted data to drive business value. • Consistent user experience — Document processes in an easily- referenceable format – transparent and understandable by all. • Performance and scalability — Reduce the proliferation of data and content through governance processes.
  • 11.
  • 12.
  • 13.
  • 14.
    Tableau Server Structure Server Default Default Site1 Default Site 2... Default Sites Projects
  • 15.
    Tableau Server Structure Server Default Default Site1 Default Site 2... Default Sites (Multi-Tenant) Projects (Content)
  • 16.
    Administrative Roles Site AdminProject Leader Server Admin
  • 17.
  • 18.
  • 19.
    Data Source Management •Provide business users with access to data • Data selection driven by business questions • Operationalized: fit, fresh, and relevant • Comply with organizational policies and procedures
  • 20.
    Data Source Management Connectto Data Yes Build metadata Model Live or Extract? Extract Data Shared? Publish metadata model to Tableau Server No Begin Analysis
  • 21.
    Metadata Management Shared Databases Apps Cloud Big Data Files Sourcesof Data Data Source VizQL Model Connections Connection Attributes Tables Joins Live/Extract Calculations Aliases Default Formatting Comments Filters Aggregations Roles Table Calculations Data Model
  • 22.
    Key Considerations forData Source Management • What are the key sources of data for a department or team? • Who is the Data Steward or owner of the data? • Will you connect live or extract the data? • Should the data source be embedded or published? • Do variants of a dataset exist? If so, can they be consolidated as an authoritative source? • If multiple data sources are consolidated, does the single data source performance or utility suffer by attempting to fulfill too many use cases at once? • What business questions need to be answered by the data source? • What naming conventions are used for Published Data Sources?
  • 23.
    Metadata Management Filtered andsized to the analysis Use business-friendly naming conventions Set data types Create hierarchies Apply formatting (dates, numbers) Set FY start date, if applicable Add new calculations Remove duplicate/test calculations Enter comments Aggregate to highest level Hide unused fields Publish to Tableau Server
  • 24.
    Key Considerations forMetadata Management • What is the process for curating data sources? • Has the data source been sized to the analysis at hand? • What is your organizational standard for naming conventions and field formatting? • Does the metadata model meet all criteria for curation, including user- friendly naming conventions? • Has the metadata checklist been defined, published, and integrated into the validation, promotion, and certification process?
  • 25.
  • 26.
    Key Considerations forData Quality • What processes exist for ensuring accuracy, completeness, reliability, and relevance? • Have you develop a checklist to operationalize the process? • Who needs to review data prior to it becoming shared and trusted? • Is this process adaptable to business users and are they able to partner with data owners to report issues?
  • 27.
  • 28.
    Key Considerations forData Enrichment • Will data enrichment and preparation be centralized or self-service? • What organizational roles perform data enrichment and preparation? • What ETL tools and processes should be used to automate enrichment and/or preparation? • What sources of data provide valuable context when combined with each other? • How complex are the data sources to be combined? • Will users be able to use Tableau Prep Builder? • How will you enable self-service data preparation?
  • 29.
    Direct Access Workflow Data Steward/Analyst (Creator) Site Admin or Project Leader Analyst (Creator or Explorer) Connectto Certified Data Identify sources of data Connect to sources of data Create/Augment prototype Data Model Save/Publish Workbook to Sandbox Project Publish Data Source to Production Data Sources Project Review prototype Data Model Approve? Certify Published Data Source No Yes
  • 30.
    Analyst (Creator /Explorer) Restricted Access Workflow DBA/Data Steward (Creator) Site Admin or Project Leader Connectto Certified Data Identify sources of data Connect to sources of data Create/Augment prototype Data Model Save/Publish Workbook to Sandbox Project Publish Data Source to Production Data Sources Project Review prototype Data Model Approve? Certify Published Data Source No Yes Analyst (Creator or Explorer) Review prototype Data Source Has needed data? Augment prototype Data Model Save/Publish Workbook to Sandbox Project No Yes
  • 31.
  • 32.
  • 33.
  • 34.
    Monitoring and Management •Monitor Data Server & popular data sources • Monitor data source usage • Monitor individual usage • Monitor server health • Create custom views
  • 35.
  • 36.
  • 37.
    Key Considerations forMonitoring & Management • How long do extracts run on server? • Did the refresh succeed or fail? • Are subscriptions delivered on time? • Are there duplicate sources or data? • Are data sources being used? By whom?
  • 38.
  • 39.
  • 40.
    Key Considerations forContent Management • Define standard naming conventions – projects, workbooks, data sources • Organize by sites or projects, then by department, team, etc. • Support for ad-hoc and certified content • Separate areas for data sources and workbooks • Establish visual standards for trusted content • Use descriptions, tags, comments • Keep content fresh and relevant
  • 41.
    Content Management -Project Structure Marketing Events Campaigns Marketing Sandbox Events Campaigns Marketing Data Sources Events Campaigns Parent Nested
  • 42.
    Security & Permissions AuthenticationLOCAL | ACTIVE DIRECTORY | SAML/KERBEROS/OPENID | TRUSTED TICKETS Authorization SITE ROLE | DEFAULT & CUSTOM PERMISSIONS | INHERTITANCE & OVERRIDE Data Security DATABASE USER & SERVICE ACCOUNT | CONTENT PERMISSIONS Network Security CLIENT-SERVER SSL | DATABASE DRIVERS | STRINGENT TRUST MODEL
  • 43.
    Building a PermissionsModel Goal: Provide continuity of user experience across audiences • Define audiences (how is content organized) • Define user types (how are users organized) • Define workflow (how will users interact with content) • Identify restrictions
  • 44.
    Permissions Projects Workbooks DataSources View Save Project Leader View Download Image/PDF Download Summary Data View Comments Add Comments Filter Download Full Data Share Customized Web Edit Save Download/ Save As Move Delete Set Permissions View Connect Save Download Data Source Delete Set Permissions
  • 45.
  • 46.
  • 47.
    Key Considerations forSecurity& Permissions • What is the minimum site role for Active Directory/LDAP group synchronization? • Have you set all permissions for the All Users group in the Default project to None? • Are any explicit restrictions (Deny permissions) needed on the All Users group to propagate to every user account? • Have you created groups that correspond to a set of authoring and viewing capabilities for each project? • Have you reviewed effective permissions on select users to test your permissions model? • Have you locked permissions at the parent project to maintain security throughout the project hierarchy? • Have service account usernames/passwords been established for Published Data Sources?
  • 48.
    Key Considerations forContent Validation • Who is involved in the validation process? • Is the workbook accurate, complete, reliable, relevant, and recent? • Does the new content replace existing content? • Are the underlying data and calculations correct? • Does the workbook reflect corporate branding? • Does the workbook have a logical layout? • Are all axes and numbers formatted correctly? • Do dashboards load within the acceptable performance time? • Do filters and dashboard actions behave on the targeted views? • Does the dashboard remain useful in edge case behaviors (filtered to all, none, one value, etc.)?
  • 49.
    Content Promotion Validate PromoteCertify Audience Location Limited Sandbox Project Many Certified Project Trust Questionable High
  • 50.
    Key Considerations forContent Promotion • Who is involved in the promotion process? • Do content-promoting roles have a checklist of criteria to evaluate? • Have you clearly delineated between certified content and ad-hoc content by projects? • Is the process agile to support iterations and innovation? • Do you have workflows to address both direct and restricted sources of data and workbooks?
  • 51.
    Content Certification Certification Notes Certification Badge The certificationbadge and notes indicate a Certified Data Source so that users will know the data can be trusted.
  • 52.
    Key Considerations forContent Certification • Who is responsible for designating certified content? • Have all criteria for achieving certification status been met? • Are all fields completed: about, certification notes, tags?
  • 53.
  • 54.
    Key Considerations forContent Utilization • How much traffic goes to each view? • What is the definition of stale content? How often is stale content purged? • How much indirect utilization (alerts & subscriptions) occurs? • Are subscriptions delivered on time? • Does the actual audience size match with expectations? • Does content follow a weekly, monthly, quarterly trend? • What is the frequency of login or days since last login by user cohort? • What is the distribution of workbook and data source size?
  • 55.
  • 56.
  • 57.
    Governance Models Centralized Data accessrestricted to centralized group Content authoring restricted to centralized group Many view & interact with content Delegated Data access restricted to trained group Published data sources available Content based on published data sources, modifiable by some Self-Governing Open data access, with certification process Content created by anyone, with certification process Content modifiable, with certification process
  • 58.
    Governance Models Centralized Analytics CoE BusinessLead Business Analyst Business Analyst Business Lead Organization Centralized Organization Analytics COE Business Lead Business Analyst Business Analyst Business Lead Steering Comm COE Resource COE Resource IT Hybrid Hybrid Decentralized Organization Business Lead Business Analyst Business Analyst Business Lead Analytics CoE Analytics CoE De-centralized
  • 59.
    Tableau Data Server’sImpact Shared Publishing Data Sources Monitoring Usage & Status Curating & Sharing Metadata Models Establishing Processes for Publishing & Certifying Managing Permissions for Data Access Designating Certified Data Sources Establishing Data Standards
  • 60.
    How do youconnect to Data Server? In Tableau Server In Tableau Desktop
  • 61.
  • 62.
    Hybrid Data Architecture In-MemoryExtract • Database is too slow for interactive analytics • Take load off a transactional database • When you need to be offline Live Query • Using a fast database • Require up-to-the-minute data • Query-dependent scenarios, such as Initial SQL
  • 63.
  • 64.
  • 67.
    Insights-driven organizations will grow 7Xfaster than global GDP 7X Source: Insights - Driven Businesses Set The Pace For Global Growth – Forrester Research, 2018
  • 69.
    How we supportyou Customer Success + Orchestrate the resources needed to accelerate adoption and maximize the value of your investment throughout your journey. Premium Support + Technical advisor working with your Center of Operations to ensure optimal performance of your environment. Enterprise Adoption Program + Team of resources (Tableau and/or partners) setting up Governance, Mentoring, Architecture, Automation, and Data Strategy. Training + Establish a scalable enablement program for your users.
  • 70.
    Month 1 Month4 Month 8 Month 12 Scheduled Touchpoints / Document Value Kick Off Call Executive Business Review Executive Business Review Review priorities and deliver Success Plan Tableau Day Data Gathering (Capability Assessment / Elite Deployment Review / Usage stats etc.) Align Roadmaps Customer Success Summit Sample Engagement Schedule
  • 71.
    Strategy for TableauJourney Lead Architect 25 days over 9 months Foundation / CoE & Community Solution Architect 40 days Foundation - Tableau Platform Solution Architect 25 days Enablement - Onsite Training Tableau Trainer 15 days Business Projects Analytics Consultant 120 days Enterprise Adoption Consulting Program | 225 Days Onsite  Sample Breakdown of activities  Overall Ownership & responsibility  Overall Discovery  Define Visual Analytics strategy  Develop Roadmap to Success / Adoption at Scale  Plan and coordinate Resources  Ensure Customer Success  Ensure Knowhow Transfer  Primary contact for customer  Steering committee meetings  Escalation instance  Quality assurance & Best Practices  Execute on Visual Analytics Strategy  Working on Processes and Guidelines  Define Governance, Compliance processes  Working on Templates  Best Practices assessment and Best Practices platform  Define and set up Enablement Plan and Certification Program  Sharing and Enablement on Processes, Guidelines and Best Practices  Documentation  Support on client‘s overall Data strategy  Define and work on Tableau Data Strategy  Empower the role Data Steward  Deployment processes and Change Management  Define Support processes  Plan and Execute on Community activities  Operation and support concept  Define and configure High Availability Architecture  Set up Monitoring Processes  Set up Disaster & Recovery Processes  Support / Implement Automation  Security concepts / Implementing Security and Governance processes  Enablement and know-how Transfer of customer‘s internal staff  Tableau Web Authoring  Tableau Desktop I & II: Accelerated  Tableau Desktop I: Fundamentals  Tableau Desktop III: Advanced  Visual Analytics  Tableau Server Administration  Tableau Server Architecture   Working on Business requirements  Data source analysis  Alignment according to the central data strategy, processes and guidelines  Enablement and knowhow Transfer to the business users  Quality assurance and best practices for Visual Analytics & Dashboards  Creation / Implementation of Dashboards  Contribution to Community Activities  Conduct specific Workshops with the business  Agile and iterative Tableau solution development  Tableau Doctor activities
  • 72.
    Premium Support Key Benefits: •24/7 PI and P2 coverage • Support coverage for ALL Tableau deployments regardless of where they are hosted • Dedicated Technical Account Manager (TAM) assigned to your account with a documented deployment on file with Tableau’s support team to more quickly diagnosis issues should they occur • Weekly status calls with the Tableau TAM and your admin team • Dedicated Senior Support Engineer team that exclusively handles Premium Support tickets • On-site escalation management – if an issue cannot be resolved through normal support channels Tableau will send an on-site resource to resolve the issue at Tableau’s Expense • Twice annual deployment reviews (at a minimum) identifying potential issues and opportunities to improve performance • Upgrade planning and assistance • Strategic planning for scale/growth
  • 73.
    Next Steps Define youranalytics strategy, build your team and establish a Success Plan.

Editor's Notes

  • #2 The importance of data to the modern enterprise is no longer a topic of debate. The sheer volume of data that organizations capture, store, and organize continues to grow at a staggering pace. Suddenly, every company is a data company. But while execs were nearly unanimous that they were working towards developing a data-driven culture, very few have been successful in building one.
  • #3 A study was recently run by McKinsey. They analyzed 1,000 companies with more than $1 billion in revenue and spanning 13 sectors and 12 geographies. Out of this study---92% were failing to achieve he elusive goal of analytics at scale. ________________________- Source: https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/breaking-away-the-secrets-to-scaling-analytics
  • #4 so that means that approximately only 8% of organizations are doing this right, and have succeeded in building a data culture. What is it that they know that the rest of us don’t? First, they know that Realizing the full value of your data means empowering everyone to make better decisions with it, and this cannot be done simply by choosing the right technology. It’s important to remember that you are not just deploying Software — you are driving organizational transformation by prioritizing facts over intuition with data at the center of every conversation. Your decisions should not only give people the right tools, but they must also enable the development of new skills, create new behaviors, encourage participation, and recognize achievements to alter how the organization uses data every day. So how do you do that?
  • #5 At the heart of every successful data culture we have found 3 core capabilities. A capability around agility – specifically when it comes to the agility of the environment itself. A capability around data and analytics proficiency within the user base. And a capability of community that is at the heart of building and driving that culture change. These three capabilities must be built on trusted and governed data that is the lifeblood of any successful data company.
  • #6 Tableau Blueprint is a step by step guide that will help you build these capabilities. It provides concrete plans, recommendations, and guidelines across critical foundational work and three primary workstreams that will turn repeatable processes into core capabilities. Let’s take a closer look at the different pieces of Blueprint.
  • #7 In order to do this, you need to define the controls, roles, and repeatable processes to make the appropriate data and content available to the right people across their org.   It’s also important to think about your data and your content, and put some guidelines in place to determine what you want centralized and locked down, or what can be more open and self governing. By understanding how these governance principals can be applied to the different data or content, that will help you design what your users need to now, and what access and capabilities they have at their disposal. And what’s important to remember about governance, is that this is something that needs to be revisited often. Your business needs are changing, the data and content in your environment is changing, and your users are changing. Your governance model needs to evolve with those changes. Some of our most successful customers have monthly meetings with cross functional governance teams designed specifically to ensure that the governance in place is continuing to drive adoption and safe usage of data.  
  • #8 Starting with our Platform. We understand that data is a mission critical asset to your organization. That is why we have built the Tableau platform to ensure you have everything you need to not only empower everyone in your organization with data, but to do so without putting your data or your organization at risk.  A combination of content and data governance features work together to keep your data in the right hands, while also ensuring that users can find the right data and to make decisions based on accurate, up-to-date and trusted data.  And we surround these capabiliteis with a combination of security protocols & reliability and scalability features to meet the needs of your business. As for the Modern Analytics Workflow 
  • #10 From soft guidelines to firm boundaries for the usage of Tableau, organizations need to design their own governance processes, complete with defined roles and responsibilities that everyone understands. When IT becomes the business enabler, informed users will have confidence in the data and will reap the benefits of governed data access.
  • #11 At a high level, these are the 2 pillars of Governance I mentioned in our Agenda. Data Governance (Data Security, Data Quality, Metadata management) and Content Governance (Content Promotion and Certification, Permissions, and Content Utilization), just to name a few.   The Tableau Governance Framework encompasses the people who understand and comply with the established controls, data and content management processes, and Tableau's supporting capabilities. Combining people, process, and technology, it creates a foundation for accountability and enables, rather than restricts, access to secure and trusted data and for all skill levels in your organization.   Acknowledging that varying degrees of governance are required, these principles can be right-sized and applied to any kind of data regardless of where it falls in the governance spectrum, which we’ll examine in more detail later.  Before we dive in, I want to discuss some of the structure and roles that will help support the framework.
  • #12 Let's start with the Tableau Server Structure.
  • #13 First we have creators, Creators are the power users in your organization that understand how to leverage Tableau to create and publish data sources and to author dashboards intended for use by others. They typically have rights to access your company's data sources directly so as to create the metadata layer for consumption by Explorers and Viewers.  These users get access to all Tableau products, including Tableau Prep. Creators are likely to be involved with all steps of a Modern Analytics Workflow. Explorers are likely a larger % of your overall user base. These are often the curious people in your organization who want to be more informed about their own business. These are folks that often have a lot of questions, often changing by the minute, who need answers. Explorers are empowered to access published content (dashboards, views, data sources), they can interact with views or they can edit and explore further to answer any questions they still have. If you're in IT now, these are the folks that email you a new request every day. Explorers are typically involved with Accessing, interacting, exploring and potentially sharing as part of the modern analytics workflow Viewers are your more casual consumers of data. They may need access to a handful of reports but can typically get the data they need through filtering and/or drilling and be on their way. This is one group that I see customers initially overestimating and often wind up recalculating as they realize the users have a greater thirst for insights than initially thought. Viewers are likely limited to accessing and interacting with existing content. Now that you're familiar with the Framework and How your users fit into the framework let's dive into the 2 pillars Data and Content Governance, starting with Data.
  • #14 When you install Tableau Server, you will find a site a Default site and a Default project, which is highlighted here. A Site is a collection of users, groups, and content (workbooks, data sources) that's walled off from any other groups and content on the same instance of Tableau Server. A Project is a container used for organizing content (workbooks, views, and data sources). Specific permissions define the capabilities a user or group has within it. And yes, you can have nested projects within a project. The Default project will serve as permissions template for all new projects on the site.
  • #15 Tableau Server supports multi-tenancy by allowing administrators to create sites on the server for collections of users and content. If you are not part of a site’s users, you will not even know of it’s existence.  This hierarchical structure allows for scoped or delegated administration to the server, site, or project level, as well as granular control of permissions.  Now that you have a visual reference for how Tableau Server can be structured, let’s look at the administrative and user roles.
  • #16 *Administrative not Administrator* Server admin should be responsible for the health of the Server itself. Ensuring high availability of access and scaling hardware for performance. They will handle things like upgrades, tweaking server parameters (I.e. max query times) and will oversee things like custom schedule creating for tasks like extract refreshes and subscriptions. The server admin has permissions to all content on all sites. Site admin is responsible more for the daily operational admin tasks required for a particular department, or geography. This takes a lot off the plate of IT and helps with agility. These individuals are typically more in-the-know about who should be able see what then IT, so they are often better equipped to manage users, groups and their associated data and content permissions. Site admins have permission to all content on their Site(s). Project leader is responsible for handling some of the governance and permissions for a particular project and potentially the projects nested within. They are able to assist with things like granting permissions but do not have some of the rights the admins have (I.e. adding new users). By delegating site and projects governance to trusted members of your organization, your Server admins can focus on high priority tasks like performance improvements and product upgrades, rather than getting tied up with operational tasks that the business is more than equipped to handle. This usually just requires some additional level of trust in your business to help self govern. Let's now look some user roles which align to our new role based subscriptions.
  • #18 For most IT organizations, Ensuring that the right data is available to the right people in the organization, at the time they need it, is critical. Often, if the business isn't empowered with data or if there's too many restrictions and hurdles required to access their data, users go the route of locally saving sensitive data to perform quick an analysis...putting your company at risk and blinding IT from how data is being used.  In a self-service environment, the role of Data Governance is to enable access, rather than restrict it, and allow users get the answers they need in a controlled environment allowing IT visibility and ensuring data compliance.  Let's break this down, starting with Data Source Management
  • #19 DSM is all about providing easy access to sources of data in compliance with your own organizational data strategy, policies, and procedures DSM enables your users to choose from certified data sources, data sources that have been vetted for accuracy and kept fresh with your most recent data.  Choosing the most relevant source based on the question or questions they need to answer. The questions change like the weather so unless you have a crystal ball, canned reporting won't  be enough. What might this look like?
  • #20 Here's a very simplified flow: 1st you connect to your data (Tableau has over 65 native connectors)...lots of flexibility, a common theme with Tableau and major reason for why we've had so much success in scaling to the enterprise. Next, connect live or extract your data and leverage our Hyper data engine. Up to you. Make the most use of your existing investments. You can use Tableau Desktop and/or Tableau Prep to help build your metdata model (often referred to as the semantic layer by our predecessors but that makes this sound over complicated which it's not). Once you have the metadata model it's time to Analyze in desktop or Share to your server for yourself and others to analyze. That's it.
  • #21 Tableau created a simple, elegant, and powerful metadata system that gives users flexibility while allowing for enterprise metadata management. A metadata model can be embedded in a workbook or centrally managed as a Published Data Source with Data Server. Data Source — The Data Source has one or more live and/or extract connections and attributes for the database, the tables, views and columns to use, and joins or custom SQL used to access the data. Data Model — Upon connection, Tableau automatically characterizes fields as Dimensions or Measures. In addition, the Data Model stores calculations, aliases, and formatting. VizQL Model — The VizQL Model enables users to adjust the role and aggregation of the fields at run time. This enables one user to define the base Data Source and Data Model as a collection of fields without needing to know, plan, or otherwise account for all the variations of analysis to be performed with the Data Source by other users.
  • #22 DSM is all about providing easy access to sources of data in compliance with your own organizational data strategy, policies, and procedures DSM enables your users to choose from certified data sources, data sources that have been vetted for accuracy and kept fresh with your most recent data.  Choosing the most relevant source based on the question or questions they need to answer. The questions change like the weather so unless you have a crystal ball, canned reporting won't  be enough. What might this look like?
  • #23 ***be prescriptive on slides like this to suggest this is what we see working*** The Tableau Platform helps you autonomously define, update and expose metadata for our customers. 1. Filtered and sized to the analysis – If your dataset has 20 years of global data and you're only interested in US data for the last 5 years, our extracts can help you reduce your data greatly *roll through them* Add provide commentary so end users can just hover over a field for an explanation And finally Management and  Montioring
  • #24 DSM is all about providing easy access to sources of data in compliance with your own organizational data strategy, policies, and procedures DSM enables your users to choose from certified data sources, data sources that have been vetted for accuracy and kept fresh with your most recent data.  Choosing the most relevant source based on the question or questions they need to answer. The questions change like the weather so unless you have a crystal ball, canned reporting won't  be enough. What might this look like?
  • #25 Data quality is an assessment of accuracy, completeness, reliability and relevance for decision making, which covers processes you already employ today.  Let's face it, data is rarely perfect in its raw state. The visualizations generated in Tableau, even in the data ingestion and prep stage now with Tableau Prep, help you identify and clean up and ultimately enrich your data. On your screen now, you should see Tableau Prep comparing the number of records for Orders and Returns from each table joined by Order ID. I can instantly see, based on red font, that there's a log of Orders that do not have a corresponding return. Probably a good indicator for this company.  As you check your Data Quality, you may find a need to enrich your data to drive better decision making.
  • #26 DSM is all about providing easy access to sources of data in compliance with your own organizational data strategy, policies, and procedures DSM enables your users to choose from certified data sources, data sources that have been vetted for accuracy and kept fresh with your most recent data.  Choosing the most relevant source based on the question or questions they need to answer. The questions change like the weather so unless you have a crystal ball, canned reporting won't  be enough. What might this look like?
  • #27 This might include steps like cleaning your data (maybe you need to remove the dashes from one data source) To marrying additional data sets (maybe you have some targets in an access database) Tableau Prep now makes it easy to visually create ETL processes and workflows from multiple sources of data to make data ready for analysis. Reducing the amount of time you spend prepping your data, and allowing you more time to actually explore and analyze your data. But we mustn't forget about security. Our data is precious and we need to treat it as such.
  • #28 DSM is all about providing easy access to sources of data in compliance with your own organizational data strategy, policies, and procedures DSM enables your users to choose from certified data sources, data sources that have been vetted for accuracy and kept fresh with your most recent data.  Choosing the most relevant source based on the question or questions they need to answer. The questions change like the weather so unless you have a crystal ball, canned reporting won't  be enough. What might this look like?
  • #29 Assumption: Data Steward/Analyst has direct access to sources of data. Appropriate sources of data are identified to answer business questions. Data Steward/Analyst: Connects to sources of data. Creates a prototype data model. Saves or publishes workbook to Tableau Server. Site Admin or Project Leader reviews data model to ensure it complies with data standard and approves it. Data Steward/Analyst publishes data source to Production Data Sources project. Site Admin or Project Leader certifies data source. Analysts connect to Certified Data Source to create new content based on trusted data.
  • #30 Assumption: Analyst does not have direct access to sources of data and relies on the DBA or Data Steward for connecting to sources of data. DBA or Data Steward is a Project Leader of the Data Sources - [Department] project. Appropriate sources of data are identified to answer business questions. DBA/Data Steward: Connects to sources of data. Creates a prototype data model. Saves or publishes workbook to Tableau Server. Analyst reviews prototype data source, augments prototype data model with new calculations, if necessary. Site Admin or Project Leader reviews data model to ensure it complies with data standard and approves it. Data Steward/Analyst publishes data source to Production Data Sources project and certifies it. Analysts connect to Certified Data Source to create new content based on trusted data.
  • #31 In a self-service environment with multiple publishers, it’s common for a project on Tableau Server to contain a variety of content. Published Data Sources should contain notes in the details tab to help users understand key information such as description, intended audience, business use case, connection type, database, owner, and more info. When this is the case, business users can be confident in knowing which data source is the right one to connect to.
  • #32 Tableau Server and Tableau Online let you search content in a variety of ways. The quick search field at the top of the page searches the entire site for matching resources of any kind. Filtered search finds matches using search criteria that are specific to each resource type, such as modified date for workbooks, and connection type for data sources. You can also find workbooks or views that you've marked as favorites. Shown here, quick search looks for matching text in resource names, owners, tags, captions, comments, and other information across the site. As you type, the top items that match your search text appear, ordered by relevance. Relevance is based on number of page views, recent activity, and your favorites. For data sources, the number of connected workbooks and views is also considered. In the search results, you'll see the total number of page views and favorites for a resource. Hover over an item to see a tooltip that tells you more about it.
  • #33 Tableau Server has a recommendations engine to provide a list of recommended data sources in Tableau Desktop. It improves discoverability of relevant data and accelerate users through data preparation so that they can focus on their analysis. With recommended data sources, users are presented with relevant published data sources in the flow of their data preparation, saving time and encouraging proper data source reuse when you’re signed into Tableau Server. Benefits are: Leverage existing expertise in your organization Guided query building Limit the need for constant data model changes
  • #34 Data is critical to operating Tableau at scale. In our last session, we saw how Monitoring & Management covers the end-to-end lifecycle management from raw to curated data. From Tableau’s repository data, you can: Monitor Data Server & popular data sources Monitor data source usage Monitor individual usage Monitor server health Create custom views
  • #35 With Tableau Server’s administrative views, you can monitor usage and status of Published Data Sources. The Traffic to Data Sources view gives you the ability to see usage of data sources on your Tableau Server installation. This can help you determine which data sources are most heavily used and those that are less often used. You can filter the information you see by selecting the data source, the action taken on that data source, and the time range. Server administrators can specify the site. Site administrators view only their site’s usage data.
  • #36 You can also monitor data extracts and their success or failure in Administrative Views. In the Background Tasks for Extract administrative view, a table lists the extracts that ran in the time period specified on the timeline. You will see a status of success or error to investigate further.
  • #37 Monitor broad usage patterns across organizational business units Monitor and audit usage of published content and track usage of untrusted content All of this is made visible to you through our Postgres database. We provide some pre-authored default admin views and if you have Server (not for Online) you can create your own custom views. I've yet to meet an admin who hasn't. This is your looking glass into how your business uses your data, what data do the use, what data do they not use?  unbelievably valuable data. As we wrap up the Data Governance pillar, here's a look at what those canned views may look like
  • #38 As the use of analytics increases, a growing number of mission-critical business decisions will become data driven. The net effect is not only an increase content volume but also in the varying skill levels among its users who will be collaborating and uncovering valuable insights. With more and more people using data daily, it is critical that Tableau’s content can be secured, governed, and trusted as well as organized so that people can discover, consume, and, create content with confidence.
  • #39 If you recall, this was introduced in our last session, “Evaluating a Modern BI Platform”. Be sure to catch it on-demand if you are joining this webinar series for the first time today. At a high level, these are the 2 pillars of Governance I mentioned in our Agenda. Data Governance (Data Security, Data Quality, Metadata management) and Content Governance (Content Promotion and Certification, Permissions, and Content Utilization), just to name a few.   The Tableau Governance Framework encompasses the people who understand and comply with the established controls, data and content management processes, and Tableau's supporting capabilities. Combining people, process, and technology, it creates a foundation for accountability and enables, rather than restricts, access to secure and trusted data and for all skill levels in your organization.   Acknowledging that varying degrees of governance are required, these principles can be right-sized and applied to any kind of data regardless of where it falls in the governance spectrum, which we’ll examine in more detail later.  Before we dive in, I want to discuss some of the structure and roles that will help support the framework.
  • #40 Content Management starts with things like standardizing on naming conventions and organizing your content using Tableau's sites and projects.  This might be done by department or possibly  by geography. A good content management strategy should support both certified and ad-hoc content which can be easily distinguished. You will likely want to keep your data sources separate from your dashboards to keep things neatly organized. For your certified, or trusted, or blessed content...visual best practices should be established. Maybe this means leveraging corporate colors, avoiding red/green color combinations or limiting dashboards to X number of views. Descriptions and tags can be leveraged to help users find the right content  And content must remain fresh and relevant. This includes keep data refreshed and sunsetting stale content. How does Tablau help with content management?
  • #41 Our project structure can be easily setup to maintain separate hierarchies for sandbox (experimental, test) projects from your production (certified, promoted) projects. This simplifies inherent content permissions for project leaders or site administrators managing the content. This also reduces the workload for your Server/IT admins. You don't need 3 separate environments to promote content...that model is costly and does not fit the modern analytics workflow. Tableau server is engineered to be quite resilient. There are checks and balance parameters designed to ensure that quote/unqote "one query can't bring down the whole house" Regarding security and permissions, 
  • #42  It's all about 2 questions:  1. Authentication – Should you even be here and  2. Authorization – If you should be here, what are you allowed to see and do I spoke to the former during the Data Governance section since there's some overlap so hopefully this slide will serve as a refresher. Regarding Authorization Our governance framework on Tableau Server offers permissioning at the Site level, the Project Level and the View Level for individual users and/or groups. We have some default personal, but you also have the freedom to explicitly allow or deny users many granular capabilities, like the ability to download content, edit content or save content.. And permissions can be locked down for certain content (I.e. certified content) while sandbox content may have more flexibility to override permissions.  The permissions model will likely look different for each company, however, there are some questions you can ask yourselves to help define the model for your organization.
  • #43  Like, How do you want to organize your content? This all starts with defining your audience. and  may be best answered by, how do you organize your users and user types? Once you understand your audience, you'll then want to explore how you expect users to interact with the content.  Will a static view be enough? Is filtering enough? Or will your users want to explore the WHY rather than simply relying on the WHAT? My guess is the latter. And then you can start to identify restrictions. I.e. this group of Power Users can be trusted with publishing content, whereas, this group of field reps simply need the ability to see the what and ask the why to run their own part of the business.
  • #44 In Tableau Server, you set content permissions to specify who is allowed to work with which content resources on a particular site and what they are able to do with that content.  For example, you can tightly restrict who has access to your company’s financial information, but widely share organizational development content. These permissions can be applied to Projects, Workbooks, Data Sources, each of which having a different subset of permissions to grant or deny.
  • #45 Tableau Server permissions for data sources, which are detailed on the right, allow for granular control of what capabilities are allowed for groups or users. In the example, Finance - Data Source Publishers group can view, connect, save, download, delete, and set permissions on data sources. This group has a Publisher permissions rule on the project so they can publish or save to the project. Members of the group include DBAs and/or Data Stewards. Finance - Workbook Publishers group and Finance - Workbook Users + Web Edit groups can view and connect to data sources. Members of the groups include authors and consumers. The groups have Viewer permissions rule on the project so they can view its contents. Permissions are locked to the project for workbooks and data sources, which means they cannot be modified.
  • #46 Along with the granular controls, Tableau provides capabilities to control content through: Site Roles as well as Tableau new Role Based Licenses which we talked about early on. Next you can create permissions templates as well as creating rules, like locking down permissions for certain projects. As well as rules applied to the owners of sites, projects or specific content. These multiple levels of content authorization help with empower your users while ensuring the right controls are in place. Now let's take a look at what Content Promotion.
  • #47 Monitor broad usage patterns across organizational business units Monitor and audit usage of published content and track usage of untrusted content All of this is made visible to you through our Postgres database. We provide some pre-authored default admin views and if you have Server (not for Online) you can create your own custom views. I've yet to meet an admin who hasn't. This is your looking glass into how your business uses your data, what data do the use, what data do they not use?  unbelievably valuable data. As we wrap up the Data Governance pillar, here's a look at what those canned views may look like
  • #48 Monitor broad usage patterns across organizational business units Monitor and audit usage of published content and track usage of untrusted content All of this is made visible to you through our Postgres database. We provide some pre-authored default admin views and if you have Server (not for Online) you can create your own custom views. I've yet to meet an admin who hasn't. This is your looking glass into how your business uses your data, what data do the use, what data do they not use?  unbelievably valuable data. As we wrap up the Data Governance pillar, here's a look at what those canned views may look like
  • #49 IT: Define process for validating content is correct  IT & Users: Access platform capabilities to assist with validation and accuracy verification of user- generated analytic content IT: Define process for promoting content Users: Promote validated analytic content to centralized-trusted environment as determined by governance process IT: Define process for certifying content Certify content as trusted and delineate from untrusted content in the same environment The, how do we monitor utilization and ensure that content remains relevant
  • #50 Monitor broad usage patterns across organizational business units Monitor and audit usage of published content and track usage of untrusted content All of this is made visible to you through our Postgres database. We provide some pre-authored default admin views and if you have Server (not for Online) you can create your own custom views. I've yet to meet an admin who hasn't. This is your looking glass into how your business uses your data, what data do the use, what data do they not use?  unbelievably valuable data. As we wrap up the Data Governance pillar, here's a look at what those canned views may look like
  • #51 In Tableau Server 10.4 and later, certifications and recommendations will help you make data sources discoverable and improve your ability to govern enterprise analytics effectively in Tableau. Both features help reduce the proliferation of redundant data models and save analysts time when trying to find good data that they can trust. After publishing a data source to Tableau Server, define a certification process so users will know the data can be trusted. Certified Data Sources get preferential treatment in Tableau Server search results and in the smart data source recommendations algorithm so that they are discoverable and easily reusable. Certification notes allow you to describe why a particular data source can be trusted. These notes are accessible throughout Tableau when viewing this data source.
  • #52 Monitor broad usage patterns across organizational business units Monitor and audit usage of published content and track usage of untrusted content All of this is made visible to you through our Postgres database. We provide some pre-authored default admin views and if you have Server (not for Online) you can create your own custom views. I've yet to meet an admin who hasn't. This is your looking glass into how your business uses your data, what data do the use, what data do they not use?  unbelievably valuable data. As we wrap up the Data Governance pillar, here's a look at what those canned views may look like
  • #53 So we've talked a lot about how the Tableau Governance Framework is and how that fits into a Modern Analytics Workflow. We've talked about the Tableau Server Structure and a few of the key roles. We dove into the 2 pillars of our framework, data and content governance. And I've shown you some of the Tableau features that make this all possible on a single platform. let's look at 3 common <click> Gov models
  • #54 Again, the Postgres metadata holds these answers as it did for the Data governance monitoring. Here the questions might be: Some of these are covered in our admin views with ship with your Server, while others can be answered through custom queries of your repository (again, not currently avaialblee for online customers). Here's a quick peak at what that might look like.
  • #55 3 common Governance models starting with a quick refresher on the Modern Analytics Workflow.
  • #56 Reminding them of the Modern Analytics Workflow When designing your governance model, it's critical to keep your vision for Modern Analytics in mind. IT enables the modern analytics workflow, but it is primarily driven by business users and analysts throughout the organization. But it’s important to remember it is not all or none, nor is it ever done. And as we discussed earlier, no one does one merely flip the switch and say "as of today, we are a Self-Service org. So what are the models we see?
  • #57 Let's look at the 3 most common models: Centralized, Delegated, and Self-Governing.   <click> Centralized: a team produces data sources and dashboards for business consumption due to complexity or sensitivity of data All data access restricted to centralized group Content authoring restricted to centralized group Content consumption by many This is the "traditional approach" and in most cases cannot keep up with the demand of the business. <click> Delegated: Increasing IT-Business collaboration; introduction of Data Steward and Site Admin roles as intermediate between IT and business (increasing responsibilities from IT) Direct database access restricted to trained group of users Published data sources available Content based on published data sources and modifiable by some   <click> Self-Governing: Strong IT-Business trust and collaboration; well-defined roles; use of certified content; well-defined process for validation, promotion, and certification Open data access, with certification process Content created by anyone, with certification process Content modifiable by all, with certification process There are various approaches to governance. And every organization is going to fall at a different point on the spectrum ranging from an IT-led, highly-governed and control environment to one with much lighter controls if any (full transparency), with many organizations landing somewhere in between. Often times, even within one organization, the governance requirements may vary depending on the user needs within that area as well as the data itself.  Just 2 more slides 
  • #58 Centralized Centralized COE exists as another functional area responsible for setting Analytics strategy Easier to stand up as a direct extension of corporate strategy COE may not have adequate representation from functional areas and field Harder to embed analytics into individual business units or functional groups Care needs to be taken if one centralized team is responsible for all areas of the CoE – resources shouldn’t be able to all focus in one area at the expense of the others (e.g. focus on technical issues only vs. enabling users) Hybrid Hybrid COE acts as a steward of best practices and governance of analytics vision (best of both) Most effective way to extension of corporate strategy Requires strong processes and governance Business unit participation and champions are critical to success Allows for self sufficiency across business lines with the ability to tap into the COE for guidance and support Accept and plan for some groups needing flexibility against best practices or standards due to business or cultural needs De-centralized Each business unit effectively maintains their own Analytics strategy Business units develop siloed reporting processes leading to a lack of standardization and enterprise-wide initiatives No standard process or governance of analytics Potential for duplication of effort (and cost) Can lead to lack of standardization and confusion – especially if a dashboard consumer may view dashboards from different business groups
  • #59 Tableau Server has many built-in features to promote security, governance, data exploration, and collaboration. Data Server, which is part of Tableau Server, is arguably the most powerful of these tools. Data Server enables business users to have trust and confidence that they are using the right data so they can explore it the way they want and discover new insights that drive business value.
  • #60 By installing Tableau Server, you immediately have the ability to share curated metadata models with Data Server, including standard calculations. If a connection requires a database driver, you need to install and maintain the driver only on the server, instead of on each user’s computer. If you use Tableau Online, all supported drivers are available to data sources published to your site. In Tableau Desktop, users with the appropriate permissions can: Connect to Data Server like connecting to any other database: Choose Tableau Server from the connection list. Authenticate. Select the data source. Manage permissions, manually refresh, or delete a Published Data Source by accessing Data Sources on Tableau Server.
  • #61 When you publish a metadata model to Tableau Server, you are using Data Server. <click> In preparation for publishing your metadata model should have friendly names, hierarchies, and calculations where applicable. <click>Publishing your metadata model is as simple as choosing Publish to Server... from the context menu of the data source you are connected to, entering your credentials, and specifying data source permissions. Data Server supports a centralized metadata model and data management for your organization using Published Data Sources. With the appropriate permissions, users can share live connections or extracted data sets they’ve defined by publishing data sources on Tableau Server. When a data source is published, other users can connect to it from their workbooks, as they do other types of data. When the data in a Published Data Source is updated, all workbooks that connect to it pick up the changes. In addition to helping your users create data consistency and reliability, using Data Server offers advantages to you as the administrator. Because multiple workbooks can connect to one data source, you can minimize the proliferation of embedded data sources and save on storage space and processing time. When someone downloads a workbook that connects to a Published Data Source that in turn has an extract connection, the extract stays on the server, reducing network traffic.
  • #62 Tableau’s hybrid data architecture provides two modes for interacting with data, using a live connection or an in-memory extract. Switching between the two is as easy as selecting the right option for your use case. Tableau’s native data connectors leverage your existing data infrastructure by sending dynamic, optimized SQL or MDX statements directly to the source database rather than importing all the data. If you have a data architecture built on transactional databases or want to reduce the workload off the core data infrastructure, Tableau’s Data Engine provides an in-memory data store that is optimized for analytics. You can connect and extract your data to bring it in-memory to perform queries in Tableau with one click.
  • #63 Tableau lets you subscribe yourself and other users to your workbooks and views. You can make sure your entire team sees your weekly KPI reports, or your executives receive your quarterly sales progress in their inbox whenever you want. To subscribe another user, click the subscribe button for any workbook or view you own or created.
  • #65 Tableau Blueprint is a step by step guide that will help you build these capabilities. It provides concrete plans, recommendations, and guidelines across critical foundational work and three primary workstreams that will turn repeatable processes into core capabilities. Let’s take a closer look at the different pieces of Blueprint.
  • #67 Forrester forecasts that insight-driven public companies will grow at least 7x faster than the global GDP --- or 27% annually.   In our new, data-driven world, companies that put data at the center of their operations are growing faster than the rate of ordinary companies, while companies that struggle to harness the power of their data are seeing their market presence disappear.   -------------------------------------- Internal link to full report: https://salesinsight.gosavo.com/Document/Document.aspx?id=48384263&view=&srlid=63659924&srisprm=False&sritidx=0&srpgidx=0&srpgsz=25 External link to gated report: https://www.forrester.com/report/InsightsDriven+Businesses+Set+The+Pace+For+Global+Growth/-/E-RES130848
  • #68 A great example of a company that is making analytics widely available in their organization is Charles Schwab. The Charles Schwab Corporation is one of the largest publicly-traded financial services firms based on client assets. At Schwab, data is fundamental to enhancing the customer experience, driving operational leverage, and reducing risk. At the beginning of 2016, Charles Schwab had over 6,000 people using Tableau, moving to 12,000 employees 18 months later, and now has 16,000 users. Today, about half of the company uses Tableau on a daily basis. With this access to visual analytics, Schwab has seen an increase in collaboration and openness. Schwab’s has IT Center of Excellence helps plan for capacity, curates data courses, provides tracking for usage, offers expertise, and helps manages the internal community. Schwab is another wonderful example of a thriving internal community: Community by the way was big for them. Jumpword Tableau – SharePoint site. Product releases, support forums, getting started guidance, training and maintenance (uptime) calendar, design templates and TPS file, connection to experts, user groups notes. Charles Schwab TUG – 1 hour meetings covering 5 things: Viz Contests, The Real World (15-20 minutes of different people at Schwab talking about what they are doing with Tableau), Training, Survey/Raffle, COE Announcements. Tableau Doctor – same as TC, weekly, experts.
  • #70 Line depicts an annual contract period, not necessarily calendar year in an ideal state. Key dependencies such as customer and internal stakeholder availability can cause deviation from schedule. Here’s an example of how this breaks down tactically for an inside CSM. There are a series of calls, emails etc. which we will use for the Inside CSMs to drive. Here’s an example of a customer engagement throughout an annual contract Internal Call: this step is necessary for knowledge transfer from AM to CSM and vice versa. Also a great opportunity to get on the same page and partner on the account planning process Kick off call: explain the program/process to the customer and get buy-in. Validate goals/intended value from Tableau purchase and assess current state with Tableau journey Success Plan: the living post sale roadmap to success EBR’s: Chance to review and track success against the goals, share use cases and adoption metrics, and adjust priorities/goals for next period Monthly Newsletter: keeps customers up to date on new releases, upcoming trainings, community building activities, beta programs, etc. Monthly Touchpoints: keeps a pulse on progress Renewal: Safeguards proper contacts/handoffs/etc. to ensure the customer renews and renews on time.