SlideShare a Scribd company logo
1 of 60
Download to read offline
EXCLUSIVELY FOR
TDWI PREMIUM MEMBERS
Volume 17 • Number 4 • 4th Quarter 2012
The leading publication for business intelligence and data warehousing professionals
The Necessary Skills for Advanced Analytics	 4
Hugh J. Watson
BI Dashboards the Agile Way	 8
Paul DeSarra
Best Practices for Turning Big Data into	 17
Big Insights
Jorge A. Lopez
Implementing Dashboards for a Large 	 22
Business Community
Doug Calhoun and Ramesh Srinivasan
Data “Government” Models for Healthcare 	34
Jason Oliveira
BI Q&A: Gaming Companies on the 	 40
Bleeding Edge of Analytics 	
Linda L. Briggs
Offloading Analytics: Creating a 	 43
Performance-Based Data Solution
John Santaferraro
BI Experts’ Perspective: Mobile Apps 	 49
Timothy Leonard, William McKnight, John O’Brien,
and Lyndsay Wise
BI Training Solutions:
As Close as Your Conference Room
We know you can’t always send people to training, especially
in today’s economy. So TDWI Onsite Education brings the
training to you. The same great instructors, the same great
BI/DW education as a TDWI event—brought to your own
conference room at an affordable rate.
It’s just that easy. Your location, our instructors, your team.
Contact Yvonne Baho at 978.582.7105 or
ybaho@tdwi.org for more information.
tdwi.org/onsite
1BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
volume 17 • number 4
3	 From the Editor
4	 The Necessary Skills for Advanced Analytics
Hugh J. Watson
8	 BI Dashboards the Agile Way
Paul DeSarra
17	 Best Practices for Turning Big Data into Big Insights
Jorge A. Lopez
22	 Implementing Dashboards for a Large Business Community
Doug Calhoun and Ramesh Srinivasan
34	 Data “Government” Models for Healthcare
Jason Oliveira
40	 BI Q&A: Gaming Companies on the Bleeding Edge of Analytics
Linda L. Briggs
43	 Offloading Analytics: Creating a Performance-Based Data Solution
John Santaferraro
49	 BI Experts’ Perspective: Mobile Apps
Timothy Leonard, William McKnight, John O’Brien, and Lyndsay Wise
55	 Intructions for Authors
56	 BI StatShots
2 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
volume 17 • number 4
EDITORIAL BOARD
Editorial Director
James E. Powell, TDWI
Managing Editor
Jennifer Agee, TDWI
Senior Editor
Hugh J. Watson, TDWI Fellow, University of Georgia
Director, TDWI Research
Philip Russom, TDWI
Director, TDWI Research
David Stodder, TDWI
Associate Editors
Barry Devlin, 9sight Consulting
Mark Frolick, Xavier University
Troy Hiltbrand, Idaho National Laboratory
Claudia Imhoff, TDWI Fellow, Intelligent Solutions, Inc.
Barbara Haley Wixom, TDWI Fellow, University of Virginia
Advertising Sales: Scott Geissler, sgeissler@tdwi.org, 248.658.6365.
List Rentals: 1105 Media, Inc., offers numerous e-mail, postal, and telemarketing
lists targeting business intelligence and data warehousing professionals, as well
as other high-tech markets. For more information, please contact our list manager,
Merit Direct, at 914.368.1000 or www.meritdirect.com.
Reprints: For single article reprints (in minimum quantities of 250–500),
e-prints, plaques, and posters contact: PARS International, phone: 212.221.9595,
e-mail: 1105reprints@parsintl.com, www.magreprints.com/QuickQuote.asp
© Copyright 2012 by 1105 Media, Inc. All rights reserved. Reproductions in
whole or in part are prohibited except by written permission. Mail requests to
“Permissions Editor,” c/o Business Intelligence Journal, 1201 Monster Road SW,
Suite 250, Renton, WA 98057. The information in this journal has not undergone
any formal testing by 1105 Media, Inc., and is distributed without any warranty
expressed or implied. Implementation or use of any information contained herein
is the reader’s sole responsibility. While the information has been reviewed for
accuracy, there is no guarantee that the same or similar results may be achieved
in all environments. Technical inaccuracies may result from printing errors,
new developments in the industry, and/or changes or enhancements to either
hardware or software components. Printed in the USA. [ISSN 1547-2825]
Product and company names mentioned herein may be trademarks and/or
registered trademarks of their respective companies.
President	Rich Zbylut
Director, Online Products	 Melissa Parrish
& Marketing
Senior Graphic Designer	Bill Grimmer
President & 	Neal Vitale
Chief Executive Officer
Senior Vice President &	Richard Vitale
Chief Financial Officer
Executive Vice President	 Michael J. Valenti
Vice President, Finance	Christopher M. Coates
& Administration
Vice President, 	Erik A. Lindgren
Information Technology &
Application Development
Vice President, 	David F. Myers
Event Operations
Chairman of the Board	 Jeffrey S. Klein
Reaching the Staff
Staff may be reached via e-mail, telephone, fax, or mail.
E-mail: To e-mail any member of the staff, please use the
following form: FirstinitialLastname@1105media.com
Renton office (weekdays, 8:30 a.m.–5:00 p.m. PT)
Telephone 425.277.9126; Fax 425.687.2842
1201 Monster Road SW, Suite 250, Renton, WA 98057
Corporate office (weekdays, 8:30 a.m.–5:30 p.m. PT)
Telephone 818.814.5200; Fax 818.734.1522
9201 Oakdale Avenue, Suite 101, Chatsworth, CA 91311
Business Intelligence Journal
(article submission inquiries)
Jennifer Agee
E-mail: journal@tdwi.org
tdwi.org/journalsubmissions
TDWI Premium Membership
(inquiries & changes of address)
E-mail: membership@tdwi.org
tdwi.org/PremiumMembership
425.226.3053
Fax: 425.687.2842
tdwi.org
3BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Speed is on everyone’s mind these days. From real-time data to on-demand reporting, BI
professionals want up-to-the-minute information and they want it now. The authors in this
issue of the Business Intelligence Journal understand.
Agile development methodologies have long promised speedier delivery of new applica-
tions or features thanks to shorter development cycles and increased user collaboration.
Paul DeSarra explains how an agile approach can be leveraged to meet the highly dynamic
needs of business; he uses an agile dashboard project to illustrate his ideas.
Dashboards are a quick and easy way to communicate key performance indicators, and
Doug Calhoun and Ramesh Srinivasan provide tips and best practices for creating a
successful dashboard design.
An agile approach may also be what’s needed for mobile development at a maternity
clothes maker, the subject of our Experts’ Perspective. Timothy Leonard, William
McKnight, John O’Brien, and Lyndsay Wise offer their advice for getting mobile BI up
and running quickly.
Of the three leading characteristics of big data (the so-called 3 Vs: volume, variety, and
velocity), it’s the speed component that is often cited as its downfall. How can you process
so much data without becoming bogged down? Jorge A. Lopez describes one approach.
John Santaferraro discusses how analytics must be offloaded to separate analytics databases
if big data is to provide accelerated queries, faster batch processing, and immediate access
to a robust analytics environment.
Senior editor Hugh J. Watson notes that studies suggest enterprises will soon face a
shortage of data scientists. He explains that we will have to give business analysts and data
scientists wider and more in-depth permissions and provide more training for existing staff
if we’re to keep up with current trends.
Healthcare organizations face a variety of tough governance challenges. Jason Oliveira
explores what can be learned from other governance and services organizations that have
adopted business intelligence competency centers (BICCs) and how to apply that knowl-
edge to help improve healthcare’s BI disciplines.
Speed can present challenges, which is why our Q&A explores how gaming companies are
on the bleeding edge of analytics, using real-time information to improve gameplay (as
well as up-sell or cross-sell products or services to players).
How are you keeping up? We welcome your feedback and comments; please send them to
jpowell@tdwi.org.
From the Editor
4 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
The Necessary
Skills for Advanced
Analytics
Hugh J. Watson
Analytics work requires business domain knowledge, the
ability to work with data, and modeling skills. Figure 1
identifies some of the skills in each area. The importance
of particular skills and the exact forms they take depend
on the user and the kind of analytics involved. Let’s take
a closer look.
It is useful to distinguish among business users, busi-
ness analysts, and data scientists. Business users access
analytics-related information and use descriptive analytics
tools and applications in their work—reports, OLAP,
dashboards/scorecards, and data visualization. They have
extensive business domain knowledge and are probably
familiar with the data they are accessing and using but
have a limited need for and understanding of modeling.
Advanced Analytics
Hugh J. Watson is a C. Herman and
Mary Virginia Terry Chair of Business
Administration in the Terry College of
Business at the University of Georgia. He
is a Fellow of TDWI and senior editor of
the Business Intelligence Journal.
hwatson@uga.edu
BUSINESS DOMAIN
• Goals
• Strategies
• Processes
• Decisions
• Communication
• of results
DATA
• Access
• Integration
• Transformation
• Preparation
MODELING
• Methods, techniques,
• and algorithms
• Tools and products
• Methodologies
Figure 1. Skills needed for analytics.
5BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Advanced Analytics
Business analysts use analytical tools and applications
to understand business conditions and drive business
processes. Their job is to access and analyze data and to
provide information to others in the organization. Most
business analysts are located in the functional areas of
a business (such as marketing) and perform analytical
work (such as designing marketing campaigns), or they
may work in a centralized analytics team that provides
analytics organizationwide. Depending on their posi-
tions, business analysts work with some combination of
descriptive, predictive, and prescriptive analytics. They tend
to have a good balance of business domain knowledge as
well as data and modeling skills.
The data scientist title is taking hold even though it
sounds elitist (I’ve also heard the term data ninja). Data
scientists typically have advanced training in multivari-
ate statistics, artificial intelligence, machine learning,
mathematical programming, and simulation. They
perform predictive and prescriptive analytics and often
hold advanced degrees, including Ph.D.s, in fields such as
econometrics, statistics, mathematics, and management
science. Companies don’t need many data scientists, but
they come in handy for some advanced work.
Data scientists often have limited business domain knowl-
edge, the ability to handle data related to performing
analytics (e.g., data transformations), and strong modeling
skills. They often move from project to project and are
paired with business users and business analysts so that
necessary domain knowledge is included on the team.
Most companies have moved along the BI/analytics
maturity curve and now have business users who
understand and can employ descriptive analytics and
business analysts who can deliver descriptive and some
predictive analytics. Interest is now focusing on the
organizational capability to perform predictive and
prescriptive (that is, advanced) analytics to answer why
things happen and propose changes that will optimize
performance. This explains why enterprises are employing
more data scientists.
Successful advanced analytics requires a high level of
business domain, data, and modeling skills, and a team
of people is often required to ensure that all of the skills
and perspectives are in place. As an example, consider the
following experience.
Southwire: Bringing the Skills Together	
Several years ago, I received a call from a manager
at Southwire, a leading producer of building, utility,
industrial power, and telecommunications cable products
and copper and aluminum rods. He wanted help solving
an impending problem associated with the production
of copper, a key component of many of his company’s
products. My experience on that project (in particular,
how the problem was approached and solved) provides a
good example of the skills required to be successful with
advanced analytics.
I learned that the there is no set “formula” for manufac-
turing copper. A variety of ores and other ingredients
are used depending on what is available. The current
approach involved an expert who would look at what
materials were on hand and decide what and how much
of each ingredient should be used. It was critical that the
ingredients produced copper and that the copper would
be viscous enough to flow out of the smelter and refining
furnace. The problem was that the expert was retiring
soon and his expertise was going to be lost. A new
solution approach was needed.
Southwire assembled a team of business people, chemical
engineers, IT, and me. We had individuals with business
knowledge, subject area experts, people who were familiar
with available data and systems, and members with
modeling expertise. The team contained all the skills
needed for advanced analytics.
My role was to provide the modeling (data scientist)
skills. I saw two possible modeling approaches. The first
option was to create an expert/rules-based system based
on the knowledge of the retiring expert. We would
capture in an application the heuristics that the expert
used in deciding what to put into the smelter each day.
The model would be descriptive in that it would describe
the expert’s thinking.
6 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
The other alternative, and the one chosen, was to use
linear programming. If you are familiar with linear
programming, Southwire’s problem was the classic
production blending application. You create an objective
function (that is, an equation) that you want to minimize
with the sum of the cost of the various ingredients
multiplied by the quantity of each ingredient. You also
write constraint equations that reflect the conditions that
the solution must satisfy. The output of the analysis is
the quantity of each ingredient that will minimize costs
while satisfying all of the constraints.	
The writing of the constraint equations was fascinating to
me. Remember that the solution had to produce copper
and it had to be sufficiently viscous. These requirements
were handled through the constraint equations and
reflected what ingredients were available and the chemi-
cal reactions involved. The chemical engineers’ input was
critical for developing these equations. Remember when
you took chemistry in high school or college and studied
valences (the number of bonds a given atom has formed,
or can form, with one or more other atoms)? This and
other factors (such as what ingredients were available each
day) were incorporated into the constraint equations.
Data scientists are not a “one-trick pony” when it comes
to modeling. They are familiar with multiple modeling
approaches and algorithms. They are able to identify and
experiment with different models until they find the
one that seems most appropriate. At Southwire, a linear
Advanced Analytics
programming modeling approach was selected over an
expert/rules-based system.
Once the objective function and constraint equations
were developed, it was necessary for IT and me to select
an appropriate linear programming package, enter the
objective function and constraint equations, test the
solution, develop a user interface that operational people
could easily use for entering data (such as ingredients)
and interpreting the output, implement the system, and
train people to use it.
Assembling the Skills
Enterprises have the business domain knowledge for
advanced analytics. However, as illustrated at Southwire,
a key to success is to make sure that people with business
domain knowledge are on the analytics team.
Enterprises also have the required data skills, but a few
changes may be necessary to accommodate their need for
advanced analytics. Data scientists (and some business
analysts) may need to have fewer restrictions on the data
they can access and what they can do with it. They may
need access to underlying data structures as well as the
ability to join, transform, and aggregate data in ways
necessary for their work. They may also need the ability
to enter new data into the warehouse, such as from
third-party demographic data sources. A possible solution
to the potential conflict over control versus flexibility
is an analytical sandbox, whether it is internal to the
warehouse or hosted on a separate platform.
Finding the required modeling skills is a trickier proposi-
tion. You can hire consultants, as Southwire did, or use
a third-party analytics provider, but these options can
become costly over time if your plans include extensive
advanced analytics. You can probably coach some of your
current business analysts. There are many conferences
(such as those offered by TDWI), short courses, and
university offerings that teach advanced analytics. As
advanced analytics becomes better integrated into
application software (for example, campaign manage-
ment) and easier to use, it is likely that trained business
analysts can take on tasks that have skill requirements
typically associated with data scientists.
A possible solution to the
potential conflict over control
versus flexibility is an analytical
sandbox, whether it is internal
to the warehouse or hosted on
a separate platform.
7BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
You can also hire data scientists. This isn’t a new
approach; many companies have already done so and
have data scientists scattered throughout their business
units or in specialized groups such as analytics compe-
tency centers.
Studies suggest that companies are planning to hire more
data scientists and will face a shortage of such resources.
For example, the McKinsey Global Institute predicts a
shortfall of between 140,000 and 190,000 data scientists
by 2018 (Manyika, et al, 2011). Many universities are
gearing up to meet the need through degree programs,
concentrations, and certificates. These offerings are
usually in business, engineering, or statistics and the
instructional delivery varies from on campus to online.
One of the first and best-known programs is the
Master of Science in Analytics at North Carolina State
University. SAS has been an important contributor to
the program, which is offered through the Institute for
Advanced Analytics and has its own facility on campus.
Deloitte Consulting has partnered with the Kelly School
of Business at Indiana University to offer a certificate
in business analytics for Deloitte’s professionals. Just
this year, Northwestern University rolled out an online
Master of Science in Predictive Analytics offered through
its School of Continuing Studies.
Will students take advantage of these programs in large
enough numbers? Advanced analytics is a tough study,
and many students may not have the necessary aptitude,
inclination, and drive to complete the programs, even
though the career opportunities are great.
Summary
You have already been performing analytics under the
BI umbrella. BI includes descriptive analytics, and you
have probably also been performing predictive analytics.
For more advanced analytics, however, you will need
to “ramp up your game” a little. You have the business
domain knowledge covered. For the data component, you
will need to grant business analysts and data scientists
wider or more in-depth permissions and you will likely
need to extend and enhance your analytical platforms
(such as appliances and sandboxes). For the modeling
Advanced Analytics
skills, you will probably need to provide training for
existing staff and bring in new people with specialized
analytical skills. ■
Reference
Manyika, James, Michael Chui, Brad Brown, Jacques
Bughin, Richard Dobbs, Charles Roxburgh, and
Angela Hung Byers [2011]. Big Data: The Next
Frontier for Innovation, Competition, and Productivity,
McKinsey Global Institute, May.
http://www.mckinsey.com/insights/mgi/research/
technology_and_innovation/big_data_the_next_
frontier_for_innovation
8 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
BI Dashboards
the Agile Way
Paul DeSarra
Abstract
Although the concept of agile software development has been
around for more than 10 years, organizations only recently
began to think about how this methodology can be applied
to business intelligence (BI) and analytics. BI teams are
continually evolving their rapid delivery of additional value
through reporting, analytics, and dashboard solutions. These
teams must also discover what types of BI solutions can
reinvigorate a BI deployment and produce meaningful results.
One way to reinvigorate BI deployments is to take the concept
of agile software development and apply it to BI initiatives
such as BI dashboard solutions, which can both re-engage
the business and drive actionable intelligence and confident
decision making. Agile BI replaces traditional BI project
methods (heavy in documentation and process) with a
user-driven approach. This article discusses an approach to
building BI solutions and dashboards using an agile software
development methodology.
Introduction
Although the concept of agile software development has
been around for more than a decade, it’s only been in the
last few years that organizations have started to examine
how this methodology can be applied to business intel-
ligence and analytics. The constantly changing, highly
dynamic needs of business today have increased the
demands on BI environments and teams. The pressure to
be more organized, turn projects around faster, and ensure
user adoption at all levels is increasing. Teams need to be
able to react to demands from the business and proactively
develop ideas and solutions that give the business more
creative ways to think about how to use data.
Leveraging an agile software methodology as it applies
to business intelligence is a great way to meet these
constantly changing business needs.
Paul DeSarra is Inergex practice director for
business intelligence and data warehousing. He
has 15 years of BI strategy, development, and
management experience working with enterprises.
pdesarra@inergex.com
9BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
In a nutshell, using an agile software development meth-
odology (“agile”) instead of a traditional development
methodology allows end users to experience a version
of the software product sooner. Instead of adhering to a
strict and intensive requirements and design phase before
development begins, agile employs a series of shorter
development cycles to increase user collaboration. The
agile approach welcomes changes during the development
process to provide a better product that delivers measur-
able value quickly and efficiently.
There are four guiding principles for agile software devel-
opment (according to the Manifesto for Agile Software
Development, www.agilemanifesto.org). These can also be
applied to business intelligence development efforts.
Principle #1: Value individuals and interactions
over processes and tools
Traditional BI development focuses on strong processes
and tools to solve development challenges. As a result,
many organizations end up creating silos among the busi-
ness and IT teams. Each team silo focuses on individual
responsibilities and objectives and, in effect, each
team loses sight of the overall project goal of providing
cohesive and comprehensive data-driven solutions that
improve performance levels.
When using an agile BI approach, all those involved in
the BI initiative work together as one team with one goal
and set of objectives. To accomplish this, many organiza-
tions create hybrid teams and a business intelligence
competency center (BICC) composed of individuals
with the necessary skills to define, architect, and deliver
analytic solutions. In some cases, many of these teams
are organized under business units outside of IT and the
program management office.
Principle #2: Value working software over
comprehensive documentation
Traditional BI development in a big-bang approach
focuses on developing detailed documentation about
common metrics, terminologies, processes, governance,
support, business cases, and data warehouse architectures.
Organizations may create a standardized enterprise data
warehouse and then fail because they were focused on the
documentation and lost touch with the business and the
problems they were trying to solve.
This doesn’t mean we should stop creating detailed docu-
mentation. BI teams can and should continue to focus on
creating documentation that emphasizes the vision and
scope as well as the architecture for future support. With
agile BI, the focus is not on solving every BI problem at
once but rather on delivering pieces of BI functionality in
manageable chunks via shorter development cycles and
documenting each cycle as it happens.
Principle #3: Value customer collaboration over
contract negotiation
Using an agile BI approach does not mean giving users an
unlimited budget or tolerance for changes. Instead, users
can review changes discussed in the last development
cycle to ensure expectations and objectives are being met
throughout the project.
Traditional BI development teams use functional docu-
mentation to discuss what the solution will look like and
how it will operate. Such an “imagine this” method often
leaves users to try and visualize what they believe the
solution will become. The resulting subjective expecta-
tions can quickly derail a BI project. In contrast, an agile
methodology reviews progress during each development
cycle using prototypes so that stakeholders and business
users can see how the BI solution is expected to look and
Agile employs a series of shorter
development cycles to increase
user collaboration. It welcomes
changes during development to
deliver measurable value quickly
and efficiently.
10 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
function. Prototypes put a visual “face” to the project by
showing what data is available, how it will be used, and
how it will be delivered.
Principle #4: Value responding to change over
following a plan
With an agile methodology, traditional BI projects that
focus on huge project and resource plans are replaced
by shorter development cycles designed to better incor-
porate changes and keep the project team focused and
informed. For BI projects, changes should be expected
and welcomed. When users see prototypes and gain a
better understanding of what analytic capabilities and
information are available, they are better able to com-
municate how they could use that information to make
improved business decisions. Such revelations and ideas
only strengthen the final product.
An agile BI project still uses a plan, but its plan is short,
manageable, and coupled with a prototype users can
see and experience. Changes are jointly reviewed with
business sponsors, users, and IT professionals at every
project stage.
Example: An Agile Dashboard
To better understand how this methodology can be used,
let’s look at a real-world example of incorporating agile BI
into a BI dashboard project for an executive sales team.
The vice president of sales of a large manufacturing
organization asked us to help his company gain better
insight into its orders, shipments, and pipeline in order
to hold the sales teams more accountable. Specifically, he
wanted a dashboard that he and his executive team could
use to meet accountability objectives. His vision for the
dashboard was solid, and our role was to take that vision
and boil it down to key metrics that would drive actions.
After a few meetings with the vice president of sales
and the IT sponsor (in this case, the IT director), we
concluded that an agile BI dashboard project was the best
approach. We ensured we had the needed sponsorship
from both the business and IT teams. In addition, we
confirmed the organization was using a BI tool that was
capable of delivering the desired solution.
We advanced the project using a hybrid approach to
agile development, breaking the project into three
phases to quickly and efficiently develop the scope, build
prototypes, conduct reviews, develop the solution, and
implement it quickly.
Phase 1
This was the foundational phase for our project and
focused on the third agile principle (“customer collabora-
tion over contract negotiation”). Phase 1 should last no
more than one week and involves identifying, at a high
level, the scope of the BI dashboard to ensure that the
executive sponsors are engaged and the internal teams
are assembled. Phase 1 is essential because it is used to
narrow the scope and prioritize what can be delivered in
the set time frame.
SCOPE
PROTOTYPE
STAKEHOLDER REVIEWS
BUILD
RELEASE 1 ... N
Figure 1. The agile process.
An agile BI project still uses a plan,
but its plan is short, manageable,
and coupled with a prototype users
can see and experience.
11BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
In the first week, we worked with
IT and the vice president of sales
to ensure that the team had the
right people with the right skills
who understood the project goals.
We outlined roles and responsibili-
ties, opportunity and vision, and
the high-level scope—all standard
practices for an agile BI project.
We worked with the vice president
of sales along with several key
business users to identify the
metrics of greatest value. We
worked diligently to understand
what metrics were needed and
how they influenced business
decisions.
A dashboard metric isn’t enough;
we strived to enable users to
respond to each metric to achieve
the best business results. For
example, we examined what
happened after the dashboard
highlighted a large gap between
what the customer relationship
management (CRM) application
identified as a sales opportunity
and the revenue actually gener-
ated. We asked questions about
the process of capturing these
opportunities in the CRM to
better understand leading factors
that could influence revenue.
Delving into these questions
ensured that we understood the
full sales engagement process.
We didn’t stop there. We
identified about 10 metrics for
invoicing, orders, shipments, and
budgets across four different
dimensions—business area, Figure 2. Dashboard prototype examples.
12 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
product, customer, and date attributes. In our vision,
the dashboard would allow the sales teams to focus
more effectively on specific sales opportunities, better
track budgets, and confidently predict and forecast sales
throughout the year (and know where and how to make
necessary adjustments). We held two meetings with the
IT team to better understand the ability of the source
systems to provide the data elements required.
The Phase 1 deliverables included a high-level vision and
scope document that clearly set the stage for the rest of
the project. By quickly defining the vision and scope as
well as establishing a short time frame, we removed one
barrier (long contract negotiations and timelines) so we
could focus on having the right people involved and the
right team defined.
Phase 1 was completed in one week.
Phase 2
Phase 2 is where collaboration, rapid prototyping,
whiteboard sessions, and interactive brainstorming take
place. This phase applies three of the agile principles
(“individuals and interactions,” “customer collaboration,”
and “responding to change”). Phase 2 focuses on using
prototyping methods in brainstorming sessions to quickly
build and show business users how their ideas and needs
are reflected in the proposed solution—sometimes
iteratively and on the fly. The prototyping tool may
be separate from your BI tool, but it must be able to
demonstrate visual elements as well as drill-up, drill-
down, and interactivity.
This phase requires collaboration between the sponsors,
key business users, and IT. A key benefit of this phase is
that users “see” the data in action and will know whether
the data is being presented in a way that effectively
delivers the information they need. In fact, the process
often gives users new ideas for using the information to
make business decisions (see Figure 2).
In Phase 1 we created our vision and scope, outlined the
business problem, and understood the set of metrics and
dimensions necessary to reach the desired outcome. We
approached Phase 2 with two goals in mind:
■■
Collaborate with the vice president of sales and the
sales teams to define the “look” of the BI dashboard
and the data interactions required to populate it.
■■
Work with the IT team to determine the data
components and further understand what could be
accomplished and delivered by the project deadline.
(The overall project length was seven weeks, so we had
only six weeks left.)
The collaboration sessions were held with the vice
president of sales, several key business users, and
individuals from the IT team. The meetings started as
whiteboarding sessions. Once we completed the initial
design, we built a prototype with phony (but business-
sensible) data and set up daily meetings to review and
refine our development cycles.
In each session, we identified how and why metrics were
to be used and outlined the decisions that would be made
using the data. We evaluated different ways to display
information so it would be most useful to users. We also
mocked up the drill-through detail analysis and report-
ing that would be available via the easy-to-understand
dashboard and made sure only a single path led to the
detail at each level. The resulting dashboard prototype
The prototyping tool may be
separate from your BI tool, but it
must be able to demonstrate visual
elements as well as drill-up, drill-
down, and interactivity.
13BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
had four quadrants, each of which was meant to answer a
specific question:
■■
How are we performing today?
■■
Are we on plan and what is our updated forecast?
■■
Where are we winning and losing?
■■
Who and what is not profitable?
The mockup took the form of charts, regional maps, and
dynamic and color-coded lists. It also included detailed
drill-through paths and report examples to help guide
users in making decisions. For example, a user could
click on a troubled region on the map that identified a
large revenue gap based on forecasting and get details
on current activity within that region as well as open
opportunities and win/loss details.
All in all, we held about 10 different business sessions
and kept coming back the next day with a refined
prototype to generate ideas. As a result, throughout the
entire process, users were engaged, excited, and willing
to participate in the sessions. They also felt confident
that their needs were being addressed and their ideas and
feedback were incorporated.
We simultaneously worked on the data components to
map the vision to the actual data sources. To do so, we
had to remove several roadblocks and make some tough
decisions as a team (IT and business) in order to meet
our deadline. As the team forged ahead, we uncovered
several items that needed to be worked through as
quickly as possible.
■■
A few financial metrics were not in the current ERP
but would be implemented in an upgraded version,
which was set to go live the following year. We worked
with the business to outline the metrics and ultimately
decided to put them on hold so that we could con-
tinue building the rest of the dashboard.
■■
There was a need to tie in a certain product category
captured in a separate data source outside the ERP.
The product category was required to ensure we were
capturing the full picture. This product was set to be
coded in the new ERP. We decided to pull in and map
this information from the separate data source and
also put in place a process to map it into the new ERP
when the time was right.
We uncovered more than 15 potential roadblocks to the
initiative, and we worked through them all with the team.
We kept everyone informed and made joint IT/business
decisions to move forward—accomplishing this with
daily status meetings with IT and the business subject
matter experts to address issues quickly and outline
resolutions. Sponsors and stakeholders were also part of
weekly checkpoint meetings.
After we removed all our technical and business road-
blocks, we completed Phase 2 and delivered the prototype
dashboard, drill-through mockups, and a “Lean Require-
ments” document that captured the requirements and
outlined the assumptions and decisions we had made. We
also built a “Lean Design” document that described the
database design, data mapping, reporting designs, and
ETL construct.
Phase 2 was completed in four weeks.
Phase 3
Phase 3 is the “build” phase and applies the second agile
principle (“working software”). The foundation has been
Users felt confident that their
needs were being addressed
and their ideas and feedback
were incorporated.
14 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
set, the scope has been refined to ensure rapid delivery,
IT and business are fully engaged, and now the time
has come to take the prototype and construct it within
the BI environment. Phase 3 should take no more than
a few weeks and involves building integration and ETL
procedures, security, and the BI solution itself.
At this point, with two weeks left, we began to build
the required dashboard, drill-through reports, and
supporting data layers. Building everything in dynamic
prototypes made it much easier to ensure expectations
were in line as development progressed. During this
phase, we continued to show the results of actual develop-
ment of the dashboard every two days to the business
sponsor and key users.
Throughout this process, changes were still submitted.
We reviewed all changes and put them into one of two
buckets—implement or put on hold—and made notes
in our change control log. Some of the change requests
that flowed through in this phase revolved around
adding different relative time-period buckets for revenue
and margin analysis, some minor layout changes, and
three changes that were put on hold for future phases
around customizing various alternate drill paths from the
dashboard based on a user’s business unit and region.
Phase 3 was completed in two weeks.
Tips for Agile BI Success
In the end, the initial phase of the dashboard was released
in seven weeks. The project was a success because of the
agile BI processes applied to every aspect of the project.
One of the core success factors was the use of prototypes
and interactive sessions. Using prototypes enabled us
to keep all players involved from the beginning and
provided a forum to exchange ideas, discuss issues, and
actually “see” the solution as its development progressed.
After reading the case study, perhaps you are now think-
ing, “Can organizations really implement these types of
BI solutions in seven weeks?” You may be asking, “What
about data governance, load procedures, ETL, business
rules, capacity planning, and security maintenance?” The
reality is that you must strike a balance when using the
agile software development methodology for your BI ini-
tiatives. The process walks the line of ensuring that you
are building a solid foundation that has longevity, speed,
and strength to weather the dynamic and demanding
needs of the business. The following ideas and concepts
can help you implement an agile BI process.
Tip #1: Start small, think big
When you begin to build an agile BI solution, it doesn’t
matter if you have an enterprise data warehouse coupled
with a large-scale, mega-vendor BI software stack or a
small data mart managed with a niche tool. The key is to
focus on the immediate business need and pain, then map
that to the ultimate vision. Get the stakeholders to define
and work with you to build out what it will look like.
Once you have the vision, determine the best approach
that completes the work quickly and keeps the long-term
picture in mind. If you need to take shortcuts to get the
work done, that’s fine, as long as everyone approves the
shortcuts and you have a process in place to close the
loop at a future point. For example, if you have an ERP
application and you have to group some of your sales data
into a customized dimension (instead of modifying the
ERP source of records) in order to deliver the BI solution,
then do so, but ensure that you get approval and that
everyone understands the costs and benefits.
The project was a success because
of the agile BI processes applied
to every aspect of the project.
One of the core success factors
was the use of prototypes and
interactive sessions.
15BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
Although you are building a specific solution, you can
still take steps to ensure it is repeatable, scalable, and
fits into your overall data architecture. For example, in
our case study, we were building a specific BI dashboard
solution that was focused on shipments, orders, and
pipeline processes specific to the sales functional area.
However, in creating the solution, we built a data
design that could scale outside of sales by building
conformed dimensions and process-driven fact tables. If,
for performance reasons, we had to create summary or
aggregated tables to support specific business areas, we
made sure these mapped back to lower-grain fact tables
for data consistency and detail analysis.
Tip #2: Remove the roadblocks
Whether you face IT challenges or other obstacles, work
systematically to overcome them. Typical roadblocks you
may encounter in an agile BI project include:
A narrowed scope. In some cases, it can be challenging to
narrow the scope of a BI project so that a portion of the
solution can be delivered in a shorter time frame. This is
a slippery slope and requires the ability to prioritize and
find common ground with business users and/or sponsors.
If you can get the business sponsor to commit to a shorter
time frame up front, it will be easier to narrow the
scope. In addition, separate out the “must-haves” and the
“would-like-to-haves” right away.
Data gaps. In any BI project, data gaps are typical as users
may not fully understand how information is collected
or data anomalies are discovered. Agile BI is no differ-
ent, and data profiling is a necessary step. In our case
study example, we encountered data gaps that we had
to eliminate or overcome by accepting risk, leaving out
components, or implementing a temporary fix.
Business commitment and time. Agile BI requires interaction
with business stakeholders, sponsors, and users through-
out the project’s life. Secure commitment up front with
everyone and clearly outline the project’s benefits in terms
of effective decision making.
Managing expectations. There is often a gap in the expecta-
tions about what it takes to deliver a BI solution and the
time it actually takes. Users may believe that much more
can be done in a short amount of time, which can cause
extreme tension between IT and the business. Managing
these expectations requires strong communication skills
and an individual on the team who can effectively bridge
IT and business users. This individual should understand
data modeling and architecture, reporting and analytics,
and dimensional concepts and be able to articulate
the challenges to business managers and sponsors in a
language they understand.
Rogue development. Agile BI still follows a process and
a method. There is still documentation and a plan;
success metrics are still defined at the beginning of the
project. Project management is still a core component
in this process. We recommend that you still use the
following tools, documentation, and processes to help
guide the project:
■■
A vision and scope document is used to define initial,
critical success factors and get project approval.
■■
A requirements document outlines the core busi-
ness problems and key data elements, metrics, and
dimensions that are needed for the BI solution. The
difference from traditional BI development is that
this document focuses on the smaller and shorter
deliverables and keeps it “lean.”
If you need to take shortcuts to get
the work done, that’s fine, as long
as everyone approves the shortcuts
and you have a process in place to
close the loop at a future point.
16 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Agile Dashboard Development
■■
A design document describes the database design, data
mapping, reporting designs, and ETL constructs.
Again, different from a traditional BI project, this
design should focus on bringing the technical team
together on the architecture and for future support
without getting lost in too many details.
■■
A project baseline plan for delivering a piece of
functionality quickly, with the longer-term plan
represented at a higher level.
■■
A change control log to track which changes are
implemented and which are put on hold.
■■
An enhancement log to track enhancements that the
team is unable to fit into the first release.
If you have obtained the right sponsorship at the start
and ensured everyone has the same vision and under-
stands the project, your ability to remove roadblocks will
be improved. Inevitably, however, challenges will arise, so
always keep one eye on the vision and one on the scope.
Tip #3: Engage the business
BI professionals sometimes get so focused on the
technology that even after the initial meeting with
business users they may flip back to thinking mostly
about the tools and technology rather than the business’s
pains, needs, and objectives.
In our case study, we used rapid prototyping and
whiteboarding sessions to gather requirements and keep
the right people involved and working in unison. We had
daily brainstorming sessions to promote collaboration on
the design, discuss the metrics and information needed
to make business decisions, and show the BI dashboard
prototype progress.
From this, we built a requirements document that was
focused on the key metrics and data elements, and we
incorporated visuals from our prototype to ensure we
had everything captured. Once we completed this phase
and moved to the full development stage, keeping the
key users involved continued to be highly important.
During development, we still met at least twice a week
to review progress and update our change control logs as
we showed progress on the BI dashboard solution. The
prototype became our guide to ensuring the development
was on course and meeting all expectations.
Summary
As business becomes more dynamic and social in nature,
BI environments need to be prepared to move fast and
deliver value in creative ways. Intertwining BI best
practices with the agile software methodology is one way
to infuse speed, creativity, commitment, and value into
any BI project. ■
17BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
big data insights
Best Practices for
Turning Big Data
into Big Insights
Jorge A. Lopez
Abstract
Big data is surfacing from a variety of sources—from
transaction growth and increases in machine-generated
information to social media input and organic business
growth. What does an enterprise need to do to maximize
the benefits of this big data?
In this article, we examine several best practices that can help
big data make a difference. We discuss the role that extract,
transform, and load (ETL) plays in transforming big data into
useful data. We also discuss how it can help address the scal-
ability and ease-of-use challenges of Hadoop environments.
Introduction
Growing data volumes are not a new problem. In fact,
big data has always been an issue. Fifty years ago, “big
data” was someone with a ledger recording inventory;
more recently, it was a bank’s mainframe processing
customer transactional data. Today, new technologies
enable the creation of both machine- and user-generated
data at unprecedented speeds. With the growing use of
smartphones and social networks, among other technolo-
gies, IDC estimates that digital data will grow to 35
zettabytes by 2020 (IDC, 2011). These new technologies
have turned big data into a mainstream problem. In fact,
it’s not uncommon to see small and midsize organizations
with just a few hundred employees struggling to keep up
with growing data volumes and shrinking batch windows,
just as large enterprises do.
The viability of many businesses will depend on their
ability to transform all this data into competitive
insights. According to McKinsey (Manyika, et al, 2011),
big data presents opportunities to drive innovation,
improve productivity, enhance customer satisfaction,
and increase profit margins. Although many CIOs and
CEOs recognize the value of big data, they have struggled
Jorge A. Lopez is senior manager, data
integration, for Syncsort Incorporated.
jlopez@syncsort.com
18 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
big data insights
(through no fault of their own) to handle this new influx
of data. The problem isn’t information overload; it’s the
failure to harness, prioritize, and understand the data
flowing in. This is why data integration is a critical—yet
often overlooked—step in the big data analytics strategy.
Traditional IT approaches will not generate the results
businesses expect in this era of big data. Therefore, IT
organizations should look at the hype around big data
as an opportunity to set a new strategy for harnessing
their data to improve business outcomes. As a first step,
organizations must examine their existing data strategies
and ask: Are these data strategies helping us achieve the
objectives of the business? Can our environment eco-
nomically scale to support the requirements of big data?
Can our infrastructure quickly adapt to new demands
for information?
To fully take advantage of new sources of information,
organizations must cut through the buzz that big data
creates. There are many definitions of big data, but most
experts agree on three fundamental characteristics:
volume, velocity, and variety. Another key aspect, often
overlooked, is cost. Forrester, for instance, defines big
data as “techniques and technologies that make handling
data at extreme scale affordable” (Hopkins and Evelson,
2011). This touches on two critical areas that must be
addressed to have a successful data management strategy:
scalability and cost effectiveness.
To scale data volumes 10, 50, or 100 times requires new
architectural approaches to the data integration process.
Doing so in a cost-effective way has been the biggest
challenge to date for organizations. No matter what kind
of IT environment you have or how you label your data
(big or small), there are steps you can take to rearchitect
and optimize your approach to data management, such as
returning your attention to the data integration process
in your quest for improved business insights.
Not All Big Data Is Important Data
Sometimes it’s easy to get caught up in the hype about
big data. However, trying to process larger data volumes
can significantly increase the amount of noise, hindering
your ability to uncover valuable insights. It’s important to
remember that not all data is created equal. Any big data
strategy must include ways to efficiently and effectively
process the required data while filtering out the noise.
Data integration tools play a key role in filtering out the
unnecessary data early in the process to make data sets
more manageable and, ultimately, load only relevant data
into the appropriate environment for analysis (whether
that is a data warehouse, Hadoop, or an appliance).
Organizations can take three approaches:
1.	 Define a clear data strategy that identifies the users’
data requirements. (Why do I need this data? How
will it help me accomplish my business objectives?)
2.	 Build an efficient data model that is adequate to the
demands of the business.
3.	 Have the right data integration tools to do the job.
Ultimately, the data integration tool is the critical
component; it can help materialize the strategy and
execute on it to build an efficient data model. The
tool must have the right capabilities as well as the
scalability and performance required to work effec-
tively. A key component is the ease of use that allows
developers to focus on business requirements instead
of worrying about performance, scalability, and cost.
To scale data volumes 10, 50, or
100 times requires new architectural
approaches to the data integration
process. Doing so in a cost-
effective way has been the biggest
challenge to date for organizations.
19BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
big data insights
Bringing Data Transformations
Back to the ETL Layer
Data integration and ETL tools have historically focused
on expanding functionality. For instance, ETL was origi-
nally conceived as a means to extract data from multiple
sources, transform it to make it consumable (by sorting,
joining, and aggregating the data), and ultimately load
and store it within a data warehouse. However, in today’s
era of big data, this strategy neglects two critical success
factors: ease of use and high performance at scale.
As IT organizations confront the accelerating volume,
variety, and velocity of data by applying analytics,
they have been forced to turn to costly and inefficient
workarounds, such as hand-coded solutions or pushing
transformations into the database, to overcome their
performance challenges. The costs of such scaling
approaches can outweigh their benefits. The best example
is staging data when joining heterogeneous data sources.
This practice alone increases the complexity of data
integration environments and adds millions of dollars a
year in database costs just to keep the lights on. As such,
an Enterprise Strategy Group survey (Gahm, et al, 2011)
found “data integration complexity” cited as the number
one data analytics challenge. There are new approaches
that don’t require big budgets, however.
To rectify this situation, we recommend bringing all
data transformations back into a high-performance,
in-memory ETL engine. This approach tackles four
main issues:
1.	 Think about performance in strategic, rather
than tactical, terms. This requires a proactive, not
reactive, approach. Performance and scalability
should be at the core of any decision throughout
the entire development cycle, from inception and
evaluation to development and ongoing maintenance.
Organizations must attack the root of the problem
with approaches that are specifically designed for
performance.
2.	 Organizations must improve the efficiency of the
data integration architecture by optimizing hardware
resource utilization to minimize infrastructure costs
and complexity.
3.	Productivity gains can be achieved through self-
optimization techniques, which means that little, if
any, manual tuning of data transformations should
be required. The constant tuning of databases can
consume so many hours and resources that it actually
hinders the business.
4.	 Cost savings are realized by eliminating the data
staging environment, resulting in server and database
maintenance cost savings; deferring large infrastruc-
ture investments with the efficient use of system
resources; and gaining improved developer productiv-
ity because a considerable amount of time need not be
spent tuning for growing data volumes, thus providing
more time for strategic projects.
The high-performance ETL approach should accelerate
existing data integration environments where organiza-
tions have already made significant investments and
enhance emerging big data frameworks such as Hadoop.
IT departments within several companies have initiated
high-performance ETL projects to achieve a long-term,
sustainable solution to their data integration challenges:
■■
A storage industry pioneer and leading producer of
high-performance hard drives and solid-state drives
needed to improve its assurance process and inven-
tory management with faster data processing of one
million data records from its manufacturing plants.
Using a high-performance ETL strategy, the company
has reduced data processing times from 5.5 hours
to 12 minutes and has released 70 percent of its data
warehousing capacity to devote to analytics.
■■
An independent provider of risk assessment and
decision analytics expertise to the global healthcare
industry needed to process and analyze 40–50 TB
of claims data per month to uncover risk-mitigation
opportunities. Through a similar approach, the
healthcare analytics organization reduced processing
20 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
big data insights
from 2 hours to 2.5 minutes. Further business growth
could also be supported by reducing turnaround time
for new customers being entered into the system from
5 days to 24 hours.
■■
A mobile advertising platform company needed to
quickly analyze large volumes of online activity data
(such as views, clicks, and conversion rates) that was
doubling every year in order to make important
decisions (such as what ad space to bid on and when
and where to place ads for customers). The business
went from waiting one hour to obtain the information
needed to adjust advertising campaigns down to 10
minutes. In addition, its two developers, who spent
most of their time just maintaining the infrastructure,
can now work on more valuable projects.
The benefits of a proper ETL process with fast, efficient,
simple, and cost-effective data integration translate into
benefits across the entire organization, including opera-
tional, financial, and business gains, with the ultimate
benefit being quicker access to cleaner, more relevant data
to drive big data insights and optimize decision making.
Optimize Your Hadoop Environment
In today’s world of mobile devices, social networks, and
online data, organizations must massively scale data
integration and analytics differently than before. Accord-
ing to Forrester (2011), despite the opportunity new data
presents, organizations use only a small fraction of the
data available to them. A new architecture is necessary
to change both performance and costs that are driving
Hadoop, the open source framework for big data.
Hadoop is designed to manage and process large data vol-
umes. It presents several opportunities but also introduces
challenges—including scalability and ease of use—that
lead to siloed deployments with limited functionality,
which is why Hadoop doesn’t provide significant value
by itself. Organizations should not expect to rely solely
on Hadoop for all their needs; other tools and platforms
need to complement Hadoop to optimize the data
management environment for these data sets.
Hadoop gets its scalability by deploying a significant
number of commodity servers. This way, the Hadoop
framework can distribute the work among servers
for increased performance at scale. Of course, adding
commodity hardware running open source software
looks like a more cost-effective proposition than adding
nodes to a high-end, proprietary database appliance.
However, the hardware required to cope with growing
data volumes and performance service-level agreements
can grow significantly. Therefore, it is not uncommon to
find Hadoop deployments with a significant number of
nodes. This elevates capital and operational costs because
of hardware maintenance, cooling, power, and data
center expenses. In addition, the required tuning involves
hundreds of configurable parameters, making it difficult
to achieve optimum performance.
Such increased complexity is tied to ease of use, which
is one of the major challenges facing nearly every
organization working with Hadoop. Hadoop is not easy
to develop. For instance, adding new capabilities (such as
reverse sorting) and coding MapReduce jobs is typically
performed manually, which requires specific skills that
are expensive and difficult to find. For many organiza-
tions, finding the skill set needed to manage Hadoop is
the most significant barrier to Hadoop adoption.
Organizations can overcome these challenges and
extend Hadoop’s capabilities, maximize its value, and
simplify the overall Hadoop experience by integrating the
high-performance ETL approach. This approach allows
for sorting and organizing the data before it is pushed
into the Hadoop environment by leveraging Hadoop
Distributed File System (HDFS) connectivity and by
creating MapReduce jobs in a separate graphical interface
rather than writing Java or Pig scripts. Data integration
comes into play after analysis as well; the results of the
analyzed data need to be reintegrated into other informa-
tion systems.
For example, comScore, a global digital information
provider of online consumer behavior insights, saw its
data volume increase 72 times per day within two years
and deployed a Hadoop cluster to better manage the
data processing.
21BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
big data insights
However, it is challenging to bring Hadoop into an
enterprise with heterogeneous operating systems.
Moreover, Hadoop lacks critical features such as real-time
integration and robust high availability. Therefore, com-
Score deployed a data integration strategy that groups
and splits larger data files that fit more perfectly into
Hadoop, which provides a higher rate of parallelism on
compressed files and reduces disk costs for the Hadoop
cluster. This saved 75 TB of data storage per month and
slashed processing time from 48 hours to just 6 hours, so
comScore can now process twice the data each month
(compared to a year before), allowing the company to
provide its customers data insights faster.
Summary
Today’s enterprises must make sense of the increasing
volume, velocity, and variety of data while maintaining
cost and operational efficiencies. Your business
intelligence strategy must focus on optimizing the data
integration process so it is fast, efficient, simple, and
cost effective. This means ensuring you have all the
right data at your fingertips by managing the volume
and new sources of data, coupled with scalability as
data requirements evolve. Quicker access to cleaner,
more relevant data is what drives big data insights and
what will truly lead your enterprise to faster and more
profitable decisions. ■
References
Gahm, Jennifer, Bill Lundell, and John McKnight
[2011]. “The Impact of Big Data on Data Analytics,”
Enterprise Strategy Group, research report, September.
http://www.esg-global.com/research-reports/research-
report-the-impact-of-big-data-on-data-analytics/
Hopkins, Brian, and Boris Evelson [2011]. “Expand Your
Digital Horizon With Big Data,” Forrester, research
report, September. http://www.forrester.com/Expand
+Your+Digital+Horizon+With+Big+Data/fulltext/-/E-
RES60751?objectid=RES60751
IDC [2011]. “The 2011 Digital Universe Study:
Extracting Value from Chaos,” digital iView report,
June. http://www.emc.com/collateral/demos/
microsites/emc-digital-universe-2011/index.htm
Manyika, James, Michael Chui, Brad Brown, Jacques
Bughin, Richard Dobbs, Charles Roxburgh, and
Angela Hung Byers [2011]. Big Data: The Next
Frontier for Innovation, Competition, and Productivity,
McKinsey Global Institute, May.
http://www.mckinsey.com/insights/mgi/research/
technology_and_innovation/big_data_the_next_
frontier_for_innovation
22 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
practical dashboard development
Implementing
Dashboards for
a Large Business
Community
A Practical Guide for the Dashboard
Development Process
Doug Calhoun and Ramesh Srinivasan
Abstract
Dashboards are becoming more prevalent as business intelli-
gence tools, and the reason is obvious: well-designed, accurate
dashboards can quickly communicate important business
indicators and trends and provide actionable information.
However, creating and implementing a successful dashboard
involves a great amount of work. It often requires implementing
tight controls while allowing the flexibility needed to test and
learn with the business.
This article outlines tips for how to integrate these seemingly
divergent processes as well as how to ensure the data
accuracy, ease of use, and optimal performance that make
a truly successful dashboard.
Introduction
The use of dashboards as a primary business intelligence
tool is expanding quickly. When supporting a business
unit fueled by data, how does an application team build
dashboards that will provide great business value and
be sustainable?
There are many methods for doing this, as we will explain
in this article. However, there are also certain fundamen-
tal principles that may seem obvious, but can be difficult
to implement:
■■
Engage business users, not just at the beginning and
end of a project, but throughout the entire process.
Make business users your partners.
Doug Calhoun is systems analyst, claims
technology—data and information delivery
at Allstate Insurance Company.
dcal9@allstate.com
Ramesh Srinivasan is manager, claims
technology—data and information delivery
at Allstate Insurance Company.
rsri2@allstate.com
23BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
practical dashboard development
■■
Involve the entire application team throughout your
project’s life. A factory-like approach of handing off
tasks from phase to phase will not work well.
■■
Although design updates may require an iterative
approach with business users, the number of compo-
nents needed should drive the team to define phases and
key deliverables early in your project to keep it on track.
■■
Sophisticated user interfaces are great, but in the
end, it’s really about the data. Ensure that everyone
is in agreement about how to define the data from a
business point of view, and create a plan for how to
validate it.
■■
Ease of use is critical. Make sure your business part-
ners get hands-on opportunities as often as possible.
■■
Design your technology based on the number and types
of users. Performance and capacity should be considered
when designing and building dashboards, much as they
are with more traditional transactional systems.
This article is not intended to serve as a guide to visual
design. That topic has already been studied extensively
and successfully. We will discuss best practices for the
process of creating a successful design.
In addition, the word dashboard is used here as a general
term for data visualization tools showing at-a-glance
trends and other indicators. It is not meant to signify
the timing or refresh rate of the data, and could be
used interchangeably with “scorecard” depending on
how a business unit chooses to define it. In the business
intelligence world, “dashboard” has become the most
common term, so it will be used here with assumed
broader connotations.
Another concern is process methodology. Many compa-
nies primarily employ a waterfall life cycle, which can be
a difficult fit for a business intelligence implementation.
However, a purely agile methodology for dashboards
can also lead to trouble, as there are complexities with
development and testing that require a certain level of
more traditional phase-gating.
Essentially, the dashboard needs to be treated both as
an application (with all the functional testing required)
as well as a mechanism for providing data, including
iterative testing and prototype updates. A certain level of
flexibility in your development process may be required to
achieve a happy medium and ensure a successful rollout.
Depending on the size of your company, you may also
need to leverage the assistance of other technology groups
to implement. Where appropriate, involve groups such
as your business intelligence community of practice or
center of expertise; data, solution, and/or BI architecture;
database administrators; all associated infrastructure/
server administrators; change and release coordinators;
and any other applicable groups you believe should be
enlisted. Do this early.
All of this may require “innovating your process,” which
might sound like a contradiction in terms to process
methodologists but may have practical application to
your work. The best practices below will guide you in the
direction that best fits your project’s needs.
Starting the Project
If you are embarking on a dashboard project for the first
time, there are several rules of thumb you should follow
at the project’s outset.
First, as with any project, you will need to define team
roles and lay the groundwork for how the project life
cycle will work. At the same time, you will need to
Sophisticated user interfaces
are great, but in the end, it’s
really about the data. Ensure that
everyone is in agreement about
how to define the data from a
business point of view.
24 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
identify and engage all stakeholders and ensure both
groups agree on expected outcomes.
It is unlikely that you will be working on a dashboard
project without a business case behind it, but getting a
request from the business and truly engaging users as
partners are two very different things. Although it can
be easy to take orders and make assumptions, keeping
the key business partners involved throughout the entire
project life cycle, and beyond, is absolutely essential.
Finally, you will need some vital information before you
begin. Some questions are obvious from a technical point
of view. What are the data sources? Will data be stored
separately? What tools will be used? What environment(s)
must be built? Other questions are just as vital, but may
not be so obvious. For example, is the project feasible,
especially as an initial effort?
We recommend you limit the scope of an initial
dashboard to a simple, straightforward first effort that
has high business value. This way, a quick win is more
possible, success can be attained early, and business trust
will be earned as you “learn the ropes.”
You will also need to be sure that the project is appropri-
ate for a dashboard or other visualizations. For example,
if the business primarily wants to track how hundreds
of their individual workers are performing, a dashboard
is likely not the right vehicle. However, if they want to
track how their offices are performing over a period of
time, using standard, well-known measures within the
company, then a dashboard may be the best option. (You
can still consider getting to the individuals’ detail, which
we’ll discuss shortly.)
The main lesson here, and throughout the early phases of
your project, is to ask questions and keep on asking them!
If something does not make sense or seems impossible,
work with business users until you reach a mutually
satisfactory agreement.
Once the project looks possible, list all your assump-
tions—whether business related, technical, or process/
project based. You’ll need this list to build an order-of-
magnitude estimate, define the technical space you will
be working within, and help business users understand
their role during the project (and how crucial it is).
Having everything in order even before detailed require-
ments are determined will give both you and business
users confidence. After all, before you start involving
them in detailed requirements meetings, they’re going to
want some idea about when to expect a finished product.
Finally, as you devise this plan, treat the dashboard as
a full-blown application. Although the dashboard is
built in the business intelligence space, it has both the
complexity of a dynamic user interface (with the myriad
possibilities of errors on click events), as well as the need
for absolutely exact, gold-standard data. Both the data
and the functionality will need to be tested thoroughly,
as if you were developing a transactional application. If
you release the slickest, most attractive dashboard your
business has ever seen but the data is wrong or a button
doesn’t work, user confidence will quickly erode. Your
dashboard may be pretty—pretty meaningless.
Consider the metrics and aggregations needed and what
types of structures will be required to support your
project. Depending on your company’s standards, you
might be using denormalized tables, dimensional tables
in a warehouse (or a combination of these), an integra-
tion of detailed and aggregated data, OLAP cubes, or
many other possible sources and targets. As with any BI
solution, you need to choose the appropriate data model.
Limit the scope of an initial
dashboard to a simple,
straightforward first effort that
has high business value.
practical dashboard development
25BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
The point here is that performance is paramount for
user adoption.
Document and agree upon functional requirements and
data definitions while offering the flexibility of iterative
testing and tweaking that a business intelligence solution
should provide. It is critical to lock down the logic
behind the displayed metrics early in the project. If that
changes or is vague to everyone, there is little chance
you’ll deliver a successful dashboard.
Gathering Requirements
Involving business users in your work is crucial—and
most clearly needed—early in a project, especially during
requirements gathering and scoping. You may need to
remind yourself to keep your business partners actively
involved, because it’s vital to your project’s success!
Multiple meetings will certainly be necessary, but make
sure to keep users actively engaged via various methods,
including whiteboarding at first, dashboard prototyping
later, and sharing early data results. This will not only
help hone the requirements, but also allow business users
to feel they are truly partnering on the project. This will
ensure that they know and trust what they will be getting.
In addition, the entire development team should be
involved from inception through implementation to
ensure nothing gets lost in translation through the work.
See Figure 1 for a gauge of both business and technical
involvement through a general project life cycle (regard-
less of the specific methodology used).
The following best practices can help you avoid pitfalls
during requirements gathering, even when the relation-
ship with the business is good.
Know your user. It is possible that your business partner
may represent only one part of the larger group using
the dashboard, or may be assigned to a project and may
not be an ultimate end user at all. Some users may have
different business needs from your primary business
partner. Make sure that you define all the groups of users
who will have access to the dashboard, and ensure all of
their voices are heard. This is not as easy as it sounds, but
is worth the effort.
Scope
Datarequirements
Dataanddashboarddesign
BuildandUIupdate
Usertesting
Implementandchangenav
Architects/BI CoE/DBA
Testing team
Developers
Analysts (technical)
Business champion/SME
Business sponsor
Figure 1. Both business users and technical staff should be involved throughout a project’s life cycle.
practical dashboard development
Dashboard Implementation Effort by Role and Phase
26 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Define a use case for every component you build. There is
no point in creating a dashboard component unless
there is a direct use for it that can be easily defined and
documented. Documenting the requirements is crucial
to ensure business users get what they have asked for and
so developers and testers have a clear guide about what
they must build. You want to ensure that the use cases,
and the data shown, will stay meaningful over time for
each component; it is not a good idea to introduce new
or rarely used metrics with a dashboard solution. Finally,
require sign-off for all use cases, business requirements,
and scope documentation you create. The scope should
be limited to the business metrics and granularity of
the data at this stage; visualization requirements can be
developed later.
Know your data sources and plan your approach. You must
understand both where the data initially resides and,
if you use an extract-transform-load (ETL) or similar
process, where it will eventually reside. If storing the
data, you will need to know how it should be stored, how
long the data will need to be available for access, and how
often it needs to be refreshed. Especially if using ETL,
three-quarters of your work will be spent on the analysis,
load build and testing, and validation of the data. Even
without ETL, our experience is that the majority of the
time should be spent working with the data rather than
building the front end. Given the visual nature of dash-
boards, it is easy to assume that the bulk of your work is
spent building attractive, user-friendly interfaces. This
is simply not the case with successful implementations,
especially when so many easy-to-implement dashboard
tool suites are available.
Include only trusted, known metrics whenever possible.
Exceptions may arise, but if metrics are well known, the
exceptions will be much easier to validate. The sources of
the data must also be trusted, and business users should
be included in selecting data sources.
Know your refresh rate. Will the dashboard be loaded
monthly, weekly, daily, hourly, or a combination of
these frequencies? The fundamental dashboard design
approach will depend on your answer to this question.
Use cases will drive your design. Make sure you have
thorough discussions about what is really needed versus
what would be nice to have, because the more often the
dashboard will be refreshed, the more support (and cost)
it will require after rollout.
Identify all filters and selections. The earlier in the project’s
life you can do this, the better. This information has a
major influence on your dashboard design and will affect
decisions about performance and capacity. If a large
combination of multi-select filters can be selected for one
component, there will be a multitude of data combina-
tions to validate and possibly many thousands of rows
to be stored. Technologists can be tempted to impress
their business partners, but be careful not to promise
something that is not scalable or sustainable.
Understand what levels of aggregation and detail are required.
An early requirements exercise should involve the filters
and dimensions that will be used as well as how they
should be aggregated. Time periods are a common
dimension as are office or geographical territories. On
the flip side, sophisticated business users will inevitably
want to know the details behind what is driving their
trend or that one outlier metric. Not having a method of
either drilling down to (or otherwise easily accessing) the
detail behind the aggregation will frustrate users after
the post-implementation honeymoon period has ended.
Determining aggregation/detail needs should be part
of the discussions during requirements gathering, but
remember to balance your requirements with develop-
ment difficulty and desired timelines. If detailed data is
provided, it should be accessed directly via the dashboard,
Define all the groups of users
who will have access to the
dashboard, and ensure all of
their voices are heard.
practical dashboard development
27BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
whether through sub-reports or drill-down capabilities in
the components themselves, depending on your tool set.
Identify how much history you need. Some graphical trends
will require year-over-year comparisons. Beyond that, it
may be worth considering how long any data that no
longer appears on the dashboard should be retained.
If it does need to be retained for compliance or other
purposes, an archival strategy should be considered, or
possibly a view built on top of the base data. The more
the dashboard can be limited to querying only the data it
needs to display, the better it will perform.
Define the data testing and validation process. It is never
too early to address how you will ensure data quality
through a validation process. Defining specific responsi-
bilities and expectations, and what methods will be used
for validation, should happen even before design. This
will also ensure that business users will be ready when
they are asked to begin testing. The validity of the data
is the most critical factor in the dashboard’s success and
continued use.
Integrate business users. There are several ways to involve
business users in requirements gathering and refinement
besides letting them dictate while you take notes. These
options include:
■■
Prototype early and often. Prototyping can start with
simple whiteboard exercises, and many dashboard
tools now lend themselves to quick prototyping
so business users can see and play with something
similar to the final product deliverable. This hands-on
method is excellent for rooting out requirements gaps,
although it should not replace formal documentation.
■■
Use real data wherever possible when prototyping to
give business users a better context. It also helps you to
identify and correct data issues early.
■■
Integrate developers. Requirements gathering should
not be done solely by analysts. If there are separate
individuals responsible for coding, they must be
involved at this stage so they truly understand the
value and meaning of what they will build.
■■
Set expectations for production support. Agree upon
a process for communication of user questions or any
defects users discover. Depending on the user, this can
be done many ways, although users at the executive
level will likely prefer a direct communication path
with the team’s manager(s). Additional suggestions
appear in the post-implementation section later in this
article.
■■
Define milestone deliverables. Regardless of the
software development methodology you use, defining
milestone deliverables is critical for instilling and
retaining business confidence in the project. It is also
necessary to ensure the development team is progressing
as expected.
	
Milestone due dates should be communicated early
and deadlines met. If a deadline is at risk of being
missed, share this information (as well as the reasons
for the problem and the recommended course of
action) with business users so new dates and dead-
lines can be agreed upon or so the team can remove
items from the project scope or adjust resource levels
and assignments.
An early requirements exercise
should involve the filters and
dimensions that will be used as well
as how they should be aggregated.
practical dashboard development
28 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
Required deliverables from the business requirements-
gathering phase may include:
■■
Scope lockdown, with documentation of what is in
scope and out of scope.
■■
Final prototype with business sign-off. (Note: This
remains a working prototype, and all team members
must understand and agree that the design may
change later in the project if practical.) The highest-
level sponsor of the project should be part of this
sign-off, as well as further sign-offs of the actual
product prior to rollout.
■■
Detailed requirements definitions, including images
from the prototype. Such documents can tie the
business definitions of the metrics to the way they will
be displayed. Such a connection will bring clarity both
to the business client and to the developers/analysts
building the solution.
■■
Technical conceptual design. This high-level docu-
ment defines all data sources and targets, what delivery
mechanisms are being used, and the general business
use case(s) for the dashboard.
Designing and Building the Dashboard:
Soup to Nuts
When dashboard design has begun, all layers should
be considered in relation to one other. For example, if
the dashboard will be connected to aggregated tables
designed for performance, those tables, the way they are
loaded (or otherwise populated), and any performance
and capacity concerns should be considered. This is just
as important as designing the dashboard functionality.
In general, the dashboard design should:
■■
Ensure a single, consistent view of the data. This can
apply to the visual look and feel as well as how often
the components on a screen are refreshed. The user
should not have to think about how to interpret the
dashboard; the data presentation should be clear
and intuitive.
■■
Keep everything in one place. If detailed data or supple-
mental reports are needed, use the dashboard like a
portal or ensure a centralized interface keeps the data
logically consolidated. Also, make sure the same data
source is used for both detailed and aggregated data
on the dashboard.
	
Keep in mind, however, that business users may expect
that a snapshot of the dashboard will not change. For
example, a monthly metric could possibly vary slightly
in the source data, but re-querying every time for the
dashboard view with different results could erode
confidence and even skew expected trends. Have a
conversation with business users early on to discuss
such scenarios and determine whether storing point-
in-time dashboard snapshots will be required.
■■
Understand the usage scenario. Knowing the size of the
user base, as well as the types of users and when they
will be accessing the dashboard, can drive design. You
should understand the usage volumetrics early in your
project and plan accordingly. You must also ensure
that any maintenance windows do not conflict with
peak-time use. Environment sizing, capacity, and
performance will all be critical to ensure a stable tool.
■■
Address multiple environments for development. If your
environment has the necessary capacity, build develop-
ment, test, and production environments. It’s worth it.
Defining milestone deliverables
is critical for instilling and retaining
business confidence in the project.
practical dashboard development
29BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
■■
Plan to validate data accuracy as early as possible, and
ensure your design and project plan allow this. To avoid
rework, it is crucial to make every effort to get the
data perfect and acquire sign-off in a lower database
environment during user testing. This will ensure
that the data acquisition process is free of bugs. At the
same time, ensure that you validate using data from
the production source system(s), because the data
will be well defined and likely have an independent
method of validation.
■■
Roll out with historical data available. Plan on migrating
all validated data to production tables along with
the rest of the code. Implementing a dashboard with
predefined history and trends will ensure a great first
impression and enhance user confidence.
In addition to these areas of focus, consider several
design best practices for both database/data quality and
dashboard interfaces.
Database-Level Best Practices
Ideally, your dashboard will be running in a stable
database environment. This environment may be man-
aged by your team or may be the responsibility of another
area of your company. Either way, your dashboard is
meant to provide data for quick and meaningful analysis,
so treating the data and the tables in which it resides is
critical. Some best practices include:
■■
Using ETL or other data acquisition methods to
regularly write to a highly aggregated, denormalized
table. This will ensure optimal performance, as
dashboard click events need to be fast. A good goal
is to ensure that no click event takes more than three
seconds to return data to the dashboard.
■■
Use predefined and well-maintained dimensional
tables wherever possible. This ensures consistency and
eliminates redundant data structures.
■■
Store the data using IDs, and reference static code or
dimensional tables wherever possible. This way, if a
business rule changes, only one table must be modi-
fied, and no data has been persisted to a table that is
now outdated.
■■
Design and model the data so the front end can
dynamically handle any business changes at the
source level. This eliminates the need to update the
code every time business users make a change, and
maintenance costs will be much lower. The develop-
ment team will then be able to work on exciting new
projects rather than just keeping the lights on.
■■
Detailed data should be kept separate and not reloaded
anywhere, if possible. However, it should be available
in the same database so the aggregate and related
detail can easily coexist.
■■
Unless absolutely necessary, do not store calculated
values or any data that is prone to business rule
changes. If persisted data becomes incorrect, it can
be a huge effort to re-state it. Calculated fields can be
done quickly using front-end queries or formulas (if
designed properly).
■■
Create a data archival strategy based on business
needs for data retention and how much history the
dashboard needs to show.
■■
Ensure that any queries from the dashboard to the
tables are well-tuned and that they will continue to
run quickly over time.
■■
Likewise, ensure that any middle-tier environment
used for running the dashboard queries is highly stable
and can take advantage of any caching opportunities
to enhance performance.
Dashboard-Level Best Practices
Spending a great deal of time on getting the dashboard
data modeled, stored, automated, and correct will, of
practical dashboard development
30 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
course, all be for naught if the dashboard front end is
not intuitive, does not perform, or otherwise does not
have high availability. To address this, take these steps
throughout the life cycle:
■■
Check the dashboard usability by bringing in end
users who were not involved in the initial project.
Observe how quickly and easily they can meet their
objectives, and remove all bias as you watch. You will
need to plan for their participation well in advance,
and this work should be done early in your testing
(make sure to have production data at this point) so
there is time to react to their input.
■■
Within the dashboard code, implement dynamic
server configuration so all dashboard components can
automatically reference the proper environment for
the database, middle tier, and front end itself. This
reduces reliance on hard-coded server names and can
prevent deployments from accidentally pointing to the
wrong location.
■■
Users will want to use Excel regardless of how
well-designed your dashboard is. Make sure an Excel
export option is available for all the data shown on the
dashboard and any included reports.
■■
For every dashboard component, include a label
referencing the specific data source as well as the data
refresh date. This simple step resolves confusion and
will greatly reduce the number of support questions
you receive post-rollout.
■■
Do everything possible to avoid hard-coding filters,
axes, or any other front-end components that change
based on predictably changing business. The data and
the front end both need to be flexible and dynamic
enough to display information based on a changing
business. The dashboard should not have to display
invalid or stale data for a time period while the devel-
opment team scrambles to implement a production fix.
That would inevitably lead to a drop in user adoption
and reduced confidence in the dashboard’s validity.
■■
Test plans should include scripts for testing the overall
dashboard load time as well as specific load times for
all click events. This will afford the time needed to
tweak code for optimal performance.
■■
Near the end of testing, simulate a performance load
test whether you have automated tools to do this
or you have to do it manually with multiple users.
The purpose is to ensure no part of the underlying
infrastructure has an issue with load.
■■
Test boundary conditions to avoid unforeseen defects
later in the project’s life. For example, what happens
when a multi-year trend goes into a new year? Will the
x-axis continue to sort properly? Define all conditions
like this and find a way to test each one.
Running the Project (and Subsequent Projects)
Considering the myriad of complexities involved in
implementing a dashboard, from ensuring correct data
is available when expected, to designing a usable and
innovative front end, to working with the business
through multiple and complex requirements, costs can be
high and timelines can easily be missed if the project is
not carefully managed.
The following procedures will help ensure a successful
dashboard release, all in the context of the best practices
explained so far:
Create and use an estimating model. The model should
cover all aspects of a dashboard release (from data to user
interface), all the technical roles and resources that will
be involved, and be sufficiently detailed to break down
time in hours by both phase and resource type. A model
Do not store calculated values
or any data that is prone to
business rule changes.
practical dashboard development
31BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
that can be defined by selecting answers to requirements-
based questions will be the easiest for your analysts to
use, such as: How many metrics and components will be
displayed? How many data sources will be used? Does
data for validation exist?
The model should be refined after each large project by
entering the answers to these questions and determining
how closely the model’s hours match those actually spent.
Data validation is your top priority. Plan and allocate the time
with your business partners and understand what data
sources you will use for validation. If there is no indepen-
dent source, you and your business users must reach an
agreement about how validation will be performed.
Share real data as soon as it becomes available and the
team has reasonable confidence in its accuracy. There
is no reason to wait to share data, regardless of how
early in the process this occurs, because the earlier data
defects are identified and resolved, the more smoothly the
subsequent processes will go.
As we’ve mentioned, we recommend you implement your
project with historical data loaded. If this is planned,
ensure that business users are aware and secure their
pledge to spend adequate time comparing and validating
the historical data.
Define phases of work and identify key deliverables for each.
Regardless of the development methodology your depart-
ment uses, you must align milestones to specific dates to
ensure the project does not get out of control and to keep
business users confident in your progress.
Depending on your business client and their expectations,
you may need to blend agile and waterfall methods.
Although this will not satisfy ardent practitioners of the
methodologies, a blended approach can allow for the
iterative testing and discovery that this type of work
requires while ensuring adherence to a strict timeline,
which a release of this complexity also requires.
Implementations are complex, so make a detailed plan. The
manager or lead of the project should define all the steps
needed, assign dates and responsible parties, and build
a T-minus document/project scorecard. These tasks
should be completed during the initial stages of the work,
soon after any intake approval and/or slotting, and the
document should be reviewed with the entire team at least
once a week to ensure the project is consistently on track.
Escalate all identified issues and risks early and often. If your
department already has a process for bringing issues and
risks into the open and to the attention of those who can
mitigate them, use it. Otherwise, create your own process
for the project. Enlist all stakeholders and technology
leaders for support, and do this proactively.
Review, review, review. Plan multiple design and code
reviews, and assume at least a draft and final review
will be needed for each major piece of work. Devote
ample time to design review, because waiting until the
dashboard is built may make recovery impossible if a
fundamental design flaw has gone unnoticed. Formalize
a method for tracking and implementing all changes
identified during reviews.
Keep the development team engaged. For example, if the
development team includes offshore resources, record
key meetings using Webinar technology. This can serve
as both knowledge transfer and training material later.
Make sure everyone knows about the recording and
ensure that no legal or compliance issues will arise.
Even though your work may be completed in phases,
dashboards can rarely be efficiently delivered if a “factory”
Depending on your business
client and their expectations,
you may need to blend agile
and waterfall methods.
practical dashboard development
32 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4
approach is used (in which requirements are passed to
designers, and designs passed to builders, without every-
one being involved). When a developer is far removed
from business users working on a dashboard project, this
can lead to project failure.
Implement a formal user-acceptance testing process. Once
the development team has completed all internal testing
of data and functionality, plan time (we recommend two
to three weeks) to allow the business team to complete
their tests. Testing should include as much data valida-
tion as possible. We recommend you formally kick off the
testing phase with business users and employ a docu-
mented process for submitting defects and questions to
the development team. Make this easy for your business
partners. They should focus on testing, not on how to
submit their test results.
Require sponsor/stakeholder approval before rollout. This will
give your dashboard legitimacy to the ultimate end users
and is invaluable for those early weeks when adoption
may hang in the balance. This approval should include
a presentation during which the sponsor can view and
provide feedback about the dashboard, with sufficient
time allotted to make adjustments. As mentioned,
we recommend you conduct sponsor reviews of the
dashboard throughout the project, including during
prototype design.
Post-Implementation (You’re Never Really Done)
After the dashboard is implemented, team members are
often tempted to relax. There may also be new projects
demanding focus.
Do not become distracted or complacent, because
there are certain post-implementation steps that will
ensure both that the few critical months after rollout
go smoothly and that the development team does not
become bogged down by production support or answer-
ing business questions.
First, build post-implementation work into the initial
plan. Sustainability and support should be factors in
scope and technical design.
For larger rollouts, consider best practices for the sponsor-
ing business group and the technology team to handle
presentations. This way, both business and technical
questions can be answered accurately, all key partners are
included, and accountability is shared.
Post-rollout sponsorship and change navigation
coordination are crucial. The business unit will likely be
responsible for communications and training, but the
technology team can and should influence this.
If possible, ensure you have a method to collect usage
metrics. If you can identify usage by user ID, that is even
better because delineation between business and technol-
ogy usage can be made and groups can be identified for
training if usage is lower than expected.
The development team can suggest and implement
innovative ways to communicate with users:
■■
Add a scrolling marquee to the dashboard or use some
other technique for instantly communicating impor-
tant messages. This component should be database
driven, and the technical support team should have
write access into a table separate from the dashboard’s
main tables. This way, announcements such as
planned downtime or key data load dates can be easily
delivered to all users.
■■
Add an e-mail button that goes directly to the
dashboard support team. This may not be a popular
choice for all technology teams, but dashboards are
Create an internal process
for ticket and defect handling,
and implement bug fixes in
small, bundled releases.
practical dashboard development
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare
BI "Governments" for Healthcare

More Related Content

What's hot

Creating Value with Digital HR
Creating Value with Digital HRCreating Value with Digital HR
Creating Value with Digital HRCapgemini
 
iabsg_dataroundtable
iabsg_dataroundtableiabsg_dataroundtable
iabsg_dataroundtablePeter Hubert
 
Digital leadership : What's Different?
Digital leadership : What's Different?Digital leadership : What's Different?
Digital leadership : What's Different?Vikas Mahendra
 
Digital Strategies for Employee Engagement
Digital Strategies for Employee EngagementDigital Strategies for Employee Engagement
Digital Strategies for Employee EngagementStephan Schillerwein
 
Outsourcing Best Practices - Process Efficiency
Outsourcing Best Practices - Process EfficiencyOutsourcing Best Practices - Process Efficiency
Outsourcing Best Practices - Process Efficiencyhillmand
 
Mobile and Informal Learning: Trends for 2012
Mobile and Informal Learning: Trends for 2012Mobile and Informal Learning: Trends for 2012
Mobile and Informal Learning: Trends for 2012Josh Bersin
 
HWZ-Darden Konferenz: Building a Sustainable Analytics Orientation
HWZ-Darden Konferenz: Building a Sustainable Analytics OrientationHWZ-Darden Konferenz: Building a Sustainable Analytics Orientation
HWZ-Darden Konferenz: Building a Sustainable Analytics OrientationHWZ Hochschule für Wirtschaft
 
Learning in the Flow of Work
Learning in the Flow of WorkLearning in the Flow of Work
Learning in the Flow of WorkJosh Bersin
 
PWC Report on the Future of Work:
PWC Report on the Future of Work: PWC Report on the Future of Work:
PWC Report on the Future of Work: Scott K. Wilder
 
HWZ-Darden Konferenz: The Curatorial Challenge of Leaders
HWZ-Darden Konferenz: The Curatorial Challenge of LeadersHWZ-Darden Konferenz: The Curatorial Challenge of Leaders
HWZ-Darden Konferenz: The Curatorial Challenge of LeadersHWZ Hochschule für Wirtschaft
 
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...Christiaan Lustig
 
The Emerging Role Of The Cio
The Emerging Role Of The CioThe Emerging Role Of The Cio
The Emerging Role Of The CioNirvesh Sooful
 
Digital HR
Digital HRDigital HR
Digital HRTom Haak
 
The Future of Work: Winning With an Agile Workforce
The Future of Work: Winning With an Agile WorkforceThe Future of Work: Winning With an Agile Workforce
The Future of Work: Winning With an Agile WorkforceCatalant Technologies
 
2017March-Future of Artificaial Intelligence in IT
2017March-Future of Artificaial Intelligence in IT2017March-Future of Artificaial Intelligence in IT
2017March-Future of Artificaial Intelligence in ITSan Francisco Bay Area
 
Digital HR Workplace: Simple Ways to Hire and Onboard Faster
Digital HR Workplace: Simple Ways to Hire and Onboard FasterDigital HR Workplace: Simple Ways to Hire and Onboard Faster
Digital HR Workplace: Simple Ways to Hire and Onboard FasterHuman Capital Media
 

What's hot (20)

Creating Value with Digital HR
Creating Value with Digital HRCreating Value with Digital HR
Creating Value with Digital HR
 
HWZ-Darden Konferenz: Strategy Execution
HWZ-Darden Konferenz: Strategy ExecutionHWZ-Darden Konferenz: Strategy Execution
HWZ-Darden Konferenz: Strategy Execution
 
iabsg_dataroundtable
iabsg_dataroundtableiabsg_dataroundtable
iabsg_dataroundtable
 
Digital leadership : What's Different?
Digital leadership : What's Different?Digital leadership : What's Different?
Digital leadership : What's Different?
 
HR Trends
HR TrendsHR Trends
HR Trends
 
Digital Strategies for Employee Engagement
Digital Strategies for Employee EngagementDigital Strategies for Employee Engagement
Digital Strategies for Employee Engagement
 
Outsourcing Best Practices - Process Efficiency
Outsourcing Best Practices - Process EfficiencyOutsourcing Best Practices - Process Efficiency
Outsourcing Best Practices - Process Efficiency
 
Mobile and Informal Learning: Trends for 2012
Mobile and Informal Learning: Trends for 2012Mobile and Informal Learning: Trends for 2012
Mobile and Informal Learning: Trends for 2012
 
HWZ-Darden Konferenz: Building a Sustainable Analytics Orientation
HWZ-Darden Konferenz: Building a Sustainable Analytics OrientationHWZ-Darden Konferenz: Building a Sustainable Analytics Orientation
HWZ-Darden Konferenz: Building a Sustainable Analytics Orientation
 
Learning in the Flow of Work
Learning in the Flow of WorkLearning in the Flow of Work
Learning in the Flow of Work
 
PWC Report on the Future of Work:
PWC Report on the Future of Work: PWC Report on the Future of Work:
PWC Report on the Future of Work:
 
HWZ-Darden Konferenz: The Curatorial Challenge of Leaders
HWZ-Darden Konferenz: The Curatorial Challenge of LeadersHWZ-Darden Konferenz: The Curatorial Challenge of Leaders
HWZ-Darden Konferenz: The Curatorial Challenge of Leaders
 
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...
How Focus on Digital Employee Experience (DEX) Improves Digital Workplace Ado...
 
The Emerging Role Of The Cio
The Emerging Role Of The CioThe Emerging Role Of The Cio
The Emerging Role Of The Cio
 
Digital HR
Digital HRDigital HR
Digital HR
 
The Future of Work: Winning With an Agile Workforce
The Future of Work: Winning With an Agile WorkforceThe Future of Work: Winning With an Agile Workforce
The Future of Work: Winning With an Agile Workforce
 
HR Trends 2017
HR Trends 2017HR Trends 2017
HR Trends 2017
 
Getting Strategic with HR Technology
Getting Strategic with HR TechnologyGetting Strategic with HR Technology
Getting Strategic with HR Technology
 
2017March-Future of Artificaial Intelligence in IT
2017March-Future of Artificaial Intelligence in IT2017March-Future of Artificaial Intelligence in IT
2017March-Future of Artificaial Intelligence in IT
 
Digital HR Workplace: Simple Ways to Hire and Onboard Faster
Digital HR Workplace: Simple Ways to Hire and Onboard FasterDigital HR Workplace: Simple Ways to Hire and Onboard Faster
Digital HR Workplace: Simple Ways to Hire and Onboard Faster
 

Viewers also liked

Paying attention, making connections
Paying attention, making connectionsPaying attention, making connections
Paying attention, making connectionsSreela Banerjee
 
Прикладна математика
Прикладна математикаПрикладна математика
Прикладна математикаMarina Matkova
 
Why be in the life skills user group
Why be in the life skills user groupWhy be in the life skills user group
Why be in the life skills user groupSreela Banerjee
 
Tuitando sobre Talk Shows
Tuitando sobre Talk ShowsTuitando sobre Talk Shows
Tuitando sobre Talk Showsairstrip
 
CHIME College Live - Anatomy of a Measure
CHIME College Live - Anatomy of a MeasureCHIME College Live - Anatomy of a Measure
CHIME College Live - Anatomy of a MeasureJason Oliveira
 
Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Swaroop Iyer
 
Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Swaroop Iyer
 
Tema 4 ланцюгові дроби
Tema 4 ланцюгові дробиTema 4 ланцюгові дроби
Tema 4 ланцюгові дробиMarina Matkova
 
Ark dc the world2102140745
Ark dc the world2102140745Ark dc the world2102140745
Ark dc the world2102140745ArkDataCentres
 
Case series of pseudocyst of pancreas
Case series of pseudocyst of pancreasCase series of pseudocyst of pancreas
Case series of pseudocyst of pancreasKaushik Kumar Eswaran
 
Anniversary party invitation cards
Anniversary party invitation cardsAnniversary party invitation cards
Anniversary party invitation cards媛媛 王
 
デジタルガジェット選抜総選挙2015
デジタルガジェット選抜総選挙2015デジタルガジェット選抜総選挙2015
デジタルガジェット選抜総選挙2015nettabo
 
презентаціяBankovskiy5 1
презентаціяBankovskiy5 1презентаціяBankovskiy5 1
презентаціяBankovskiy5 1Marina Matkova
 
Scalable vector graphics(svg)
Scalable vector graphics(svg)Scalable vector graphics(svg)
Scalable vector graphics(svg)Pradip Mudi
 
Каталог Esteban
Каталог EstebanКаталог Esteban
Каталог Estebanbykasya
 

Viewers also liked (20)

Paying attention, making connections
Paying attention, making connectionsPaying attention, making connections
Paying attention, making connections
 
Прикладна математика
Прикладна математикаПрикладна математика
Прикладна математика
 
Why be in the life skills user group
Why be in the life skills user groupWhy be in the life skills user group
Why be in the life skills user group
 
Tuitando sobre Talk Shows
Tuitando sobre Talk ShowsTuitando sobre Talk Shows
Tuitando sobre Talk Shows
 
CHIME College Live - Anatomy of a Measure
CHIME College Live - Anatomy of a MeasureCHIME College Live - Anatomy of a Measure
CHIME College Live - Anatomy of a Measure
 
Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851
 
Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851Kla tencor 2014-12-8_18-57-31_195604001418093851
Kla tencor 2014-12-8_18-57-31_195604001418093851
 
Tema 4 ланцюгові дроби
Tema 4 ланцюгові дробиTema 4 ланцюгові дроби
Tema 4 ланцюгові дроби
 
Tema 3 2007
Tema 3 2007Tema 3 2007
Tema 3 2007
 
Institutions delivery1
Institutions delivery1Institutions delivery1
Institutions delivery1
 
Tema 3 2007
Tema 3 2007Tema 3 2007
Tema 3 2007
 
Matb geometria 1
Matb   geometria 1Matb   geometria 1
Matb geometria 1
 
Ark dc the world2102140745
Ark dc the world2102140745Ark dc the world2102140745
Ark dc the world2102140745
 
Case series of pseudocyst of pancreas
Case series of pseudocyst of pancreasCase series of pseudocyst of pancreas
Case series of pseudocyst of pancreas
 
Anniversary party invitation cards
Anniversary party invitation cardsAnniversary party invitation cards
Anniversary party invitation cards
 
デジタルガジェット選抜総選挙2015
デジタルガジェット選抜総選挙2015デジタルガジェット選抜総選挙2015
デジタルガジェット選抜総選挙2015
 
презентаціяBankovskiy5 1
презентаціяBankovskiy5 1презентаціяBankovskiy5 1
презентаціяBankovskiy5 1
 
Scalable vector graphics(svg)
Scalable vector graphics(svg)Scalable vector graphics(svg)
Scalable vector graphics(svg)
 
Dia de bicicletar
Dia de bicicletarDia de bicicletar
Dia de bicicletar
 
Каталог Esteban
Каталог EstebanКаталог Esteban
Каталог Esteban
 

Similar to BI "Governments" for Healthcare

Building a Better Healthcare Dashboard
Building a Better Healthcare DashboardBuilding a Better Healthcare Dashboard
Building a Better Healthcare DashboardPerficient, Inc.
 
Strategic Workforce Planning: Key Principles and Objectives, Paul Turner
Strategic Workforce Planning: Key Principles and Objectives, Paul TurnerStrategic Workforce Planning: Key Principles and Objectives, Paul Turner
Strategic Workforce Planning: Key Principles and Objectives, Paul TurnerThe HR Observer
 
Big Data - Bridging Technology and Humans
Big Data - Bridging Technology and HumansBig Data - Bridging Technology and Humans
Big Data - Bridging Technology and HumansMark Laurance
 
Barry Ooi; Big Data lookb4YouLeap
Barry Ooi; Big Data lookb4YouLeapBarry Ooi; Big Data lookb4YouLeap
Barry Ooi; Big Data lookb4YouLeapBarry Ooi
 
Leading with Data: Boost Your ROI with Open and Big Data
Leading with Data: Boost Your ROI with Open and Big DataLeading with Data: Boost Your ROI with Open and Big Data
Leading with Data: Boost Your ROI with Open and Big DataMcGraw-Hill Professional
 
ROI of A Liberated Data Analyst
ROI of A Liberated Data AnalystROI of A Liberated Data Analyst
ROI of A Liberated Data Analyst3Sixty Insights
 
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATION
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATIONBUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATION
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATIONJen Stirrup
 
Fundamentals of Data Analytics Outline
Fundamentals of Data Analytics OutlineFundamentals of Data Analytics Outline
Fundamentals of Data Analytics OutlineDan Meyer
 
Understanding What’s Possible: Getting Business Value from Big Data Quickly
Understanding What’s Possible: Getting Business Value from Big Data QuicklyUnderstanding What’s Possible: Getting Business Value from Big Data Quickly
Understanding What’s Possible: Getting Business Value from Big Data QuicklyInside Analysis
 
Careers in it
Careers in itCareers in it
Careers in itnyunuks
 
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...Attivio
 
11 ntc sourcingit-final
11 ntc sourcingit-final11 ntc sourcingit-final
11 ntc sourcingit-finalkdbirex
 
Hdi Capital Area Program Slides May 18 2018
Hdi Capital Area Program Slides May 18 2018Hdi Capital Area Program Slides May 18 2018
Hdi Capital Area Program Slides May 18 2018hdicapitalarea
 
Are you getting the most out of your data?
Are you getting the most out of your data?Are you getting the most out of your data?
Are you getting the most out of your data?SAS Canada
 
INFORMATIONGOVERNANCEFounded in 1807, John W
INFORMATIONGOVERNANCEFounded in 1807, John WINFORMATIONGOVERNANCEFounded in 1807, John W
INFORMATIONGOVERNANCEFounded in 1807, John WKiyokoSlagleis
 
HDI Capital Area Local Chapter March 2016 Meeting
HDI Capital Area Local Chapter March 2016 Meeting HDI Capital Area Local Chapter March 2016 Meeting
HDI Capital Area Local Chapter March 2016 Meeting hdicapitalarea
 
Big Data Webinar 31st July 2014
Big Data Webinar 31st July 2014Big Data Webinar 31st July 2014
Big Data Webinar 31st July 2014Gorkana
 
Data Science & Big Data Analytics Discovering, Analyzing
Data Science & Big Data Analytics Discovering, AnalyzingData Science & Big Data Analytics Discovering, Analyzing
Data Science & Big Data Analytics Discovering, AnalyzingOllieShoresna
 

Similar to BI "Governments" for Healthcare (20)

Building a Better Healthcare Dashboard
Building a Better Healthcare DashboardBuilding a Better Healthcare Dashboard
Building a Better Healthcare Dashboard
 
Strategic Workforce Planning: Key Principles and Objectives, Paul Turner
Strategic Workforce Planning: Key Principles and Objectives, Paul TurnerStrategic Workforce Planning: Key Principles and Objectives, Paul Turner
Strategic Workforce Planning: Key Principles and Objectives, Paul Turner
 
Big Data - Bridging Technology and Humans
Big Data - Bridging Technology and HumansBig Data - Bridging Technology and Humans
Big Data - Bridging Technology and Humans
 
Barry Ooi; Big Data lookb4YouLeap
Barry Ooi; Big Data lookb4YouLeapBarry Ooi; Big Data lookb4YouLeap
Barry Ooi; Big Data lookb4YouLeap
 
Leading with Data: Boost Your ROI with Open and Big Data
Leading with Data: Boost Your ROI with Open and Big DataLeading with Data: Boost Your ROI with Open and Big Data
Leading with Data: Boost Your ROI with Open and Big Data
 
ROI of A Liberated Data Analyst
ROI of A Liberated Data AnalystROI of A Liberated Data Analyst
ROI of A Liberated Data Analyst
 
Digital transformation guide and checklist 2020
Digital transformation guide and checklist 2020 Digital transformation guide and checklist 2020
Digital transformation guide and checklist 2020
 
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATION
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATIONBUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATION
BUILDING A STRONG FOUNDATION FOR SUCCESS WITH BI AND DIGITAL TRANSFORMATION
 
Fundamentals of Data Analytics Outline
Fundamentals of Data Analytics OutlineFundamentals of Data Analytics Outline
Fundamentals of Data Analytics Outline
 
Understanding What’s Possible: Getting Business Value from Big Data Quickly
Understanding What’s Possible: Getting Business Value from Big Data QuicklyUnderstanding What’s Possible: Getting Business Value from Big Data Quickly
Understanding What’s Possible: Getting Business Value from Big Data Quickly
 
Careers in it
Careers in itCareers in it
Careers in it
 
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...
TDWI Best Practices Report- Achieving Greater Agility with Business Intellige...
 
11 ntc sourcingit-final
11 ntc sourcingit-final11 ntc sourcingit-final
11 ntc sourcingit-final
 
Hdi Capital Area Program Slides May 18 2018
Hdi Capital Area Program Slides May 18 2018Hdi Capital Area Program Slides May 18 2018
Hdi Capital Area Program Slides May 18 2018
 
Are you getting the most out of your data?
Are you getting the most out of your data?Are you getting the most out of your data?
Are you getting the most out of your data?
 
INFORMATIONGOVERNANCEFounded in 1807, John W
INFORMATIONGOVERNANCEFounded in 1807, John WINFORMATIONGOVERNANCEFounded in 1807, John W
INFORMATIONGOVERNANCEFounded in 1807, John W
 
HDI Capital Area Local Chapter March 2016 Meeting
HDI Capital Area Local Chapter March 2016 Meeting HDI Capital Area Local Chapter March 2016 Meeting
HDI Capital Area Local Chapter March 2016 Meeting
 
Big Data Webinar 31st July 2014
Big Data Webinar 31st July 2014Big Data Webinar 31st July 2014
Big Data Webinar 31st July 2014
 
Data Science & Big Data Analytics Discovering, Analyzing
Data Science & Big Data Analytics Discovering, AnalyzingData Science & Big Data Analytics Discovering, Analyzing
Data Science & Big Data Analytics Discovering, Analyzing
 
Slalmd2014 cid presentation
Slalmd2014 cid presentationSlalmd2014 cid presentation
Slalmd2014 cid presentation
 

Recently uploaded

Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetEnjoy Anytime
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Hyundai Motor Group
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 

Recently uploaded (20)

Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 

BI "Governments" for Healthcare

  • 1. EXCLUSIVELY FOR TDWI PREMIUM MEMBERS Volume 17 • Number 4 • 4th Quarter 2012 The leading publication for business intelligence and data warehousing professionals The Necessary Skills for Advanced Analytics 4 Hugh J. Watson BI Dashboards the Agile Way 8 Paul DeSarra Best Practices for Turning Big Data into 17 Big Insights Jorge A. Lopez Implementing Dashboards for a Large 22 Business Community Doug Calhoun and Ramesh Srinivasan Data “Government” Models for Healthcare 34 Jason Oliveira BI Q&A: Gaming Companies on the 40 Bleeding Edge of Analytics Linda L. Briggs Offloading Analytics: Creating a 43 Performance-Based Data Solution John Santaferraro BI Experts’ Perspective: Mobile Apps 49 Timothy Leonard, William McKnight, John O’Brien, and Lyndsay Wise
  • 2. BI Training Solutions: As Close as Your Conference Room We know you can’t always send people to training, especially in today’s economy. So TDWI Onsite Education brings the training to you. The same great instructors, the same great BI/DW education as a TDWI event—brought to your own conference room at an affordable rate. It’s just that easy. Your location, our instructors, your team. Contact Yvonne Baho at 978.582.7105 or ybaho@tdwi.org for more information. tdwi.org/onsite
  • 3. 1BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 volume 17 • number 4 3 From the Editor 4 The Necessary Skills for Advanced Analytics Hugh J. Watson 8 BI Dashboards the Agile Way Paul DeSarra 17 Best Practices for Turning Big Data into Big Insights Jorge A. Lopez 22 Implementing Dashboards for a Large Business Community Doug Calhoun and Ramesh Srinivasan 34 Data “Government” Models for Healthcare Jason Oliveira 40 BI Q&A: Gaming Companies on the Bleeding Edge of Analytics Linda L. Briggs 43 Offloading Analytics: Creating a Performance-Based Data Solution John Santaferraro 49 BI Experts’ Perspective: Mobile Apps Timothy Leonard, William McKnight, John O’Brien, and Lyndsay Wise 55 Intructions for Authors 56 BI StatShots
  • 4. 2 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 volume 17 • number 4 EDITORIAL BOARD Editorial Director James E. Powell, TDWI Managing Editor Jennifer Agee, TDWI Senior Editor Hugh J. Watson, TDWI Fellow, University of Georgia Director, TDWI Research Philip Russom, TDWI Director, TDWI Research David Stodder, TDWI Associate Editors Barry Devlin, 9sight Consulting Mark Frolick, Xavier University Troy Hiltbrand, Idaho National Laboratory Claudia Imhoff, TDWI Fellow, Intelligent Solutions, Inc. Barbara Haley Wixom, TDWI Fellow, University of Virginia Advertising Sales: Scott Geissler, sgeissler@tdwi.org, 248.658.6365. List Rentals: 1105 Media, Inc., offers numerous e-mail, postal, and telemarketing lists targeting business intelligence and data warehousing professionals, as well as other high-tech markets. For more information, please contact our list manager, Merit Direct, at 914.368.1000 or www.meritdirect.com. Reprints: For single article reprints (in minimum quantities of 250–500), e-prints, plaques, and posters contact: PARS International, phone: 212.221.9595, e-mail: 1105reprints@parsintl.com, www.magreprints.com/QuickQuote.asp © Copyright 2012 by 1105 Media, Inc. All rights reserved. Reproductions in whole or in part are prohibited except by written permission. Mail requests to “Permissions Editor,” c/o Business Intelligence Journal, 1201 Monster Road SW, Suite 250, Renton, WA 98057. The information in this journal has not undergone any formal testing by 1105 Media, Inc., and is distributed without any warranty expressed or implied. Implementation or use of any information contained herein is the reader’s sole responsibility. While the information has been reviewed for accuracy, there is no guarantee that the same or similar results may be achieved in all environments. Technical inaccuracies may result from printing errors, new developments in the industry, and/or changes or enhancements to either hardware or software components. Printed in the USA. [ISSN 1547-2825] Product and company names mentioned herein may be trademarks and/or registered trademarks of their respective companies. President Rich Zbylut Director, Online Products Melissa Parrish & Marketing Senior Graphic Designer Bill Grimmer President & Neal Vitale Chief Executive Officer Senior Vice President & Richard Vitale Chief Financial Officer Executive Vice President Michael J. Valenti Vice President, Finance Christopher M. Coates & Administration Vice President, Erik A. Lindgren Information Technology & Application Development Vice President, David F. Myers Event Operations Chairman of the Board Jeffrey S. Klein Reaching the Staff Staff may be reached via e-mail, telephone, fax, or mail. E-mail: To e-mail any member of the staff, please use the following form: FirstinitialLastname@1105media.com Renton office (weekdays, 8:30 a.m.–5:00 p.m. PT) Telephone 425.277.9126; Fax 425.687.2842 1201 Monster Road SW, Suite 250, Renton, WA 98057 Corporate office (weekdays, 8:30 a.m.–5:30 p.m. PT) Telephone 818.814.5200; Fax 818.734.1522 9201 Oakdale Avenue, Suite 101, Chatsworth, CA 91311 Business Intelligence Journal (article submission inquiries) Jennifer Agee E-mail: journal@tdwi.org tdwi.org/journalsubmissions TDWI Premium Membership (inquiries & changes of address) E-mail: membership@tdwi.org tdwi.org/PremiumMembership 425.226.3053 Fax: 425.687.2842 tdwi.org
  • 5. 3BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Speed is on everyone’s mind these days. From real-time data to on-demand reporting, BI professionals want up-to-the-minute information and they want it now. The authors in this issue of the Business Intelligence Journal understand. Agile development methodologies have long promised speedier delivery of new applica- tions or features thanks to shorter development cycles and increased user collaboration. Paul DeSarra explains how an agile approach can be leveraged to meet the highly dynamic needs of business; he uses an agile dashboard project to illustrate his ideas. Dashboards are a quick and easy way to communicate key performance indicators, and Doug Calhoun and Ramesh Srinivasan provide tips and best practices for creating a successful dashboard design. An agile approach may also be what’s needed for mobile development at a maternity clothes maker, the subject of our Experts’ Perspective. Timothy Leonard, William McKnight, John O’Brien, and Lyndsay Wise offer their advice for getting mobile BI up and running quickly. Of the three leading characteristics of big data (the so-called 3 Vs: volume, variety, and velocity), it’s the speed component that is often cited as its downfall. How can you process so much data without becoming bogged down? Jorge A. Lopez describes one approach. John Santaferraro discusses how analytics must be offloaded to separate analytics databases if big data is to provide accelerated queries, faster batch processing, and immediate access to a robust analytics environment. Senior editor Hugh J. Watson notes that studies suggest enterprises will soon face a shortage of data scientists. He explains that we will have to give business analysts and data scientists wider and more in-depth permissions and provide more training for existing staff if we’re to keep up with current trends. Healthcare organizations face a variety of tough governance challenges. Jason Oliveira explores what can be learned from other governance and services organizations that have adopted business intelligence competency centers (BICCs) and how to apply that knowl- edge to help improve healthcare’s BI disciplines. Speed can present challenges, which is why our Q&A explores how gaming companies are on the bleeding edge of analytics, using real-time information to improve gameplay (as well as up-sell or cross-sell products or services to players). How are you keeping up? We welcome your feedback and comments; please send them to jpowell@tdwi.org. From the Editor
  • 6. 4 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 The Necessary Skills for Advanced Analytics Hugh J. Watson Analytics work requires business domain knowledge, the ability to work with data, and modeling skills. Figure 1 identifies some of the skills in each area. The importance of particular skills and the exact forms they take depend on the user and the kind of analytics involved. Let’s take a closer look. It is useful to distinguish among business users, busi- ness analysts, and data scientists. Business users access analytics-related information and use descriptive analytics tools and applications in their work—reports, OLAP, dashboards/scorecards, and data visualization. They have extensive business domain knowledge and are probably familiar with the data they are accessing and using but have a limited need for and understanding of modeling. Advanced Analytics Hugh J. Watson is a C. Herman and Mary Virginia Terry Chair of Business Administration in the Terry College of Business at the University of Georgia. He is a Fellow of TDWI and senior editor of the Business Intelligence Journal. hwatson@uga.edu BUSINESS DOMAIN • Goals • Strategies • Processes • Decisions • Communication • of results DATA • Access • Integration • Transformation • Preparation MODELING • Methods, techniques, • and algorithms • Tools and products • Methodologies Figure 1. Skills needed for analytics.
  • 7. 5BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Advanced Analytics Business analysts use analytical tools and applications to understand business conditions and drive business processes. Their job is to access and analyze data and to provide information to others in the organization. Most business analysts are located in the functional areas of a business (such as marketing) and perform analytical work (such as designing marketing campaigns), or they may work in a centralized analytics team that provides analytics organizationwide. Depending on their posi- tions, business analysts work with some combination of descriptive, predictive, and prescriptive analytics. They tend to have a good balance of business domain knowledge as well as data and modeling skills. The data scientist title is taking hold even though it sounds elitist (I’ve also heard the term data ninja). Data scientists typically have advanced training in multivari- ate statistics, artificial intelligence, machine learning, mathematical programming, and simulation. They perform predictive and prescriptive analytics and often hold advanced degrees, including Ph.D.s, in fields such as econometrics, statistics, mathematics, and management science. Companies don’t need many data scientists, but they come in handy for some advanced work. Data scientists often have limited business domain knowl- edge, the ability to handle data related to performing analytics (e.g., data transformations), and strong modeling skills. They often move from project to project and are paired with business users and business analysts so that necessary domain knowledge is included on the team. Most companies have moved along the BI/analytics maturity curve and now have business users who understand and can employ descriptive analytics and business analysts who can deliver descriptive and some predictive analytics. Interest is now focusing on the organizational capability to perform predictive and prescriptive (that is, advanced) analytics to answer why things happen and propose changes that will optimize performance. This explains why enterprises are employing more data scientists. Successful advanced analytics requires a high level of business domain, data, and modeling skills, and a team of people is often required to ensure that all of the skills and perspectives are in place. As an example, consider the following experience. Southwire: Bringing the Skills Together Several years ago, I received a call from a manager at Southwire, a leading producer of building, utility, industrial power, and telecommunications cable products and copper and aluminum rods. He wanted help solving an impending problem associated with the production of copper, a key component of many of his company’s products. My experience on that project (in particular, how the problem was approached and solved) provides a good example of the skills required to be successful with advanced analytics. I learned that the there is no set “formula” for manufac- turing copper. A variety of ores and other ingredients are used depending on what is available. The current approach involved an expert who would look at what materials were on hand and decide what and how much of each ingredient should be used. It was critical that the ingredients produced copper and that the copper would be viscous enough to flow out of the smelter and refining furnace. The problem was that the expert was retiring soon and his expertise was going to be lost. A new solution approach was needed. Southwire assembled a team of business people, chemical engineers, IT, and me. We had individuals with business knowledge, subject area experts, people who were familiar with available data and systems, and members with modeling expertise. The team contained all the skills needed for advanced analytics. My role was to provide the modeling (data scientist) skills. I saw two possible modeling approaches. The first option was to create an expert/rules-based system based on the knowledge of the retiring expert. We would capture in an application the heuristics that the expert used in deciding what to put into the smelter each day. The model would be descriptive in that it would describe the expert’s thinking.
  • 8. 6 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 The other alternative, and the one chosen, was to use linear programming. If you are familiar with linear programming, Southwire’s problem was the classic production blending application. You create an objective function (that is, an equation) that you want to minimize with the sum of the cost of the various ingredients multiplied by the quantity of each ingredient. You also write constraint equations that reflect the conditions that the solution must satisfy. The output of the analysis is the quantity of each ingredient that will minimize costs while satisfying all of the constraints. The writing of the constraint equations was fascinating to me. Remember that the solution had to produce copper and it had to be sufficiently viscous. These requirements were handled through the constraint equations and reflected what ingredients were available and the chemi- cal reactions involved. The chemical engineers’ input was critical for developing these equations. Remember when you took chemistry in high school or college and studied valences (the number of bonds a given atom has formed, or can form, with one or more other atoms)? This and other factors (such as what ingredients were available each day) were incorporated into the constraint equations. Data scientists are not a “one-trick pony” when it comes to modeling. They are familiar with multiple modeling approaches and algorithms. They are able to identify and experiment with different models until they find the one that seems most appropriate. At Southwire, a linear Advanced Analytics programming modeling approach was selected over an expert/rules-based system. Once the objective function and constraint equations were developed, it was necessary for IT and me to select an appropriate linear programming package, enter the objective function and constraint equations, test the solution, develop a user interface that operational people could easily use for entering data (such as ingredients) and interpreting the output, implement the system, and train people to use it. Assembling the Skills Enterprises have the business domain knowledge for advanced analytics. However, as illustrated at Southwire, a key to success is to make sure that people with business domain knowledge are on the analytics team. Enterprises also have the required data skills, but a few changes may be necessary to accommodate their need for advanced analytics. Data scientists (and some business analysts) may need to have fewer restrictions on the data they can access and what they can do with it. They may need access to underlying data structures as well as the ability to join, transform, and aggregate data in ways necessary for their work. They may also need the ability to enter new data into the warehouse, such as from third-party demographic data sources. A possible solution to the potential conflict over control versus flexibility is an analytical sandbox, whether it is internal to the warehouse or hosted on a separate platform. Finding the required modeling skills is a trickier proposi- tion. You can hire consultants, as Southwire did, or use a third-party analytics provider, but these options can become costly over time if your plans include extensive advanced analytics. You can probably coach some of your current business analysts. There are many conferences (such as those offered by TDWI), short courses, and university offerings that teach advanced analytics. As advanced analytics becomes better integrated into application software (for example, campaign manage- ment) and easier to use, it is likely that trained business analysts can take on tasks that have skill requirements typically associated with data scientists. A possible solution to the potential conflict over control versus flexibility is an analytical sandbox, whether it is internal to the warehouse or hosted on a separate platform.
  • 9. 7BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 You can also hire data scientists. This isn’t a new approach; many companies have already done so and have data scientists scattered throughout their business units or in specialized groups such as analytics compe- tency centers. Studies suggest that companies are planning to hire more data scientists and will face a shortage of such resources. For example, the McKinsey Global Institute predicts a shortfall of between 140,000 and 190,000 data scientists by 2018 (Manyika, et al, 2011). Many universities are gearing up to meet the need through degree programs, concentrations, and certificates. These offerings are usually in business, engineering, or statistics and the instructional delivery varies from on campus to online. One of the first and best-known programs is the Master of Science in Analytics at North Carolina State University. SAS has been an important contributor to the program, which is offered through the Institute for Advanced Analytics and has its own facility on campus. Deloitte Consulting has partnered with the Kelly School of Business at Indiana University to offer a certificate in business analytics for Deloitte’s professionals. Just this year, Northwestern University rolled out an online Master of Science in Predictive Analytics offered through its School of Continuing Studies. Will students take advantage of these programs in large enough numbers? Advanced analytics is a tough study, and many students may not have the necessary aptitude, inclination, and drive to complete the programs, even though the career opportunities are great. Summary You have already been performing analytics under the BI umbrella. BI includes descriptive analytics, and you have probably also been performing predictive analytics. For more advanced analytics, however, you will need to “ramp up your game” a little. You have the business domain knowledge covered. For the data component, you will need to grant business analysts and data scientists wider or more in-depth permissions and you will likely need to extend and enhance your analytical platforms (such as appliances and sandboxes). For the modeling Advanced Analytics skills, you will probably need to provide training for existing staff and bring in new people with specialized analytical skills. ■ Reference Manyika, James, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers [2011]. Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute, May. http://www.mckinsey.com/insights/mgi/research/ technology_and_innovation/big_data_the_next_ frontier_for_innovation
  • 10. 8 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development BI Dashboards the Agile Way Paul DeSarra Abstract Although the concept of agile software development has been around for more than 10 years, organizations only recently began to think about how this methodology can be applied to business intelligence (BI) and analytics. BI teams are continually evolving their rapid delivery of additional value through reporting, analytics, and dashboard solutions. These teams must also discover what types of BI solutions can reinvigorate a BI deployment and produce meaningful results. One way to reinvigorate BI deployments is to take the concept of agile software development and apply it to BI initiatives such as BI dashboard solutions, which can both re-engage the business and drive actionable intelligence and confident decision making. Agile BI replaces traditional BI project methods (heavy in documentation and process) with a user-driven approach. This article discusses an approach to building BI solutions and dashboards using an agile software development methodology. Introduction Although the concept of agile software development has been around for more than a decade, it’s only been in the last few years that organizations have started to examine how this methodology can be applied to business intel- ligence and analytics. The constantly changing, highly dynamic needs of business today have increased the demands on BI environments and teams. The pressure to be more organized, turn projects around faster, and ensure user adoption at all levels is increasing. Teams need to be able to react to demands from the business and proactively develop ideas and solutions that give the business more creative ways to think about how to use data. Leveraging an agile software methodology as it applies to business intelligence is a great way to meet these constantly changing business needs. Paul DeSarra is Inergex practice director for business intelligence and data warehousing. He has 15 years of BI strategy, development, and management experience working with enterprises. pdesarra@inergex.com
  • 11. 9BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development In a nutshell, using an agile software development meth- odology (“agile”) instead of a traditional development methodology allows end users to experience a version of the software product sooner. Instead of adhering to a strict and intensive requirements and design phase before development begins, agile employs a series of shorter development cycles to increase user collaboration. The agile approach welcomes changes during the development process to provide a better product that delivers measur- able value quickly and efficiently. There are four guiding principles for agile software devel- opment (according to the Manifesto for Agile Software Development, www.agilemanifesto.org). These can also be applied to business intelligence development efforts. Principle #1: Value individuals and interactions over processes and tools Traditional BI development focuses on strong processes and tools to solve development challenges. As a result, many organizations end up creating silos among the busi- ness and IT teams. Each team silo focuses on individual responsibilities and objectives and, in effect, each team loses sight of the overall project goal of providing cohesive and comprehensive data-driven solutions that improve performance levels. When using an agile BI approach, all those involved in the BI initiative work together as one team with one goal and set of objectives. To accomplish this, many organiza- tions create hybrid teams and a business intelligence competency center (BICC) composed of individuals with the necessary skills to define, architect, and deliver analytic solutions. In some cases, many of these teams are organized under business units outside of IT and the program management office. Principle #2: Value working software over comprehensive documentation Traditional BI development in a big-bang approach focuses on developing detailed documentation about common metrics, terminologies, processes, governance, support, business cases, and data warehouse architectures. Organizations may create a standardized enterprise data warehouse and then fail because they were focused on the documentation and lost touch with the business and the problems they were trying to solve. This doesn’t mean we should stop creating detailed docu- mentation. BI teams can and should continue to focus on creating documentation that emphasizes the vision and scope as well as the architecture for future support. With agile BI, the focus is not on solving every BI problem at once but rather on delivering pieces of BI functionality in manageable chunks via shorter development cycles and documenting each cycle as it happens. Principle #3: Value customer collaboration over contract negotiation Using an agile BI approach does not mean giving users an unlimited budget or tolerance for changes. Instead, users can review changes discussed in the last development cycle to ensure expectations and objectives are being met throughout the project. Traditional BI development teams use functional docu- mentation to discuss what the solution will look like and how it will operate. Such an “imagine this” method often leaves users to try and visualize what they believe the solution will become. The resulting subjective expecta- tions can quickly derail a BI project. In contrast, an agile methodology reviews progress during each development cycle using prototypes so that stakeholders and business users can see how the BI solution is expected to look and Agile employs a series of shorter development cycles to increase user collaboration. It welcomes changes during development to deliver measurable value quickly and efficiently.
  • 12. 10 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development function. Prototypes put a visual “face” to the project by showing what data is available, how it will be used, and how it will be delivered. Principle #4: Value responding to change over following a plan With an agile methodology, traditional BI projects that focus on huge project and resource plans are replaced by shorter development cycles designed to better incor- porate changes and keep the project team focused and informed. For BI projects, changes should be expected and welcomed. When users see prototypes and gain a better understanding of what analytic capabilities and information are available, they are better able to com- municate how they could use that information to make improved business decisions. Such revelations and ideas only strengthen the final product. An agile BI project still uses a plan, but its plan is short, manageable, and coupled with a prototype users can see and experience. Changes are jointly reviewed with business sponsors, users, and IT professionals at every project stage. Example: An Agile Dashboard To better understand how this methodology can be used, let’s look at a real-world example of incorporating agile BI into a BI dashboard project for an executive sales team. The vice president of sales of a large manufacturing organization asked us to help his company gain better insight into its orders, shipments, and pipeline in order to hold the sales teams more accountable. Specifically, he wanted a dashboard that he and his executive team could use to meet accountability objectives. His vision for the dashboard was solid, and our role was to take that vision and boil it down to key metrics that would drive actions. After a few meetings with the vice president of sales and the IT sponsor (in this case, the IT director), we concluded that an agile BI dashboard project was the best approach. We ensured we had the needed sponsorship from both the business and IT teams. In addition, we confirmed the organization was using a BI tool that was capable of delivering the desired solution. We advanced the project using a hybrid approach to agile development, breaking the project into three phases to quickly and efficiently develop the scope, build prototypes, conduct reviews, develop the solution, and implement it quickly. Phase 1 This was the foundational phase for our project and focused on the third agile principle (“customer collabora- tion over contract negotiation”). Phase 1 should last no more than one week and involves identifying, at a high level, the scope of the BI dashboard to ensure that the executive sponsors are engaged and the internal teams are assembled. Phase 1 is essential because it is used to narrow the scope and prioritize what can be delivered in the set time frame. SCOPE PROTOTYPE STAKEHOLDER REVIEWS BUILD RELEASE 1 ... N Figure 1. The agile process. An agile BI project still uses a plan, but its plan is short, manageable, and coupled with a prototype users can see and experience.
  • 13. 11BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development In the first week, we worked with IT and the vice president of sales to ensure that the team had the right people with the right skills who understood the project goals. We outlined roles and responsibili- ties, opportunity and vision, and the high-level scope—all standard practices for an agile BI project. We worked with the vice president of sales along with several key business users to identify the metrics of greatest value. We worked diligently to understand what metrics were needed and how they influenced business decisions. A dashboard metric isn’t enough; we strived to enable users to respond to each metric to achieve the best business results. For example, we examined what happened after the dashboard highlighted a large gap between what the customer relationship management (CRM) application identified as a sales opportunity and the revenue actually gener- ated. We asked questions about the process of capturing these opportunities in the CRM to better understand leading factors that could influence revenue. Delving into these questions ensured that we understood the full sales engagement process. We didn’t stop there. We identified about 10 metrics for invoicing, orders, shipments, and budgets across four different dimensions—business area, Figure 2. Dashboard prototype examples.
  • 14. 12 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development product, customer, and date attributes. In our vision, the dashboard would allow the sales teams to focus more effectively on specific sales opportunities, better track budgets, and confidently predict and forecast sales throughout the year (and know where and how to make necessary adjustments). We held two meetings with the IT team to better understand the ability of the source systems to provide the data elements required. The Phase 1 deliverables included a high-level vision and scope document that clearly set the stage for the rest of the project. By quickly defining the vision and scope as well as establishing a short time frame, we removed one barrier (long contract negotiations and timelines) so we could focus on having the right people involved and the right team defined. Phase 1 was completed in one week. Phase 2 Phase 2 is where collaboration, rapid prototyping, whiteboard sessions, and interactive brainstorming take place. This phase applies three of the agile principles (“individuals and interactions,” “customer collaboration,” and “responding to change”). Phase 2 focuses on using prototyping methods in brainstorming sessions to quickly build and show business users how their ideas and needs are reflected in the proposed solution—sometimes iteratively and on the fly. The prototyping tool may be separate from your BI tool, but it must be able to demonstrate visual elements as well as drill-up, drill- down, and interactivity. This phase requires collaboration between the sponsors, key business users, and IT. A key benefit of this phase is that users “see” the data in action and will know whether the data is being presented in a way that effectively delivers the information they need. In fact, the process often gives users new ideas for using the information to make business decisions (see Figure 2). In Phase 1 we created our vision and scope, outlined the business problem, and understood the set of metrics and dimensions necessary to reach the desired outcome. We approached Phase 2 with two goals in mind: ■■ Collaborate with the vice president of sales and the sales teams to define the “look” of the BI dashboard and the data interactions required to populate it. ■■ Work with the IT team to determine the data components and further understand what could be accomplished and delivered by the project deadline. (The overall project length was seven weeks, so we had only six weeks left.) The collaboration sessions were held with the vice president of sales, several key business users, and individuals from the IT team. The meetings started as whiteboarding sessions. Once we completed the initial design, we built a prototype with phony (but business- sensible) data and set up daily meetings to review and refine our development cycles. In each session, we identified how and why metrics were to be used and outlined the decisions that would be made using the data. We evaluated different ways to display information so it would be most useful to users. We also mocked up the drill-through detail analysis and report- ing that would be available via the easy-to-understand dashboard and made sure only a single path led to the detail at each level. The resulting dashboard prototype The prototyping tool may be separate from your BI tool, but it must be able to demonstrate visual elements as well as drill-up, drill- down, and interactivity.
  • 15. 13BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development had four quadrants, each of which was meant to answer a specific question: ■■ How are we performing today? ■■ Are we on plan and what is our updated forecast? ■■ Where are we winning and losing? ■■ Who and what is not profitable? The mockup took the form of charts, regional maps, and dynamic and color-coded lists. It also included detailed drill-through paths and report examples to help guide users in making decisions. For example, a user could click on a troubled region on the map that identified a large revenue gap based on forecasting and get details on current activity within that region as well as open opportunities and win/loss details. All in all, we held about 10 different business sessions and kept coming back the next day with a refined prototype to generate ideas. As a result, throughout the entire process, users were engaged, excited, and willing to participate in the sessions. They also felt confident that their needs were being addressed and their ideas and feedback were incorporated. We simultaneously worked on the data components to map the vision to the actual data sources. To do so, we had to remove several roadblocks and make some tough decisions as a team (IT and business) in order to meet our deadline. As the team forged ahead, we uncovered several items that needed to be worked through as quickly as possible. ■■ A few financial metrics were not in the current ERP but would be implemented in an upgraded version, which was set to go live the following year. We worked with the business to outline the metrics and ultimately decided to put them on hold so that we could con- tinue building the rest of the dashboard. ■■ There was a need to tie in a certain product category captured in a separate data source outside the ERP. The product category was required to ensure we were capturing the full picture. This product was set to be coded in the new ERP. We decided to pull in and map this information from the separate data source and also put in place a process to map it into the new ERP when the time was right. We uncovered more than 15 potential roadblocks to the initiative, and we worked through them all with the team. We kept everyone informed and made joint IT/business decisions to move forward—accomplishing this with daily status meetings with IT and the business subject matter experts to address issues quickly and outline resolutions. Sponsors and stakeholders were also part of weekly checkpoint meetings. After we removed all our technical and business road- blocks, we completed Phase 2 and delivered the prototype dashboard, drill-through mockups, and a “Lean Require- ments” document that captured the requirements and outlined the assumptions and decisions we had made. We also built a “Lean Design” document that described the database design, data mapping, reporting designs, and ETL construct. Phase 2 was completed in four weeks. Phase 3 Phase 3 is the “build” phase and applies the second agile principle (“working software”). The foundation has been Users felt confident that their needs were being addressed and their ideas and feedback were incorporated.
  • 16. 14 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development set, the scope has been refined to ensure rapid delivery, IT and business are fully engaged, and now the time has come to take the prototype and construct it within the BI environment. Phase 3 should take no more than a few weeks and involves building integration and ETL procedures, security, and the BI solution itself. At this point, with two weeks left, we began to build the required dashboard, drill-through reports, and supporting data layers. Building everything in dynamic prototypes made it much easier to ensure expectations were in line as development progressed. During this phase, we continued to show the results of actual develop- ment of the dashboard every two days to the business sponsor and key users. Throughout this process, changes were still submitted. We reviewed all changes and put them into one of two buckets—implement or put on hold—and made notes in our change control log. Some of the change requests that flowed through in this phase revolved around adding different relative time-period buckets for revenue and margin analysis, some minor layout changes, and three changes that were put on hold for future phases around customizing various alternate drill paths from the dashboard based on a user’s business unit and region. Phase 3 was completed in two weeks. Tips for Agile BI Success In the end, the initial phase of the dashboard was released in seven weeks. The project was a success because of the agile BI processes applied to every aspect of the project. One of the core success factors was the use of prototypes and interactive sessions. Using prototypes enabled us to keep all players involved from the beginning and provided a forum to exchange ideas, discuss issues, and actually “see” the solution as its development progressed. After reading the case study, perhaps you are now think- ing, “Can organizations really implement these types of BI solutions in seven weeks?” You may be asking, “What about data governance, load procedures, ETL, business rules, capacity planning, and security maintenance?” The reality is that you must strike a balance when using the agile software development methodology for your BI ini- tiatives. The process walks the line of ensuring that you are building a solid foundation that has longevity, speed, and strength to weather the dynamic and demanding needs of the business. The following ideas and concepts can help you implement an agile BI process. Tip #1: Start small, think big When you begin to build an agile BI solution, it doesn’t matter if you have an enterprise data warehouse coupled with a large-scale, mega-vendor BI software stack or a small data mart managed with a niche tool. The key is to focus on the immediate business need and pain, then map that to the ultimate vision. Get the stakeholders to define and work with you to build out what it will look like. Once you have the vision, determine the best approach that completes the work quickly and keeps the long-term picture in mind. If you need to take shortcuts to get the work done, that’s fine, as long as everyone approves the shortcuts and you have a process in place to close the loop at a future point. For example, if you have an ERP application and you have to group some of your sales data into a customized dimension (instead of modifying the ERP source of records) in order to deliver the BI solution, then do so, but ensure that you get approval and that everyone understands the costs and benefits. The project was a success because of the agile BI processes applied to every aspect of the project. One of the core success factors was the use of prototypes and interactive sessions.
  • 17. 15BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development Although you are building a specific solution, you can still take steps to ensure it is repeatable, scalable, and fits into your overall data architecture. For example, in our case study, we were building a specific BI dashboard solution that was focused on shipments, orders, and pipeline processes specific to the sales functional area. However, in creating the solution, we built a data design that could scale outside of sales by building conformed dimensions and process-driven fact tables. If, for performance reasons, we had to create summary or aggregated tables to support specific business areas, we made sure these mapped back to lower-grain fact tables for data consistency and detail analysis. Tip #2: Remove the roadblocks Whether you face IT challenges or other obstacles, work systematically to overcome them. Typical roadblocks you may encounter in an agile BI project include: A narrowed scope. In some cases, it can be challenging to narrow the scope of a BI project so that a portion of the solution can be delivered in a shorter time frame. This is a slippery slope and requires the ability to prioritize and find common ground with business users and/or sponsors. If you can get the business sponsor to commit to a shorter time frame up front, it will be easier to narrow the scope. In addition, separate out the “must-haves” and the “would-like-to-haves” right away. Data gaps. In any BI project, data gaps are typical as users may not fully understand how information is collected or data anomalies are discovered. Agile BI is no differ- ent, and data profiling is a necessary step. In our case study example, we encountered data gaps that we had to eliminate or overcome by accepting risk, leaving out components, or implementing a temporary fix. Business commitment and time. Agile BI requires interaction with business stakeholders, sponsors, and users through- out the project’s life. Secure commitment up front with everyone and clearly outline the project’s benefits in terms of effective decision making. Managing expectations. There is often a gap in the expecta- tions about what it takes to deliver a BI solution and the time it actually takes. Users may believe that much more can be done in a short amount of time, which can cause extreme tension between IT and the business. Managing these expectations requires strong communication skills and an individual on the team who can effectively bridge IT and business users. This individual should understand data modeling and architecture, reporting and analytics, and dimensional concepts and be able to articulate the challenges to business managers and sponsors in a language they understand. Rogue development. Agile BI still follows a process and a method. There is still documentation and a plan; success metrics are still defined at the beginning of the project. Project management is still a core component in this process. We recommend that you still use the following tools, documentation, and processes to help guide the project: ■■ A vision and scope document is used to define initial, critical success factors and get project approval. ■■ A requirements document outlines the core busi- ness problems and key data elements, metrics, and dimensions that are needed for the BI solution. The difference from traditional BI development is that this document focuses on the smaller and shorter deliverables and keeps it “lean.” If you need to take shortcuts to get the work done, that’s fine, as long as everyone approves the shortcuts and you have a process in place to close the loop at a future point.
  • 18. 16 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Agile Dashboard Development ■■ A design document describes the database design, data mapping, reporting designs, and ETL constructs. Again, different from a traditional BI project, this design should focus on bringing the technical team together on the architecture and for future support without getting lost in too many details. ■■ A project baseline plan for delivering a piece of functionality quickly, with the longer-term plan represented at a higher level. ■■ A change control log to track which changes are implemented and which are put on hold. ■■ An enhancement log to track enhancements that the team is unable to fit into the first release. If you have obtained the right sponsorship at the start and ensured everyone has the same vision and under- stands the project, your ability to remove roadblocks will be improved. Inevitably, however, challenges will arise, so always keep one eye on the vision and one on the scope. Tip #3: Engage the business BI professionals sometimes get so focused on the technology that even after the initial meeting with business users they may flip back to thinking mostly about the tools and technology rather than the business’s pains, needs, and objectives. In our case study, we used rapid prototyping and whiteboarding sessions to gather requirements and keep the right people involved and working in unison. We had daily brainstorming sessions to promote collaboration on the design, discuss the metrics and information needed to make business decisions, and show the BI dashboard prototype progress. From this, we built a requirements document that was focused on the key metrics and data elements, and we incorporated visuals from our prototype to ensure we had everything captured. Once we completed this phase and moved to the full development stage, keeping the key users involved continued to be highly important. During development, we still met at least twice a week to review progress and update our change control logs as we showed progress on the BI dashboard solution. The prototype became our guide to ensuring the development was on course and meeting all expectations. Summary As business becomes more dynamic and social in nature, BI environments need to be prepared to move fast and deliver value in creative ways. Intertwining BI best practices with the agile software methodology is one way to infuse speed, creativity, commitment, and value into any BI project. ■
  • 19. 17BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 big data insights Best Practices for Turning Big Data into Big Insights Jorge A. Lopez Abstract Big data is surfacing from a variety of sources—from transaction growth and increases in machine-generated information to social media input and organic business growth. What does an enterprise need to do to maximize the benefits of this big data? In this article, we examine several best practices that can help big data make a difference. We discuss the role that extract, transform, and load (ETL) plays in transforming big data into useful data. We also discuss how it can help address the scal- ability and ease-of-use challenges of Hadoop environments. Introduction Growing data volumes are not a new problem. In fact, big data has always been an issue. Fifty years ago, “big data” was someone with a ledger recording inventory; more recently, it was a bank’s mainframe processing customer transactional data. Today, new technologies enable the creation of both machine- and user-generated data at unprecedented speeds. With the growing use of smartphones and social networks, among other technolo- gies, IDC estimates that digital data will grow to 35 zettabytes by 2020 (IDC, 2011). These new technologies have turned big data into a mainstream problem. In fact, it’s not uncommon to see small and midsize organizations with just a few hundred employees struggling to keep up with growing data volumes and shrinking batch windows, just as large enterprises do. The viability of many businesses will depend on their ability to transform all this data into competitive insights. According to McKinsey (Manyika, et al, 2011), big data presents opportunities to drive innovation, improve productivity, enhance customer satisfaction, and increase profit margins. Although many CIOs and CEOs recognize the value of big data, they have struggled Jorge A. Lopez is senior manager, data integration, for Syncsort Incorporated. jlopez@syncsort.com
  • 20. 18 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 big data insights (through no fault of their own) to handle this new influx of data. The problem isn’t information overload; it’s the failure to harness, prioritize, and understand the data flowing in. This is why data integration is a critical—yet often overlooked—step in the big data analytics strategy. Traditional IT approaches will not generate the results businesses expect in this era of big data. Therefore, IT organizations should look at the hype around big data as an opportunity to set a new strategy for harnessing their data to improve business outcomes. As a first step, organizations must examine their existing data strategies and ask: Are these data strategies helping us achieve the objectives of the business? Can our environment eco- nomically scale to support the requirements of big data? Can our infrastructure quickly adapt to new demands for information? To fully take advantage of new sources of information, organizations must cut through the buzz that big data creates. There are many definitions of big data, but most experts agree on three fundamental characteristics: volume, velocity, and variety. Another key aspect, often overlooked, is cost. Forrester, for instance, defines big data as “techniques and technologies that make handling data at extreme scale affordable” (Hopkins and Evelson, 2011). This touches on two critical areas that must be addressed to have a successful data management strategy: scalability and cost effectiveness. To scale data volumes 10, 50, or 100 times requires new architectural approaches to the data integration process. Doing so in a cost-effective way has been the biggest challenge to date for organizations. No matter what kind of IT environment you have or how you label your data (big or small), there are steps you can take to rearchitect and optimize your approach to data management, such as returning your attention to the data integration process in your quest for improved business insights. Not All Big Data Is Important Data Sometimes it’s easy to get caught up in the hype about big data. However, trying to process larger data volumes can significantly increase the amount of noise, hindering your ability to uncover valuable insights. It’s important to remember that not all data is created equal. Any big data strategy must include ways to efficiently and effectively process the required data while filtering out the noise. Data integration tools play a key role in filtering out the unnecessary data early in the process to make data sets more manageable and, ultimately, load only relevant data into the appropriate environment for analysis (whether that is a data warehouse, Hadoop, or an appliance). Organizations can take three approaches: 1. Define a clear data strategy that identifies the users’ data requirements. (Why do I need this data? How will it help me accomplish my business objectives?) 2. Build an efficient data model that is adequate to the demands of the business. 3. Have the right data integration tools to do the job. Ultimately, the data integration tool is the critical component; it can help materialize the strategy and execute on it to build an efficient data model. The tool must have the right capabilities as well as the scalability and performance required to work effec- tively. A key component is the ease of use that allows developers to focus on business requirements instead of worrying about performance, scalability, and cost. To scale data volumes 10, 50, or 100 times requires new architectural approaches to the data integration process. Doing so in a cost- effective way has been the biggest challenge to date for organizations.
  • 21. 19BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 big data insights Bringing Data Transformations Back to the ETL Layer Data integration and ETL tools have historically focused on expanding functionality. For instance, ETL was origi- nally conceived as a means to extract data from multiple sources, transform it to make it consumable (by sorting, joining, and aggregating the data), and ultimately load and store it within a data warehouse. However, in today’s era of big data, this strategy neglects two critical success factors: ease of use and high performance at scale. As IT organizations confront the accelerating volume, variety, and velocity of data by applying analytics, they have been forced to turn to costly and inefficient workarounds, such as hand-coded solutions or pushing transformations into the database, to overcome their performance challenges. The costs of such scaling approaches can outweigh their benefits. The best example is staging data when joining heterogeneous data sources. This practice alone increases the complexity of data integration environments and adds millions of dollars a year in database costs just to keep the lights on. As such, an Enterprise Strategy Group survey (Gahm, et al, 2011) found “data integration complexity” cited as the number one data analytics challenge. There are new approaches that don’t require big budgets, however. To rectify this situation, we recommend bringing all data transformations back into a high-performance, in-memory ETL engine. This approach tackles four main issues: 1. Think about performance in strategic, rather than tactical, terms. This requires a proactive, not reactive, approach. Performance and scalability should be at the core of any decision throughout the entire development cycle, from inception and evaluation to development and ongoing maintenance. Organizations must attack the root of the problem with approaches that are specifically designed for performance. 2. Organizations must improve the efficiency of the data integration architecture by optimizing hardware resource utilization to minimize infrastructure costs and complexity. 3. Productivity gains can be achieved through self- optimization techniques, which means that little, if any, manual tuning of data transformations should be required. The constant tuning of databases can consume so many hours and resources that it actually hinders the business. 4. Cost savings are realized by eliminating the data staging environment, resulting in server and database maintenance cost savings; deferring large infrastruc- ture investments with the efficient use of system resources; and gaining improved developer productiv- ity because a considerable amount of time need not be spent tuning for growing data volumes, thus providing more time for strategic projects. The high-performance ETL approach should accelerate existing data integration environments where organiza- tions have already made significant investments and enhance emerging big data frameworks such as Hadoop. IT departments within several companies have initiated high-performance ETL projects to achieve a long-term, sustainable solution to their data integration challenges: ■■ A storage industry pioneer and leading producer of high-performance hard drives and solid-state drives needed to improve its assurance process and inven- tory management with faster data processing of one million data records from its manufacturing plants. Using a high-performance ETL strategy, the company has reduced data processing times from 5.5 hours to 12 minutes and has released 70 percent of its data warehousing capacity to devote to analytics. ■■ An independent provider of risk assessment and decision analytics expertise to the global healthcare industry needed to process and analyze 40–50 TB of claims data per month to uncover risk-mitigation opportunities. Through a similar approach, the healthcare analytics organization reduced processing
  • 22. 20 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 big data insights from 2 hours to 2.5 minutes. Further business growth could also be supported by reducing turnaround time for new customers being entered into the system from 5 days to 24 hours. ■■ A mobile advertising platform company needed to quickly analyze large volumes of online activity data (such as views, clicks, and conversion rates) that was doubling every year in order to make important decisions (such as what ad space to bid on and when and where to place ads for customers). The business went from waiting one hour to obtain the information needed to adjust advertising campaigns down to 10 minutes. In addition, its two developers, who spent most of their time just maintaining the infrastructure, can now work on more valuable projects. The benefits of a proper ETL process with fast, efficient, simple, and cost-effective data integration translate into benefits across the entire organization, including opera- tional, financial, and business gains, with the ultimate benefit being quicker access to cleaner, more relevant data to drive big data insights and optimize decision making. Optimize Your Hadoop Environment In today’s world of mobile devices, social networks, and online data, organizations must massively scale data integration and analytics differently than before. Accord- ing to Forrester (2011), despite the opportunity new data presents, organizations use only a small fraction of the data available to them. A new architecture is necessary to change both performance and costs that are driving Hadoop, the open source framework for big data. Hadoop is designed to manage and process large data vol- umes. It presents several opportunities but also introduces challenges—including scalability and ease of use—that lead to siloed deployments with limited functionality, which is why Hadoop doesn’t provide significant value by itself. Organizations should not expect to rely solely on Hadoop for all their needs; other tools and platforms need to complement Hadoop to optimize the data management environment for these data sets. Hadoop gets its scalability by deploying a significant number of commodity servers. This way, the Hadoop framework can distribute the work among servers for increased performance at scale. Of course, adding commodity hardware running open source software looks like a more cost-effective proposition than adding nodes to a high-end, proprietary database appliance. However, the hardware required to cope with growing data volumes and performance service-level agreements can grow significantly. Therefore, it is not uncommon to find Hadoop deployments with a significant number of nodes. This elevates capital and operational costs because of hardware maintenance, cooling, power, and data center expenses. In addition, the required tuning involves hundreds of configurable parameters, making it difficult to achieve optimum performance. Such increased complexity is tied to ease of use, which is one of the major challenges facing nearly every organization working with Hadoop. Hadoop is not easy to develop. For instance, adding new capabilities (such as reverse sorting) and coding MapReduce jobs is typically performed manually, which requires specific skills that are expensive and difficult to find. For many organiza- tions, finding the skill set needed to manage Hadoop is the most significant barrier to Hadoop adoption. Organizations can overcome these challenges and extend Hadoop’s capabilities, maximize its value, and simplify the overall Hadoop experience by integrating the high-performance ETL approach. This approach allows for sorting and organizing the data before it is pushed into the Hadoop environment by leveraging Hadoop Distributed File System (HDFS) connectivity and by creating MapReduce jobs in a separate graphical interface rather than writing Java or Pig scripts. Data integration comes into play after analysis as well; the results of the analyzed data need to be reintegrated into other informa- tion systems. For example, comScore, a global digital information provider of online consumer behavior insights, saw its data volume increase 72 times per day within two years and deployed a Hadoop cluster to better manage the data processing.
  • 23. 21BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 big data insights However, it is challenging to bring Hadoop into an enterprise with heterogeneous operating systems. Moreover, Hadoop lacks critical features such as real-time integration and robust high availability. Therefore, com- Score deployed a data integration strategy that groups and splits larger data files that fit more perfectly into Hadoop, which provides a higher rate of parallelism on compressed files and reduces disk costs for the Hadoop cluster. This saved 75 TB of data storage per month and slashed processing time from 48 hours to just 6 hours, so comScore can now process twice the data each month (compared to a year before), allowing the company to provide its customers data insights faster. Summary Today’s enterprises must make sense of the increasing volume, velocity, and variety of data while maintaining cost and operational efficiencies. Your business intelligence strategy must focus on optimizing the data integration process so it is fast, efficient, simple, and cost effective. This means ensuring you have all the right data at your fingertips by managing the volume and new sources of data, coupled with scalability as data requirements evolve. Quicker access to cleaner, more relevant data is what drives big data insights and what will truly lead your enterprise to faster and more profitable decisions. ■ References Gahm, Jennifer, Bill Lundell, and John McKnight [2011]. “The Impact of Big Data on Data Analytics,” Enterprise Strategy Group, research report, September. http://www.esg-global.com/research-reports/research- report-the-impact-of-big-data-on-data-analytics/ Hopkins, Brian, and Boris Evelson [2011]. “Expand Your Digital Horizon With Big Data,” Forrester, research report, September. http://www.forrester.com/Expand +Your+Digital+Horizon+With+Big+Data/fulltext/-/E- RES60751?objectid=RES60751 IDC [2011]. “The 2011 Digital Universe Study: Extracting Value from Chaos,” digital iView report, June. http://www.emc.com/collateral/demos/ microsites/emc-digital-universe-2011/index.htm Manyika, James, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers [2011]. Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute, May. http://www.mckinsey.com/insights/mgi/research/ technology_and_innovation/big_data_the_next_ frontier_for_innovation
  • 24. 22 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 practical dashboard development Implementing Dashboards for a Large Business Community A Practical Guide for the Dashboard Development Process Doug Calhoun and Ramesh Srinivasan Abstract Dashboards are becoming more prevalent as business intelli- gence tools, and the reason is obvious: well-designed, accurate dashboards can quickly communicate important business indicators and trends and provide actionable information. However, creating and implementing a successful dashboard involves a great amount of work. It often requires implementing tight controls while allowing the flexibility needed to test and learn with the business. This article outlines tips for how to integrate these seemingly divergent processes as well as how to ensure the data accuracy, ease of use, and optimal performance that make a truly successful dashboard. Introduction The use of dashboards as a primary business intelligence tool is expanding quickly. When supporting a business unit fueled by data, how does an application team build dashboards that will provide great business value and be sustainable? There are many methods for doing this, as we will explain in this article. However, there are also certain fundamen- tal principles that may seem obvious, but can be difficult to implement: ■■ Engage business users, not just at the beginning and end of a project, but throughout the entire process. Make business users your partners. Doug Calhoun is systems analyst, claims technology—data and information delivery at Allstate Insurance Company. dcal9@allstate.com Ramesh Srinivasan is manager, claims technology—data and information delivery at Allstate Insurance Company. rsri2@allstate.com
  • 25. 23BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 practical dashboard development ■■ Involve the entire application team throughout your project’s life. A factory-like approach of handing off tasks from phase to phase will not work well. ■■ Although design updates may require an iterative approach with business users, the number of compo- nents needed should drive the team to define phases and key deliverables early in your project to keep it on track. ■■ Sophisticated user interfaces are great, but in the end, it’s really about the data. Ensure that everyone is in agreement about how to define the data from a business point of view, and create a plan for how to validate it. ■■ Ease of use is critical. Make sure your business part- ners get hands-on opportunities as often as possible. ■■ Design your technology based on the number and types of users. Performance and capacity should be considered when designing and building dashboards, much as they are with more traditional transactional systems. This article is not intended to serve as a guide to visual design. That topic has already been studied extensively and successfully. We will discuss best practices for the process of creating a successful design. In addition, the word dashboard is used here as a general term for data visualization tools showing at-a-glance trends and other indicators. It is not meant to signify the timing or refresh rate of the data, and could be used interchangeably with “scorecard” depending on how a business unit chooses to define it. In the business intelligence world, “dashboard” has become the most common term, so it will be used here with assumed broader connotations. Another concern is process methodology. Many compa- nies primarily employ a waterfall life cycle, which can be a difficult fit for a business intelligence implementation. However, a purely agile methodology for dashboards can also lead to trouble, as there are complexities with development and testing that require a certain level of more traditional phase-gating. Essentially, the dashboard needs to be treated both as an application (with all the functional testing required) as well as a mechanism for providing data, including iterative testing and prototype updates. A certain level of flexibility in your development process may be required to achieve a happy medium and ensure a successful rollout. Depending on the size of your company, you may also need to leverage the assistance of other technology groups to implement. Where appropriate, involve groups such as your business intelligence community of practice or center of expertise; data, solution, and/or BI architecture; database administrators; all associated infrastructure/ server administrators; change and release coordinators; and any other applicable groups you believe should be enlisted. Do this early. All of this may require “innovating your process,” which might sound like a contradiction in terms to process methodologists but may have practical application to your work. The best practices below will guide you in the direction that best fits your project’s needs. Starting the Project If you are embarking on a dashboard project for the first time, there are several rules of thumb you should follow at the project’s outset. First, as with any project, you will need to define team roles and lay the groundwork for how the project life cycle will work. At the same time, you will need to Sophisticated user interfaces are great, but in the end, it’s really about the data. Ensure that everyone is in agreement about how to define the data from a business point of view.
  • 26. 24 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 identify and engage all stakeholders and ensure both groups agree on expected outcomes. It is unlikely that you will be working on a dashboard project without a business case behind it, but getting a request from the business and truly engaging users as partners are two very different things. Although it can be easy to take orders and make assumptions, keeping the key business partners involved throughout the entire project life cycle, and beyond, is absolutely essential. Finally, you will need some vital information before you begin. Some questions are obvious from a technical point of view. What are the data sources? Will data be stored separately? What tools will be used? What environment(s) must be built? Other questions are just as vital, but may not be so obvious. For example, is the project feasible, especially as an initial effort? We recommend you limit the scope of an initial dashboard to a simple, straightforward first effort that has high business value. This way, a quick win is more possible, success can be attained early, and business trust will be earned as you “learn the ropes.” You will also need to be sure that the project is appropri- ate for a dashboard or other visualizations. For example, if the business primarily wants to track how hundreds of their individual workers are performing, a dashboard is likely not the right vehicle. However, if they want to track how their offices are performing over a period of time, using standard, well-known measures within the company, then a dashboard may be the best option. (You can still consider getting to the individuals’ detail, which we’ll discuss shortly.) The main lesson here, and throughout the early phases of your project, is to ask questions and keep on asking them! If something does not make sense or seems impossible, work with business users until you reach a mutually satisfactory agreement. Once the project looks possible, list all your assump- tions—whether business related, technical, or process/ project based. You’ll need this list to build an order-of- magnitude estimate, define the technical space you will be working within, and help business users understand their role during the project (and how crucial it is). Having everything in order even before detailed require- ments are determined will give both you and business users confidence. After all, before you start involving them in detailed requirements meetings, they’re going to want some idea about when to expect a finished product. Finally, as you devise this plan, treat the dashboard as a full-blown application. Although the dashboard is built in the business intelligence space, it has both the complexity of a dynamic user interface (with the myriad possibilities of errors on click events), as well as the need for absolutely exact, gold-standard data. Both the data and the functionality will need to be tested thoroughly, as if you were developing a transactional application. If you release the slickest, most attractive dashboard your business has ever seen but the data is wrong or a button doesn’t work, user confidence will quickly erode. Your dashboard may be pretty—pretty meaningless. Consider the metrics and aggregations needed and what types of structures will be required to support your project. Depending on your company’s standards, you might be using denormalized tables, dimensional tables in a warehouse (or a combination of these), an integra- tion of detailed and aggregated data, OLAP cubes, or many other possible sources and targets. As with any BI solution, you need to choose the appropriate data model. Limit the scope of an initial dashboard to a simple, straightforward first effort that has high business value. practical dashboard development
  • 27. 25BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 The point here is that performance is paramount for user adoption. Document and agree upon functional requirements and data definitions while offering the flexibility of iterative testing and tweaking that a business intelligence solution should provide. It is critical to lock down the logic behind the displayed metrics early in the project. If that changes or is vague to everyone, there is little chance you’ll deliver a successful dashboard. Gathering Requirements Involving business users in your work is crucial—and most clearly needed—early in a project, especially during requirements gathering and scoping. You may need to remind yourself to keep your business partners actively involved, because it’s vital to your project’s success! Multiple meetings will certainly be necessary, but make sure to keep users actively engaged via various methods, including whiteboarding at first, dashboard prototyping later, and sharing early data results. This will not only help hone the requirements, but also allow business users to feel they are truly partnering on the project. This will ensure that they know and trust what they will be getting. In addition, the entire development team should be involved from inception through implementation to ensure nothing gets lost in translation through the work. See Figure 1 for a gauge of both business and technical involvement through a general project life cycle (regard- less of the specific methodology used). The following best practices can help you avoid pitfalls during requirements gathering, even when the relation- ship with the business is good. Know your user. It is possible that your business partner may represent only one part of the larger group using the dashboard, or may be assigned to a project and may not be an ultimate end user at all. Some users may have different business needs from your primary business partner. Make sure that you define all the groups of users who will have access to the dashboard, and ensure all of their voices are heard. This is not as easy as it sounds, but is worth the effort. Scope Datarequirements Dataanddashboarddesign BuildandUIupdate Usertesting Implementandchangenav Architects/BI CoE/DBA Testing team Developers Analysts (technical) Business champion/SME Business sponsor Figure 1. Both business users and technical staff should be involved throughout a project’s life cycle. practical dashboard development Dashboard Implementation Effort by Role and Phase
  • 28. 26 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Define a use case for every component you build. There is no point in creating a dashboard component unless there is a direct use for it that can be easily defined and documented. Documenting the requirements is crucial to ensure business users get what they have asked for and so developers and testers have a clear guide about what they must build. You want to ensure that the use cases, and the data shown, will stay meaningful over time for each component; it is not a good idea to introduce new or rarely used metrics with a dashboard solution. Finally, require sign-off for all use cases, business requirements, and scope documentation you create. The scope should be limited to the business metrics and granularity of the data at this stage; visualization requirements can be developed later. Know your data sources and plan your approach. You must understand both where the data initially resides and, if you use an extract-transform-load (ETL) or similar process, where it will eventually reside. If storing the data, you will need to know how it should be stored, how long the data will need to be available for access, and how often it needs to be refreshed. Especially if using ETL, three-quarters of your work will be spent on the analysis, load build and testing, and validation of the data. Even without ETL, our experience is that the majority of the time should be spent working with the data rather than building the front end. Given the visual nature of dash- boards, it is easy to assume that the bulk of your work is spent building attractive, user-friendly interfaces. This is simply not the case with successful implementations, especially when so many easy-to-implement dashboard tool suites are available. Include only trusted, known metrics whenever possible. Exceptions may arise, but if metrics are well known, the exceptions will be much easier to validate. The sources of the data must also be trusted, and business users should be included in selecting data sources. Know your refresh rate. Will the dashboard be loaded monthly, weekly, daily, hourly, or a combination of these frequencies? The fundamental dashboard design approach will depend on your answer to this question. Use cases will drive your design. Make sure you have thorough discussions about what is really needed versus what would be nice to have, because the more often the dashboard will be refreshed, the more support (and cost) it will require after rollout. Identify all filters and selections. The earlier in the project’s life you can do this, the better. This information has a major influence on your dashboard design and will affect decisions about performance and capacity. If a large combination of multi-select filters can be selected for one component, there will be a multitude of data combina- tions to validate and possibly many thousands of rows to be stored. Technologists can be tempted to impress their business partners, but be careful not to promise something that is not scalable or sustainable. Understand what levels of aggregation and detail are required. An early requirements exercise should involve the filters and dimensions that will be used as well as how they should be aggregated. Time periods are a common dimension as are office or geographical territories. On the flip side, sophisticated business users will inevitably want to know the details behind what is driving their trend or that one outlier metric. Not having a method of either drilling down to (or otherwise easily accessing) the detail behind the aggregation will frustrate users after the post-implementation honeymoon period has ended. Determining aggregation/detail needs should be part of the discussions during requirements gathering, but remember to balance your requirements with develop- ment difficulty and desired timelines. If detailed data is provided, it should be accessed directly via the dashboard, Define all the groups of users who will have access to the dashboard, and ensure all of their voices are heard. practical dashboard development
  • 29. 27BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 whether through sub-reports or drill-down capabilities in the components themselves, depending on your tool set. Identify how much history you need. Some graphical trends will require year-over-year comparisons. Beyond that, it may be worth considering how long any data that no longer appears on the dashboard should be retained. If it does need to be retained for compliance or other purposes, an archival strategy should be considered, or possibly a view built on top of the base data. The more the dashboard can be limited to querying only the data it needs to display, the better it will perform. Define the data testing and validation process. It is never too early to address how you will ensure data quality through a validation process. Defining specific responsi- bilities and expectations, and what methods will be used for validation, should happen even before design. This will also ensure that business users will be ready when they are asked to begin testing. The validity of the data is the most critical factor in the dashboard’s success and continued use. Integrate business users. There are several ways to involve business users in requirements gathering and refinement besides letting them dictate while you take notes. These options include: ■■ Prototype early and often. Prototyping can start with simple whiteboard exercises, and many dashboard tools now lend themselves to quick prototyping so business users can see and play with something similar to the final product deliverable. This hands-on method is excellent for rooting out requirements gaps, although it should not replace formal documentation. ■■ Use real data wherever possible when prototyping to give business users a better context. It also helps you to identify and correct data issues early. ■■ Integrate developers. Requirements gathering should not be done solely by analysts. If there are separate individuals responsible for coding, they must be involved at this stage so they truly understand the value and meaning of what they will build. ■■ Set expectations for production support. Agree upon a process for communication of user questions or any defects users discover. Depending on the user, this can be done many ways, although users at the executive level will likely prefer a direct communication path with the team’s manager(s). Additional suggestions appear in the post-implementation section later in this article. ■■ Define milestone deliverables. Regardless of the software development methodology you use, defining milestone deliverables is critical for instilling and retaining business confidence in the project. It is also necessary to ensure the development team is progressing as expected. Milestone due dates should be communicated early and deadlines met. If a deadline is at risk of being missed, share this information (as well as the reasons for the problem and the recommended course of action) with business users so new dates and dead- lines can be agreed upon or so the team can remove items from the project scope or adjust resource levels and assignments. An early requirements exercise should involve the filters and dimensions that will be used as well as how they should be aggregated. practical dashboard development
  • 30. 28 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 Required deliverables from the business requirements- gathering phase may include: ■■ Scope lockdown, with documentation of what is in scope and out of scope. ■■ Final prototype with business sign-off. (Note: This remains a working prototype, and all team members must understand and agree that the design may change later in the project if practical.) The highest- level sponsor of the project should be part of this sign-off, as well as further sign-offs of the actual product prior to rollout. ■■ Detailed requirements definitions, including images from the prototype. Such documents can tie the business definitions of the metrics to the way they will be displayed. Such a connection will bring clarity both to the business client and to the developers/analysts building the solution. ■■ Technical conceptual design. This high-level docu- ment defines all data sources and targets, what delivery mechanisms are being used, and the general business use case(s) for the dashboard. Designing and Building the Dashboard: Soup to Nuts When dashboard design has begun, all layers should be considered in relation to one other. For example, if the dashboard will be connected to aggregated tables designed for performance, those tables, the way they are loaded (or otherwise populated), and any performance and capacity concerns should be considered. This is just as important as designing the dashboard functionality. In general, the dashboard design should: ■■ Ensure a single, consistent view of the data. This can apply to the visual look and feel as well as how often the components on a screen are refreshed. The user should not have to think about how to interpret the dashboard; the data presentation should be clear and intuitive. ■■ Keep everything in one place. If detailed data or supple- mental reports are needed, use the dashboard like a portal or ensure a centralized interface keeps the data logically consolidated. Also, make sure the same data source is used for both detailed and aggregated data on the dashboard. Keep in mind, however, that business users may expect that a snapshot of the dashboard will not change. For example, a monthly metric could possibly vary slightly in the source data, but re-querying every time for the dashboard view with different results could erode confidence and even skew expected trends. Have a conversation with business users early on to discuss such scenarios and determine whether storing point- in-time dashboard snapshots will be required. ■■ Understand the usage scenario. Knowing the size of the user base, as well as the types of users and when they will be accessing the dashboard, can drive design. You should understand the usage volumetrics early in your project and plan accordingly. You must also ensure that any maintenance windows do not conflict with peak-time use. Environment sizing, capacity, and performance will all be critical to ensure a stable tool. ■■ Address multiple environments for development. If your environment has the necessary capacity, build develop- ment, test, and production environments. It’s worth it. Defining milestone deliverables is critical for instilling and retaining business confidence in the project. practical dashboard development
  • 31. 29BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 ■■ Plan to validate data accuracy as early as possible, and ensure your design and project plan allow this. To avoid rework, it is crucial to make every effort to get the data perfect and acquire sign-off in a lower database environment during user testing. This will ensure that the data acquisition process is free of bugs. At the same time, ensure that you validate using data from the production source system(s), because the data will be well defined and likely have an independent method of validation. ■■ Roll out with historical data available. Plan on migrating all validated data to production tables along with the rest of the code. Implementing a dashboard with predefined history and trends will ensure a great first impression and enhance user confidence. In addition to these areas of focus, consider several design best practices for both database/data quality and dashboard interfaces. Database-Level Best Practices Ideally, your dashboard will be running in a stable database environment. This environment may be man- aged by your team or may be the responsibility of another area of your company. Either way, your dashboard is meant to provide data for quick and meaningful analysis, so treating the data and the tables in which it resides is critical. Some best practices include: ■■ Using ETL or other data acquisition methods to regularly write to a highly aggregated, denormalized table. This will ensure optimal performance, as dashboard click events need to be fast. A good goal is to ensure that no click event takes more than three seconds to return data to the dashboard. ■■ Use predefined and well-maintained dimensional tables wherever possible. This ensures consistency and eliminates redundant data structures. ■■ Store the data using IDs, and reference static code or dimensional tables wherever possible. This way, if a business rule changes, only one table must be modi- fied, and no data has been persisted to a table that is now outdated. ■■ Design and model the data so the front end can dynamically handle any business changes at the source level. This eliminates the need to update the code every time business users make a change, and maintenance costs will be much lower. The develop- ment team will then be able to work on exciting new projects rather than just keeping the lights on. ■■ Detailed data should be kept separate and not reloaded anywhere, if possible. However, it should be available in the same database so the aggregate and related detail can easily coexist. ■■ Unless absolutely necessary, do not store calculated values or any data that is prone to business rule changes. If persisted data becomes incorrect, it can be a huge effort to re-state it. Calculated fields can be done quickly using front-end queries or formulas (if designed properly). ■■ Create a data archival strategy based on business needs for data retention and how much history the dashboard needs to show. ■■ Ensure that any queries from the dashboard to the tables are well-tuned and that they will continue to run quickly over time. ■■ Likewise, ensure that any middle-tier environment used for running the dashboard queries is highly stable and can take advantage of any caching opportunities to enhance performance. Dashboard-Level Best Practices Spending a great deal of time on getting the dashboard data modeled, stored, automated, and correct will, of practical dashboard development
  • 32. 30 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 course, all be for naught if the dashboard front end is not intuitive, does not perform, or otherwise does not have high availability. To address this, take these steps throughout the life cycle: ■■ Check the dashboard usability by bringing in end users who were not involved in the initial project. Observe how quickly and easily they can meet their objectives, and remove all bias as you watch. You will need to plan for their participation well in advance, and this work should be done early in your testing (make sure to have production data at this point) so there is time to react to their input. ■■ Within the dashboard code, implement dynamic server configuration so all dashboard components can automatically reference the proper environment for the database, middle tier, and front end itself. This reduces reliance on hard-coded server names and can prevent deployments from accidentally pointing to the wrong location. ■■ Users will want to use Excel regardless of how well-designed your dashboard is. Make sure an Excel export option is available for all the data shown on the dashboard and any included reports. ■■ For every dashboard component, include a label referencing the specific data source as well as the data refresh date. This simple step resolves confusion and will greatly reduce the number of support questions you receive post-rollout. ■■ Do everything possible to avoid hard-coding filters, axes, or any other front-end components that change based on predictably changing business. The data and the front end both need to be flexible and dynamic enough to display information based on a changing business. The dashboard should not have to display invalid or stale data for a time period while the devel- opment team scrambles to implement a production fix. That would inevitably lead to a drop in user adoption and reduced confidence in the dashboard’s validity. ■■ Test plans should include scripts for testing the overall dashboard load time as well as specific load times for all click events. This will afford the time needed to tweak code for optimal performance. ■■ Near the end of testing, simulate a performance load test whether you have automated tools to do this or you have to do it manually with multiple users. The purpose is to ensure no part of the underlying infrastructure has an issue with load. ■■ Test boundary conditions to avoid unforeseen defects later in the project’s life. For example, what happens when a multi-year trend goes into a new year? Will the x-axis continue to sort properly? Define all conditions like this and find a way to test each one. Running the Project (and Subsequent Projects) Considering the myriad of complexities involved in implementing a dashboard, from ensuring correct data is available when expected, to designing a usable and innovative front end, to working with the business through multiple and complex requirements, costs can be high and timelines can easily be missed if the project is not carefully managed. The following procedures will help ensure a successful dashboard release, all in the context of the best practices explained so far: Create and use an estimating model. The model should cover all aspects of a dashboard release (from data to user interface), all the technical roles and resources that will be involved, and be sufficiently detailed to break down time in hours by both phase and resource type. A model Do not store calculated values or any data that is prone to business rule changes. practical dashboard development
  • 33. 31BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 that can be defined by selecting answers to requirements- based questions will be the easiest for your analysts to use, such as: How many metrics and components will be displayed? How many data sources will be used? Does data for validation exist? The model should be refined after each large project by entering the answers to these questions and determining how closely the model’s hours match those actually spent. Data validation is your top priority. Plan and allocate the time with your business partners and understand what data sources you will use for validation. If there is no indepen- dent source, you and your business users must reach an agreement about how validation will be performed. Share real data as soon as it becomes available and the team has reasonable confidence in its accuracy. There is no reason to wait to share data, regardless of how early in the process this occurs, because the earlier data defects are identified and resolved, the more smoothly the subsequent processes will go. As we’ve mentioned, we recommend you implement your project with historical data loaded. If this is planned, ensure that business users are aware and secure their pledge to spend adequate time comparing and validating the historical data. Define phases of work and identify key deliverables for each. Regardless of the development methodology your depart- ment uses, you must align milestones to specific dates to ensure the project does not get out of control and to keep business users confident in your progress. Depending on your business client and their expectations, you may need to blend agile and waterfall methods. Although this will not satisfy ardent practitioners of the methodologies, a blended approach can allow for the iterative testing and discovery that this type of work requires while ensuring adherence to a strict timeline, which a release of this complexity also requires. Implementations are complex, so make a detailed plan. The manager or lead of the project should define all the steps needed, assign dates and responsible parties, and build a T-minus document/project scorecard. These tasks should be completed during the initial stages of the work, soon after any intake approval and/or slotting, and the document should be reviewed with the entire team at least once a week to ensure the project is consistently on track. Escalate all identified issues and risks early and often. If your department already has a process for bringing issues and risks into the open and to the attention of those who can mitigate them, use it. Otherwise, create your own process for the project. Enlist all stakeholders and technology leaders for support, and do this proactively. Review, review, review. Plan multiple design and code reviews, and assume at least a draft and final review will be needed for each major piece of work. Devote ample time to design review, because waiting until the dashboard is built may make recovery impossible if a fundamental design flaw has gone unnoticed. Formalize a method for tracking and implementing all changes identified during reviews. Keep the development team engaged. For example, if the development team includes offshore resources, record key meetings using Webinar technology. This can serve as both knowledge transfer and training material later. Make sure everyone knows about the recording and ensure that no legal or compliance issues will arise. Even though your work may be completed in phases, dashboards can rarely be efficiently delivered if a “factory” Depending on your business client and their expectations, you may need to blend agile and waterfall methods. practical dashboard development
  • 34. 32 BUSINESS INTELLIGENCE Journal • vol. 17, No. 4 approach is used (in which requirements are passed to designers, and designs passed to builders, without every- one being involved). When a developer is far removed from business users working on a dashboard project, this can lead to project failure. Implement a formal user-acceptance testing process. Once the development team has completed all internal testing of data and functionality, plan time (we recommend two to three weeks) to allow the business team to complete their tests. Testing should include as much data valida- tion as possible. We recommend you formally kick off the testing phase with business users and employ a docu- mented process for submitting defects and questions to the development team. Make this easy for your business partners. They should focus on testing, not on how to submit their test results. Require sponsor/stakeholder approval before rollout. This will give your dashboard legitimacy to the ultimate end users and is invaluable for those early weeks when adoption may hang in the balance. This approval should include a presentation during which the sponsor can view and provide feedback about the dashboard, with sufficient time allotted to make adjustments. As mentioned, we recommend you conduct sponsor reviews of the dashboard throughout the project, including during prototype design. Post-Implementation (You’re Never Really Done) After the dashboard is implemented, team members are often tempted to relax. There may also be new projects demanding focus. Do not become distracted or complacent, because there are certain post-implementation steps that will ensure both that the few critical months after rollout go smoothly and that the development team does not become bogged down by production support or answer- ing business questions. First, build post-implementation work into the initial plan. Sustainability and support should be factors in scope and technical design. For larger rollouts, consider best practices for the sponsor- ing business group and the technology team to handle presentations. This way, both business and technical questions can be answered accurately, all key partners are included, and accountability is shared. Post-rollout sponsorship and change navigation coordination are crucial. The business unit will likely be responsible for communications and training, but the technology team can and should influence this. If possible, ensure you have a method to collect usage metrics. If you can identify usage by user ID, that is even better because delineation between business and technol- ogy usage can be made and groups can be identified for training if usage is lower than expected. The development team can suggest and implement innovative ways to communicate with users: ■■ Add a scrolling marquee to the dashboard or use some other technique for instantly communicating impor- tant messages. This component should be database driven, and the technical support team should have write access into a table separate from the dashboard’s main tables. This way, announcements such as planned downtime or key data load dates can be easily delivered to all users. ■■ Add an e-mail button that goes directly to the dashboard support team. This may not be a popular choice for all technology teams, but dashboards are Create an internal process for ticket and defect handling, and implement bug fixes in small, bundled releases. practical dashboard development