SlideShare a Scribd company logo
BSc (Hons) Business Technology
An Exploration of the Gamification of Business Intelligence Tools
andthe Effect onUser Engagement
Gary Brogan
B00272662
22nd
April 2016
Supervisor: Dr Carolyn Begg
1 | P a g e
Declaration
This dissertation is submitted in partial fulfilment of the requirements for the degree of BSc
Business Technology (Honours) in the University of the West of Scotland.
I declare that this dissertation embodies the results of my own work and that it has been
composed by myself. Following normal academic conventions, I have made due
acknowledgement to the work of others.
Name: GARY BROGAN
Signature:
Date: 22/04/2016
2 | P a g e
Library Reference Sheet
Surname- Brogan
First Name- Gary Initials- GB
Borrower ID Number- B00272662
Course Code – BTCS – COMPSCI
Course Description – BSc Hons Business Technology
Project Supervisor-Dr Carolyn Begg
Dissertation Title-
An Exploration of the Gamification of Business Intelligence Tools and the Effect
on User Engagement
Session- 2015/2016
Acknowledgements
I would like to thank both Dr Carolyn Begg and PhD student Stephen Miller for their
continued support and advice throughout this project. A special thank you goes to my wife
Tracey Brogan, who has been completely understanding and supportive during my time at
University, and my two daughters, Abbey and Liara, who have also fully supported me
throughout this journey.
3 | P a g e
Contents
Abstract.......................................................................................................................................5
Chapter 1: Introduction:............................................................................................................... 6
1.1 Introduction to Key Themes.................................................................................................6
1.2 Aims and objectives of the project........................................................................................ 6
1.3 Research methodology and techniques used ........................................................................7
1.4 Project scope and limitations............................................................................................... 7
Chapter 2: Literature review:........................................................................................................8
2.1 Business Intelligence ...........................................................................................................9
2.1.1 Business Intelligence defined......................................................................................... 9
2.1.2 Current state of Business Intelligence........................................................................... 10
2.1.3 Summary.................................................................................................................... 12
2.2 User Engagement with BI Tools.......................................................................................... 12
2.2.1 What is employee engagement?.................................................................................. 13
2.2.2 What is User Engagement?.......................................................................................... 13
2.2.3 BI Adoption Rate......................................................................................................... 13
2.2.4 Summary.................................................................................................................... 15
2.3 Gamification ..................................................................................................................... 15
2.3.1 Game Elements........................................................................................................... 17
2.3.2 Scope of Gamification ................................................................................................. 18
2.3.3 Successful Gamification............................................................................................... 19
2.3.4 Gamification platforms for Business Intelligence tools................................................... 19
2.3.2 Summary.................................................................................................................... 20
2.4 Literature review conclusion.............................................................................................. 20
2.4.1 User Engagement........................................................................................................ 21
2.4.2 Enterprise Gamification relationship with BI................................................................. 21
2.4.3 Motivational Theory linked to SSBI and Gamification .................................................... 21
2.4.4 Summary.................................................................................................................... 21
Chapter 3: Research methodology: ............................................................................................ 21
3.1 Selection criteria............................................................................................................... 22
3.2 Project GamBIT................................................................................................................. 22
3.3 Ethical debate surrounding gamification and study participation.......................................... 23
3.4 Quantitative Research ....................................................................................................... 24
3.4.1 Measuring User Engagement....................................................................................... 25
3.5 Qualitative Research.......................................................................................................... 25
3.6 Methodological stages....................................................................................................... 26
4 | P a g e
3.6.1 Steps involved in open coding...................................................................................... 27
Chapter 4: Experimental and Interview Process:......................................................................... 28
4.2 GamBIT Experiment........................................................................................................... 28
4.2.1 Appeal for Volunteers ................................................................................................. 28
4.2.2 Experiment................................................................................................................. 29
4.3 Interview process.............................................................................................................. 29
Chapter 5: Results and Analysis:................................................................................................. 30
5.1 Quantitative data.............................................................................................................. 30
5.1.1 Participant Results and Analysis................................................................................... 31
5.1.2 Survey Information Results andAnalysis....................................................................... 33
5.1.3 UES statistical Results and Analysis............................................................................... 39
5.1.4 User engagement highest rankingfactors..................................................................... 40
5.1.5 User engagementlowest ranking factors...................................................................... 43
5.1.6 Summary of UES data.................................................................................................. 44
5.1.7 Time taken to complete tasks results and analysis......................................................... 44
5.1.8 Summary of time taken to complete tasks.................................................................... 45
5.2 Qualitative data................................................................................................................. 45
5.2.1 Participant A............................................................................................................... 46
5.2.2 Participant B............................................................................................................... 48
5.2.3 Participant C............................................................................................................... 50
5.2.4 Participant D............................................................................................................... 51
5.3 Summary of Qualitative Data Results.................................................................................. 53
Chapter 6: Conclusion: .............................................................................................................. 54
6.1 Review of research objectives............................................................................................ 55
6.2 Discussion of primary and secondary conclusions................................................................ 55
6.3 Limitations placed on project............................................................................................. 56
6.4 Future research work ........................................................................................................ 57
6.5 Summary .......................................................................................................................... 58
Chapter 7: Critical evaluation: ..................................................................................................... 58
7.1 Reflecting on the initial stages of the project....................................................................... 58
7.2 Approach to project........................................................................................................... 59
7.3 Honours year modules....................................................................................................... 59
7.4 Project aids....................................................................................................................... 59
7.5 Summary .......................................................................................................................... 60
References................................................................................................................................. 60
Appendix:................................................................................................................................... 63
5 | P a g e
Appendix A Descriptive statistics.............................................................................................. 63
Appendix B Appeal for volunteers............................................................................................ 66
Appendix C Semi-Structured Interviews.................................................................................... 67
Appendix D Interview Guide.................................................................................................... 68
Appendix E Project Specification Form..................................................................................... 68
Appendix F Initial GamBIT development involvement ............................................................... 69
Appendix G User Engagement Scale including Research Survey ................................................. 70
Appendix H GamBIT gamification elements.............................................................................. 70
Abstract
The principal objective of the study is to explore the issue of lack of user engagement with
BI tools. The point it aims to address is whether making BI tools more fun and engaging by
applying gamification to BI tools effects user engagement with the tools. This project will
explore the gamification of BI tools and the effects on user engagement, to see if there is an
increase in user engagement.
The literature revealed that only 24% of staff who are exposed to BI tools are considered
actively engaged with BI tools. It is also widely acknowledged that BI has not fulfilled its true
potential with traditional BI best practises being considered a bit of a failure. Could
“gamifying” a BI tool affect user engagement with the tool and address the issue of lack of
engagement? To test this theory a prototype gamified BI tool has been developed, namely
Project GamBIT.
To carry out the research on the study objectives a mixed methodology was used which
helped gather both qualitative and quantitative data. This was deemed the most
appropriate approach to a study which is exploratory in nature as each approach has the
potential to enhance and/or complement the other in knowledge gained on the same
research problem.
To gather the quantitative data the User Engagement Scale (UES) was applied to the GamBIT
prototype and analysis of the data. To gather the qualitative data semi-structured
interviews were conducted with volunteers who had took part in the GamBIT experiment in
an attempt to glean more information over and above the quantitative data.
The study is unique as there is little credible academic research been carried out on the lack
of user engagement of BI tools. The results of this unique study demonstrates that
gamifying a BI tool does increase certain user engagement factors and can increases
motivation to use BI tools more often. Feedback from the interviews conducted highlights
further areas where user engagement was considered more significantly increased.
6 | P a g e
Chapter 1: Introduction:
The first chapter will set the scene for the research that has been undertaken and will
provide an introduction of the topics covered in the report. It will cover the aims and
objectives of the report and include the inspiration that led to the research being carried
out along with the research problem, that there is lack of engagement with BI tools, and
aims to provide a brief background on why this is. It will provide a very brief overview of the
research methods used along with the scope and limitations of the report.
1.1 Introduction to Key Themes
Business Intelligence systems and all their components have been around for a number of
years now. Business Intelligence (BI) has, since the late 1980s, evolved into a multi-billion
dollar industry. Its main purpose is to produce timely, accurate, high-value, and actionable
information. As a technology, BI has been seen to be under used and, as such, has
significant untapped potential. One of the main factors that contribute to it being under
used is a lack of user engagement with BI front end tools. With the adoption rates of BI tools
remaining flat at around 24% over the past few years, many BI initiatives have failed to
deliver the expected results leading to a common belief that traditional BI best practices
where “a bit of a failure” (Howson,C 2014). To tackle the issue of lack of user engagement,
applying the concept of gamification to BI tools may offer a solution.
Organisations are increasingly recognising that applying gamification platforms to a wide
variety of business processes may hold the key to increased user engagement. A widely
cited 2011 Gartner study predicted by 2016 the widespread use of gamification would be
applied to 50% of organisations business processes. Although this now seems highly
unlikely, the industry does continues to grow with many gamification platform providers
such as Badgeville, Game Effective, and Bunchball leading the way in “gamifying” business
processes. These gamification practitioners champion the use of techniques such as
rewarding certain behaviours using points and badges, highlighting personal achievements
on leader boards and basically trying to make business processes more fun and rewarding.
Successful gamification practitioners also understand the relationship between psychology
and technology giving thought to what motivates someone to engage with a certain task,
process, or software tool. Understanding motivational theory and indeed why users engage
with certain tasks, processes, or software tools may provide some answers to the question
of “why individual IT adoption rates are much lower that many organisations originally
forecast?” (Wu,M. 2011).
Early indicators entertain the possibility that the recent trend of enterprise gamification,
which applies gamification to the workplace environment, may become an integral part of
any organisations future BI initiatives and a way to further operationalize BI. Could providing
enterprise gamification platforms for BI processes hold the key to tackling the issue of lack
of user engagement with BI tools?
1.2 Aims and objectives of the project
The point this project is aimed at is to address the issues surrounding lack of user
engagement with BI front end tools, and asks the question, “Whether the gamification of BI
tools can affect user engagement?” The objectives are to explore the gamification of BI tools
7 | P a g e
and the effect, if any, on user engagement with the tools. Once complete this will achieve
the aims of the project.
This has been chosen as the focus of the project as BI and its modular components are
currently playing a major role in the Business Technology sector with the lack of user
engagement with BI tools being a global organisational issue. Combined with the recent
trend of gamification, and it’s potential to increase user engagement, these subject areas
form an interesting basis of exploration for any Business Technology student.
This project will also form part of an on-going experimental study named Project GamBIT, a
prototype gamified BI tool. Work carried out has aided GamBIT application development
and helped gather evidence whether GamBIT achieved or not, increased user engagement
with a BI tool.
1.3 Research methodology and techniques used
The report will contain details of how primary research will be conducted, providing details
of how user engagement with a BI tool will be measured and analysed. This will provide
both the quantitative and qualitative data needed to address the main points of the report,
whether the gamification of BI tools can affect user engagement. As the objective is to
explore the gamification of BI tools and the effects on user engagement, if any, analysis of
both the quantitative and qualitative data was conducted to aid in the exploration process.
Additionally the report focuses on key academic papers from both the BI and Gamification
sectors, with the emphasis on user engagement within each sector, and draws on findings
from industry experts such as (Howson,C.), (Werbach,K.), (Zichermann,G.). This forms the
basis for the literature review and aims to address the key areas of the report.
As this report covers areas with few comprehensive academic works it will draw on white
papers, articles and blogs, Vendor specific websites, webinars, studies, and gamification
platform providers where appropriate. As some non-academic researched literature may be
somewhat biased, the application of criticismwill be provided when deemed necessary and
appropriate in an attempt to mitigate as much bias as possible.
1.4 Project scope and limitations
This section will concentrate on the boundaries of the secondary and primary research.
The primary research will explore if the gamification of a front-end BI tool will have any
effect on user engagement with the tool. As this subject is unique in that there is no current
academic research been done in this area, the project will include both qualitative and
quantitative research methods. Quantitative and qualitative data was collected on one
specific experimental front- end BI reporting tool. The tool was designed using the Eclipse
BIRT platform which is an open source platform for BI tools. It is also worth noting that the
majority of primary data was collected from students of the School of Engineering and
Computing who may already be considered somewhat “engaged” with front-end BI
reporting tools.
Further qualitative data was collected in an attempt glean more information over and above
the quantitative data collected. It also attempts to gain further insight into user’s thoughts,
8 | P a g e
feelings and opinions on the future evolution of both the BI and gamification sectors and
identify any correlations between these sectors.
The secondary research of this report explores the concept of BI, its modular components,
the emergence of BI and the factors influencing the BI industry with a focus on the adoption
rate of BI tools. The concept of BI is examined in its broadest sense by reviewing the
published literature with particular reference to material based on user engagement with BI
tools which is relatively limited in scope and detail. The report will then centre on the recent
trend of gamification, what it is, and its scope. It will explore the possibility of whether, by
applying gamification to a front-end BI tool, this could have an effect on user engagement
with the tool. When researching user engagement, users mainly fall into two groups,
employees and customers. For the purposes of this study the focus is on the employee user
group.
The subject areas that will form the basis of the literature review are:
 Business Intelligence (BI)
 User engagement with BI tools
 Gamification.
Figure 1-0 The Venn diagram organises the key subject areas of this report visually so the
similarity between relationships can be clearly seen.
Chapter 2: Literature review:
Keywords - Gamification, Employee Engagement, User Engagement, Business Intelligence,
Business Intelligence Tools, Game Elements, Game Mechanics, Intrinsic Motivation,
Enterprise Gamification.
9 | P a g e
This chapter contains the literature review carried out by the researcher and examines
relevant literature on the BI sector, focusing on the history of BI and the current state of the
industry. The literature review will then concentrate on user engagement with BI and
especially front-end BI tools. This part focuses specifically on the exploration of adoption
rates of BI tools and highlights any potential issues that could lead to “a lack of user
engagement with BI tools”. The focus will then turn to the new trend of gamification and its
potential correlation with BI and user engagement with front-end BI tools
2.1 Business Intelligence
The term Business Intelligence, or BI, was coined by Howard Dresner of the Gartner Group,
in the late 1980s. BI is a huge and rapidly growing industry that emerged as a result of
organisations beginning to realise and understand that the data stored within their decision
support systems (DSS) had the potential to be of great value to them. Many of the early
adopters of BI were in transaction-intensive businesses, such as telecommunications and
financial services. As the industry matured the BI technical architecture began to include
Data Warehouses, Data Marts, Executive Information Systems (EIS), Online Analytical
Processing (OLAP) and by the mid-1990s BI, along with all its modular components, became
widely adopted (Miller, 2013). As a result, BI became so closely associated with Data
Warehouse technology it became identified as one and the same and is referred to using
the acronym BI/DW. By the mid-1990s two main leaders in the BI industry emerged, Bill
Inmon and Ralph Kimball. Inmon’s philosophy is based on an enterprise approach to data
warehouse design using a top-down design method (Inmon 2005) while Kimball’s offering
consists of a dimensional design method which is considered a bottom-up design approach
(Kimball 2002). Even now a debate still rages on which of these approaches is more
effective. Research points towards both Inmon and Kimballs approaches having advantages
and disadvantages with many organisations having successfully implementing either
approach. Organisations who are considering implementing a BI infrastructure would have
to give careful consideration to both these approaches and closely aligned either approach
with the overall high level business strategy of the organisation. Until recently BI had
adopted a mainly centralised model around organisations IT departments. This meant that
getting information to the right users could take considerable time and the build-up of
requests for reports, analytics and insights from within the organisation could become
“bottlenecked”. The general consensus was that business users viewed BI tools as complex
and left the use of these tools to the “power users” within IT departments. This naturally
evolved into a big disconnect between the IT power users and business users and led to
many problems for what is now referred to as “Traditional BI”. Research suggests that
Traditional BI best practices were considered slow, painful, and expensive therefore seen as
a bit of a failure (Howson,C 2014).
2.1.1 Business Intelligence defined
As BI has evolved so too has its definition and as such can be defined in various ways.
(Howson, C. 2014) defines BI as a “set of technologies and processes that allow people at all
levels of the organisation to access and analyse data”. Gartner (2013), the world's leading
information technology research and advisory company, describes BI as an umbrella term
that includes the applications, infrastructure and tools, and best practices that enables
10 | P a g e
access to and analysis of information to improve and optimize decisions and performance.
Eckerson,W (2010), Director of BI Leadership Research , appreciated the need for BI tools to
provide production reporting, end-user query and reporting, OLAP, dashboard/screen tools,
data mining tools, and planning and modelling tools. Research suggests that currently there
are no combinations of hardware and software, any processes, protocols, or architectures
that can truly define BI. What (Wu, L, Barash, G, & Bartolini, C. et al 2007) have made clear
however, is that up until recently BI’s objectives were to:
 Offer an organisation a “single version of the truth”.
 Provide a simplified systemimplementation, deployment and administration.
 Deliver strategic, tactical and operational knowledge and actionable insight.
2.1.2 Current state of Business Intelligence
The recent unstructured data explosion and the trend towards “Big data” (Davenport, T.H.,
Barth, P. & Bean, R. 2012) has seen BI evolve yet again and as such BI has become
synonymous with Big Data and Big Data analytics. As the volume, velocity and variety of
data (the three V’s) has exponentially increased so too has the demand for cost-effective,
innovative forms of information processing for enhanced insight and decision making (Lohr,
S. 2012) .Vast volumes of data are now being captured and stored, but research shows it has
been impossible for traditional BI to analyse and manage this data due to the unstructured
nature of it (figure 2-0). Wixom, B. (2014) highlights how BI responded to the challenges
posed by Big Data by adopting advanced technologies such as:
 Hadoop architectures
 Data visualization and discovery tools
 Predictive analytics
 Rare combinations of user skills (e.g., data scientists)
Figure 2-0 - Graphic: Imex Research
Businesses are now demanding faster time to insight (DiSanto,D, 2012), to stay competitive
in today’s fast paced, evolving global markets and BI has to at least try and keep up with the
pace of these demands. Traditional BI tools could take days or weeks to produce reports and
11 | P a g e
analysis, this is no longer enough. This seen a demand for real time Business Intelligence,
(RTBI) Azvine, B, Cui, Z, & Nauck, D. (2005) agreed that it is
“becoming essential nowadays that not only is the analysis done on real-time data, but also
actions in response to analysis results can be performed in real time and instantaneously
change parameters of business processes”.
As RTBI evolved, so too has the more recent BI trend of self-service BI. Front-end business
users, who are considered the main information consumers, want to see, analyse and act
upon their data more quickly without having to heavily rely on IT departments making their
data available to them. The shift away from a centralised BI model to a more balanced
centralised/de-centralised BI model (Wu, L., Barash, G., and Bartolini,C, 2007) has seen the
emergence of, and increased organisational involvement with, self-service BI (SSBI). Gartner
(2013) defines SSBI “as end users designing and deploying their own reports and analyses
within an approved and supported architecture and tools portfolio.” Imhoff, C. & White, C.
(2011) define SSBI as the facilities within the BI environment that enable BI users to become
more self-reliant and less dependent on the IT organization. These facilities focus on four
main objectives:
1. Easier access to source data for reporting and analysis,
2. Easier and improved support for data analysis features,
3. Faster deployment options such as appliances and cloud computing, and
4. Simpler, customizable, and collaborative end-user interfaces.
Figure 2-1 - Graphic: BI Research and Intelligent Solutions, Inc.
To help organisations achieve these four main objectives it would be worth exploring the
concept of intrinsic motivation later on in this report, which Paharia, R. (2013) argues is
directly linked to SSBI users feeling empowered, and how it fits in with individual adoption
rates with SSBI processes.
Research points towards SSBI lending itself to “multiple versions of the same truth” whereas
traditional BI offered organisations a “single version of the truth”. SSBI has been facilitated
by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools
(Howson, C. 2014). Eckerson,W (2010) defines VDD tools as “self-service, in-memory
analysis tools that enable business users to access and analyse data visually at the speed of
thought with minimal or no IT assistance and then share the results of their discoveries with
12 | P a g e
colleagues, usually in the form of an interactive dashboard”. SSBI has now become
synonymous with VDD tools and has become a top investment and innovation priority for
businesses over the past few years.
The annual Gartner Business Intelligence and Analytics Summit (2014) looks at the current
trends within the BI industry and highlighted that:
 Self-service analytics is “white hot” and growing while demand for traditional
dashboard BI is in remission.
 BI on Big Data (i.e. Hadoop-based and outside of the data warehouse) is a dynamic
new class of problem that requires a new class of solution.
 Today's buyers are increasingly coming from the business side of the house and not
from corporate IT, which has seen the move away from a centralised BI model to
more decentralized BI model.
2.1.3 Summary
 Traditional BI best practices considered a bit of a failure.
 Business users viewed BI tools as complex and left the use of these tools to the
“power users” within IT departments. Leading to a big ‘disconnect’ between business
and IT staff.
 The de-centralisation of BI has seen the emergence of self-service BI. This new trend
has been facilitated by the increased use of BI front end tools, mainly Visual Data
Discovery (VDD) tools.
2.2 User Engagement with BI Tools
This section of the literature review concentrates on user engagement with BI and looks at
the links between user engagement with BI, or lack of it, and the wider global issue of
employee engagement in the workplace.
Technology is important in any BI initiative but so too is need for BI users to be “engaged”
with the BI environment. Having an engaged workforce has proven to help foster an
analytical culture within organisations. Paharia,R. (2013) suggests that engaged workers
“can drive meaningful increases in productivity, profitability, and product quality, as well as
less absenteeism, turnover, and shrinkage”. This is no mean feat to achieve. It’s the
combination of people and technology that turn data into actionable information that can
be used to enhance the organisations decision-making (Miller, A.S. 2013), that lies at the
heart of BI. By getting the right information to the right people at the right time BI can
become an integral part when improving decision making, providing valuable business
insights, optimising organisational performance and of measuring success. However,
employee adoption of and engagement with BI is critical in any BI initiatives success or
failure.
13 | P a g e
2.2.1 What is employee engagement?
Employee engagement does not have one simple or accepted definition. The Chartered
Institute of Personnel and Development take a three dimensional approach to defining
employee engagement:
• Intellectual engagement – thinking hard about the job and how to do it better
• Affective engagement – feeling positively about doing a good job
• Social engagement – actively taking opportunities to discuss work-related improvements
with others at work
2.2.2 What is User Engagement?
Research has shown that user engagement has several definitions. This highly cited
definition by O'Brien, H.L., & Toms, E.G. (2008) states “Engagement is a user’s response to
an interaction that gains maintains, and encourages their attention, particularly when they
are intrinsically motivated” while Attfield, S, Kazai, G., Lalmas, M., & Piwowarski, B. (2011)
explain that “User engagement is a quality of the user experience that emphasizes the
positive aspects of interaction – in particular the fact of being captivated by the technology”
Research points towards user engagement being the determining factor in any successful BI
initiative. Organisations that have more users engaging with BI, with the emphasise on BI
tools, will more than likely see a better Return on Investment (ROI) in their BI ventures than
that of those whose workforce are lacking in engagement (Howson,C. 2014).
2.2.3 BI Adoption Rate
Recent survey suggests that BI adoption as a percentage of employees remains flat at 22%,
but companies who have successfully deployed mobile BI (Dresner.H, 2012), show the
highest adoption at 42% of employees (figure 2-2)
Figure 2-2 - Graphic: BI Scorecard
The lack of BI adoption from the employee perspective can be aligned closely with the wider
global problem of “lack of employee engagement” in the workplace. According to Deloitte’s
2015 Global Human Capital Trends survey (figure 2-3), employee and cultural engagement is
the number one challenge companies’ face around the world.
14 | P a g e
Figure 2-3 - Graphic: Deloitte University Press
Gallup conducted a study in 2013 into the state of the global workplace. The findings show
of the 142 countries that measured employee engagement that 13% of employees are
engaged in their jobs, while 63% are not engaged and 24% are actively disengaged. While in
the U.S Dale Carnegie and MSW did a study on over 1500 employees that measured
employee engagement. It revealed that 29% of the workforce is engaged, 45% are not
engaged, and 26% are actively disengaged (Dale Carnegie Training 2012).
As more organisations employ BI and analytics to improve and optimize decisions and
performance research points towards the question many organisations have asked “what is
going to make the difference between a successful BI initiative and one that will flat line?”
The need to stay one step ahead in an ever increasing and competitive global marketplace is
proving harder. Business leaders are looking to technology as the main driver in remaining
competitive in today’s markets. Having the right information technology infrastructure in
place is not enough to give organisations the edge.
What the research leans towards is having an engaged, motivated and collaborative
workforce. This is especially true in the BI environment where adoption rates of BI tools has
flat lined over the past decade. Some have suggested that those who are exposed to the
front end tools and how they engage with them may make the difference in the success or
failure in any BI initiative. It would seemthat Organisations looking to take BI adoption
rates, and indeed user engagement with BI tools, to the next level would have to have a
clear strategy that makes user engagement a priority.
To get the right information to the right person at the right time does not guarantee BI
success, if users are not engaging with BI tools an organisations BI deployment could be
doomed to failure. However to address this problem an important questions should be
asked “is user engagement with BI at a required level to make BI a success?” If this question
cannot be clearly answered an organisations BI efforts could fail to deliver the results that
were initially predicted.
15 | P a g e
Senapati, L., (2013) argues that to gain competitive advantage through active user
engagement, organizations must leverage gamification mechanics to influence user
behaviour and drive results.
The summary below gives an indication to why there is a lack of engagement with BI tools.
2.2.4 Summary
 The adoption rate of BI tools has flat lined at 22% over the past decade.
 The issues surrounding user engagement with BI tools can be directly linked to the
wider global issue of lack of employee engagement in the workplace.
 Organisations who have employees actively engaged with BI tools see a great return
on investment with their BI initiatives.
 To take user engagement with front-end BI tools to the next level, organisations will
need a clear strategy that makes user engagement a priority.
2.3 Gamification
This section of the Interim report will focus on the subject area of gamification. It will
explore its history, how it is defined and its correlation with BI and in particular exploring
the possibilities that the gamification of BI tools could have an effect on user engagement.
Gamification is a relatively new concept that is constantly evolving and has been gaining
popularity over the past few years with many vendors now offering gamification platforms
and solutions. The development of new frameworks, technologies and design patterns has
made gamification scalable and effective (Werbach,K, Hunter,D, 2012). This has led to it
being applied and utilised throughout organisations to gain business benefits across a wide
processes, tasks and tools.
The term “gamification” has been accredited to the British-born computer programmer and
inventor Nick Pelling who coined the phrase in 2002 but it was not until 2010 that articles
and journals based on gamification started to appear. The rise in popularity of gamification
has resulted in it experiencing considerable attention over the past few years. Google trends
shows that search volume for gamification increased significantly since 2010 and spiked in
February 2014.(Figure 2-4) Since then it has stayed at a steadier search volume. (December
2015). Gartners top 10 strategic technology trends 2014 showed gamification as a rising
trend for a number of years. (Figure 2-5) but has seen the hype surrounding it die down and
should reach its plateau of productivity in the next 2 to 5 years. Like all trends it has is
champions and its critics and although gamification has quickly evolved into a multi-million
dollar industry it is still considered to be in its infancy and therefore not fully matured.
16 | P a g e
Figure 2-4 Google Trend search results for the keywords gamification & business
gamification
Figure 2-5 Gamification in the Gartner 2014 Hype Cycle
There are many schools of thought on the definition of “what” gamification is. Duggan, K. &
Shoup, K. (2013) use this explanation of Gamification to highlight both the human
behavioural and technology elements used in gamification.
“Think of gamification as the intersection of psychology and technology… understanding
what motivates someone to ‘engage’ with certain elements of a website, app, or what have
you… It’s about humanising the technology and applying psychology and behavioural
concepts to increase the likelihood that the technology will be used and used properly”.
Werbach,K. Hunter,D.(2012) define gamification as “the use of game mechanics and design
in a non-game context to engage users or solve problems”. It is important that the research
does not confuse gamification with “playing games” or “serious games” (Nicholson, S. 2012)
which also applies game elements and design to non-game concepts. Gamification is not
people playing or creating full blown games, whether it be for employees or customers, but
using game elements such as dynamics, mechanics, and components to make an existing
experience, like a task, business process, or software tool more fun, engaging, collaborative,
17 | P a g e
and rewarding. Gamification uses these motivational factors based on needs and desires to
get organizational task completed. Organisational tasks with game like engagement and
actions can make people excited about work and boost productivity (Wu,M. 2011).
2.3.1 Game Elements
Game elements can be though if as the “toolkit” needed to build and implement successful
gamification. Points, Badges, and Leader boards (PBLs) are common components within the
game elements and are a seen as surface level features of gamification. PBLs are usually a
good place to start when introducing gamification platforms but research suggests awarding
and rewarding are not enough.
Through the review of literature it would be safe to imply that if gamification initiatives are
to succeed, other certain aspects must be considered. The two key questions that emerged
where
1. What are the motivational factors that drive engagement with a
product/service/process?
2. Why should gamification be taken seriously especially in a business environment?
To answer these questions first we must look at the three key elements of gamification
namely dynamics, mechanics and components. Figure 2-6 shows how these elements relate
to each other and why they are considered the building blocks to successful gamification.
Figure 2-6 Graphic: Gamification Course 2014
The research will now look at the relationship between these three elements starting with
dynamics.
Kim, B., (2012) states that “the power of game dynamics stems from the fact that it requires
meeting relatively simple conditions in return for attainable rewards. Then gradually, the
tasks become complicated and more challenging for bigger rewards”. This could conceivably
be considered the meaning behind the game.
18 | P a g e
Game mechanics refers to a set of rules, design and tools, employed by game designers, to
generate and reward activity amongst users in a game that are intended to produce an
enjoyable gaming experience (Senapati, L., 2013). Game mechanics are the elements of the
game that makes it fun, drives the action forward, and generates user engagement. Game
mechanics could reasonably be considered the momentum behind the game.
Werbach, K. (2014) describes game components as specific instantiations of mechanics and
dynamics and can include PBLs, avatars, collectibles, unlockables. This can be closely linked
to what is considered the motivation to continue with the game.
The objectives of any gamification platform or solution should be aligned directly with the
business objectives and as such an understanding of the primary stakeholders is essential in
creating an experience that engages users while accomplishing the business objectives
(Deterding, S. et al 2012).
To make the experience engaging research highlighted that three major factors must exist
and be correctly positioned. These are motivation, momentum and meaning. This is
achieved through a combination of carefully crafted game elements and design and a deep
understanding of what motivates the users of the gamified system. Research points to the
Volkswagen (2009) initiative named the “fun theory”. This initiative puts “fun” at the heart
of seemly mundane tasks such as using a set of stairs or disposing of litter and turning it into
an engaging and somewhat rewarding experience. Gamification practitioners have learned
from this and as a result the fun theory is considered a driving factor for successful
gamification and should never be far from the thoughts of any gamification designer
(Werbach,K. Hunter,D. 2012).
Underlying the concept of gamification is motivation. Research suggests that people can be
driven to do something because of internal or external motivation (Nicholson, S. 2012).
Paharia,R. (2013) adds to this by stating “Knowing what truly motivates people, and what
doesn’t, enables us to create stronger engagement and true loyalty”.
2.3.2 Scope of Gamification
The extremely broad and expanding range of ways gamification has been successfully
utilized in recent years has led to its increase in scope. The frameworks, technologies and
design expertise are readily available to introduce gamification platforms or solutions into
organisations business processes. With the trajectory of gamification constantly changing
some Industry experts have argued that each and every business process or problem has a
“gamified” solution (Zichermann,G. Linder, J. 2013). Although this may seeman exaggerated
statement it would be worth future consideration and exploration because as of yet there is
no credible academic research been done on the subject. If what Zichermann,G. Linder,
J.(2013) say is the case, then gamification has massive scope but the legal, moral and ethical
implications of gamification put forward by Kumar,J.M. & Herger,M. (2013) could affect its
future scope. As gamification is still in its infancy and not fully matured, research suggests
gauging its scope may raise more questions than answers.
19 | P a g e
2.3.3 Successful Gamification
Gamification has proven to be successful in many diverse business fields and because it can
provide quantitative data, organisations can measure engagement with whatever process,
task or tool that has been gamified. With more and more organisations realizing
gamifications potential the type of data collected can lead to valuable insights for
organisations.
Zichermann,G.(2013) describes how in 2006 Nike introduced gamification to tackle the issue
of why business had fallen to its lowest market share ownership in the influential running
shoe category. By 2009 Nike had reversed the trend due in no small part to the gamification
platform that featured social networking and location based technology that relied heavily
on games called Nike+. Individuals who went for a run could now track the number of steps
they took, calories burned and routes they ran by attaching the Nike Fuelband round their
wrists. Zichermann,G (2013) goes on to explain that “once downloaded this data could be
compared to that of others and the experience of going for a run became much richer”. This
created a whole new level of social engagement with running challenges being issued, prizes
such as electronic badges being awarded, and videos of praise from celebrity athletes for
reaching certain goals. By 2012 Nike+ had over five million users. By leveraging a simple
concept “beating your best time” Nike created a gamification platform that encouraged
wellbeing and fitness and in turn saw its market share increased by 10% in a single year.
Stanley,R (2014) looks at Engine Yard as an example of successful gamification. Engine yard
is described as a platform for deploying, scaling, and monitoring applications. The company
implemented a Zendesk knowledge base, but didn’t see the levels of engagement they had
hoped for. To encourage participation, Engine Yard incorporated PBLs and other
gamification tactics to boost participation and reward users for making contributions to the
community. These actions successfully increased user-generated content for its customer
self-help portal, decreasing the number of support tickets and reducing the demand on
support staff.
These examples show the diverse range of business processes that have benefited from
gamification. The literature review will now focus on the relationships between BI and
gamification and look to uncover any evidence of front-end BI tools that have been
gamified.
2.3.4 Gamification platforms for Business Intelligence tools
There is considerable overlap between the aims of both gamification and BI. RedCritter, who
offer business solution software that enables enterprises to manage, showcase, and reward
employee achievements, utilize game elements as an integral part of their social enterprise
platform by incorporating Profiles, PBLs, Rewards and Skill tracking into their customers’
existing BI processes. RedCritter works with Tableau, a leading self-service BI visual data
discovery tool vendor, and Microsoft Excel to provide BI and analytics. RedCritter integrates
Tableau and Excel with their enterprise gamification platforms with RedCritter Product
Manager, Jenness, D, (2014), claiming that this type of enterprise gamification of BI leads to
“valuable insights about employee performance and engagement” and “enables self-service
data visualization and behavioural insights”. Swoyer, S. (2012) states in his article for the
20 | P a g e
TDWI that gamification has particular resonance with BI and analytics, where the search for,
and discovery of, insights already has a game-like feel to it. Gamification advocates want to
amplify this effect to intelligently apply game-like concepts and methods to BI and analytics.
The article continues with: "It's a question of game play: of how we can make [interacting
with] BI more engaging. For example, you want to get people into the flow where they're
asking questions continuously, where they're following [an analysis] from one question to
another. Where questions lead to insights, and vice versa" lead analyst at the Information
Management company, Ovum, Madan S. (2013), identified that many BI systems resemble
gamified systems in that they: “Seeks to engage business users and change organizational
behaviours to improve business performance and outcomes. Gamified functions also
typically generate a lot of data for analysis. The key is providing users with an immersive
data experience that drives them to improve on that information through exploration and
feedback.”
Madan. S, recognises that gamification and BI “are both are highly complementary” and
gamification can be seen as a way to further operationalize BI by embedding it seamlessly
into everyday knowledge work, albeit in a competitively friendly and fun way.
Research points towards the correlation between gamification and SSBI with a blog post on
Decision Hacker (2012) suggesting SSBI could reasonably be defined as an early attempt to
gamify the workplace this statement is also championed by Werbach, K. (2014). Its overall
goal is intended to engage the workforce and align organisational behaviours through
carefully designed elements. This statement may seema little premature as it is unclear that
using game elements with business processes and applications can become a viable, long-
term concept that meets business objectives (Madan, S. 2013).
2.3.2 Summary
 There is considerable overlap between the aims of gamification and BI.
 Enterprise gamification platforms are now being integrated with BI tools such as
Tableau.
 Gamification has been proven to increase user engagement with business processes,
tasks and tools.
 Gamification must be closely aligned with business objectives to be successful in the
workplace.
 As yet there is no credible academic research suggesting gamification can increase
user engagement with individual BI tools.
2.4 Literature review conclusion
The following section contains the findings from the three subject areas discussed in the
literature review and how they are connected. It also gives justification for further research
into the main points the report aims to address.
21 | P a g e
2.4.1 User Engagement
User engagement with front-end BI tools has flat lined at around 22%-26% for almost a
decade now. The review of literature entertains the idea that adding gamified layers to
front-end BI tools could have an effect on user engagement with the given tools. What this
research has attempted to reveal is that to take user engagement with front-end BI tools to
the next level, organisations will need a clear strategy that makes user engagement a
priority. Gamification platforms and solutions maybe one way of addressing this priority but
no credible academic evidence of this is currently available.
2.4.2 Enterprise Gamification relationship with BI
Many industry leaders agree that gamification may very well change the face of BI. With the
emergence of enterprise gamification platforms from providers such as Badgeville,
Bunchball, and Redcritter, more and more business processes have been successfully
gamified. Research shows little evidence of the gamification of individual BI tools. What is
more relevant is the increasing number of enterprise gamification platforms being provided
for BI vendors, with particular focus on VDD tool vendors. But as this is also a very recent,
and still emerging field it provides very little in the way of measurable results to support the
claims that these platforms will be successful applied to BI and in particular BI front end
tools.
2.4.3 Motivational Theory linked to SSBI and Gamification
The Literature review revealed SSBI and gamification share a common use of the motivation
theory, with the focus on intrinsic motivation, in an attempt to increase loyalty,
engagement, and collaboration. The relationship and similarities between both these
subject areas highlight the importance of what motivates individuals to engage with certain
tasks, processes or (more importantly for the purposes of this report) BI tools.
2.4.4 Summary
The key theme of the literature review clearly shows that there is considerable overlap
between the aims of BI and gamification and that BI systems can indeed resemble
gamification systems. Gamification platforms can generate valuable insights into user
engagement and therefore would be a good starting point for exploring the idea of its
potential effects on user engagement with BI tools. The literature review shows early
indications that by gamifying BI tools, especially front-end tools, user engagement with the
tool may very well increase. With the key theme and findings from the literature review,
further research on the exploration of the gamification of BI tools and the effects on user
engagement can be justified.
Chapter 3: Research methodology:
The purpose of this chapter is to define the type of research that was carried out through an
identification and selection process and to explain the research approach, strategy and
associated methods chosen for the data collection and analysis. The challenges and ethical
issues that were encountered as well as the modifications that were made throughout the
research journey are also presented. A discussion on the ‘reliability and validity’ of the
research is provided and latterly, a conclusion is reached.
22 | P a g e
“Qualitative and quantitative research methods have grown out of, and still
represent, different paradigms. However, the fact that the approaches are
incommensurate does not mean that multiple methods cannot be combined in a
single study if it is done for complementary purposes”
Sale, J, Lohfeld, M, Brazil, K (2002)
3.1 Selection criteria
Quillan (2011) insists that it is good practice and wise to reiterate what the main objective
is, as it serves to reinforce what is being measured and how it fits with the research
questions. The main research objective is: “to address, and asks the question, whether the
Gamification of BI tools can affect user engagement.”
Specific study objectives have been formulated, which are:
 To address the issues surrounding user engagement with BI tools.
 To explore the gamification of BI tools and the effect, if any, on user engagement
with the tools.
To carry out the research on the study objectives it has been decided to use a mixed
methodology which will help gather both qualitative and quantitative data. This was
deemed the most appropriate approach to a study which is exploratory in nature as each
approach has the potential to enhance and/or complement the other in knowledge gained
on the same research problem, while each remains true to its own identity (Salomon, 1991).
The mixed methodology approach adopted throughout is designed to carry out relevant and
valuable research. According to Carey (1993), quantitative and qualitative techniques are
merely tools; integrating them allows us to answer questions of substantial importance.
3.2 Project GamBIT
This section will introduce the experimental study named Project GamBIT which forms part
of the primary research for the report objective. To gain a better understanding of the
research methodology it is important to have a clear understanding of what the prototype
purpose is, how it was developed and how it will be used.
Project GamBIT is centred on the main themes covered in the literature review, BI, user
engagement with BI tools and gamification. Its objective is to address a worldwide issue of
“lack of user engagement and adoption by users of BI tools (employees) throughout the
business world”. The study is unique in that the concept of “gamifying” a BI tool would see
an increase in user engagement with the tool. As yet this subject has lacked academic
research which has resulted in a limited existing body of knowledge.
Project GamBIT is a software prototype that has been designed and developed in an
attempt “To apply the concept of gamification to a business intelligence tool and to evaluate
what effect it has on user engagement levels” (Miller,S, 2013), I joined the study at the early
23 | P a g e
stage of testing and evaluation of the prototype. My part in the study was to aid in Project
GamBIT development and gather evidence whether GamBIT achieved or not, increased user
engagement with a BI tool. To aid GamBIT application development this report will identify,
describe and apply appropriate research methods to gather feedback on early versions of
the GamBIT prototype which include
 Use of the GamBIT prototype
 Feedback on the experience
 Ideas on improvements to the prototype
The GamBIT tool was developed using the Eclipse BIRT Java platform
http://www.eclipse.org/birt/ . Eclipse BIRT is an open source technology platform used to
create data visualizations and reports that can be embedded into rich client and web
applications. This tool has many advantages over other BI software tools and is particularly
suitable for developing or dismantling, rebuilding and customising, which this project
requires. It has allowed the developer, PHD student Stephen Miller, to strip back and
“gamify” the tool. This was achieved by dismantling the tools framework and reassembling
the tool with additional layers which incorporated gamification.
Access to the BI software and Java developers’ platform was given in an attempt aimed to
give me a better understand how the GamBIT tool has been developed and at what stage
the project is currently at. To achieve these aims an understanding of the Java code used
within the developers’ platform was deemed necessary. This included access to and an
understanding of the Java files, folders, and source code used. Java code was then edited
which create a new configuration of the code and Gambit front end.
Appendix H shows screen dumps from the GamBIT tool. The screen dumps highlight the
gamification elements added to the Eclipse platform and the process undertaken by the
volunteers who took part in the gamified experiment. These steps are deemed necessary to
help
3.3 Ethical debate surrounding gamification and study participation
There are ethical issues surrounding gamification mainly the aspect that user of the
gamified system must be treated fairly and with respect. There must be a balance struck
between the desired actions or outcomes the gamification systemis looking to achieve and
the exploitation of the user. Bogost,I (2015) has described gamification as a form of
“exploitation-ware”. The results of a study into the ethical debate surrounding gamification
within an enterprise concluded that “Gamification could be seen as an unfair mechanism to
increase productivity with no real costs. In addition, it could increase pressure on employees
to achieve more or avoid being in the bottom of the list” (Shahri, A., Hosseini, M., Phalp, K,
Taylor, J. & Ali, R. 2014).
Some have argued that gamification can be used to confuse users and ignore what is
“reality”. Gamified systems that have been designed without considering the ethical issues
surrounding gamification can fundamentally undermine the business objectives that they
24 | P a g e
were set out to achieve. The counter argument put forward by DeMonte.A (2014) of
Badgeville states that:
"Gamification can never be successful exploitationware, because it only works when the
behaviours that are motivated are behaviours that the user wants to perform in the first
place. It's not some magic solution where you can manipulate users to perform behaviours
against their will.”
As gamification matures the ethical and legal issues surrounding it will undoubtedly
become clearer (Kumar,J.M. & Herger,M. 2013). But for the purposes of the research carried
out in this report the ethical debate surrounding gamification was carefully considered as
there are no clear best practices relating to the subject area.
The research involved groups of students who volunteered to take part in the experimental
stage. There are ethical considerations to take into account and as such all volunteers were
given an information sheet that fully explained their involvement in the study giving the
volunteers the freedom to out of the study at any point. Great care and consideration was
taken to put volunteers at ease and to make them fully aware of what was expected during
the experimental stage of the GamBIT prototype and during the interview process. The
intention was to protect the confidentiality of, and give anonymity to, volunteers.
3.4 Quantitative Research
This section discusses what quantitative research is, its goals, and how this approach was
applied to the aims and objectives of the report. Quantitative research is the systematic
empirical investigation of observable phenomena via statistical, mathematical or
computational techniques. (Given, M. 2008). Quantitative research methods have been
chosen as a means of “collecting ‘facts’ of human behaviour, which when accumulated will
provide verification and elaboration on a theory that will allow scientists to state causes and
predict human behaviour” (Bogdan & Biklen, 1998, p. 38). The ontological position of the
quantitative paradigm is that there is only one truth, an objective reality that exists
independent of human perception. (Sale, J, Lohfeld, M, Brazil, K 2002), This type of research
fits with the aims of the report in as much as it is a research method that can help facilitate
the process of measuring user engagement.
The approach applied to the quantitative research methods are as follows:
1. Apply the User Engagement Scale (UES) to the GamBIT prototype to measure user
engagement.
2. Analyse the data collected from the UES.
3. Document the results and findings using tables, charts and/or graphs
4. Interpret and summarise the results
25 | P a g e
3.4.1 Measuring User Engagement
To develop an approach to measuring user engagement the question of “how can we
measure user engagement?” must be answered. O'Brien, H.L., & Toms, E.G. (2008) have
conducted several studies focusing on the assessment of engagement and believe the
following factors are considered to be most relevant in measuring user engagement:
 Perceived usability - user’s affective (e.g. frustration) & cognitive (e.g. effort)
responses
 Novelty - user’s level of interest in the task and the curiosity evoked
 Aesthetic appeal - user’s perceptions of the visual appeal of the user interface
 Focused attention - the concentration of mental activity, flow, absorption etc…
 Felt involvement - user’s feelings of being ‘drawn’ in, interested and having ‘fun’
 Endurability - user’s overall evaluation of the IS e.g. likely to return/recommend
Given the belief that the factors listed are considered the most relevant to use with the
GamBIT tool, the user engagement scale (O’Brien, H.L. & Toms, E.G. (2013) (2008)) was
chosen to collect the quantitative data. The UES has been modified to fit the needs of the
GamBIT tool.
Research suggests there is no “perfect” or “complete” way of measuring user engagement,
there are several different methods that could have been applied to project GamBIT to
produce the quantitative data needed for this study. Through research the UES was
considered best as it considers the most relevant factors in measuring user engagement.
Others such as the System Usability Scale (SUS) where considered but the developer
dismissed this “quick and dirty” scale as it was considered “one-dimensional” and the
questionnaire is, by its own nature, quite general. The User Engagement Scale (UES)
(Appendix G) was applied to measure user engagement Gambit’s software prototype and
collected quantitative data which was used to test the theory that whether GamBIT
achieved or not, increased user engagement with a BI tool.
3.5 Qualitative Research
Qualitative research methods have been chosen as a way to produce findings not arrived at
by means of quantification i.e. the UES. Qualitative research is based on interpretivism
(Altheide and Johnson, 1994; Kuzel and Like, 1991; Secker et al., 1995) and constructivism
(Guba and Lincoln, 1994). Interpretivism naturally lends itself to qualitative methods. It is, in
its simplest form, an ideal means of exploring individuals’ interpretations of their
experiences when faced with certain situations or conditions (Woods & Trexler, 2001).
The qualitative research will attempt to understand an area which little is known, in this
case the main theme of the report exploring the gamification of BI tools and its effects on
user engagement, and to obtain intricate details about the feelings, thoughts, and emotions
that are difficult to extract and/or learn about through quantitative research methods. In
this case the feelings, thoughts and emotions of the volunteers who took part in the GamBIT
experiment. Strauss, A, and Corbin,J, (1998) study of the basics of qualitative research
26 | P a g e
points to the three major components of quantitative research. The three points below
highlight how these components relate to this project:
1. The data. Which will come from semi structured interviews.
2. The procedures used to interpret and organise the data. Coding
3. The Analytical process. Taking an analytical approach to interpreting the results and
findings and including these in the report.
Qualitative data analysis consists of identifying, coding, and categorizing patterns or themes
found in the data. Data analysis was an ongoing, inductive process where data was sorted,
sifted through, read and reread. With the methods proposed in this report, codes are
assigned to certain themes and patterns that emerge. Categories are formed and
restructured until the relationships seem appropriately represented, and the story and
interpretation can be written (Strauss & Corbin, 1998)
The following section describes the methodological stages undertaken during the qualitative
research and can be loosely attributed to the grounded theory approach (Strauss, A, and
Corbin,J, 1998).
3.6 Methodological stages
This section contains a step by step process on the methodological stages used to conduct
the qualitative research. The methodological stages and how they are connected is shown in
figure 3.6 below.
Figure 3.6 Qualitative research methodological stages
The first part of the process was identifying the substantive area. The area of interest for
this report being the exploration the gamification of BI tools and the effects on user
engagement.
27 | P a g e
The study is about the perspective of one (or more) of the groups of people of the
substantive area who will comprise your substantive population. In this study University
students who are part of the School of Engineering and Computing at UWS, Paisley.
To collect data pertaining to the substantive area, conversing with individuals face-to-face
by means of a semi-structured interview was considered most appropriate.
The process of open coding was carried out as the data was collected. Open coding and data
collection are integrated activities therefore the data collection stage and open coding stage
occur simultaneously and continue until the core category is recognised/selected. Eventually
the core category and the main themes became apparent; the core category explains the
behaviour in the substantive area i.e. it explains how the main concern is resolved or
processed. This projects main concern was lack of user engagement with BI tools and the
core category was “whether the gamification of BI tools effects user engagement”.
3.6.1 Steps involved in open coding
The following section gives an overview of the steps involved during the process of open
coding.
1. The transcripts where read and first impressions note. The transcripts where read
again with microanalysis of each line carried out.
2. The following relevant pieces where then labelled- words, sentences, quotes,
phrases. This were based on what was deemed relevant to the study and included
thoughts, concerns, opinions, experiences, actions.
This type of analytical process aims to address what is considered relevant to exploring the
gamification of BI tools and the effects on user engagement. During this process the
following possibilities were looked at
 Repeating data.
 Surprises in the data.
 Relevance to objectives.
3. The next step focused on deciding which codes where most important and to create
categories by bring codes together. Some codes where combined to create new
codes. At this point some of the codes deemed less relevant where dropped. Codes
considered important where then group together allowing for the creation of the
categories.
4. The next step focused on labelling relevant categories and identifying how they are
connected. Comparative analysis was used as a means of labelling. The data
contained within the categories made up the content of the main results.
5. The results and analysis were written up.
Memos where written throughout the entire process. This helped in the interpretation of
the results and analysis with some memos written directly after the semi-structured
interviews were conducted.
28 | P a g e
Chapter 4: Experimental and Interview Process:
During the initial development of the GamBIT prototype a number of tests were conducted
to help evaluate the prototype. An approach was made to a number of students, from the
School of Engineering and Computing at the University of the West of Scotland (UWS),
Paisley who had shown an interest in work being carried out in this report. This was done
through direct observation of volunteers who had agreed to test the prototype. This was
done in an attempt to observe their interaction with the prototype and with the Eclipse
platform. The main areas under observation where
 Length of time to complete the tasks
 Navigation of the platform
 Reaction to the gamification elements
After the tests were conducted feedback was given by the volunteers which included
 Incorporate rewards such as badges when a task is complete
 Simplification of the game based rules
 Reworking of the tutorial to highlight every step of the process involved in carrying
out the tasks.
Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The
estimated time to be applied to the actual experiment was around 45 to 60 minutes. This
gave the developer time to re-evaluate the prototype and make the necessary changes prior
to the experiments being carried out.
4.2 GamBIT Experiment
In an attempt to appeal for volunteers, students from the school of Engineering and
Computing at the University of the West of Scotland, (UWS) Paisley where approached to
take part in the GamBIT experimental study. The following section includes how the appeals
were made, justification for selection, and estimated duration of the experiment.
4.2.1 Appeal for Volunteers
The GamBIT developer approached the lecturer of a 1st year class studying the module
‘Introduction to Computer Programming’ and asked if he could appeal to students to
volunteer for the experiment. These students where familiar with the Eclipse software
platform as they were learning Java programming through the use of this platform,
therefore, they were familiar with the layout of the Graphical User Interface (GUI). It is
worth noting that many of the students had little experience using BI tools.
The second group was a 3rd year group of students who were currently studying a BI
module and therefore where familiar with BI and had experience of using a BI tool. An
approach was made to the lecturer of BI class by the researcher to ask if an appeal to
students from the BI class was possible. The lecturer agreed, and subsequently all students
where emailed prior the appeal to give notice of the appeal (Appendix B). A five minute
overview of the project and the experimental study was given and then an appeal for
29 | P a g e
volunteers was made. Students were given the opportunity to ask any questions or state
any concerns. They were then advised of the time and location of the experiment and finally
thanked for their time.
The last group consisted of 4th year (Honours) students who were studying Business
Technology. These students were chosen as they would (hopefully) provide a more critical
viewpoint and assessment of the tool as they were in the last year of their studies and had a
broader experience of BI, BI applications and associated tools.
One hour time slots booked at the UWS labs for the GamBIT experiment to take place. The
estimated completion time was forty minutes. Given scope for late arrivals and varying
completion times by volunteers, one hour was deem sufficient for all volunteers to fully
carry out the experiment.
Further experiments where undertaken by other volunteers who showed an interest in the
project. These experiments where conducted over several days in the Labs at UWS.
4.2.2 Experiment
Volunteers were randomly split into Group A (control - BI tool only) and Group B
(experimental - ‘GamBIT’ tool). The random split was deemed necessary as it was a
fundamental requirement of the test design under scrutiny. Both groups where issued
envelopes on arrival containing a USB stick (with JAVA coding installed, pen, a guide to
launch software, a guide to complete the exercise and a User Engagement scale.
Group A were given USB sticks with a JAR file named: NonGambit.install.data. This file, once
installed integrated new Java programming code that generated text files (.txt extensions)
onto the USB stick whenever a user had clicked certain buttons during each of the 6 BI tasks.
Group B where given USB sticks that contained a JAR file named: Gambit.Install. This file,
once installed integrated new Java programming code that created the ‘GamBIT’
gamification techniques on all of the 6 BI tasks on the exercise tutorial. It also created text
files for the collection of a number of different qualitative and quantitative data and wrote
this data to the new text files on the USB stick during the experiment.
The volunteers were briefed on the support available during the experiment and advised
that help was available at any time from the three observers present (researcher, developer
and moderator).
On completion of the experiment every volunteer was thanked for their time and
participation. All UES, USB sticks, and pens where then collected, sealed in their given
envelopes, and split into 2 piles, group A and Group B. The data was then collected and
analysed over the next few weeks. (The results and analysis are covered in chapter 5).
4.3 Interview process
30 | P a g e
The semi structured interviews were conducted on 4 participants who took part in the
experiment. Each interview followed a similar theme based around 3 main objectives
(Appendix D).
1. To understand what each participant felt about the application of gamification
techniques to a business intelligence tool and to determine what effect it had on their
level of user engagement.
2. To ascertainhow eachparticipant felt during the test, their reasons for feeling the way
they did and to glean further information from them over and above the survey data.
3. To gather qualitative evidence from each participant on a wide range of relevant
issues concerning lack of user engagement with BI tools and to use quotes by them
as to their opinions, views, suggestions and constructive criticism.
Initial contact with each participant was made through a response to feedback given after
the experiment in which they expressed an interest in taking part in the interview process.
An email was sent stating the following points that were to be addressed prior to the semi-
structured interviews being conducted.
• Explanation of the purpose of the interview
• Addressed terms of confidentiality - The participant’s informed consent was
voluntarily achieved by means of a disclaimer attached to the (UES).
• Explained the format of interview
• Indicated how long the interview may take
• Asked them if they have any questions
• Asked for consent in the recording the session
The email contained details of the proposed dates and times, approx. duration and location
of each interview. Further correspondence took place until eventually pre-determined times
and dates where agreed with each participant. Given the busy schedules of all the
participants each interview was conducted at various places within the University campus
and on separate days. It was necessary to follow up with the participants as quickly as
possible after the experiment was conducted to keep the thoughts and feelings of
participants’ as fresh in their memory as possible.
Chapter 5: Results and Analysis:
This chapter will document the findings gathered from the collection of quantitative and
qualitative data. It will focus on the results and analysis from the experiment then
document the results and interpretation of the semi-structured interviews that were carried
out on four participants.
5.1 Quantitative data
31 | P a g e
This section contains the results and analysis of the survey data, the UES data, and the data
collected relating to the participants time spent during the experiment.
5.1.1 Participant Results and Analysis
The experiment attracted a total of 68 participants (n= number of participants, (n = 68)) and
were ‘randomly’ split into one of two groups (A/B). Table 5.1 shows that there was an
almost even split.
As slightly more participants (n = 2) used the BI tool only (non-gamified version) showing the
random nature of the group split. All statistical analyses has been conducted with this slight
differentiation.
Group A/B Group name
No. of
participants
(N =)
N=
%age
A
Control group using the
BI tool only
35 51.5%
B
Experimental group
using the GamBIT tool
33 48.5%
Table 5.1 Group A/B split
Table 5.2 shows the spread among the 3 groups of participants by UWS class/course. The
largest group was the 1st year students, of which 46 took part. The 3rd and 4th year students
consisted of 17 and 5 participants respectively. When the initial approach was made to the
3rd year student, the class consisted of around 40 students however, less than half of those
invited took part (n = 17) which accounted for 25% of the cumulative total. The 4th year
students consisted of 5 participants (n = 5, 7%).
Participants Frequency Percent
Valid
Percent
Cumulative
Percent
Valid 1st year - Intro to
Programming
46 67.6 67.6 67.6
3rd year - BI class 17 25.0 25.0 92.6
4th year Hons – Comp.
Science
5 7.4 7.4 100.0
Total 68 100.0 100.0
Table 5.2 - University Course distribution
32 | P a g e
Figure 5.1 shows the spread among the groups of participants by UWS class/course in a bar
chart.
Figure 5.1 University Course distribution
Table 5.3 shows how the three groups of volunteers were divided and allocated to the two
groups (Group A/B) during the experiment by their different university courses. This helps to
demonstrate the randomisation of the participants. The table shows a very close division
and split between the three groups and their respective student courses. From the optimum
50/50 split the largest group (1st year students) shows a +/- 2% (52%/48%) difference, with
the other groups following a similar pattern.
Figure 5.2 shows the same information in a Bar Chart.
Table 5.3 Group type A/B * University Course - Cross-tabulation
33 | P a g e
Figure 5.2 Group type A/B * University Course - Cross-tabulation Bar Chart
Summary
 The grouping of participants was completely random.
 There was an almost even group A/B split.
 1st year students made up the majority of participants (68%).
 3rd year students made up 25% of participants with a total of 17 taking part. The
number was lower than expected given the class size of 40+ students.
5.1.2 Survey Information Results and Analysis
The following section gives an overview of the survey information gathered from each
participant prior to completing the UES (Appendix G). The data is based on the responses to
four questions (Q).
1. What is your gender?
2. What is your age?
3. On average, how often have you used a business intelligence (BI) tool at work or
study before?
4. On average, how often have you played any kind of video/app/mobile game before?
Q1.
34 | P a g e
Table 5.4 shows the gender split with only 8 females participating (12%) in the experimental
study compared to a larger male participation of 60 (88%). To show the randomness of how
males and females where allocated to their respective groups (control and experimental),
Table 5.5 show the cross tablature distribution.
Table 5.4 Gender Split
Table 5.5 Gender * Group type A/B Cross-tabulation
The random nature of the allocation of test groups shows that no prior consideration was
made to ensure there was a more even distribution of males and females within the groups.
Figure 5.3 highlights the lack of female participants, this was an unfortunate circumstance
that was out with the scope and control of the researcher.
35 | P a g e
Figure 5.3 Gender distribution among the represented UWS courses
Q2.
The age distribution of participants is shown in table 5.6 and clearly shows the 18-24 age
range represented the highest majority (68%). A more even distribution was between the 25-
29 and 30-39 years age range.
Table 5.6 Age range distribution
Figure 5.4 shows the same data in a pie chart.
36 | P a g e
Figure 5.4 Age range distribution Pie Chart
Q3.
Figure 5.5 shows cross tabulation results that emerged when participants where asked how
frequently they had used a business intelligence (BI) tool.
37 | P a g e
Figure 5.5 BI usage by University course
More detailed analysis can be seen in Table 5.7. The fact that 0% of 4th year students had
never used BI tools was an expected result given that they would be considered the most
experienced in using BI tools. What was surprising is that 5% of the 3rd year students have
never used a BI tool given the course content for 3rd year BI students. A high percentage of 1st
year student (25%) had never used a BI tool before. A more even distribution can be seen
between the 1sts year students’ BItool usageof 2or3 times a week and onceortwicebefore.
38 | P a g e
Table 5.7 BI Usage by University Course
Q4.
When asked the final survey question about how frequently they had used video, application
or mobile games before, table 5.7 shows the participants’ answers.
Table 5.7 Games usage
Figure 5.6 shows the same information in a pie chart.
Figure 5.6 Frequency of games usage
39 | P a g e
Summary of survey data:
 68 people participated in the experiments
 They were split into the 2 groups almost equally: group A (51.5%), group B (48.5%)
 3 UWS classes were selected from the 1st year (68%), 3rd year (25%) and 4th year -
Honours (7%) within the School of Engineering and Computing
 The gender split was: male (88%) and female (12%)
 The majority age group was in the ‘18-24 years’ category (68%)
 44% of participants had ‘never’ used a BI tool before and a further 20% only ‘once
or twice’ before
 As expected, most of the participants play video/mobile games on a ‘daily’ or ‘two
or three per week’ basis (c.80%)
5.1.3 UES statistical Results and Analysis
All of the UES data from the 68 participants was inputted into a statistical package software
tool known as SPSS (version 23) by the GamBIT Developer. This allowed for a wide range of
statistical testing to be conducted on the survey data (Appendix A), an overview is provided
in the tables, charts and statements below.
The median (middle) score was found for each variable for all 68 cases. The mean of the
medians were calculated for all of the 6 sub-scales (factors to be measured in the UES) as
shown in tables 5.8 and 5.9.
Table 5.8 User Engagement (UE) factor scores for Group A: Control – BI tool only
40 | P a g e
Table 5.9 User Engagement (UE) factor scores for Group B: Experimental – GamBIT
The mean of all six factor mean scores for both groups can be seen in table 5.10.
Table 5.10 Mean of all 6 factor mean scores for Groups A/B
5.1.4 User engagement highest ranking factors
Based on the results and analysis of the mean scores, experimental group B (GamBIT) had
Perceived Usability (PU) and Novelty (NO) ranked 1 and 2 respectively.
41 | P a g e
Table 5.11 Ranking of lowest mean score by factor - Group B (GamBIT)
The score in brackets at the end of each statement is the %age of respondents who either:
strongly agreed (1) or agreed (2) with the statement. The PU and NO statements are:
Perceived Usability (PU):
 PU1 - I felt discouraged using the tool (70%) – 7th
 PU2 - I felt annoyed using the tool (72%) – 6th
 PU3 - Using the tool was mentally taxing (73%) – 5th
 PU4 - I found the tool confusing to use (76%) – 1st
 PU5 - I felt frustrated using the tool (76%) – 1st
 PU6 - I could not do some of the things I needed to on the tool (74%) – 4th
 PU7 - The tool experience was demanding (76%) – 1st
Novelty (NO):
 NO1 - The content of the tool incited my curiosity (52%) – 2nd
 NO2 - I would have continued to use the tool out of curiosity (45%) – 3rd
 NO3 - I felt interested in my BI tasks on the tool (61%) – 1st
Group A (control group) had Perceived Usability (PU) and Endurability (EN) ranked 1 and 2
respectively.
42 | P a g e
Table 5.12 Ranking of lowest mean score by UES Factor - Group A (control)
The PU and EN statements are:
Perceived Usability (PU):
 PU1 - I felt discouraged using the tool (76%) – 3rd
 PU2 - I felt annoyed using the tool (79%) – 2nd
 PU3 - Using the tool was mentally taxing (67%) – 6th
 PU4 - I found the tool confusing to use (76%) – 3rd
 PU5 - I felt frustrated using the tool (82%) – 1st
 PU6 - I could not do some of the things I needed to on the tool (64%) – 7th
 PU7 - The tool experience was demanding (76%) – 3rd
Endurability (EN):
 EN1 - The tool experience did not work out the way I had thought (64%) – 1st
 EN2 - I would recommend the tool to appropriate others (61%) – 2nd
 EN3 - Using the tool was worthwhile (61%) – 2nd
 EN4 - My tool experience was rewarding (52%) – 4th
The results suggest that the participants in both groups did not find either of the tools a
hindrance, demanding, or confusing in any significant way. They seemed to be able to
accomplish what they were asked to without any great difficulty.
The ‘Endurability’ (EN) aspect is associated with the users’ overall evaluation of the
experience, its worthiness and recommendation value for others to use the tool. For the
control group this factor was ranked 2nd highest. Interestingly, EN4 ranked lowest and
suggests their experience could have been more rewarding with EN1 indicating the overall
experience could have been better.
43 | P a g e
The Novelty factor, which is associated with the curiosity the tool evoked, interest levels,
and surprise elements ranked higher for GamBIT users. This suggests that GamBIT users
where more interested in the BI tasks they were asked to complete. Interestingly, over half
the experimental group (52%) stated that the content of the tool incited their curiosity
(NO3). Given that gamification aims to make tasks more fun, engaging and intrinsically
motivating the results demonstrate the developers attempt to add these elements to the
gamified BI tool.
Results from the statement ‘I felt interested in my BI tasks’ scored 61% with the GamBIT
group compared to the control group who only rated this statement at 45%. The %age
difference of 16 points, which is an increase of 35%, can be seen in table 5.13. This can be
interpreted as a significant difference in the level of interest shown by both groups.
Table 5.13 NO3 ranking score Group A/B
5.1.5 User engagement lowest ranking factors
Focused attention (FA) factor scored lowest for both groups. The FA factor is associated with
the concentration of mental activity including elements of flow, absorption and time
dissociation in the tasks. The results highlight that participants appeared to be more
concerned with their tasks than the actual BI tools. This suggests that the gamification
elements did not fully absorb the participants and that they seemed more focused on task
completion.
Focused Attention (FA) statements:
↓
Group: → BI tool only
(A)
GamBIT
(B)
%age Ran
k
%age Ran
k
FA1 - When using the tool, I lost track of the world 21% 7th 21% 7th
44 | P a g e
Table 5.14 Comparison of FA scores by * Group A (control) / B (experimental)
5.1.6 Summary of UES data
 The highest rated variables (statements on the UES survey) for both groups was
different i.e.
Group A - did not find the BI tool frustrating (82% strongly agreed or agreed)
Group B – did not find the GamBIT tool confusing, frustrating or demanding (76%)
 Novelty ranked high with the GamBIT group and seen a significant difference in
results from the control group.
 Perceived usability was ranked highest by both groups.
 Lowest ranked factor for both groups was focused attention. This suggests
participants where not fully absorbed in the gamification elements, with task
completion being of higher importance.
5.1.7 Time taken to complete tasks results and analysis
This section shows the results from the data collected relating to the time taken to complete
each task and includes the optional task 6 results. This section also questions whether the
gamification of a BI tool places additional time constraints on participants.
The results from Group A (control- BI tool) revealed that Task 2 (T2 - building a data source)
was the quickest time at 2 minutes and 11 seconds. There were two tasks that took on
average over 8 minutes i.e. T4 (formatting the data) and T6 (creating a report title) with T4
taking the longest time to complete at 8 minutes and 36 seconds.
The results from Group B (GamBIT) revealed that Task 2 (T2 - building a data source) was
also the quickest time at 2 minutes and 30 seconds which was the same task as group A -
only a little slower (19 seconds). Task 6 (T6),creating a report title, was an optional task.
Given that it was introduced at the end of the experiment, participants by this point may
have been somewhat disengaged. It is a good gauge to measure if the participants were still
around me
FA2 – I blocked out things around me when using the
tool
30% 5th 30% 5th
FA3 - My time on the tool just slipped away 45% 3rd 45% 3rd
FA4 - I was absorbed in my BI tasks 54% 2nd 58% 1st
FA5 - I was so involved in my BI tasks that I lost track
of time
58% 1st 45% 3rd
FA6 - During this experience I let myself go 33% 4th 33% 4th
FA7 - I lost myself in the tool 22% 6th 22% 6th
45 | P a g e
engaged in the tasks. T6 took the longest time to complete at 8 minutes and 06 seconds,
some 30 seconds quicker than the control group (A) which is a good result for the research.
The overall mean times to complete all six tasks are detailed below:
• Group A (BI only) -32 minutes 31 seconds
• Group B (GamBIT) - 30 minutes 25 seconds
• Time difference - 1 minute 54 seconds (in favour of GamBIT)
To answers the question whether the gamification of a BI tool places additional time
constraints on participants, evidence of time differences shows that there are no significant
time disadvantages or distractions. Results show the opposite appears to be true as the
times to complete tasks are quicker which is a positive result in regards to the research.
5.1.8 Summary of time taken to complete tasks
 The participants who used the GamBIT tool took less time to complete the six tasks.
 GamBIT group had more participants complete the additional task (n=16)
 Task 4 took longest to complete.
 Using the GamBIT BI tool lead to tasks being completed quicker compared to the
non-gamified tool.
5.2 Qualitative data
This section will give an overview of each of the interviews conducted and report on the key
finding under each of the main categories.
The interviews were based on the experiences of each participant when carrying out the
GamBIT experiment. It looked to glean more information over and above the quantitative
data collected by the application of the UES. To explore key issues further questions were
based on -
 Their experiences with BI tools in general.
 Their thoughts on gamification, in particular the gamification of BI and BI tools.
46 | P a g e
 Their experiences of the use of BI in the workplace with a focus on any issues,
obstacles and concerns.
 Their thoughts on user engagement with BI tools.
Full transcripts of all four interviews can be seen in Appendix C.
The following section will report key findings under each of these main categories.
 Game Elements
 GamBIT experiment
 Concept
 Enterprise Gamification
 Gamification of BI tools
 User engagement
A snippet of the coding process is provided to give a clearer understanding of how the
results of the coding were analysed and then interpreted.
Table 5.2.1 Sample of coding classifications. Taken from a Microsoft Excel file.
5.2.1 Participant A
Game elements
The gamification elements added to the BI tool where an unwanted distraction taking them
away from completing the tasks, stating that “I never really paid attention” and “ I never
looked at the leaderboard, never read it to see what it said”. When discussing what is the
most important features of BI tools their response was “functionality of the tools is most
important”. All of which suggests the gamification elements where not as important as
actually completing the given tasks.
GamBIT
The participant stated that coming into the experiment “I wasn’t looking to enjoy it.” The
Eclipse platform lacked the visual elements (aesthetics) needed to keep them engaged with
the task and experienced issues with the platform layout “I think it was not very user
friendly everything was clumped together. I lost one of the elements when carrying out the
task of sorting and it proved hard to find. I could not move the element back to where they
should be”. This proved to be a major issue with the Eclipse platform. This suggests that the
47 | P a g e
participant is a visual person that likes software platforms that have a familiar GUI and are
easy to navigate. I would be safe to assume that the Eclipse platform was not as user
friendly or aesthetically appealing compared to other BI platforms they had used. This
contributed to a lack of engagement with the gamified BI tool.
Concept
The concept of the mountain climber “bagging a ben” was not something they were
particularly interested in. The following quote highlights this by stating “Maybe if it was
something different (concept), as bens and mountains I am not interested in. Maybe if it
was focuses along with something that interested me a bit more maybe I would have
focused but I just clicked through it”. Suggesting that if the concept was more tailored to
them, the overall experience could have be more engaging.
Enterprise Gamification
The participant stated a personal view on how enterprise gamification could benefit an
organisation “it would really depend on staff’s attitude to the software or tools”. Asked if
this form of gamification could increase user engagement in a BI environment they stated “I
don’t think it is going to create engagement personally”.
Gamification of BI tools
When asked about their wider views on gamifying BI tools the participant states “I think BI
tools are used by professional who know how to use them and realise how critical the
information is. It would be good for learning (gamifying a BI tool)… like teaching people to
use the BI tool. So for learning purposes yes, but on the whole may slow people down”.
The participant explored the idea of gamification as a possible aid in learning to use BI tools,
quoting “As a lot of these new tools can be frustrating and maybe having a pop-up or
reward saying you have achieved may help out there. I see its place as a teaching aid for a
new tool. But using the tool for a long period of time may get more people annoyed”.
User engagement
On the subject of user engagement with gamified BI tools the response was “personally it is
not something I would engage with I don’t think, it’s not something that if added to a (BI)
tool, especially a tool I was not keen on using, would make me use it “. When describing
their feelings during the experiments “I don’t think I was overly engaged or lost track of
myself in it” and commenting on the concept “using mountains just didn’t engage me”. It is
clear that the participant actively “disengaged” with the gamified BI tool therefore the tool
had no positive effect on user engagement.
The following points stood out when writing up the memos
 To engage user’s, visualisation though the use of colours was important.
 The gamification concept has to resonate with each individual user and provide a
variety of game-based activities that appeals to them.
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub
Final.Dissertation.Sub

More Related Content

Similar to Final.Dissertation.Sub

Mekong Womens Entreprenuer Program
Mekong Womens Entreprenuer ProgramMekong Womens Entreprenuer Program
Mekong Womens Entreprenuer ProgramHetal Patel
 
Application guide 2022 563-kb
Application guide 2022 563-kbApplication guide 2022 563-kb
Application guide 2022 563-kb
Waqas Ahmad
 
gate Exam notification & broucher
gate Exam notification & brouchergate Exam notification & broucher
gate Exam notification & broucher
Jobs Blue
 
Effectiveness of using Facebook on increasing the brand awareness;
Effectiveness of using Facebook on increasing the brand awareness; Effectiveness of using Facebook on increasing the brand awareness;
Effectiveness of using Facebook on increasing the brand awareness;
Tharushika Ruwangi
 
Summer internship project report on online food app- TINYOWL
Summer internship project report on online food app- TINYOWLSummer internship project report on online food app- TINYOWL
Summer internship project report on online food app- TINYOWL
Sahil Jain
 
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
KBHN KT
 
Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Dragan Ocokoljic
 
Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Dragan Ocokoljic
 
EDLD808 Program Evaluation Final Project Final Paper - Online Education
EDLD808 Program Evaluation Final Project Final Paper - Online EducationEDLD808 Program Evaluation Final Project Final Paper - Online Education
EDLD808 Program Evaluation Final Project Final Paper - Online Education
Paul Gruhn
 
Aurora Dental Group Integrated Marketing Campaign
Aurora Dental Group Integrated Marketing CampaignAurora Dental Group Integrated Marketing Campaign
Aurora Dental Group Integrated Marketing CampaignMaureen Lepke
 
Undergraduate Dissertation
Undergraduate DissertationUndergraduate Dissertation
Undergraduate DissertationPatrick Cole
 
Does online interaction with promotional video increase customer learning and...
Does online interaction with promotional video increase customer learning and...Does online interaction with promotional video increase customer learning and...
Does online interaction with promotional video increase customer learning and...rossm2
 
Impact of dividend policy on value of the firm
Impact of dividend policy on value of the firmImpact of dividend policy on value of the firm
Impact of dividend policy on value of the firm
RUPANJAN NAYAK
 
Digital Convergence
Digital ConvergenceDigital Convergence
Digital Convergence
M V
 
Business Plan Group 4 E-Vita copy
Business Plan Group 4 E-Vita copyBusiness Plan Group 4 E-Vita copy
Business Plan Group 4 E-Vita copyAbhishek Patel
 
Summer internship report email marketing and mobile marketing
Summer internship report  email  marketing and mobile marketingSummer internship report  email  marketing and mobile marketing
Summer internship report email marketing and mobile marketingPreeti Verma
 
Tyler Wittkofsky Design Portfolio
Tyler Wittkofsky Design PortfolioTyler Wittkofsky Design Portfolio
Tyler Wittkofsky Design Portfolio
Tyler Wittkofsky
 
Terigal parking consultation report phase 1
Terigal parking consultation report phase 1Terigal parking consultation report phase 1
Terigal parking consultation report phase 1Transport Planning
 
Online shopping-project-documentation-template
Online shopping-project-documentation-templateOnline shopping-project-documentation-template
Online shopping-project-documentation-template
LaibaMalik17
 

Similar to Final.Dissertation.Sub (20)

Mekong Womens Entreprenuer Program
Mekong Womens Entreprenuer ProgramMekong Womens Entreprenuer Program
Mekong Womens Entreprenuer Program
 
Application guide 2022 563-kb
Application guide 2022 563-kbApplication guide 2022 563-kb
Application guide 2022 563-kb
 
Personal Carbon Allowances
Personal Carbon AllowancesPersonal Carbon Allowances
Personal Carbon Allowances
 
gate Exam notification & broucher
gate Exam notification & brouchergate Exam notification & broucher
gate Exam notification & broucher
 
Effectiveness of using Facebook on increasing the brand awareness;
Effectiveness of using Facebook on increasing the brand awareness; Effectiveness of using Facebook on increasing the brand awareness;
Effectiveness of using Facebook on increasing the brand awareness;
 
Summer internship project report on online food app- TINYOWL
Summer internship project report on online food app- TINYOWLSummer internship project report on online food app- TINYOWL
Summer internship project report on online food app- TINYOWL
 
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
Identifying and prioritizing stakeholder needs in neurodevelopmental conditio...
 
Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...
 
Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...Business Development Proposal Project for a Retail Merchandising Service Comp...
Business Development Proposal Project for a Retail Merchandising Service Comp...
 
EDLD808 Program Evaluation Final Project Final Paper - Online Education
EDLD808 Program Evaluation Final Project Final Paper - Online EducationEDLD808 Program Evaluation Final Project Final Paper - Online Education
EDLD808 Program Evaluation Final Project Final Paper - Online Education
 
Aurora Dental Group Integrated Marketing Campaign
Aurora Dental Group Integrated Marketing CampaignAurora Dental Group Integrated Marketing Campaign
Aurora Dental Group Integrated Marketing Campaign
 
Undergraduate Dissertation
Undergraduate DissertationUndergraduate Dissertation
Undergraduate Dissertation
 
Does online interaction with promotional video increase customer learning and...
Does online interaction with promotional video increase customer learning and...Does online interaction with promotional video increase customer learning and...
Does online interaction with promotional video increase customer learning and...
 
Impact of dividend policy on value of the firm
Impact of dividend policy on value of the firmImpact of dividend policy on value of the firm
Impact of dividend policy on value of the firm
 
Digital Convergence
Digital ConvergenceDigital Convergence
Digital Convergence
 
Business Plan Group 4 E-Vita copy
Business Plan Group 4 E-Vita copyBusiness Plan Group 4 E-Vita copy
Business Plan Group 4 E-Vita copy
 
Summer internship report email marketing and mobile marketing
Summer internship report  email  marketing and mobile marketingSummer internship report  email  marketing and mobile marketing
Summer internship report email marketing and mobile marketing
 
Tyler Wittkofsky Design Portfolio
Tyler Wittkofsky Design PortfolioTyler Wittkofsky Design Portfolio
Tyler Wittkofsky Design Portfolio
 
Terigal parking consultation report phase 1
Terigal parking consultation report phase 1Terigal parking consultation report phase 1
Terigal parking consultation report phase 1
 
Online shopping-project-documentation-template
Online shopping-project-documentation-templateOnline shopping-project-documentation-template
Online shopping-project-documentation-template
 

Final.Dissertation.Sub

  • 1. BSc (Hons) Business Technology An Exploration of the Gamification of Business Intelligence Tools andthe Effect onUser Engagement Gary Brogan B00272662 22nd April 2016 Supervisor: Dr Carolyn Begg
  • 2. 1 | P a g e Declaration This dissertation is submitted in partial fulfilment of the requirements for the degree of BSc Business Technology (Honours) in the University of the West of Scotland. I declare that this dissertation embodies the results of my own work and that it has been composed by myself. Following normal academic conventions, I have made due acknowledgement to the work of others. Name: GARY BROGAN Signature: Date: 22/04/2016
  • 3. 2 | P a g e Library Reference Sheet Surname- Brogan First Name- Gary Initials- GB Borrower ID Number- B00272662 Course Code – BTCS – COMPSCI Course Description – BSc Hons Business Technology Project Supervisor-Dr Carolyn Begg Dissertation Title- An Exploration of the Gamification of Business Intelligence Tools and the Effect on User Engagement Session- 2015/2016 Acknowledgements I would like to thank both Dr Carolyn Begg and PhD student Stephen Miller for their continued support and advice throughout this project. A special thank you goes to my wife Tracey Brogan, who has been completely understanding and supportive during my time at University, and my two daughters, Abbey and Liara, who have also fully supported me throughout this journey.
  • 4. 3 | P a g e Contents Abstract.......................................................................................................................................5 Chapter 1: Introduction:............................................................................................................... 6 1.1 Introduction to Key Themes.................................................................................................6 1.2 Aims and objectives of the project........................................................................................ 6 1.3 Research methodology and techniques used ........................................................................7 1.4 Project scope and limitations............................................................................................... 7 Chapter 2: Literature review:........................................................................................................8 2.1 Business Intelligence ...........................................................................................................9 2.1.1 Business Intelligence defined......................................................................................... 9 2.1.2 Current state of Business Intelligence........................................................................... 10 2.1.3 Summary.................................................................................................................... 12 2.2 User Engagement with BI Tools.......................................................................................... 12 2.2.1 What is employee engagement?.................................................................................. 13 2.2.2 What is User Engagement?.......................................................................................... 13 2.2.3 BI Adoption Rate......................................................................................................... 13 2.2.4 Summary.................................................................................................................... 15 2.3 Gamification ..................................................................................................................... 15 2.3.1 Game Elements........................................................................................................... 17 2.3.2 Scope of Gamification ................................................................................................. 18 2.3.3 Successful Gamification............................................................................................... 19 2.3.4 Gamification platforms for Business Intelligence tools................................................... 19 2.3.2 Summary.................................................................................................................... 20 2.4 Literature review conclusion.............................................................................................. 20 2.4.1 User Engagement........................................................................................................ 21 2.4.2 Enterprise Gamification relationship with BI................................................................. 21 2.4.3 Motivational Theory linked to SSBI and Gamification .................................................... 21 2.4.4 Summary.................................................................................................................... 21 Chapter 3: Research methodology: ............................................................................................ 21 3.1 Selection criteria............................................................................................................... 22 3.2 Project GamBIT................................................................................................................. 22 3.3 Ethical debate surrounding gamification and study participation.......................................... 23 3.4 Quantitative Research ....................................................................................................... 24 3.4.1 Measuring User Engagement....................................................................................... 25 3.5 Qualitative Research.......................................................................................................... 25 3.6 Methodological stages....................................................................................................... 26
  • 5. 4 | P a g e 3.6.1 Steps involved in open coding...................................................................................... 27 Chapter 4: Experimental and Interview Process:......................................................................... 28 4.2 GamBIT Experiment........................................................................................................... 28 4.2.1 Appeal for Volunteers ................................................................................................. 28 4.2.2 Experiment................................................................................................................. 29 4.3 Interview process.............................................................................................................. 29 Chapter 5: Results and Analysis:................................................................................................. 30 5.1 Quantitative data.............................................................................................................. 30 5.1.1 Participant Results and Analysis................................................................................... 31 5.1.2 Survey Information Results andAnalysis....................................................................... 33 5.1.3 UES statistical Results and Analysis............................................................................... 39 5.1.4 User engagement highest rankingfactors..................................................................... 40 5.1.5 User engagementlowest ranking factors...................................................................... 43 5.1.6 Summary of UES data.................................................................................................. 44 5.1.7 Time taken to complete tasks results and analysis......................................................... 44 5.1.8 Summary of time taken to complete tasks.................................................................... 45 5.2 Qualitative data................................................................................................................. 45 5.2.1 Participant A............................................................................................................... 46 5.2.2 Participant B............................................................................................................... 48 5.2.3 Participant C............................................................................................................... 50 5.2.4 Participant D............................................................................................................... 51 5.3 Summary of Qualitative Data Results.................................................................................. 53 Chapter 6: Conclusion: .............................................................................................................. 54 6.1 Review of research objectives............................................................................................ 55 6.2 Discussion of primary and secondary conclusions................................................................ 55 6.3 Limitations placed on project............................................................................................. 56 6.4 Future research work ........................................................................................................ 57 6.5 Summary .......................................................................................................................... 58 Chapter 7: Critical evaluation: ..................................................................................................... 58 7.1 Reflecting on the initial stages of the project....................................................................... 58 7.2 Approach to project........................................................................................................... 59 7.3 Honours year modules....................................................................................................... 59 7.4 Project aids....................................................................................................................... 59 7.5 Summary .......................................................................................................................... 60 References................................................................................................................................. 60 Appendix:................................................................................................................................... 63
  • 6. 5 | P a g e Appendix A Descriptive statistics.............................................................................................. 63 Appendix B Appeal for volunteers............................................................................................ 66 Appendix C Semi-Structured Interviews.................................................................................... 67 Appendix D Interview Guide.................................................................................................... 68 Appendix E Project Specification Form..................................................................................... 68 Appendix F Initial GamBIT development involvement ............................................................... 69 Appendix G User Engagement Scale including Research Survey ................................................. 70 Appendix H GamBIT gamification elements.............................................................................. 70 Abstract The principal objective of the study is to explore the issue of lack of user engagement with BI tools. The point it aims to address is whether making BI tools more fun and engaging by applying gamification to BI tools effects user engagement with the tools. This project will explore the gamification of BI tools and the effects on user engagement, to see if there is an increase in user engagement. The literature revealed that only 24% of staff who are exposed to BI tools are considered actively engaged with BI tools. It is also widely acknowledged that BI has not fulfilled its true potential with traditional BI best practises being considered a bit of a failure. Could “gamifying” a BI tool affect user engagement with the tool and address the issue of lack of engagement? To test this theory a prototype gamified BI tool has been developed, namely Project GamBIT. To carry out the research on the study objectives a mixed methodology was used which helped gather both qualitative and quantitative data. This was deemed the most appropriate approach to a study which is exploratory in nature as each approach has the potential to enhance and/or complement the other in knowledge gained on the same research problem. To gather the quantitative data the User Engagement Scale (UES) was applied to the GamBIT prototype and analysis of the data. To gather the qualitative data semi-structured interviews were conducted with volunteers who had took part in the GamBIT experiment in an attempt to glean more information over and above the quantitative data. The study is unique as there is little credible academic research been carried out on the lack of user engagement of BI tools. The results of this unique study demonstrates that gamifying a BI tool does increase certain user engagement factors and can increases motivation to use BI tools more often. Feedback from the interviews conducted highlights further areas where user engagement was considered more significantly increased.
  • 7. 6 | P a g e Chapter 1: Introduction: The first chapter will set the scene for the research that has been undertaken and will provide an introduction of the topics covered in the report. It will cover the aims and objectives of the report and include the inspiration that led to the research being carried out along with the research problem, that there is lack of engagement with BI tools, and aims to provide a brief background on why this is. It will provide a very brief overview of the research methods used along with the scope and limitations of the report. 1.1 Introduction to Key Themes Business Intelligence systems and all their components have been around for a number of years now. Business Intelligence (BI) has, since the late 1980s, evolved into a multi-billion dollar industry. Its main purpose is to produce timely, accurate, high-value, and actionable information. As a technology, BI has been seen to be under used and, as such, has significant untapped potential. One of the main factors that contribute to it being under used is a lack of user engagement with BI front end tools. With the adoption rates of BI tools remaining flat at around 24% over the past few years, many BI initiatives have failed to deliver the expected results leading to a common belief that traditional BI best practices where “a bit of a failure” (Howson,C 2014). To tackle the issue of lack of user engagement, applying the concept of gamification to BI tools may offer a solution. Organisations are increasingly recognising that applying gamification platforms to a wide variety of business processes may hold the key to increased user engagement. A widely cited 2011 Gartner study predicted by 2016 the widespread use of gamification would be applied to 50% of organisations business processes. Although this now seems highly unlikely, the industry does continues to grow with many gamification platform providers such as Badgeville, Game Effective, and Bunchball leading the way in “gamifying” business processes. These gamification practitioners champion the use of techniques such as rewarding certain behaviours using points and badges, highlighting personal achievements on leader boards and basically trying to make business processes more fun and rewarding. Successful gamification practitioners also understand the relationship between psychology and technology giving thought to what motivates someone to engage with a certain task, process, or software tool. Understanding motivational theory and indeed why users engage with certain tasks, processes, or software tools may provide some answers to the question of “why individual IT adoption rates are much lower that many organisations originally forecast?” (Wu,M. 2011). Early indicators entertain the possibility that the recent trend of enterprise gamification, which applies gamification to the workplace environment, may become an integral part of any organisations future BI initiatives and a way to further operationalize BI. Could providing enterprise gamification platforms for BI processes hold the key to tackling the issue of lack of user engagement with BI tools? 1.2 Aims and objectives of the project The point this project is aimed at is to address the issues surrounding lack of user engagement with BI front end tools, and asks the question, “Whether the gamification of BI tools can affect user engagement?” The objectives are to explore the gamification of BI tools
  • 8. 7 | P a g e and the effect, if any, on user engagement with the tools. Once complete this will achieve the aims of the project. This has been chosen as the focus of the project as BI and its modular components are currently playing a major role in the Business Technology sector with the lack of user engagement with BI tools being a global organisational issue. Combined with the recent trend of gamification, and it’s potential to increase user engagement, these subject areas form an interesting basis of exploration for any Business Technology student. This project will also form part of an on-going experimental study named Project GamBIT, a prototype gamified BI tool. Work carried out has aided GamBIT application development and helped gather evidence whether GamBIT achieved or not, increased user engagement with a BI tool. 1.3 Research methodology and techniques used The report will contain details of how primary research will be conducted, providing details of how user engagement with a BI tool will be measured and analysed. This will provide both the quantitative and qualitative data needed to address the main points of the report, whether the gamification of BI tools can affect user engagement. As the objective is to explore the gamification of BI tools and the effects on user engagement, if any, analysis of both the quantitative and qualitative data was conducted to aid in the exploration process. Additionally the report focuses on key academic papers from both the BI and Gamification sectors, with the emphasis on user engagement within each sector, and draws on findings from industry experts such as (Howson,C.), (Werbach,K.), (Zichermann,G.). This forms the basis for the literature review and aims to address the key areas of the report. As this report covers areas with few comprehensive academic works it will draw on white papers, articles and blogs, Vendor specific websites, webinars, studies, and gamification platform providers where appropriate. As some non-academic researched literature may be somewhat biased, the application of criticismwill be provided when deemed necessary and appropriate in an attempt to mitigate as much bias as possible. 1.4 Project scope and limitations This section will concentrate on the boundaries of the secondary and primary research. The primary research will explore if the gamification of a front-end BI tool will have any effect on user engagement with the tool. As this subject is unique in that there is no current academic research been done in this area, the project will include both qualitative and quantitative research methods. Quantitative and qualitative data was collected on one specific experimental front- end BI reporting tool. The tool was designed using the Eclipse BIRT platform which is an open source platform for BI tools. It is also worth noting that the majority of primary data was collected from students of the School of Engineering and Computing who may already be considered somewhat “engaged” with front-end BI reporting tools. Further qualitative data was collected in an attempt glean more information over and above the quantitative data collected. It also attempts to gain further insight into user’s thoughts,
  • 9. 8 | P a g e feelings and opinions on the future evolution of both the BI and gamification sectors and identify any correlations between these sectors. The secondary research of this report explores the concept of BI, its modular components, the emergence of BI and the factors influencing the BI industry with a focus on the adoption rate of BI tools. The concept of BI is examined in its broadest sense by reviewing the published literature with particular reference to material based on user engagement with BI tools which is relatively limited in scope and detail. The report will then centre on the recent trend of gamification, what it is, and its scope. It will explore the possibility of whether, by applying gamification to a front-end BI tool, this could have an effect on user engagement with the tool. When researching user engagement, users mainly fall into two groups, employees and customers. For the purposes of this study the focus is on the employee user group. The subject areas that will form the basis of the literature review are:  Business Intelligence (BI)  User engagement with BI tools  Gamification. Figure 1-0 The Venn diagram organises the key subject areas of this report visually so the similarity between relationships can be clearly seen. Chapter 2: Literature review: Keywords - Gamification, Employee Engagement, User Engagement, Business Intelligence, Business Intelligence Tools, Game Elements, Game Mechanics, Intrinsic Motivation, Enterprise Gamification.
  • 10. 9 | P a g e This chapter contains the literature review carried out by the researcher and examines relevant literature on the BI sector, focusing on the history of BI and the current state of the industry. The literature review will then concentrate on user engagement with BI and especially front-end BI tools. This part focuses specifically on the exploration of adoption rates of BI tools and highlights any potential issues that could lead to “a lack of user engagement with BI tools”. The focus will then turn to the new trend of gamification and its potential correlation with BI and user engagement with front-end BI tools 2.1 Business Intelligence The term Business Intelligence, or BI, was coined by Howard Dresner of the Gartner Group, in the late 1980s. BI is a huge and rapidly growing industry that emerged as a result of organisations beginning to realise and understand that the data stored within their decision support systems (DSS) had the potential to be of great value to them. Many of the early adopters of BI were in transaction-intensive businesses, such as telecommunications and financial services. As the industry matured the BI technical architecture began to include Data Warehouses, Data Marts, Executive Information Systems (EIS), Online Analytical Processing (OLAP) and by the mid-1990s BI, along with all its modular components, became widely adopted (Miller, 2013). As a result, BI became so closely associated with Data Warehouse technology it became identified as one and the same and is referred to using the acronym BI/DW. By the mid-1990s two main leaders in the BI industry emerged, Bill Inmon and Ralph Kimball. Inmon’s philosophy is based on an enterprise approach to data warehouse design using a top-down design method (Inmon 2005) while Kimball’s offering consists of a dimensional design method which is considered a bottom-up design approach (Kimball 2002). Even now a debate still rages on which of these approaches is more effective. Research points towards both Inmon and Kimballs approaches having advantages and disadvantages with many organisations having successfully implementing either approach. Organisations who are considering implementing a BI infrastructure would have to give careful consideration to both these approaches and closely aligned either approach with the overall high level business strategy of the organisation. Until recently BI had adopted a mainly centralised model around organisations IT departments. This meant that getting information to the right users could take considerable time and the build-up of requests for reports, analytics and insights from within the organisation could become “bottlenecked”. The general consensus was that business users viewed BI tools as complex and left the use of these tools to the “power users” within IT departments. This naturally evolved into a big disconnect between the IT power users and business users and led to many problems for what is now referred to as “Traditional BI”. Research suggests that Traditional BI best practices were considered slow, painful, and expensive therefore seen as a bit of a failure (Howson,C 2014). 2.1.1 Business Intelligence defined As BI has evolved so too has its definition and as such can be defined in various ways. (Howson, C. 2014) defines BI as a “set of technologies and processes that allow people at all levels of the organisation to access and analyse data”. Gartner (2013), the world's leading information technology research and advisory company, describes BI as an umbrella term that includes the applications, infrastructure and tools, and best practices that enables
  • 11. 10 | P a g e access to and analysis of information to improve and optimize decisions and performance. Eckerson,W (2010), Director of BI Leadership Research , appreciated the need for BI tools to provide production reporting, end-user query and reporting, OLAP, dashboard/screen tools, data mining tools, and planning and modelling tools. Research suggests that currently there are no combinations of hardware and software, any processes, protocols, or architectures that can truly define BI. What (Wu, L, Barash, G, & Bartolini, C. et al 2007) have made clear however, is that up until recently BI’s objectives were to:  Offer an organisation a “single version of the truth”.  Provide a simplified systemimplementation, deployment and administration.  Deliver strategic, tactical and operational knowledge and actionable insight. 2.1.2 Current state of Business Intelligence The recent unstructured data explosion and the trend towards “Big data” (Davenport, T.H., Barth, P. & Bean, R. 2012) has seen BI evolve yet again and as such BI has become synonymous with Big Data and Big Data analytics. As the volume, velocity and variety of data (the three V’s) has exponentially increased so too has the demand for cost-effective, innovative forms of information processing for enhanced insight and decision making (Lohr, S. 2012) .Vast volumes of data are now being captured and stored, but research shows it has been impossible for traditional BI to analyse and manage this data due to the unstructured nature of it (figure 2-0). Wixom, B. (2014) highlights how BI responded to the challenges posed by Big Data by adopting advanced technologies such as:  Hadoop architectures  Data visualization and discovery tools  Predictive analytics  Rare combinations of user skills (e.g., data scientists) Figure 2-0 - Graphic: Imex Research Businesses are now demanding faster time to insight (DiSanto,D, 2012), to stay competitive in today’s fast paced, evolving global markets and BI has to at least try and keep up with the pace of these demands. Traditional BI tools could take days or weeks to produce reports and
  • 12. 11 | P a g e analysis, this is no longer enough. This seen a demand for real time Business Intelligence, (RTBI) Azvine, B, Cui, Z, & Nauck, D. (2005) agreed that it is “becoming essential nowadays that not only is the analysis done on real-time data, but also actions in response to analysis results can be performed in real time and instantaneously change parameters of business processes”. As RTBI evolved, so too has the more recent BI trend of self-service BI. Front-end business users, who are considered the main information consumers, want to see, analyse and act upon their data more quickly without having to heavily rely on IT departments making their data available to them. The shift away from a centralised BI model to a more balanced centralised/de-centralised BI model (Wu, L., Barash, G., and Bartolini,C, 2007) has seen the emergence of, and increased organisational involvement with, self-service BI (SSBI). Gartner (2013) defines SSBI “as end users designing and deploying their own reports and analyses within an approved and supported architecture and tools portfolio.” Imhoff, C. & White, C. (2011) define SSBI as the facilities within the BI environment that enable BI users to become more self-reliant and less dependent on the IT organization. These facilities focus on four main objectives: 1. Easier access to source data for reporting and analysis, 2. Easier and improved support for data analysis features, 3. Faster deployment options such as appliances and cloud computing, and 4. Simpler, customizable, and collaborative end-user interfaces. Figure 2-1 - Graphic: BI Research and Intelligent Solutions, Inc. To help organisations achieve these four main objectives it would be worth exploring the concept of intrinsic motivation later on in this report, which Paharia, R. (2013) argues is directly linked to SSBI users feeling empowered, and how it fits in with individual adoption rates with SSBI processes. Research points towards SSBI lending itself to “multiple versions of the same truth” whereas traditional BI offered organisations a “single version of the truth”. SSBI has been facilitated by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools (Howson, C. 2014). Eckerson,W (2010) defines VDD tools as “self-service, in-memory analysis tools that enable business users to access and analyse data visually at the speed of thought with minimal or no IT assistance and then share the results of their discoveries with
  • 13. 12 | P a g e colleagues, usually in the form of an interactive dashboard”. SSBI has now become synonymous with VDD tools and has become a top investment and innovation priority for businesses over the past few years. The annual Gartner Business Intelligence and Analytics Summit (2014) looks at the current trends within the BI industry and highlighted that:  Self-service analytics is “white hot” and growing while demand for traditional dashboard BI is in remission.  BI on Big Data (i.e. Hadoop-based and outside of the data warehouse) is a dynamic new class of problem that requires a new class of solution.  Today's buyers are increasingly coming from the business side of the house and not from corporate IT, which has seen the move away from a centralised BI model to more decentralized BI model. 2.1.3 Summary  Traditional BI best practices considered a bit of a failure.  Business users viewed BI tools as complex and left the use of these tools to the “power users” within IT departments. Leading to a big ‘disconnect’ between business and IT staff.  The de-centralisation of BI has seen the emergence of self-service BI. This new trend has been facilitated by the increased use of BI front end tools, mainly Visual Data Discovery (VDD) tools. 2.2 User Engagement with BI Tools This section of the literature review concentrates on user engagement with BI and looks at the links between user engagement with BI, or lack of it, and the wider global issue of employee engagement in the workplace. Technology is important in any BI initiative but so too is need for BI users to be “engaged” with the BI environment. Having an engaged workforce has proven to help foster an analytical culture within organisations. Paharia,R. (2013) suggests that engaged workers “can drive meaningful increases in productivity, profitability, and product quality, as well as less absenteeism, turnover, and shrinkage”. This is no mean feat to achieve. It’s the combination of people and technology that turn data into actionable information that can be used to enhance the organisations decision-making (Miller, A.S. 2013), that lies at the heart of BI. By getting the right information to the right people at the right time BI can become an integral part when improving decision making, providing valuable business insights, optimising organisational performance and of measuring success. However, employee adoption of and engagement with BI is critical in any BI initiatives success or failure.
  • 14. 13 | P a g e 2.2.1 What is employee engagement? Employee engagement does not have one simple or accepted definition. The Chartered Institute of Personnel and Development take a three dimensional approach to defining employee engagement: • Intellectual engagement – thinking hard about the job and how to do it better • Affective engagement – feeling positively about doing a good job • Social engagement – actively taking opportunities to discuss work-related improvements with others at work 2.2.2 What is User Engagement? Research has shown that user engagement has several definitions. This highly cited definition by O'Brien, H.L., & Toms, E.G. (2008) states “Engagement is a user’s response to an interaction that gains maintains, and encourages their attention, particularly when they are intrinsically motivated” while Attfield, S, Kazai, G., Lalmas, M., & Piwowarski, B. (2011) explain that “User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology” Research points towards user engagement being the determining factor in any successful BI initiative. Organisations that have more users engaging with BI, with the emphasise on BI tools, will more than likely see a better Return on Investment (ROI) in their BI ventures than that of those whose workforce are lacking in engagement (Howson,C. 2014). 2.2.3 BI Adoption Rate Recent survey suggests that BI adoption as a percentage of employees remains flat at 22%, but companies who have successfully deployed mobile BI (Dresner.H, 2012), show the highest adoption at 42% of employees (figure 2-2) Figure 2-2 - Graphic: BI Scorecard The lack of BI adoption from the employee perspective can be aligned closely with the wider global problem of “lack of employee engagement” in the workplace. According to Deloitte’s 2015 Global Human Capital Trends survey (figure 2-3), employee and cultural engagement is the number one challenge companies’ face around the world.
  • 15. 14 | P a g e Figure 2-3 - Graphic: Deloitte University Press Gallup conducted a study in 2013 into the state of the global workplace. The findings show of the 142 countries that measured employee engagement that 13% of employees are engaged in their jobs, while 63% are not engaged and 24% are actively disengaged. While in the U.S Dale Carnegie and MSW did a study on over 1500 employees that measured employee engagement. It revealed that 29% of the workforce is engaged, 45% are not engaged, and 26% are actively disengaged (Dale Carnegie Training 2012). As more organisations employ BI and analytics to improve and optimize decisions and performance research points towards the question many organisations have asked “what is going to make the difference between a successful BI initiative and one that will flat line?” The need to stay one step ahead in an ever increasing and competitive global marketplace is proving harder. Business leaders are looking to technology as the main driver in remaining competitive in today’s markets. Having the right information technology infrastructure in place is not enough to give organisations the edge. What the research leans towards is having an engaged, motivated and collaborative workforce. This is especially true in the BI environment where adoption rates of BI tools has flat lined over the past decade. Some have suggested that those who are exposed to the front end tools and how they engage with them may make the difference in the success or failure in any BI initiative. It would seemthat Organisations looking to take BI adoption rates, and indeed user engagement with BI tools, to the next level would have to have a clear strategy that makes user engagement a priority. To get the right information to the right person at the right time does not guarantee BI success, if users are not engaging with BI tools an organisations BI deployment could be doomed to failure. However to address this problem an important questions should be asked “is user engagement with BI at a required level to make BI a success?” If this question cannot be clearly answered an organisations BI efforts could fail to deliver the results that were initially predicted.
  • 16. 15 | P a g e Senapati, L., (2013) argues that to gain competitive advantage through active user engagement, organizations must leverage gamification mechanics to influence user behaviour and drive results. The summary below gives an indication to why there is a lack of engagement with BI tools. 2.2.4 Summary  The adoption rate of BI tools has flat lined at 22% over the past decade.  The issues surrounding user engagement with BI tools can be directly linked to the wider global issue of lack of employee engagement in the workplace.  Organisations who have employees actively engaged with BI tools see a great return on investment with their BI initiatives.  To take user engagement with front-end BI tools to the next level, organisations will need a clear strategy that makes user engagement a priority. 2.3 Gamification This section of the Interim report will focus on the subject area of gamification. It will explore its history, how it is defined and its correlation with BI and in particular exploring the possibilities that the gamification of BI tools could have an effect on user engagement. Gamification is a relatively new concept that is constantly evolving and has been gaining popularity over the past few years with many vendors now offering gamification platforms and solutions. The development of new frameworks, technologies and design patterns has made gamification scalable and effective (Werbach,K, Hunter,D, 2012). This has led to it being applied and utilised throughout organisations to gain business benefits across a wide processes, tasks and tools. The term “gamification” has been accredited to the British-born computer programmer and inventor Nick Pelling who coined the phrase in 2002 but it was not until 2010 that articles and journals based on gamification started to appear. The rise in popularity of gamification has resulted in it experiencing considerable attention over the past few years. Google trends shows that search volume for gamification increased significantly since 2010 and spiked in February 2014.(Figure 2-4) Since then it has stayed at a steadier search volume. (December 2015). Gartners top 10 strategic technology trends 2014 showed gamification as a rising trend for a number of years. (Figure 2-5) but has seen the hype surrounding it die down and should reach its plateau of productivity in the next 2 to 5 years. Like all trends it has is champions and its critics and although gamification has quickly evolved into a multi-million dollar industry it is still considered to be in its infancy and therefore not fully matured.
  • 17. 16 | P a g e Figure 2-4 Google Trend search results for the keywords gamification & business gamification Figure 2-5 Gamification in the Gartner 2014 Hype Cycle There are many schools of thought on the definition of “what” gamification is. Duggan, K. & Shoup, K. (2013) use this explanation of Gamification to highlight both the human behavioural and technology elements used in gamification. “Think of gamification as the intersection of psychology and technology… understanding what motivates someone to ‘engage’ with certain elements of a website, app, or what have you… It’s about humanising the technology and applying psychology and behavioural concepts to increase the likelihood that the technology will be used and used properly”. Werbach,K. Hunter,D.(2012) define gamification as “the use of game mechanics and design in a non-game context to engage users or solve problems”. It is important that the research does not confuse gamification with “playing games” or “serious games” (Nicholson, S. 2012) which also applies game elements and design to non-game concepts. Gamification is not people playing or creating full blown games, whether it be for employees or customers, but using game elements such as dynamics, mechanics, and components to make an existing experience, like a task, business process, or software tool more fun, engaging, collaborative,
  • 18. 17 | P a g e and rewarding. Gamification uses these motivational factors based on needs and desires to get organizational task completed. Organisational tasks with game like engagement and actions can make people excited about work and boost productivity (Wu,M. 2011). 2.3.1 Game Elements Game elements can be though if as the “toolkit” needed to build and implement successful gamification. Points, Badges, and Leader boards (PBLs) are common components within the game elements and are a seen as surface level features of gamification. PBLs are usually a good place to start when introducing gamification platforms but research suggests awarding and rewarding are not enough. Through the review of literature it would be safe to imply that if gamification initiatives are to succeed, other certain aspects must be considered. The two key questions that emerged where 1. What are the motivational factors that drive engagement with a product/service/process? 2. Why should gamification be taken seriously especially in a business environment? To answer these questions first we must look at the three key elements of gamification namely dynamics, mechanics and components. Figure 2-6 shows how these elements relate to each other and why they are considered the building blocks to successful gamification. Figure 2-6 Graphic: Gamification Course 2014 The research will now look at the relationship between these three elements starting with dynamics. Kim, B., (2012) states that “the power of game dynamics stems from the fact that it requires meeting relatively simple conditions in return for attainable rewards. Then gradually, the tasks become complicated and more challenging for bigger rewards”. This could conceivably be considered the meaning behind the game.
  • 19. 18 | P a g e Game mechanics refers to a set of rules, design and tools, employed by game designers, to generate and reward activity amongst users in a game that are intended to produce an enjoyable gaming experience (Senapati, L., 2013). Game mechanics are the elements of the game that makes it fun, drives the action forward, and generates user engagement. Game mechanics could reasonably be considered the momentum behind the game. Werbach, K. (2014) describes game components as specific instantiations of mechanics and dynamics and can include PBLs, avatars, collectibles, unlockables. This can be closely linked to what is considered the motivation to continue with the game. The objectives of any gamification platform or solution should be aligned directly with the business objectives and as such an understanding of the primary stakeholders is essential in creating an experience that engages users while accomplishing the business objectives (Deterding, S. et al 2012). To make the experience engaging research highlighted that three major factors must exist and be correctly positioned. These are motivation, momentum and meaning. This is achieved through a combination of carefully crafted game elements and design and a deep understanding of what motivates the users of the gamified system. Research points to the Volkswagen (2009) initiative named the “fun theory”. This initiative puts “fun” at the heart of seemly mundane tasks such as using a set of stairs or disposing of litter and turning it into an engaging and somewhat rewarding experience. Gamification practitioners have learned from this and as a result the fun theory is considered a driving factor for successful gamification and should never be far from the thoughts of any gamification designer (Werbach,K. Hunter,D. 2012). Underlying the concept of gamification is motivation. Research suggests that people can be driven to do something because of internal or external motivation (Nicholson, S. 2012). Paharia,R. (2013) adds to this by stating “Knowing what truly motivates people, and what doesn’t, enables us to create stronger engagement and true loyalty”. 2.3.2 Scope of Gamification The extremely broad and expanding range of ways gamification has been successfully utilized in recent years has led to its increase in scope. The frameworks, technologies and design expertise are readily available to introduce gamification platforms or solutions into organisations business processes. With the trajectory of gamification constantly changing some Industry experts have argued that each and every business process or problem has a “gamified” solution (Zichermann,G. Linder, J. 2013). Although this may seeman exaggerated statement it would be worth future consideration and exploration because as of yet there is no credible academic research been done on the subject. If what Zichermann,G. Linder, J.(2013) say is the case, then gamification has massive scope but the legal, moral and ethical implications of gamification put forward by Kumar,J.M. & Herger,M. (2013) could affect its future scope. As gamification is still in its infancy and not fully matured, research suggests gauging its scope may raise more questions than answers.
  • 20. 19 | P a g e 2.3.3 Successful Gamification Gamification has proven to be successful in many diverse business fields and because it can provide quantitative data, organisations can measure engagement with whatever process, task or tool that has been gamified. With more and more organisations realizing gamifications potential the type of data collected can lead to valuable insights for organisations. Zichermann,G.(2013) describes how in 2006 Nike introduced gamification to tackle the issue of why business had fallen to its lowest market share ownership in the influential running shoe category. By 2009 Nike had reversed the trend due in no small part to the gamification platform that featured social networking and location based technology that relied heavily on games called Nike+. Individuals who went for a run could now track the number of steps they took, calories burned and routes they ran by attaching the Nike Fuelband round their wrists. Zichermann,G (2013) goes on to explain that “once downloaded this data could be compared to that of others and the experience of going for a run became much richer”. This created a whole new level of social engagement with running challenges being issued, prizes such as electronic badges being awarded, and videos of praise from celebrity athletes for reaching certain goals. By 2012 Nike+ had over five million users. By leveraging a simple concept “beating your best time” Nike created a gamification platform that encouraged wellbeing and fitness and in turn saw its market share increased by 10% in a single year. Stanley,R (2014) looks at Engine Yard as an example of successful gamification. Engine yard is described as a platform for deploying, scaling, and monitoring applications. The company implemented a Zendesk knowledge base, but didn’t see the levels of engagement they had hoped for. To encourage participation, Engine Yard incorporated PBLs and other gamification tactics to boost participation and reward users for making contributions to the community. These actions successfully increased user-generated content for its customer self-help portal, decreasing the number of support tickets and reducing the demand on support staff. These examples show the diverse range of business processes that have benefited from gamification. The literature review will now focus on the relationships between BI and gamification and look to uncover any evidence of front-end BI tools that have been gamified. 2.3.4 Gamification platforms for Business Intelligence tools There is considerable overlap between the aims of both gamification and BI. RedCritter, who offer business solution software that enables enterprises to manage, showcase, and reward employee achievements, utilize game elements as an integral part of their social enterprise platform by incorporating Profiles, PBLs, Rewards and Skill tracking into their customers’ existing BI processes. RedCritter works with Tableau, a leading self-service BI visual data discovery tool vendor, and Microsoft Excel to provide BI and analytics. RedCritter integrates Tableau and Excel with their enterprise gamification platforms with RedCritter Product Manager, Jenness, D, (2014), claiming that this type of enterprise gamification of BI leads to “valuable insights about employee performance and engagement” and “enables self-service data visualization and behavioural insights”. Swoyer, S. (2012) states in his article for the
  • 21. 20 | P a g e TDWI that gamification has particular resonance with BI and analytics, where the search for, and discovery of, insights already has a game-like feel to it. Gamification advocates want to amplify this effect to intelligently apply game-like concepts and methods to BI and analytics. The article continues with: "It's a question of game play: of how we can make [interacting with] BI more engaging. For example, you want to get people into the flow where they're asking questions continuously, where they're following [an analysis] from one question to another. Where questions lead to insights, and vice versa" lead analyst at the Information Management company, Ovum, Madan S. (2013), identified that many BI systems resemble gamified systems in that they: “Seeks to engage business users and change organizational behaviours to improve business performance and outcomes. Gamified functions also typically generate a lot of data for analysis. The key is providing users with an immersive data experience that drives them to improve on that information through exploration and feedback.” Madan. S, recognises that gamification and BI “are both are highly complementary” and gamification can be seen as a way to further operationalize BI by embedding it seamlessly into everyday knowledge work, albeit in a competitively friendly and fun way. Research points towards the correlation between gamification and SSBI with a blog post on Decision Hacker (2012) suggesting SSBI could reasonably be defined as an early attempt to gamify the workplace this statement is also championed by Werbach, K. (2014). Its overall goal is intended to engage the workforce and align organisational behaviours through carefully designed elements. This statement may seema little premature as it is unclear that using game elements with business processes and applications can become a viable, long- term concept that meets business objectives (Madan, S. 2013). 2.3.2 Summary  There is considerable overlap between the aims of gamification and BI.  Enterprise gamification platforms are now being integrated with BI tools such as Tableau.  Gamification has been proven to increase user engagement with business processes, tasks and tools.  Gamification must be closely aligned with business objectives to be successful in the workplace.  As yet there is no credible academic research suggesting gamification can increase user engagement with individual BI tools. 2.4 Literature review conclusion The following section contains the findings from the three subject areas discussed in the literature review and how they are connected. It also gives justification for further research into the main points the report aims to address.
  • 22. 21 | P a g e 2.4.1 User Engagement User engagement with front-end BI tools has flat lined at around 22%-26% for almost a decade now. The review of literature entertains the idea that adding gamified layers to front-end BI tools could have an effect on user engagement with the given tools. What this research has attempted to reveal is that to take user engagement with front-end BI tools to the next level, organisations will need a clear strategy that makes user engagement a priority. Gamification platforms and solutions maybe one way of addressing this priority but no credible academic evidence of this is currently available. 2.4.2 Enterprise Gamification relationship with BI Many industry leaders agree that gamification may very well change the face of BI. With the emergence of enterprise gamification platforms from providers such as Badgeville, Bunchball, and Redcritter, more and more business processes have been successfully gamified. Research shows little evidence of the gamification of individual BI tools. What is more relevant is the increasing number of enterprise gamification platforms being provided for BI vendors, with particular focus on VDD tool vendors. But as this is also a very recent, and still emerging field it provides very little in the way of measurable results to support the claims that these platforms will be successful applied to BI and in particular BI front end tools. 2.4.3 Motivational Theory linked to SSBI and Gamification The Literature review revealed SSBI and gamification share a common use of the motivation theory, with the focus on intrinsic motivation, in an attempt to increase loyalty, engagement, and collaboration. The relationship and similarities between both these subject areas highlight the importance of what motivates individuals to engage with certain tasks, processes or (more importantly for the purposes of this report) BI tools. 2.4.4 Summary The key theme of the literature review clearly shows that there is considerable overlap between the aims of BI and gamification and that BI systems can indeed resemble gamification systems. Gamification platforms can generate valuable insights into user engagement and therefore would be a good starting point for exploring the idea of its potential effects on user engagement with BI tools. The literature review shows early indications that by gamifying BI tools, especially front-end tools, user engagement with the tool may very well increase. With the key theme and findings from the literature review, further research on the exploration of the gamification of BI tools and the effects on user engagement can be justified. Chapter 3: Research methodology: The purpose of this chapter is to define the type of research that was carried out through an identification and selection process and to explain the research approach, strategy and associated methods chosen for the data collection and analysis. The challenges and ethical issues that were encountered as well as the modifications that were made throughout the research journey are also presented. A discussion on the ‘reliability and validity’ of the research is provided and latterly, a conclusion is reached.
  • 23. 22 | P a g e “Qualitative and quantitative research methods have grown out of, and still represent, different paradigms. However, the fact that the approaches are incommensurate does not mean that multiple methods cannot be combined in a single study if it is done for complementary purposes” Sale, J, Lohfeld, M, Brazil, K (2002) 3.1 Selection criteria Quillan (2011) insists that it is good practice and wise to reiterate what the main objective is, as it serves to reinforce what is being measured and how it fits with the research questions. The main research objective is: “to address, and asks the question, whether the Gamification of BI tools can affect user engagement.” Specific study objectives have been formulated, which are:  To address the issues surrounding user engagement with BI tools.  To explore the gamification of BI tools and the effect, if any, on user engagement with the tools. To carry out the research on the study objectives it has been decided to use a mixed methodology which will help gather both qualitative and quantitative data. This was deemed the most appropriate approach to a study which is exploratory in nature as each approach has the potential to enhance and/or complement the other in knowledge gained on the same research problem, while each remains true to its own identity (Salomon, 1991). The mixed methodology approach adopted throughout is designed to carry out relevant and valuable research. According to Carey (1993), quantitative and qualitative techniques are merely tools; integrating them allows us to answer questions of substantial importance. 3.2 Project GamBIT This section will introduce the experimental study named Project GamBIT which forms part of the primary research for the report objective. To gain a better understanding of the research methodology it is important to have a clear understanding of what the prototype purpose is, how it was developed and how it will be used. Project GamBIT is centred on the main themes covered in the literature review, BI, user engagement with BI tools and gamification. Its objective is to address a worldwide issue of “lack of user engagement and adoption by users of BI tools (employees) throughout the business world”. The study is unique in that the concept of “gamifying” a BI tool would see an increase in user engagement with the tool. As yet this subject has lacked academic research which has resulted in a limited existing body of knowledge. Project GamBIT is a software prototype that has been designed and developed in an attempt “To apply the concept of gamification to a business intelligence tool and to evaluate what effect it has on user engagement levels” (Miller,S, 2013), I joined the study at the early
  • 24. 23 | P a g e stage of testing and evaluation of the prototype. My part in the study was to aid in Project GamBIT development and gather evidence whether GamBIT achieved or not, increased user engagement with a BI tool. To aid GamBIT application development this report will identify, describe and apply appropriate research methods to gather feedback on early versions of the GamBIT prototype which include  Use of the GamBIT prototype  Feedback on the experience  Ideas on improvements to the prototype The GamBIT tool was developed using the Eclipse BIRT Java platform http://www.eclipse.org/birt/ . Eclipse BIRT is an open source technology platform used to create data visualizations and reports that can be embedded into rich client and web applications. This tool has many advantages over other BI software tools and is particularly suitable for developing or dismantling, rebuilding and customising, which this project requires. It has allowed the developer, PHD student Stephen Miller, to strip back and “gamify” the tool. This was achieved by dismantling the tools framework and reassembling the tool with additional layers which incorporated gamification. Access to the BI software and Java developers’ platform was given in an attempt aimed to give me a better understand how the GamBIT tool has been developed and at what stage the project is currently at. To achieve these aims an understanding of the Java code used within the developers’ platform was deemed necessary. This included access to and an understanding of the Java files, folders, and source code used. Java code was then edited which create a new configuration of the code and Gambit front end. Appendix H shows screen dumps from the GamBIT tool. The screen dumps highlight the gamification elements added to the Eclipse platform and the process undertaken by the volunteers who took part in the gamified experiment. These steps are deemed necessary to help 3.3 Ethical debate surrounding gamification and study participation There are ethical issues surrounding gamification mainly the aspect that user of the gamified system must be treated fairly and with respect. There must be a balance struck between the desired actions or outcomes the gamification systemis looking to achieve and the exploitation of the user. Bogost,I (2015) has described gamification as a form of “exploitation-ware”. The results of a study into the ethical debate surrounding gamification within an enterprise concluded that “Gamification could be seen as an unfair mechanism to increase productivity with no real costs. In addition, it could increase pressure on employees to achieve more or avoid being in the bottom of the list” (Shahri, A., Hosseini, M., Phalp, K, Taylor, J. & Ali, R. 2014). Some have argued that gamification can be used to confuse users and ignore what is “reality”. Gamified systems that have been designed without considering the ethical issues surrounding gamification can fundamentally undermine the business objectives that they
  • 25. 24 | P a g e were set out to achieve. The counter argument put forward by DeMonte.A (2014) of Badgeville states that: "Gamification can never be successful exploitationware, because it only works when the behaviours that are motivated are behaviours that the user wants to perform in the first place. It's not some magic solution where you can manipulate users to perform behaviours against their will.” As gamification matures the ethical and legal issues surrounding it will undoubtedly become clearer (Kumar,J.M. & Herger,M. 2013). But for the purposes of the research carried out in this report the ethical debate surrounding gamification was carefully considered as there are no clear best practices relating to the subject area. The research involved groups of students who volunteered to take part in the experimental stage. There are ethical considerations to take into account and as such all volunteers were given an information sheet that fully explained their involvement in the study giving the volunteers the freedom to out of the study at any point. Great care and consideration was taken to put volunteers at ease and to make them fully aware of what was expected during the experimental stage of the GamBIT prototype and during the interview process. The intention was to protect the confidentiality of, and give anonymity to, volunteers. 3.4 Quantitative Research This section discusses what quantitative research is, its goals, and how this approach was applied to the aims and objectives of the report. Quantitative research is the systematic empirical investigation of observable phenomena via statistical, mathematical or computational techniques. (Given, M. 2008). Quantitative research methods have been chosen as a means of “collecting ‘facts’ of human behaviour, which when accumulated will provide verification and elaboration on a theory that will allow scientists to state causes and predict human behaviour” (Bogdan & Biklen, 1998, p. 38). The ontological position of the quantitative paradigm is that there is only one truth, an objective reality that exists independent of human perception. (Sale, J, Lohfeld, M, Brazil, K 2002), This type of research fits with the aims of the report in as much as it is a research method that can help facilitate the process of measuring user engagement. The approach applied to the quantitative research methods are as follows: 1. Apply the User Engagement Scale (UES) to the GamBIT prototype to measure user engagement. 2. Analyse the data collected from the UES. 3. Document the results and findings using tables, charts and/or graphs 4. Interpret and summarise the results
  • 26. 25 | P a g e 3.4.1 Measuring User Engagement To develop an approach to measuring user engagement the question of “how can we measure user engagement?” must be answered. O'Brien, H.L., & Toms, E.G. (2008) have conducted several studies focusing on the assessment of engagement and believe the following factors are considered to be most relevant in measuring user engagement:  Perceived usability - user’s affective (e.g. frustration) & cognitive (e.g. effort) responses  Novelty - user’s level of interest in the task and the curiosity evoked  Aesthetic appeal - user’s perceptions of the visual appeal of the user interface  Focused attention - the concentration of mental activity, flow, absorption etc…  Felt involvement - user’s feelings of being ‘drawn’ in, interested and having ‘fun’  Endurability - user’s overall evaluation of the IS e.g. likely to return/recommend Given the belief that the factors listed are considered the most relevant to use with the GamBIT tool, the user engagement scale (O’Brien, H.L. & Toms, E.G. (2013) (2008)) was chosen to collect the quantitative data. The UES has been modified to fit the needs of the GamBIT tool. Research suggests there is no “perfect” or “complete” way of measuring user engagement, there are several different methods that could have been applied to project GamBIT to produce the quantitative data needed for this study. Through research the UES was considered best as it considers the most relevant factors in measuring user engagement. Others such as the System Usability Scale (SUS) where considered but the developer dismissed this “quick and dirty” scale as it was considered “one-dimensional” and the questionnaire is, by its own nature, quite general. The User Engagement Scale (UES) (Appendix G) was applied to measure user engagement Gambit’s software prototype and collected quantitative data which was used to test the theory that whether GamBIT achieved or not, increased user engagement with a BI tool. 3.5 Qualitative Research Qualitative research methods have been chosen as a way to produce findings not arrived at by means of quantification i.e. the UES. Qualitative research is based on interpretivism (Altheide and Johnson, 1994; Kuzel and Like, 1991; Secker et al., 1995) and constructivism (Guba and Lincoln, 1994). Interpretivism naturally lends itself to qualitative methods. It is, in its simplest form, an ideal means of exploring individuals’ interpretations of their experiences when faced with certain situations or conditions (Woods & Trexler, 2001). The qualitative research will attempt to understand an area which little is known, in this case the main theme of the report exploring the gamification of BI tools and its effects on user engagement, and to obtain intricate details about the feelings, thoughts, and emotions that are difficult to extract and/or learn about through quantitative research methods. In this case the feelings, thoughts and emotions of the volunteers who took part in the GamBIT experiment. Strauss, A, and Corbin,J, (1998) study of the basics of qualitative research
  • 27. 26 | P a g e points to the three major components of quantitative research. The three points below highlight how these components relate to this project: 1. The data. Which will come from semi structured interviews. 2. The procedures used to interpret and organise the data. Coding 3. The Analytical process. Taking an analytical approach to interpreting the results and findings and including these in the report. Qualitative data analysis consists of identifying, coding, and categorizing patterns or themes found in the data. Data analysis was an ongoing, inductive process where data was sorted, sifted through, read and reread. With the methods proposed in this report, codes are assigned to certain themes and patterns that emerge. Categories are formed and restructured until the relationships seem appropriately represented, and the story and interpretation can be written (Strauss & Corbin, 1998) The following section describes the methodological stages undertaken during the qualitative research and can be loosely attributed to the grounded theory approach (Strauss, A, and Corbin,J, 1998). 3.6 Methodological stages This section contains a step by step process on the methodological stages used to conduct the qualitative research. The methodological stages and how they are connected is shown in figure 3.6 below. Figure 3.6 Qualitative research methodological stages The first part of the process was identifying the substantive area. The area of interest for this report being the exploration the gamification of BI tools and the effects on user engagement.
  • 28. 27 | P a g e The study is about the perspective of one (or more) of the groups of people of the substantive area who will comprise your substantive population. In this study University students who are part of the School of Engineering and Computing at UWS, Paisley. To collect data pertaining to the substantive area, conversing with individuals face-to-face by means of a semi-structured interview was considered most appropriate. The process of open coding was carried out as the data was collected. Open coding and data collection are integrated activities therefore the data collection stage and open coding stage occur simultaneously and continue until the core category is recognised/selected. Eventually the core category and the main themes became apparent; the core category explains the behaviour in the substantive area i.e. it explains how the main concern is resolved or processed. This projects main concern was lack of user engagement with BI tools and the core category was “whether the gamification of BI tools effects user engagement”. 3.6.1 Steps involved in open coding The following section gives an overview of the steps involved during the process of open coding. 1. The transcripts where read and first impressions note. The transcripts where read again with microanalysis of each line carried out. 2. The following relevant pieces where then labelled- words, sentences, quotes, phrases. This were based on what was deemed relevant to the study and included thoughts, concerns, opinions, experiences, actions. This type of analytical process aims to address what is considered relevant to exploring the gamification of BI tools and the effects on user engagement. During this process the following possibilities were looked at  Repeating data.  Surprises in the data.  Relevance to objectives. 3. The next step focused on deciding which codes where most important and to create categories by bring codes together. Some codes where combined to create new codes. At this point some of the codes deemed less relevant where dropped. Codes considered important where then group together allowing for the creation of the categories. 4. The next step focused on labelling relevant categories and identifying how they are connected. Comparative analysis was used as a means of labelling. The data contained within the categories made up the content of the main results. 5. The results and analysis were written up. Memos where written throughout the entire process. This helped in the interpretation of the results and analysis with some memos written directly after the semi-structured interviews were conducted.
  • 29. 28 | P a g e Chapter 4: Experimental and Interview Process: During the initial development of the GamBIT prototype a number of tests were conducted to help evaluate the prototype. An approach was made to a number of students, from the School of Engineering and Computing at the University of the West of Scotland (UWS), Paisley who had shown an interest in work being carried out in this report. This was done through direct observation of volunteers who had agreed to test the prototype. This was done in an attempt to observe their interaction with the prototype and with the Eclipse platform. The main areas under observation where  Length of time to complete the tasks  Navigation of the platform  Reaction to the gamification elements After the tests were conducted feedback was given by the volunteers which included  Incorporate rewards such as badges when a task is complete  Simplification of the game based rules  Reworking of the tutorial to highlight every step of the process involved in carrying out the tasks. Time taken for the volunteers to complete the tasks varied from 50 to 75 minutes. The estimated time to be applied to the actual experiment was around 45 to 60 minutes. This gave the developer time to re-evaluate the prototype and make the necessary changes prior to the experiments being carried out. 4.2 GamBIT Experiment In an attempt to appeal for volunteers, students from the school of Engineering and Computing at the University of the West of Scotland, (UWS) Paisley where approached to take part in the GamBIT experimental study. The following section includes how the appeals were made, justification for selection, and estimated duration of the experiment. 4.2.1 Appeal for Volunteers The GamBIT developer approached the lecturer of a 1st year class studying the module ‘Introduction to Computer Programming’ and asked if he could appeal to students to volunteer for the experiment. These students where familiar with the Eclipse software platform as they were learning Java programming through the use of this platform, therefore, they were familiar with the layout of the Graphical User Interface (GUI). It is worth noting that many of the students had little experience using BI tools. The second group was a 3rd year group of students who were currently studying a BI module and therefore where familiar with BI and had experience of using a BI tool. An approach was made to the lecturer of BI class by the researcher to ask if an appeal to students from the BI class was possible. The lecturer agreed, and subsequently all students where emailed prior the appeal to give notice of the appeal (Appendix B). A five minute overview of the project and the experimental study was given and then an appeal for
  • 30. 29 | P a g e volunteers was made. Students were given the opportunity to ask any questions or state any concerns. They were then advised of the time and location of the experiment and finally thanked for their time. The last group consisted of 4th year (Honours) students who were studying Business Technology. These students were chosen as they would (hopefully) provide a more critical viewpoint and assessment of the tool as they were in the last year of their studies and had a broader experience of BI, BI applications and associated tools. One hour time slots booked at the UWS labs for the GamBIT experiment to take place. The estimated completion time was forty minutes. Given scope for late arrivals and varying completion times by volunteers, one hour was deem sufficient for all volunteers to fully carry out the experiment. Further experiments where undertaken by other volunteers who showed an interest in the project. These experiments where conducted over several days in the Labs at UWS. 4.2.2 Experiment Volunteers were randomly split into Group A (control - BI tool only) and Group B (experimental - ‘GamBIT’ tool). The random split was deemed necessary as it was a fundamental requirement of the test design under scrutiny. Both groups where issued envelopes on arrival containing a USB stick (with JAVA coding installed, pen, a guide to launch software, a guide to complete the exercise and a User Engagement scale. Group A were given USB sticks with a JAR file named: NonGambit.install.data. This file, once installed integrated new Java programming code that generated text files (.txt extensions) onto the USB stick whenever a user had clicked certain buttons during each of the 6 BI tasks. Group B where given USB sticks that contained a JAR file named: Gambit.Install. This file, once installed integrated new Java programming code that created the ‘GamBIT’ gamification techniques on all of the 6 BI tasks on the exercise tutorial. It also created text files for the collection of a number of different qualitative and quantitative data and wrote this data to the new text files on the USB stick during the experiment. The volunteers were briefed on the support available during the experiment and advised that help was available at any time from the three observers present (researcher, developer and moderator). On completion of the experiment every volunteer was thanked for their time and participation. All UES, USB sticks, and pens where then collected, sealed in their given envelopes, and split into 2 piles, group A and Group B. The data was then collected and analysed over the next few weeks. (The results and analysis are covered in chapter 5). 4.3 Interview process
  • 31. 30 | P a g e The semi structured interviews were conducted on 4 participants who took part in the experiment. Each interview followed a similar theme based around 3 main objectives (Appendix D). 1. To understand what each participant felt about the application of gamification techniques to a business intelligence tool and to determine what effect it had on their level of user engagement. 2. To ascertainhow eachparticipant felt during the test, their reasons for feeling the way they did and to glean further information from them over and above the survey data. 3. To gather qualitative evidence from each participant on a wide range of relevant issues concerning lack of user engagement with BI tools and to use quotes by them as to their opinions, views, suggestions and constructive criticism. Initial contact with each participant was made through a response to feedback given after the experiment in which they expressed an interest in taking part in the interview process. An email was sent stating the following points that were to be addressed prior to the semi- structured interviews being conducted. • Explanation of the purpose of the interview • Addressed terms of confidentiality - The participant’s informed consent was voluntarily achieved by means of a disclaimer attached to the (UES). • Explained the format of interview • Indicated how long the interview may take • Asked them if they have any questions • Asked for consent in the recording the session The email contained details of the proposed dates and times, approx. duration and location of each interview. Further correspondence took place until eventually pre-determined times and dates where agreed with each participant. Given the busy schedules of all the participants each interview was conducted at various places within the University campus and on separate days. It was necessary to follow up with the participants as quickly as possible after the experiment was conducted to keep the thoughts and feelings of participants’ as fresh in their memory as possible. Chapter 5: Results and Analysis: This chapter will document the findings gathered from the collection of quantitative and qualitative data. It will focus on the results and analysis from the experiment then document the results and interpretation of the semi-structured interviews that were carried out on four participants. 5.1 Quantitative data
  • 32. 31 | P a g e This section contains the results and analysis of the survey data, the UES data, and the data collected relating to the participants time spent during the experiment. 5.1.1 Participant Results and Analysis The experiment attracted a total of 68 participants (n= number of participants, (n = 68)) and were ‘randomly’ split into one of two groups (A/B). Table 5.1 shows that there was an almost even split. As slightly more participants (n = 2) used the BI tool only (non-gamified version) showing the random nature of the group split. All statistical analyses has been conducted with this slight differentiation. Group A/B Group name No. of participants (N =) N= %age A Control group using the BI tool only 35 51.5% B Experimental group using the GamBIT tool 33 48.5% Table 5.1 Group A/B split Table 5.2 shows the spread among the 3 groups of participants by UWS class/course. The largest group was the 1st year students, of which 46 took part. The 3rd and 4th year students consisted of 17 and 5 participants respectively. When the initial approach was made to the 3rd year student, the class consisted of around 40 students however, less than half of those invited took part (n = 17) which accounted for 25% of the cumulative total. The 4th year students consisted of 5 participants (n = 5, 7%). Participants Frequency Percent Valid Percent Cumulative Percent Valid 1st year - Intro to Programming 46 67.6 67.6 67.6 3rd year - BI class 17 25.0 25.0 92.6 4th year Hons – Comp. Science 5 7.4 7.4 100.0 Total 68 100.0 100.0 Table 5.2 - University Course distribution
  • 33. 32 | P a g e Figure 5.1 shows the spread among the groups of participants by UWS class/course in a bar chart. Figure 5.1 University Course distribution Table 5.3 shows how the three groups of volunteers were divided and allocated to the two groups (Group A/B) during the experiment by their different university courses. This helps to demonstrate the randomisation of the participants. The table shows a very close division and split between the three groups and their respective student courses. From the optimum 50/50 split the largest group (1st year students) shows a +/- 2% (52%/48%) difference, with the other groups following a similar pattern. Figure 5.2 shows the same information in a Bar Chart. Table 5.3 Group type A/B * University Course - Cross-tabulation
  • 34. 33 | P a g e Figure 5.2 Group type A/B * University Course - Cross-tabulation Bar Chart Summary  The grouping of participants was completely random.  There was an almost even group A/B split.  1st year students made up the majority of participants (68%).  3rd year students made up 25% of participants with a total of 17 taking part. The number was lower than expected given the class size of 40+ students. 5.1.2 Survey Information Results and Analysis The following section gives an overview of the survey information gathered from each participant prior to completing the UES (Appendix G). The data is based on the responses to four questions (Q). 1. What is your gender? 2. What is your age? 3. On average, how often have you used a business intelligence (BI) tool at work or study before? 4. On average, how often have you played any kind of video/app/mobile game before? Q1.
  • 35. 34 | P a g e Table 5.4 shows the gender split with only 8 females participating (12%) in the experimental study compared to a larger male participation of 60 (88%). To show the randomness of how males and females where allocated to their respective groups (control and experimental), Table 5.5 show the cross tablature distribution. Table 5.4 Gender Split Table 5.5 Gender * Group type A/B Cross-tabulation The random nature of the allocation of test groups shows that no prior consideration was made to ensure there was a more even distribution of males and females within the groups. Figure 5.3 highlights the lack of female participants, this was an unfortunate circumstance that was out with the scope and control of the researcher.
  • 36. 35 | P a g e Figure 5.3 Gender distribution among the represented UWS courses Q2. The age distribution of participants is shown in table 5.6 and clearly shows the 18-24 age range represented the highest majority (68%). A more even distribution was between the 25- 29 and 30-39 years age range. Table 5.6 Age range distribution Figure 5.4 shows the same data in a pie chart.
  • 37. 36 | P a g e Figure 5.4 Age range distribution Pie Chart Q3. Figure 5.5 shows cross tabulation results that emerged when participants where asked how frequently they had used a business intelligence (BI) tool.
  • 38. 37 | P a g e Figure 5.5 BI usage by University course More detailed analysis can be seen in Table 5.7. The fact that 0% of 4th year students had never used BI tools was an expected result given that they would be considered the most experienced in using BI tools. What was surprising is that 5% of the 3rd year students have never used a BI tool given the course content for 3rd year BI students. A high percentage of 1st year student (25%) had never used a BI tool before. A more even distribution can be seen between the 1sts year students’ BItool usageof 2or3 times a week and onceortwicebefore.
  • 39. 38 | P a g e Table 5.7 BI Usage by University Course Q4. When asked the final survey question about how frequently they had used video, application or mobile games before, table 5.7 shows the participants’ answers. Table 5.7 Games usage Figure 5.6 shows the same information in a pie chart. Figure 5.6 Frequency of games usage
  • 40. 39 | P a g e Summary of survey data:  68 people participated in the experiments  They were split into the 2 groups almost equally: group A (51.5%), group B (48.5%)  3 UWS classes were selected from the 1st year (68%), 3rd year (25%) and 4th year - Honours (7%) within the School of Engineering and Computing  The gender split was: male (88%) and female (12%)  The majority age group was in the ‘18-24 years’ category (68%)  44% of participants had ‘never’ used a BI tool before and a further 20% only ‘once or twice’ before  As expected, most of the participants play video/mobile games on a ‘daily’ or ‘two or three per week’ basis (c.80%) 5.1.3 UES statistical Results and Analysis All of the UES data from the 68 participants was inputted into a statistical package software tool known as SPSS (version 23) by the GamBIT Developer. This allowed for a wide range of statistical testing to be conducted on the survey data (Appendix A), an overview is provided in the tables, charts and statements below. The median (middle) score was found for each variable for all 68 cases. The mean of the medians were calculated for all of the 6 sub-scales (factors to be measured in the UES) as shown in tables 5.8 and 5.9. Table 5.8 User Engagement (UE) factor scores for Group A: Control – BI tool only
  • 41. 40 | P a g e Table 5.9 User Engagement (UE) factor scores for Group B: Experimental – GamBIT The mean of all six factor mean scores for both groups can be seen in table 5.10. Table 5.10 Mean of all 6 factor mean scores for Groups A/B 5.1.4 User engagement highest ranking factors Based on the results and analysis of the mean scores, experimental group B (GamBIT) had Perceived Usability (PU) and Novelty (NO) ranked 1 and 2 respectively.
  • 42. 41 | P a g e Table 5.11 Ranking of lowest mean score by factor - Group B (GamBIT) The score in brackets at the end of each statement is the %age of respondents who either: strongly agreed (1) or agreed (2) with the statement. The PU and NO statements are: Perceived Usability (PU):  PU1 - I felt discouraged using the tool (70%) – 7th  PU2 - I felt annoyed using the tool (72%) – 6th  PU3 - Using the tool was mentally taxing (73%) – 5th  PU4 - I found the tool confusing to use (76%) – 1st  PU5 - I felt frustrated using the tool (76%) – 1st  PU6 - I could not do some of the things I needed to on the tool (74%) – 4th  PU7 - The tool experience was demanding (76%) – 1st Novelty (NO):  NO1 - The content of the tool incited my curiosity (52%) – 2nd  NO2 - I would have continued to use the tool out of curiosity (45%) – 3rd  NO3 - I felt interested in my BI tasks on the tool (61%) – 1st Group A (control group) had Perceived Usability (PU) and Endurability (EN) ranked 1 and 2 respectively.
  • 43. 42 | P a g e Table 5.12 Ranking of lowest mean score by UES Factor - Group A (control) The PU and EN statements are: Perceived Usability (PU):  PU1 - I felt discouraged using the tool (76%) – 3rd  PU2 - I felt annoyed using the tool (79%) – 2nd  PU3 - Using the tool was mentally taxing (67%) – 6th  PU4 - I found the tool confusing to use (76%) – 3rd  PU5 - I felt frustrated using the tool (82%) – 1st  PU6 - I could not do some of the things I needed to on the tool (64%) – 7th  PU7 - The tool experience was demanding (76%) – 3rd Endurability (EN):  EN1 - The tool experience did not work out the way I had thought (64%) – 1st  EN2 - I would recommend the tool to appropriate others (61%) – 2nd  EN3 - Using the tool was worthwhile (61%) – 2nd  EN4 - My tool experience was rewarding (52%) – 4th The results suggest that the participants in both groups did not find either of the tools a hindrance, demanding, or confusing in any significant way. They seemed to be able to accomplish what they were asked to without any great difficulty. The ‘Endurability’ (EN) aspect is associated with the users’ overall evaluation of the experience, its worthiness and recommendation value for others to use the tool. For the control group this factor was ranked 2nd highest. Interestingly, EN4 ranked lowest and suggests their experience could have been more rewarding with EN1 indicating the overall experience could have been better.
  • 44. 43 | P a g e The Novelty factor, which is associated with the curiosity the tool evoked, interest levels, and surprise elements ranked higher for GamBIT users. This suggests that GamBIT users where more interested in the BI tasks they were asked to complete. Interestingly, over half the experimental group (52%) stated that the content of the tool incited their curiosity (NO3). Given that gamification aims to make tasks more fun, engaging and intrinsically motivating the results demonstrate the developers attempt to add these elements to the gamified BI tool. Results from the statement ‘I felt interested in my BI tasks’ scored 61% with the GamBIT group compared to the control group who only rated this statement at 45%. The %age difference of 16 points, which is an increase of 35%, can be seen in table 5.13. This can be interpreted as a significant difference in the level of interest shown by both groups. Table 5.13 NO3 ranking score Group A/B 5.1.5 User engagement lowest ranking factors Focused attention (FA) factor scored lowest for both groups. The FA factor is associated with the concentration of mental activity including elements of flow, absorption and time dissociation in the tasks. The results highlight that participants appeared to be more concerned with their tasks than the actual BI tools. This suggests that the gamification elements did not fully absorb the participants and that they seemed more focused on task completion. Focused Attention (FA) statements: ↓ Group: → BI tool only (A) GamBIT (B) %age Ran k %age Ran k FA1 - When using the tool, I lost track of the world 21% 7th 21% 7th
  • 45. 44 | P a g e Table 5.14 Comparison of FA scores by * Group A (control) / B (experimental) 5.1.6 Summary of UES data  The highest rated variables (statements on the UES survey) for both groups was different i.e. Group A - did not find the BI tool frustrating (82% strongly agreed or agreed) Group B – did not find the GamBIT tool confusing, frustrating or demanding (76%)  Novelty ranked high with the GamBIT group and seen a significant difference in results from the control group.  Perceived usability was ranked highest by both groups.  Lowest ranked factor for both groups was focused attention. This suggests participants where not fully absorbed in the gamification elements, with task completion being of higher importance. 5.1.7 Time taken to complete tasks results and analysis This section shows the results from the data collected relating to the time taken to complete each task and includes the optional task 6 results. This section also questions whether the gamification of a BI tool places additional time constraints on participants. The results from Group A (control- BI tool) revealed that Task 2 (T2 - building a data source) was the quickest time at 2 minutes and 11 seconds. There were two tasks that took on average over 8 minutes i.e. T4 (formatting the data) and T6 (creating a report title) with T4 taking the longest time to complete at 8 minutes and 36 seconds. The results from Group B (GamBIT) revealed that Task 2 (T2 - building a data source) was also the quickest time at 2 minutes and 30 seconds which was the same task as group A - only a little slower (19 seconds). Task 6 (T6),creating a report title, was an optional task. Given that it was introduced at the end of the experiment, participants by this point may have been somewhat disengaged. It is a good gauge to measure if the participants were still around me FA2 – I blocked out things around me when using the tool 30% 5th 30% 5th FA3 - My time on the tool just slipped away 45% 3rd 45% 3rd FA4 - I was absorbed in my BI tasks 54% 2nd 58% 1st FA5 - I was so involved in my BI tasks that I lost track of time 58% 1st 45% 3rd FA6 - During this experience I let myself go 33% 4th 33% 4th FA7 - I lost myself in the tool 22% 6th 22% 6th
  • 46. 45 | P a g e engaged in the tasks. T6 took the longest time to complete at 8 minutes and 06 seconds, some 30 seconds quicker than the control group (A) which is a good result for the research. The overall mean times to complete all six tasks are detailed below: • Group A (BI only) -32 minutes 31 seconds • Group B (GamBIT) - 30 minutes 25 seconds • Time difference - 1 minute 54 seconds (in favour of GamBIT) To answers the question whether the gamification of a BI tool places additional time constraints on participants, evidence of time differences shows that there are no significant time disadvantages or distractions. Results show the opposite appears to be true as the times to complete tasks are quicker which is a positive result in regards to the research. 5.1.8 Summary of time taken to complete tasks  The participants who used the GamBIT tool took less time to complete the six tasks.  GamBIT group had more participants complete the additional task (n=16)  Task 4 took longest to complete.  Using the GamBIT BI tool lead to tasks being completed quicker compared to the non-gamified tool. 5.2 Qualitative data This section will give an overview of each of the interviews conducted and report on the key finding under each of the main categories. The interviews were based on the experiences of each participant when carrying out the GamBIT experiment. It looked to glean more information over and above the quantitative data collected by the application of the UES. To explore key issues further questions were based on -  Their experiences with BI tools in general.  Their thoughts on gamification, in particular the gamification of BI and BI tools.
  • 47. 46 | P a g e  Their experiences of the use of BI in the workplace with a focus on any issues, obstacles and concerns.  Their thoughts on user engagement with BI tools. Full transcripts of all four interviews can be seen in Appendix C. The following section will report key findings under each of these main categories.  Game Elements  GamBIT experiment  Concept  Enterprise Gamification  Gamification of BI tools  User engagement A snippet of the coding process is provided to give a clearer understanding of how the results of the coding were analysed and then interpreted. Table 5.2.1 Sample of coding classifications. Taken from a Microsoft Excel file. 5.2.1 Participant A Game elements The gamification elements added to the BI tool where an unwanted distraction taking them away from completing the tasks, stating that “I never really paid attention” and “ I never looked at the leaderboard, never read it to see what it said”. When discussing what is the most important features of BI tools their response was “functionality of the tools is most important”. All of which suggests the gamification elements where not as important as actually completing the given tasks. GamBIT The participant stated that coming into the experiment “I wasn’t looking to enjoy it.” The Eclipse platform lacked the visual elements (aesthetics) needed to keep them engaged with the task and experienced issues with the platform layout “I think it was not very user friendly everything was clumped together. I lost one of the elements when carrying out the task of sorting and it proved hard to find. I could not move the element back to where they should be”. This proved to be a major issue with the Eclipse platform. This suggests that the
  • 48. 47 | P a g e participant is a visual person that likes software platforms that have a familiar GUI and are easy to navigate. I would be safe to assume that the Eclipse platform was not as user friendly or aesthetically appealing compared to other BI platforms they had used. This contributed to a lack of engagement with the gamified BI tool. Concept The concept of the mountain climber “bagging a ben” was not something they were particularly interested in. The following quote highlights this by stating “Maybe if it was something different (concept), as bens and mountains I am not interested in. Maybe if it was focuses along with something that interested me a bit more maybe I would have focused but I just clicked through it”. Suggesting that if the concept was more tailored to them, the overall experience could have be more engaging. Enterprise Gamification The participant stated a personal view on how enterprise gamification could benefit an organisation “it would really depend on staff’s attitude to the software or tools”. Asked if this form of gamification could increase user engagement in a BI environment they stated “I don’t think it is going to create engagement personally”. Gamification of BI tools When asked about their wider views on gamifying BI tools the participant states “I think BI tools are used by professional who know how to use them and realise how critical the information is. It would be good for learning (gamifying a BI tool)… like teaching people to use the BI tool. So for learning purposes yes, but on the whole may slow people down”. The participant explored the idea of gamification as a possible aid in learning to use BI tools, quoting “As a lot of these new tools can be frustrating and maybe having a pop-up or reward saying you have achieved may help out there. I see its place as a teaching aid for a new tool. But using the tool for a long period of time may get more people annoyed”. User engagement On the subject of user engagement with gamified BI tools the response was “personally it is not something I would engage with I don’t think, it’s not something that if added to a (BI) tool, especially a tool I was not keen on using, would make me use it “. When describing their feelings during the experiments “I don’t think I was overly engaged or lost track of myself in it” and commenting on the concept “using mountains just didn’t engage me”. It is clear that the participant actively “disengaged” with the gamified BI tool therefore the tool had no positive effect on user engagement. The following points stood out when writing up the memos  To engage user’s, visualisation though the use of colours was important.  The gamification concept has to resonate with each individual user and provide a variety of game-based activities that appeals to them.