HomeRoots Pitch Deck | Investor Insights | April 2024
How Knowledge Management and Big Data Multiply the Impact of CI
1. The Intelligence Collaborative
http://IntelCollab.com #IntelCollab
Powered by
How Knowledge Management
and Big Data Multiply the
Impact of CI
A Complimentary Webinar from Aurora WDC
12:00 Noon Eastern /// Wednesday 14 December 2016
~ featuring ~
Scott Leeb Derek Johnson
2. The Intelligence Collaborative
http://IntelCollab.com #IntelCollab
Powered by
Scott Leeb
Scott Leeb is the Engagement Director for the Iknow, LLC. Over the past 15 years, he has created,
managed and grown the global business intelligence programs at four Fortune 500 companies (Prudential
Retirement, The McGraw-Hill Companies, KPMG and Ingram Micro) and a leading international
philanthropy (The Rockefeller Foundation). In 2011 he served as President of SCIP. He is currently on the
faculty of the University of Johannesburg.
Scott has spoken in Europe, Asia, Africa, Australia, North and South America on a wide range of topics
including competitive intelligence, business intelligence, market intelligence, strategy and knowledge
management. He began his career as a senior intelligence analyst for the US Army, specializing in East
Asian political-military affairs. Scott holds MAs from The Australian National University and Columbia
University, a BA from Yale University and a language certificate from Beijing University.
Email: sleeb@iknow.us
The Intelligence Collaborative is the online learning and networking community powered by Aurora WDC, our clients, partners and
other friends and dedicated to exploring how to apply intelligence methods to solve real-world business problems.
Apply for a free 30-day trial membership at http://IntelCollab.com or learn more about Aurora WDC
at http://AuroraWDC.com.
3. The Intelligence Collaborative
http://IntelCollab.com #IntelCollab
Powered by
α Use the Questions pane on your GoToWebinar control
panel and all questions will be answered in the second
half of the hour.
α You are welcome to tweet any comments on Twitter
where we are monitoring the hashtag #IntelCollab or
eavesdrop via http://tweetchat.com/room/IntelCollab
α Slides will be available after the webinar for embedding
and sharing via http://slideshare.net/IntelCollab
α To view the recording and download the PPT file, please
register for a trial membership at http://IntelCollab.com
Questions, Commentary & Content
25. The Intelligence Collaborative
http://IntelCollab.com #IntelCollab
Powered by
Thank you!
Now how about a little Q&A?
Email: sleeb@iknow.us
Web: www.iknow.us
The Intelligence Collaborative is the online learning and networking community powered by Aurora WDC, our clients, partners and
other friends and dedicated to exploring how to apply intelligence methods to solve real-world business problems.
Apply for a free 30-day trial membership at http://IntelCollab.com or learn more about Aurora WDC
at http://AuroraWDC.com. See you next time!
Editor's Notes
Good afternoon (and good morning and good evening to those of you from around the world) and thank you to Aurora WDC for the opportunity to be here today to talk about this important topic – How Big Data and Knowledge Management Multiply the Impact of Competitive Intelligence. I am grateful to take part in the Intelligence Collaborative – I have seen the excellent roster and am honored to be considered as presenter for this forum. My plan is to chat for 40 minutes or so and then open it to Q&A.
Let me begin by simply saying we are in a golden age of competitive intelligence. I have been involved in CI for two decades and the prospects for success for CI as a respected function in organizations has never been better. This is due to the rise of Big Data, or rather the importance of it, in the past 5 year. In my view, the equation is simple: Big Data = big opportunity. To explain this requires an understanding of what is big data. Get 20 people in a room and you are likely to have 30 definitions of what big data is.
#1 – big, focus on behavior
#2 – big, types of data
#3 - traditional and non-traditional
#4 – big, a concept
All talk about big data as a step towards something
Let the sheer size of big data serve as its definition
7.5 billion people on the planet = 12.75 billion megabytes of new information every second
Volume – new terms: GB, TB, PB, Exabyte (18 zeros); zetta (21); yotta (24); Xenottabyte (27); Shilentnobyte (30); Domegemegrottebyte (33)
Velocity – traders – milliseconds save millions
Variety – Save a word doc and you have 17 options in Word 2016
Veracity – needle in haystack vs deciding which needle; a lot of white noise out there; sources
Value – Make the business case; big data has no intrinsic value
Variability – Variability is often confused with variety. Say you have bakery that sells 10 different breads. That is variety. Now imagine you go to that bakery three days in a row and every day you buy the same type of bread but each day it tastes and smells different. That is variability.
Variability is thus very relevant in performing sentiment analyses. Variability means that the meaning is changing (rapidly). In (almost) the same tweets a word can have a totally different meaning. In order to perform a proper sentiment analyses, algorithms need to be able to understand the context and be able to decipher the exact meaning of a word in that context. This is still very difficult
Visualization – This is the hard part of big data. Making all that vast amount of data comprehensible in a manner that is easy to understand and read. With the right analyses and visualizations, raw data can be put to use otherwise raw data remains essentially useless. Visualizations of course do not mean ordinary graphs or pie charts. They mean complex graphs that can include many variables of data while still remaining understandable and readable.
Visualizing might not be the most technological difficult part; it sure is the most challenging part. Telling a complex story in a graph is very difficult but also extremely crucial. Luckily there are more and more big data startups appearing that focus on this aspect and in the end, visualizations will make the difference. One of them is future this will be the direction to go, where visualizations help organizations answer questions they did not know to ask
The miniseries House of Cards represented Netflix’s first foray into original programming. In 2011, the program, a remake of a BBC miniseries, was up for purchase with Director David Fincher and starring Kevin Spacey. Netflix spent $100 million for two seasons of episodes (13 episodes per season) – this was a bold bet considering the company had little experience in original programming and a figure that amounted to over 25% of its operating income for the year.
Netflix made this decision solely based on looking at their massive stash of data. Subscribers who watched the original series, they found, were also likely to watch movies directed by David Fincher and enjoy ones that starred Kevin Spacey. Considering the material and the players involved, the company was sure that an audience was out there.
Netflix collects a lot of data. Currently, it has many millions of customers worldwide (86 million in 190 countries) and spreads a very wide net to collect data on them. In particular, they capture something they call “user actions.” These include the times of day people watch certain movies (for example, people do not watch horror movies before breakfast). It also logs when you start and stop viewing, what you rewind to watch again, whether you watch on a TV or iPad and so on. It even looks at pirate movie sites to determine what’s trending. [Netflix] already knew that a healthy share had streamed the work of David Fincher, the director of The Social Network, from beginning to end. And films featuring Kevin Spacey had always done well, as had the British version of “House of Cards.”
Normally, the process for vetting a show would include the creation of a pilot and then focus groups would be convened. Depending on the findings additional pilots and focus groups might be needed. The pilot would be tweaked depending on audience comments and perhaps another pilot done and consumer feedback analyzed. Some of the more progressive content companies would post episodes on-line and invite viewers to respond. This cut the costs of focus groups, but only marginally served to streamline the process. Netflix, on the other hand, entirely by-passed this timely and expensive process.
And the result?
On February 1, 2013, the tv series debuted. It proved to be an immediate hit. It quickly became the most watched show in the Netflix library. In 2015, Kevin Spacey won a Golden Globe award for – Best actor in a tv series (drama). Furthermore, it was estimated that for the show to break even, an additional 600,000 customers would need to be added each year. The first year, Nextflix added over 3 million new customers in the U.S. alone.
This illustrates how big data is re-writing the corporate playbook. (2:45)
Netflix and House of Cards and Big Data”
Organizations have always used conventional data to develop high-level metrics and business intelligence. Smart organizations have long relied on data to help make strategic business decisions. But the power and allure of Big Data is that it allows us to make decisions that we had not previously been able to make or decisions that we were making with incomplete or inaccurate data. As in the case with Netflix, we are stacking the odds in our favor of making a smart decision. Questions that had previously not been raised can now be asked and answered as hidden patterns are uncovered and unknown correlations observed.
Peter Drucker was often quoted as saying that "you can't manage what you can't measure." Drucker means that you can't know whether or not you are successful unless success is defined and tracked. The allure of big data is that it provides organizations with an unprecedented capability to measure.
As the slide shows, there are three key benefits of Big Data. Let’s quickly examine each:
Big Data allows us to make new decisions. For retailers, for example, it is about targeting customers for goods that they need before they even realize they need it. Famously, Target tried to answer the question of determining if its customers were pregnant so that they could begin targeting them with ads for such things as diapers, formula, pacifiers, etc. Ultimately, news of this leaked to the public and there was an uproar about retailers peering too deeply into the private lives of their customers, but the fact remains that large retailers such as Home Depot, Best Buy and Safeway are all making big bets about customer preferences that they did not previously make due to a lack of information.
Big Data creates an environment for better decision making– Let’s take the case of Amazon, which seems to have an almost magical power to offer you additional products based on its knowledge of previous sales and customer reviews of the products you’re looking at and/or purchasing. For Amazon the value of big data in this case is obvious: better recommendations equals more product sales. Amazon’s value proposition to its customers (i.e. us) in this case is that we will always find exactly what we need as well as things we “forgot” we needed, but want to buy now.
As for new product and service offerings, let’s use the case of Verizon, which collects a wealth of data on and about its wireless customers. Verizon Wireless is pursuing new offerings based on its extensive mobile device data. In a business unit called Precision Market Insights, Verizon is selling information about how often mobile phone users are in certain locations, their activities and backgrounds. Customers thus far have included malls, stadium owners and billboard firms. For the Phoenix Suns, an NBA basketball team, Verizon’s Precision Market Insights offered information on where people attending the team’s games live, what percentage of game attendees are from out of town, and how often game attendees combine a basketball game with a baseball spring training game or a visit to a fast food chain. Such insights are obviously valuable to the Suns in targeting advertising and promotions. (3:10)
The dirty little secret of Big Data is that it has no intrinsic value. So now let’s turn to the core of this presentation, which is talking about how knowledge management is a strategic enabler that helps unlock the value of Big Data, turn big data into big intelligence and ultimately result in better organizational decision making. There are two areas in which knowledge management plays a prominent role.
First, as we talked about earlier, Big Data is big..and getting bigger. The more data there is, the greater the requirement to integrate and organize it. More than half of big data projects (55%) never get completed, according to a widely quoted survey by Infochimps. This is in large part due to the data being inaccessible or in a format that is unusable.
Second, knowledge management helps promote data democracy. This is the idea that data is the great equalizer. Data is power and organizations need to understand that where the data lies (or is analyzed) regardless of where it fits in the organization, makes it a critical decision making node. Now, this flies in the face of traditional hierarchical organizations, but organizations need to evolve to grow and knowledge management can play an important role in facilitating this transition. (0:55)
Ok, so for the past three slides I have been talking about integrating data. But once it has been integrated it needs to be organized in a manner where it can easily be searched and retrieved. The explosion of available data and information creates massive findability, navigation, interpretation, and content management problems. The vast amounts of data mean that simply relying on human tagging is not sufficient. It will need to be supplemented with automatic tagging.
Taxonomy helps with interoperability and synchronization across the many data forms. As we discussed earlier, data can take many forms – video (.mov), audio (.wav), pictures (.jpeg, .png). Even a simple word document can take different forms – as I was preparing some note for this presentation, I went to hit save and I saw that there were 17 different formats in which I could save my file. There needs to be a way to standardize the data and create such things as a commonality of definitions. For example, you need a system that recognizes the term Microsoft but also that MSFT (its ticker symbol) also refers to Microsoft. Or the ability to discern between Paris Hilton the hotel and Paris Hilton the socialite. Or the ability to recognize that the term green onion and scallion are in fact two different terms for the same food, so if someone searches on green onions, search hits with the term scallions also appear.
A good taxonomy tool will enrich content with metadata for process automation, extract facts, entities and relationships for better analytics and harmonize all information sources for key business insights.
But this is not enough. Now at least the data is organized, but now it needs to be converted into something that will ultimately help executives make better decisions. It needs to be converted into intelligence. For this we need CI. (1:15)
Some of you might recognize this as the famous Japanese painting entitled “The Great Wave off Kanagawa,” by the painter Katsushika Hokusai. It depicts an enormous wave threatening boats off the coast of the prefecture of Kanagawa. Mount Fuji is in the background. Imagine it is us as CI professionals in the boat. How are we going to ride this wave and come out not only unscathed, but stronger.
There are I believe 5 ways we can leverage Big Data and KM to help improve the visibility of the CI function.
One reason projects fail is people get so frustrated with all the data
Need to put in place a data collection strategy
There is data, information and intelligence. Need to focus on the end game: valuable insights that help drive decision making. There is a saying GIGO.
Prosletyzer for data discipline.
According to Forrester, firms use only five percent of the data available to them, while created data is growing at 40 percent to 50 percent annually and only 25 percent to 30 percent of that total is being captured. This places a premium on constantly monitoring the technological landscape for tools that address the many dimensions of big data. Improved predictive analytics, real-time computational capabilities and modelling techniques all dictate that technology upgrades be a significant part of any big data strategy. The movement towards open source software has kept the cost of change comparatively low. Organizations need to develop an adaptive data strategy to address emerging technologies.
Exascale computing – one billion billion calculations per second – schedule for 2023 – originally 2018)
The rise of big data has led to calls for changes in traditional modelling techniques
The new breed of analytics specialists need to have a combination of skills including statistical techniques, applied mathematical methods, advanced machine learning algorithms, data visualization, and business and communications skills. Many of the key techniques for using big data are rarely taught in traditional university courses. Perhaps even more important are skills in cleaning and organizing large data sets; the new kinds of data rarely come in structured formats. Not surprisingly, people with these skills are hard to find and in great demand. Human Resource departments need to develop new strategies and approaches to acquiring, retaining and developing talent.
1. Data Scientist
2. Data Engineer
3. Big Data Engineer
4. Machine Learning Scientist
5. Business Analytics Specialist
6. Data Visualization Developer
7. Business Intelligence (BI) Engineer
8. BI Solutions Architect
9. BI Specialist
10. Analytics Manager
11. Machine Learning Engineer
12. Statistician
IBM, for example, recently announced that it has committed $100 million dollars to educating and training data scientists in China to head off an anticipated workforce shortage
Emerging skill sets
Metrics and measurement – much more quantifiable
Software savvy – project management tools, resource management tools
Sensitivity analysis – competing priorities
ETL - extract, transform and load data into the repository
OLAP – Online analytical processing – data cube structure that allows for fast processing
Multi source – leads to specialization
Coding/Programming – sorting through vast amounts of data – writing algorithms
Quality control – vetting
One size does not fit all
One characteristic of data is that it is omnidirectional.
In my view, most institutions are by definition siloed.
Work with complete data sets and not duplicate effort.
Google 20% - Gmail, Ad Sense. Codified in its 2004 IPO letter.
US Army scenario planning always had a science fiction writer.
Data has always been used to develop high-level metrics and business intelligence. Smart organizations have long relied on data to help make strategic business decisions. But the power and allure of Big Data is how it enables organizations to leverage unconventional data points (such as unstructured text): the information that was previously ignored because there was no reasonable way to process it. Questions which had previously not been raised can now be asked and answered as hidden patterns are uncovered and unknown correlations observed. The context for decision making is transformed from linear extrapolation (“What does this mean?”) to dynamic supposition (“What could this mean?”). Organizations need to recognize this subtle, yet significant, shift embrace it and promote it through management ranks.
There are I believe 5 ways we can leverage Big Data and KM to help improve the visibility of the CI function.
As technology and systems improve to better address and analyze increasing amounts of data, it is important to remember that machines can only do part of the real work. Machines can do a great deal of the work humans used to do or could never do economically, but they cannot replace human knowledge workers, as it's people who translate data and insights from analytics into business outcomes. It is the people who determine what questions to ask, where to find data that shines a light on that question, and only then conduct analysis to learn about the story the numbers are telling. In short, organizations need to remember that solving the big data equations is as much about people as it is about technology.
Big Data brings with it big promise. There are many challenges, but the rewards are clear. As has been often noted, “You can’t manage what you don’t measure.” The allure of big data is that it provides organizations with an unprecedented capability to measure.
Experimental environment – fail fast. Where do we earn, we learn from our mistakes.