The traditional measures of organizational performance – the balance sheet, income statement, sales pipeline, operations – have been around for decades / even centuries. In fact, the balance sheet and income statement, the two most fundamental measures of business performance, came into existence in the 1930s, well before computers were even invented. Today, we live in a completely new, digital world yet the fundamental measures of performance and ‘information tools’ we use to improve our organizations have not changed at all. Leading companies have been discovering that Big Data can have big payoffs for their business. In essence, capturing and analyzing massive amounts of data allows organizations to track new ‘signals’ or new measures of performance, new classifications and understanding of their customers, new insights into operations, finding new, measurable ways of doing business that they’ve never had before.The question you need to ask yourself is, ‘what new signals could help me dramatically improve my business?’
Going back to the Income Statement and Balance Sheet – they are at best rear-view measures of the top line and bottom line. They provide a snapshot in time of all that has happened, but very little, if any, indication of what is happening in the enterprise. For example, an online retailer with a subscription model experiences a massive drop in stock price because of poor Income Statement results. Further analysis indicates that there was a dramatic drop in subscribers (Churn) in one of their most profitable segments. It would stand to reason that a pre-emptive pulse-check on the churn could have helped stem the bleeding and perhaps prevented the reaction on Wall Street. This churn is an example of a new signal that could have helped this one enterprise run its business more proactively rather than look for explanations with a rear-view perspective.
To look forward, companies need a real-time data platform that can ingest the ever growing data elements, which influence your business and compare them with traditional measures from the balance sheet and income statement, giving insight now vs. months down the road. This real-time data platform must have extreme capabilities to collect, move, store and process massive amounts of data from any source and discover new trends, patterns and behaviors in real time…all critical to gaining competitive advantage.
We need to consider the new analysis that will be possible and put our creativity to work. There are so many existing and new use cases for this information. Each can improve our business. Each can optimize our performance. Each can identify new opportunities we had not seen before. But if we create a unique application for every one, where will we end up. Will we end up with clarity or confusion. Will we be agile or trapped arguing about which of these multitude of applications is correct.The organizations that learn to not just employ this information, not just find these new signals, but to make sense of them and to tame the growth – these are the organizations that will be winning in the future. These are the organizations that will lead the market in every arena.
Facebookdatifies our friends, LinkedIn our professional contacts, Spotify, Kindle datifiesbook reading – the morning after I finished the latest Ken Follett novel, I got a special offer for the next in the series.
McLarenhas a project to use HANA both for their operations, and to help get real-time information from the car’s 120 sensors. Here is a vision dashboard. For more information, see this video: http://blogs.sap.com/jonathanbecher/2012/11/15/jonathan-becher-cmo-at-sap-talks-about-the-challenge-of-big-data-in-formula-1/
Faster – faster queries (customers have seen speed increases of over 1000%), and no latency (all transaction data instantly available for analysis). It’s fast in the way digital cameras are faster than old film cameras. Film cameras take pictures very fast – but if you want to view the results, it can take days of expensive, specialized focusing. And when you get the pictures back, it’s often not what you want, so you have to take more pictures than you need, in the hope that one might come out. With a digital camera, we get the result instantly, and can take the picture again in real time until we get what we want…
Today's #analytics examines the broken glass. Wouldn't it be better to catch the vase?Smarter – make smarter decisions, earlier in the business process. Today, all of our analytic systems would be focused on analyzing the broken glass lying on the floor after the vase had toppled, and trying to figure out how to make sure it doesn’t topple in the future. But imagine if your analysis was so fast you could catch it BEFORE it hit the ground? That would be a dramatic change to the “process” – and it’s the equivalent of what customers are doing within in-memory technology today, spotting and fixing problems before they negatively effect the customer experience.
In-memory computing means "fewer moving parts", much lower TCOSimpler – no data duplication, one version of the truth. Changes to metadata, not data, mean much lower cost of operations… Here’s some of the data from the IDC SAP HANA market assessment survey, available online.
In-memory is breaking down long-standing barriers to effective use of information. SAP HANA comes with embedded algorithms for business calculations like budgeting, text analytics to take survey or social data and turn it into sentiment, and powerful predictive algorithms. There’s no longer need to move data around and invest in multiple expensive systems for different types of data analysis.
We've spent the last 40yrs thinking of analytics and operations, OLTP and OLAP, as separate. That's changing, and it's a big dealWhat are the big changes to the information infrastructures that are supporting this move from backwards-looking transactions to “signal”?Biggest change in analytics in forty years – broken down the barrier between operations and analytics, between OLTP and OLAP. Yes, we’ve had operational and embedded analytics, but tradeoffs. Now using the same data for both, without compromise. Big jump towards the single source of truth we’ve been looking for. This is the architecture of the future. All your future applications are being engineered to this standard. It’s already the architecture of choice for the latest wave of cloud providers. The analysts agree: Gartner’s Donald Feinberg: “By 2016 in-memory column-store DBMS using storage-class memory will replace 25% of traditional DW and OLTP systems“We can provide it to you today, on-premise AND in the cloud. What are the benefits? Faster, smarter, simpler systems
A Real Time Data Platform is the first requirement. Until today, the technologies that have typically supported Transactional Storage and Data Warehousing have been distinct and separate, they have each been optimized for their particular niche, nicely separated by the different teams responsible for these different applications. Looking into the future we now see a new possibility, a real time platform that can serve both. Donald Fienberg from Gartner recently outlined how he believes that in only a few years more than ¼ of organizations will have begun the process of replacing these traditional methods with an in-memory technology that can provide a different solution.But this is not all that is involved in the Real Time Data Platform. The RTDP must be able to handle both the analytical and transactional load of the organization true. But, to deliver on the “Network of Truth”, how it does that is key. A true RTDP allows any analytical view to be generated without incurring additional storage and this is key to enabling the “Network of Truth”. Using In-Memory technology allows analytical content to be endlessly replicated, modified and reimagined without incurring the traditional storage and processing costs associated with classic data warehousing and analytical storage. It does this by allowing any view to be dynamically generated from the original records without requiring the aggregations to be pre-planned and pre-calculated, without requiring the data to move into a separate local store, without requiring complex indexing operations to process. And the ability to model every variant of Truth, without incurring prohibitive cost, is the first requirement for the “Network of TRUTH”
Fixing data qualty is important :-)Can you spot the problem with these profiles on LinkedIn?
The US submitted a request to the Hong Kong government for the extradition of Edward James Snowden. It was rejected on the grounds that he’s actually Edward Joseph Snowden, giving him the time to flee to Russia. Savor the irony of the NSA being capable of collecting data on millions of people, yet the state department couldn’t get a single, high-profile target’s name right. Data quality thwarted the governments most cherished goal.
You should invest in data quality "performance mgmt" tools for data stewards – like SAP Information StewardWe have analytic applications to help you with information governance…
What are the new user expectations?
Data Discovery is vital – but shouldn’t be yet another siloPeople need:Agility = Visual Intelligence etcPower = PredictiveMaturity = Agility and power without being another silo, can take from desktop to department to country to globalAnd of course, on-premise, on-demand, across all data, integrated with collaboration… And not just for employees – increasingly, it’s about your customers…
SAP Lumirais a powerful tool for business analysts, integrated with corporate systems
New Visualization extensions for SAP Lumira
Mobile #BI is now a given. People want tight links to existing infrastructure, and full applications
A selection of different visualizations possible with the new SAP BusinessObjects Mobile BI application, available for iPad and Android.
Dulux won this year’s BI Excellence award, in part because of intuitive mobile reporting for sales staff, from SAP.
In the past, electricity grids were “one way only” and change only when somebody built a new power plant. With the introduction of wind and wave power, and the increasing desire to let people push electricity back to the grid, electric companies have to rethink the way they run the network. Something similar is happening in BI. It’s no longer enough to just “push data out” from a few central hubs. Now every user and department needs to be able to “create their own electricity” by grabbing and combining data on the fly, and making it available to others in the organization. We call this moving from a “single source of the truth” to a “network of truth”, and it’s a great opportunity to rethink the BI strategy of the organization.
So we need something new. We do not want to forgo the benefits of a “Single Version of TRUTH”. But we want to connect to the changing users expectations and we want to sustainably deliver all the value from the ever growing BIG DATA we can reach.We need to build something new. We need to build a “NETWORK of TRUTH”. Something that engages every user to participate in the definition of Truth, that allows every user to connect new information into the network, that lets everyone share each others insights, knowledge and efforts.
So what is a “Network of TRUTH” and why do we care. Lets take a closely related analogy. When people tried to write down facts about the world. They started by writing individual books on one topic at a time. We still have these references.But as those references expanded, querying and retrieving information also slowed down.To get a holistic view we create “Encyclopedia” that collected the knowledge into a single place. Amazing.But think about the Encyclopedia you had as a kid. Sure it had a lot of information, but how often did it get updated. What was the process. Revisions took years and were heavily governed. Even updates to dictionaries were slow and governed. The amount of information was still limited by the process.The next evolution took Encyclopedia into digital (Encarta) and even with the internet the governance process and manufactured nature meant the scale was limited and updates were still slow.Today none of us use Encarta or a paper Encyclopedia. We all employ Wikipedia. It is massively bigger than any prior Encyclopedia, it is more up to date, and it changes constantly.This is where we need to evolve Business Intelligence. Not back to individual books, but forward. We need to take the value of an Encyclopedia, a single collection of knowledge and create something much bigger for our organizations.Wikipedia required multiple things to be true, before it became a practical options, two technologies and a cultural change.The internet had to exist and to flourish that could connect everyone to a shared space.Technology had to be able to support the constantly changing documents (with all the wiki stuff like editing and logging)People had to be comfortable employing technology to perform their research and be willing to engage with the site.
And SAP believes we have reached that tipping point. The point, as with Wikipedia, where this is now possible. And just as with Wikipedia there are two technologies we think are required and cultural change.First. We need a platform that can operate fast enough and agile enough to allow the specialization on vast amounts of information. We call this the “Real Time Data Platform”. SAP has had a 40 year mission to bring the world to “Real Time” and we will likely have another 40 years continuing to prove how valuable that is.Second. We need the experience of interacting with the platform to be optimized for everyone. That allows them to truly engage with the system. That is designed to capture their contributions.Third. We also need organizations to have a culture that puts the right value of analytical information.Put these together and organizations will create “Networks of Truth”, and thorganizations that do will be the winners.
Finance, in particular, stands to benefit from the new advances in business intelligence and analytics:
Run every aspect of your financial operations better, w/in-memory, mobile-first interfaces, and collaborationExamples of iPad applications for financial staff (some of these are lab previews)
Question: What makes VELUX a connected enterprise ? Answer: During the past years we have gone through a transformation in relation to becoming an effective global enterprise. We have outsourced and professionalized IT, Outsourced most of our finance transactions and globalized several other functions. This development forces us to share information and knowledge across departments, borders and cultures at an unprecedented level. Business analytics has moved to the very top of our agenda and are a very important part of the transformation process. Question: How have you worked with a more adaptive response to changing market conditions ? Answer: The turbulent economic climate increases our demand for fast and flexible analytics across our business. This applies to our everyday reporting but especially to our current budgeting and planning proces. We are moving away from a traditional budgeting approach towards a rolling forecasting model where planning is more frequent but less detailed. HANA and SAP BPC is the enabler that will allow us to change and adapt our plans and targets as the sorrunding market conditions change. This adaptive approach would never have been possible a few years ago simply because of the lack of technology. Now we experience that HANA and the offerings in the Business Objects suite helps us stay ahead of our competition.Question: How are you applying predictive analytics to your business ? Answer: VELUX provides an outstanding warranty on our products. This in turn means that predicting flaws and potential warranty cases becomes important for us not just in relation to the bottom line but also to provide the best service to our customers if something breaks. For years we have been gathering detailed service registration information without any real tools to spot or predict patterns except from Excel. We are now working on applying HANA and predictive analysis on top of all of those data to be able to spot trends much sooner to benefit our customers and our bottom line. We see several other use cases for predictive analysis in future in relation to quality optimization, customer profitability and segmentation analysis and better unit forecasting. Our expectation is that analytics is and will stay on the top of our agenda for many years to come. But it's not just about the technology, we also realise that we have a big task in relation to lifting our own internal analytical skills. If we succeed in this we will be able to benefit even further from the technologies avaliable to us.
HSE24 could only do cross-sell by product, not customer. Now changed with in-memory, real-time predictive
L’Oreal wants to be a “trusted advisor” for its customers, and is looking into opportunities such as augmented reality.
Coinstar uses SAP predictive algorithms to optimize the profitability of its DVD rental machines.
Bostonuses SAP Analytics to allow citizens to have up-to-the minute info about the city. See here for more information: http://www.cityofboston.gov/bar/home.aspKeeping citizens informedKeeping city alignedAdapting to real-time feedback
The NBA uses in-memory to let fans access every statistic since 1964, in real-time: nba.com/stats
Bigpoint gaming’s BattlestarGalactica online – real-time optimization of the gaming experience using HANAhttp://www.sap.com/asset/index.epx?id=6c0a1c3d-f80e-4634-abdf-982d7eba4e13
Examples of companies that have used SAP HANA for innovative new applications, without any existing links to SAP. Visit http://www.saphana.com/community/learn/startups for more information.Genome analysis – MKI https://www.youtube.com/v/U6dA41_ulxo?autoplay=1&rel=0AlertEnterprise, uniquely recognizes that effectively addressing Insider Threat requires analyzing risks across IT Security, Physical Security and Operational Systems like SCADA etc., to safeguard critical assets. Security Convergence has not been so effective in the past. The volume of data and the number of disparate sources of information that can range from structured to un-structured data, tend to scare off the un-initiated. AlertEnterprise possesses the secret sauce to bring such divergent sources of data together and make sense out of it all. So why include SAP HANA in the mix? Predictive Risk Analytics for security took too long to process. http://www.saphana.com/community/blogs/blog/2012/05/01/insider-threat-at-airportsTaulia -- http://www.saphana.com/community/blogs/blog/2012/05/03/using-sap-hana-for-real-time-financial-risk-assessmentsJerome – http://www.saphana.com/community/learn/startups/marketplace/jeromeQunb – http://www.saphana.com/community/learn/startups/marketplace/qunbhttp://www.lemondeinformatique.fr/actualites/lire-qunb-combine-son-cluster-hadoop-avec-les-calculs-en-memoire-de-hana-51346.html Sensen networks: http://www.saphana.com/community/learn/startups/marketplace/sensennetworkshttp://sensennetworks.com/senstore – counting people, vehicle tracking, etc…
The Globe and Mail has used SAP HANA to analyze optimal strategies for paywalls on their web site.
Chef Jerome uses in-memory to do sophisticated real-time mix of structured and unstructured dataChef Jerome – the recipe on the left is automatically analyzed semantically, and combined with information from the Casino supermarket inventory systems to come up with a shopping basket (right) that you can click to have delivered to your home.
Product: Real-time Big data (R+Hadoop+HANA)Business ChallengesLonger wait time (days) for patient results for hospitals that conduct cancer detection from base on DNA sequence matching Delay in new drug discovery and higher associated costs due to lack of insights in patient dataTechnical ChallengeBig data Lack of speed, accuracy and visibility into data analysis results in huge costs and longer turnaround time for drug discovery and the identification of disease factorsBenefitsFor hospitals: Real-time DNA sequence data analysis makes it faster and easier to identify the root cause. Patient care based on genome analysis results can actually happen in one doctor visit Vs. waiting for several days or multiple follow-up visitsFor Pharmaceutical companies: provide required drugs in time and help identify “driver mutation” for new drug targetCompetition408,000 faster than traditional disk-based systemMKIand SAP HANA could alter the course of cancer research in human history It currently takes 2-3 days for a person to find differences in genome data between cancer patients and healthy people. MKI anticipates the time reduction with HANA to be 20 minutes 216x fasterHANA is about 408,000 times faster than traditional disk-based system (60 million recs) while performing independent data analysisHANA is about 5-10 times faster than another competitor. (190milion recs)R+ Hadoop + SAP HANA HANA provides us powerful real-time computation capability, and R offers us easy ways to model and analyze the data. Hadoop is the platform with distributed pre-data processing and storage capabilities. Combining all three, we can store, pre-process, compute, and analyze huge amount of data
Pirelli gathers data from sensors in truck tyres and gives it to fleet managers to optimize the fleets performance.
Our Big Data offerings begin with our portfolio of analytic applications designed to address information challenges specific to particular industries and lines of business. The analytic applications can be rapidly deployed enabling you to achieve results quickly. The applications are built using our leading analytic solutions for business intelligence, enterprise performance management, and governance, risk and compliance. We offer analytic solutions for all stakeholders, including tools for data scientists to carry out predictive analysis and data mining, as well as advanced visualization tools enabling business analysts to explore data in depth. The analytic solutions from SAP are built on top of the SAP real-time data platform, which is based on market leading data management solutions with SAP HANA at its center.With SAP you can rapidly deploy applications that also help you to build out a Big Data architecture that delivers business value each step along the way.
Analytics and information is not only about making decisions faster or doing business better. The reality is that information is now a core part of your customers’ experience – it is part of your PRODUCT. For ideas, please visit: http://www.saphana.com/community/implement
SAP Forum UK, July 2013
Transform Your Business With Analytical
NEW INFORMATION SIGNALS
360O Customer View
Propensity to Churn
Risk Mitigation, Real-time
Retain Market Value
What signals are you
70% of respondents can
envisage a “killer
application” for big data that
would be “very useful” or
“spectacular” for their
The majority chose not to
disclose what that
application would be
because it would provide a
AIIM survey of 345 Information
REAL-TIME DATA PROCESSING PLATFORM
Fewer Layers Same Core Data Simpler Landscape