As the scope of big data rapidly expands, so does the scope of the analytics that are necessary to extract insight from that data. It is simply impossible for humans or indeed rules-based engines to take that information to action. More and more, clients need analytics to make the best decisions possible; or better yet, embed those analytics into processes to automate the decision-making process, which they simply the answers based on the questions being asked at the point of impact. In order to address these rapidly evolving needs, we need to ensure the right analytics capability are deployed to suit each situation, each point of interaction and each decision point within a process. Join this session, and learn how IBM can provide a solution for the varying types of analytics: from descriptive to predictive to prescriptive to cognitive.
Cognitive analytics: What's coming in 2016?IBM Analytics
Cognitive analytics is innovating and evolving rapidly. Expert predictions in this area are essential for organizations that plan to leverage cognitive analytics in their big data analytics strategies in 2016 and beyond. It is the core investment that organizations everywhere should make to stay relevant in the insight economy. IBM is the premier solution provider, with IBM Watson as its flagship cognitive analytics platform, for realizing the opportunities this innovative technology makes possible.
Learn more about IBM Analytics at http://ibm.co/advancedanalytics
SmartData Webinar: Cognitive Computing in the Mobile App EconomyDATAVERSITY
Mobility is transforming work and life throughout the planet. Mobile apps--built for a growing range of handhelds, wearables, Internet of Things, and other platforms--are becoming the universal access paths to commerce, content, and community in the 21st century. The app economy refers to this new world where every decision, action, exploration, and experience is continuously enriched and optimized through the cloud-served apps that accompany you everywhere. In this webinar, James Kobielus, IBM's Big Data Evangelist, will discuss the potential of cognitive computing to super-power the emerging app economy. In addition to providing an overview of IBM's Watson strategy for cognitive computing, Kobielus will go in-depth on IBM's strategic partnership with Apple to draw on the strengths of each company to transform enterprise mobility through a new class of apps that leverage IBM’s Watson-based big data analytics cloud and add value to Apple's iPhone and iPad platforms in diverse industries.
IBM Academy of Technology & Cognitive ComputingNico Chillemi
I delivered this presentation at University at Chieti-Pescara in Abruzzo (Italy) in September 2015, introducing IBM Academy of Technology and talking about Cognitiva Computing and Analytics with IBM Watson and IBM IT Operations Analytics Log Analysis (ITOA). The video in Italian is available on YouTube, please contact me if you are interested. Thanks to Amanda Tenedini for the help with Social Media and to Piero Leo for the help with IBM Watson.
Ibm cognitive business_strategy_presentationdiannepatricia
IBM Cognitive Business Strategy presentation. Presented by Dianne Fodell and Jim Spohrer at the Cognitive Systems Institute Group Speaker Series call on October 8, 2015.
Thank you for your interest in the recent NY Outthink breakfast on July 19th at the Rainbow Room. Presentations shared highlighted how cognitive computing is being applied today in a variety of business situations, in many industries, and across multiple business functions. Presentation by Jason Kelley
Deloitte's report and point of view on IBM's Watson. IBM Watson, AI, Cognitive Computing are rapidly evolving technologies that can support and enhance enterprise solutions. Learn about IBM Watson the Why? and the How?
Cognitive analytics: What's coming in 2016?IBM Analytics
Cognitive analytics is innovating and evolving rapidly. Expert predictions in this area are essential for organizations that plan to leverage cognitive analytics in their big data analytics strategies in 2016 and beyond. It is the core investment that organizations everywhere should make to stay relevant in the insight economy. IBM is the premier solution provider, with IBM Watson as its flagship cognitive analytics platform, for realizing the opportunities this innovative technology makes possible.
Learn more about IBM Analytics at http://ibm.co/advancedanalytics
SmartData Webinar: Cognitive Computing in the Mobile App EconomyDATAVERSITY
Mobility is transforming work and life throughout the planet. Mobile apps--built for a growing range of handhelds, wearables, Internet of Things, and other platforms--are becoming the universal access paths to commerce, content, and community in the 21st century. The app economy refers to this new world where every decision, action, exploration, and experience is continuously enriched and optimized through the cloud-served apps that accompany you everywhere. In this webinar, James Kobielus, IBM's Big Data Evangelist, will discuss the potential of cognitive computing to super-power the emerging app economy. In addition to providing an overview of IBM's Watson strategy for cognitive computing, Kobielus will go in-depth on IBM's strategic partnership with Apple to draw on the strengths of each company to transform enterprise mobility through a new class of apps that leverage IBM’s Watson-based big data analytics cloud and add value to Apple's iPhone and iPad platforms in diverse industries.
IBM Academy of Technology & Cognitive ComputingNico Chillemi
I delivered this presentation at University at Chieti-Pescara in Abruzzo (Italy) in September 2015, introducing IBM Academy of Technology and talking about Cognitiva Computing and Analytics with IBM Watson and IBM IT Operations Analytics Log Analysis (ITOA). The video in Italian is available on YouTube, please contact me if you are interested. Thanks to Amanda Tenedini for the help with Social Media and to Piero Leo for the help with IBM Watson.
Ibm cognitive business_strategy_presentationdiannepatricia
IBM Cognitive Business Strategy presentation. Presented by Dianne Fodell and Jim Spohrer at the Cognitive Systems Institute Group Speaker Series call on October 8, 2015.
Thank you for your interest in the recent NY Outthink breakfast on July 19th at the Rainbow Room. Presentations shared highlighted how cognitive computing is being applied today in a variety of business situations, in many industries, and across multiple business functions. Presentation by Jason Kelley
Deloitte's report and point of view on IBM's Watson. IBM Watson, AI, Cognitive Computing are rapidly evolving technologies that can support and enhance enterprise solutions. Learn about IBM Watson the Why? and the How?
A presentation given in Denmark, introducing cognitive computing, highlighting potential benefits and early use-cases in insurance with IBM Watson. The presentation included demos.
Link to youtube video of FlexRate Insurers self-service demo: https://www.youtube.com/watch?v=8xRN9RzpVBE&spfreload=10
Link to IBM Watson white paper on Cognitive Computing in Insurance:
Cognitive Business: Where digital business meets digital intelligenceIBM Watson
Ravesh Lala, Vice President, IBM Watson Solutions provided a high level overview of IBM Watson on Monday August 22, 2016 at the Electronics event in NY. Ravesh shared insights into what Watson is, and how organizations have leveraged the power of Watson to advance their place in the market.
Great Bigdata eBook giving a perspective of Bigdata Analytics Predictions for 2016. Learn about the milestones, landmarks and futures of this fast growing arena.
"Data Science is highly resourceful when it comes to an understanding of the public and their decisions for service providers' products and services. The top 8 exciting trends that our world will be able to see in Data Science in the coming year of 2021 are discussed here. "
Introduction to Cognitive Computing the science behind and use of IBM WatsonSubhendu Dey
The lecture was given in a Cognitive and Analytics workshop at Indian Institute of Management. Topics covered was -
1) Understanding Natural Language Processing, Classification, Watson & its modules
2) Industry applications of Cognitive Computing
3) Understanding Cognitive Architecture
4) Understanding the disciplines / tools being used in Cognitive Science
Intelligent enterprise: Cognitive Business Presentation from World of WatsonNancy Pearson
How Companies Are Using Cognitive Computing to Drive Tangible Results including information from the 2016 Cognitive Advantage Report: http://www.ibm.com/cognitive/advantage-reports/
Listen to an experienced, global panel of insurance professionals present, discuss and answer your questions on the theme of “AI & Machine Learning”.
Brought to you by The Digital Insurer and sponsored by KPMG.
Data science provides businesses with advanced tools and technologies that allow them to automate complicated business processes linked with extracting, analyzing, and presenting raw data.
With so much happening in the technical field, and the data being generated at a rapid speed, it is crucial to know about the latest as well as the upcoming trends in data science.
On December 9 & 10, Deloitte hosted over 20 business executives and thought leaders at the Internet of Things (IoT) Grand Challenge Workshop at the Tech Museum of Innovation in San Jose. The objective of the gathering was to work collectively to solve one of the more largely unexplored areas of IoT: revenue generating IoT use cases. The following report captures what was discussed during this extraordinary event where an open, collaborative dialogue focused on advancing the field of IoT.
Explore the key findings here or learn more at www2.deloitte.com/us/IoT-challenge.
1- Inescapable in everyday life
2- Chief focus on innovation
3- Converge all big data
4- Hottest specialty in data science
5- Root in global governance
6- Principal personalization tool
7- Automate most data analysis
8- Drive scaling in cloud data services
A presentation given in Denmark, introducing cognitive computing, highlighting potential benefits and early use-cases in insurance with IBM Watson. The presentation included demos.
Link to youtube video of FlexRate Insurers self-service demo: https://www.youtube.com/watch?v=8xRN9RzpVBE&spfreload=10
Link to IBM Watson white paper on Cognitive Computing in Insurance:
Cognitive Business: Where digital business meets digital intelligenceIBM Watson
Ravesh Lala, Vice President, IBM Watson Solutions provided a high level overview of IBM Watson on Monday August 22, 2016 at the Electronics event in NY. Ravesh shared insights into what Watson is, and how organizations have leveraged the power of Watson to advance their place in the market.
Great Bigdata eBook giving a perspective of Bigdata Analytics Predictions for 2016. Learn about the milestones, landmarks and futures of this fast growing arena.
"Data Science is highly resourceful when it comes to an understanding of the public and their decisions for service providers' products and services. The top 8 exciting trends that our world will be able to see in Data Science in the coming year of 2021 are discussed here. "
Introduction to Cognitive Computing the science behind and use of IBM WatsonSubhendu Dey
The lecture was given in a Cognitive and Analytics workshop at Indian Institute of Management. Topics covered was -
1) Understanding Natural Language Processing, Classification, Watson & its modules
2) Industry applications of Cognitive Computing
3) Understanding Cognitive Architecture
4) Understanding the disciplines / tools being used in Cognitive Science
Intelligent enterprise: Cognitive Business Presentation from World of WatsonNancy Pearson
How Companies Are Using Cognitive Computing to Drive Tangible Results including information from the 2016 Cognitive Advantage Report: http://www.ibm.com/cognitive/advantage-reports/
Listen to an experienced, global panel of insurance professionals present, discuss and answer your questions on the theme of “AI & Machine Learning”.
Brought to you by The Digital Insurer and sponsored by KPMG.
Data science provides businesses with advanced tools and technologies that allow them to automate complicated business processes linked with extracting, analyzing, and presenting raw data.
With so much happening in the technical field, and the data being generated at a rapid speed, it is crucial to know about the latest as well as the upcoming trends in data science.
On December 9 & 10, Deloitte hosted over 20 business executives and thought leaders at the Internet of Things (IoT) Grand Challenge Workshop at the Tech Museum of Innovation in San Jose. The objective of the gathering was to work collectively to solve one of the more largely unexplored areas of IoT: revenue generating IoT use cases. The following report captures what was discussed during this extraordinary event where an open, collaborative dialogue focused on advancing the field of IoT.
Explore the key findings here or learn more at www2.deloitte.com/us/IoT-challenge.
1- Inescapable in everyday life
2- Chief focus on innovation
3- Converge all big data
4- Hottest specialty in data science
5- Root in global governance
6- Principal personalization tool
7- Automate most data analysis
8- Drive scaling in cloud data services
Jerry Chetty - Myth About Data InvestigationSaratoga
Jerry Chetty is a data detective, working for the Santam Group to develop and implement strategies towards the threat of economic crime. Jerry will be busting myths about data and investigation of economic crime.
Slides from talks presented at Mammoth BI in Cape Town on 17 November 2014.
Visit www.mammothbi.co.za for details on the event. Follow @MammothBI on twitter.
#MITXData 2014 - Leveraging Self-Service Business Intelligence to Drive Marke...MITX
2014 MITX Data & Analytics Summit
"Leveraging Self-Service Business Intelligence to Drive Marketing Analytics & Insight"
Speaker: Carmen Taglienti (@carmtag), Business Intelligence & Data Management Practice Lead, Slalom Consulting
Advancements in the BI technology ecosystem and the application of these capabilities to marketing analytics has enabled better, faster, and more accurate insight. In addition to the advancements in technology, marketing organizations look to embrace analytics and put the tools that support them into the hands of the decision makers in a “self-service” way. Typically organizations adopt analytics (and the supporting technology) across the enterprise according to the principles of "the analytics driven organization." This session will introduce an Analytics Maturity model that enables an analytics-driven marketing organization to assess current proficiencies, and understand the capabilities required to achieve its desired state of analytics maturity. This discussion will also cover the alignment of technology solutions at the various levels of the Analytics Maturity model, as well as the drive toward “self-service,” easy to use analytics. Finally, the presenter will demonstrate the use of real-time data acquisition and analytics to drive marketing insight.
http://blog.mitx.org/2014-data-summmit/
Predictive Analytics - Big Data Warehousing MeetupCaserta
Predictive analytics has always been about the future, and the age of big data has made that future an increasingly dynamic place, filled with opportunity and risk.
The evolution of advanced analytics technologies and the continual development of new analytical methodologies can help to optimize financial results, enable systems and services based on machine learning, obviate or mitigate fraud and reduce cybersecurity risks, among many other things.
Caserta Concepts, Zementis, and guest speaker from FICO presented the strategies, technologies and use cases driving predictive analytics in a big data environment.
For more information, visit www.casertaconcepts.com or contact us at info@casertaconcepts.com
Presentazione nell'ambito del workshop: OPEN DATA E CLOUD COMPUTING: OPPORTUNITÀ DI BUSINESS. Una vista internazionale - 15 Settembre 2014 Pad. 152 della Regione Puglia - 78 Fiera del Levante Bari
Leveraging IBM Bluemix for Conversation and Personality InsightsHandly Cameron
An overview of the IBM Bluemix service and how to get started leveraging the Watson APIs for Conversations and Personality Insights. Presented to the Atlanta Collaboration Users Group (ATLUG) for their virtual meeting on August 11, 2016.
This is the presentation I gave to the HIMSS Management Engineering and Process Improvement (ME-PI) Community on predictive analytics healthcare usage.
Keynote presentation from IBM Solutions Connect 2013 covering topics such as changing business world today and how technologies can help organisations cope with this change and move forward.
Why Everything You Know About bigdata Is A LieSunil Ranka
As a big data technologist, you can bet that you have heard it all: every crazy claim, myth, and outright lie about what big data is and what it isn't that you can imagine, and probably a few that you can't.If your company has a big data initiative or is considering one, you should be aware of these false statements and the reasons why they are wrong.
Transforming Business with Smarter AnalyticsCTI Group
Transforming Business with Smarter Analytics by Deb Mukherji @ BPT IBM Innovative Indonesia with Smarter Analytics, 12 June 2013, Shangri-La Hotel Jakarta
Entry Points – How to Get Rolling with Big Data AnalyticsInside Analysis
The Briefing Room with Robin Bloor and IBM
Live Webcast Sept. 24, 2013
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?AT=pb&SP=EC&rID=7501927&rKey=664935ceb7de1aec
Where to begin? That question remains prominent for many organizations who are trying to leverage the value of big data analytics. Most sources of big data are quite different than traditional enterprise data systems. This requires new skill sets, both for the granular integration work, as well as the strategic business perspective required to design useful solutions.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the pain points associated with modern data volumes and types. He will be briefed by Rick Clements of IBM, who will tout IBM's big data platform, specifically InfoSphere BigInsights, InfoSphere Streams and InfoSphere Data Explorer. He will also present specific use cases that demonstrate how IT and the line of business can springboard over existing challenges, gain insight and improve operational performance.
Visit InsideAnalysis.com for more information
IBM's big data seminar programme -moving beyond Hadoop - Ian Radmore, IBMInternet World
Big Data Meets Big Analytics Theatre - June 18th, 15:00-15:30
Eighty percent of the world's data is unstructured, and most businesses don't even attempt to use this data to their advantage. Imagine if you could afford to keep and analyse all the data generated by your business. Imagine that you had a way to analyse and exploit that data as it is created! Whether you're a telecoms provider trying to minimise customer churn, a utility company looking to exploit the potential of smart-metering or a surveillance company ensuring the security of clients' premises, there are genuine business opportunities from deploying big data analytics in real-time. Using live client examples, Ian will show how real-time analytics provide a powerful extension to any big data platform and is applicable across many types of information and real-world problems to deliver tangible business value.
Ian Radmore, an IBM Big Data Specialist spoke about the velocity aspect of the 4Vs associated with Big Data at the recent Internet World conference, this is the supporting presentation.
Fuel for the cognitive age: What's new in IBM predictive analytics IBM SPSS Software
IBM recently launched an updated version of its predictive analytics platform. Explore the latest features, including R, Python and Spark integration and more powerful decision optimization.
IBM's Watson is a machine-learning platform that’s been built to mirror the same learning process that humans have: Observe, Interpret, Evaluate and Decide. Through the use of this cognitive framework, Watson can search through a database of information and pull out key insights to bridge gaps in human knowledge. It’s expertise scaling for enterprise.
Watson has already helped businesses across a variety of industries increase their customer engagement, data discovery and informed decision making abilities. Is your business next?
Dr Christoph Nieuwoudt- AI in Financial Servicesitnewsafrica
Dr. Christoff Nieuwoudt delivered a keynote on AI in Financial Services at Digital Finance Africa 2023 on the 2nd of August 2023 at Gallagher Convention Centre, Johannesburg, Midrand.
The customer journey could essentially be divided into 7 elements. We’ll touch upon the issue of ‘Privacy’ and how one balance social and commercial value. Practical examples of
customer analytics at its best will be discussed as well as the importance of the eco-system.
IBM Solutions Connect 2013 - Getting started with Big DataIBM Software India
You've heard of Big Data for sure. But what are the implications of this for your organisation? Can your organisation leverage Big Data too? If you decide to go ahead with your Big Data implementation where do you start? If these questions sound familiar to you then you've stumbled upon the right presentation. Go through the presentation to:
a. Learn more on Big data
b. How Big data can help you outperform in your marketplace.
c. How to proactively manage security and risk
d. How to create IT agility to underpin the business
Also, learn about IBM's superior Big Data technologies and how they are helping today's organisations take smarter decisions and actions.
Robert Lecklin - BigData is making a differenceIBM Sverige
Vad kan Big data göra för ditt företag? Låt dig inspireras av Robert Lecklin som har hjälpt flera kunder att implementera sin Big data strategi. Genom detta har de lyckats omvandlat värdelös data till värdefulla insikter. Han kommer i denna session att dela med sig av erfarenheter av kundcase där en strategi för big data gjort avgörande skillnad...
IBM Solutions Connect 2013 keynote presentation talking about emerging challenges in today's business world and various technologies that can help decision makers address these challenges and drive their business forward. Smart ideas. Smarter actions.
Grow smarter project kista watson summit 2018_tommy auoja-1IBM Sverige
Avicii på Tele2 arena, Drake på Globen och AIK - Luleå på Hovet bäddar för en trång lördagseftermiddag i Globenområdet... (SVT Nyheter, 1 mars 2014) ...och problemen kvarstår än idag
Talare: Tommy Auoja, Kundansvarig för Offentlig Sektor, Kontaktperson i EU projektet GrowSmarter, IBM
Presentation från Watson Kista Summit 2018
Bemanningsplanering axfood och houston finalIBM Sverige
Automatiserad budgetering – låt matematiken göra grovgörat för att säkerställa en optimerad bemanning
Talare: Niklas Westerholm, Axfood & Robert Moberg, Chief Analyst, Houston Analytics
Presentation från Watson Kista Summit 2018
File share and sync (bara) är så 2017!
Att dela filer bekvämt och säkert var bara början. Box har gått vidare till att integrera delade filer i applikationer och processflöden, och revolutionera både internt och externt arbete. Hur kan det revolutionera för dig?
Talare: Jan Hygstedt, Director Nordic, Box
Presentation från Watson Kista Summit 2018
Watson kista summit 2018 en bättre arbetsdag för de många människornaIBM Sverige
Först tvingades vi anpassa oss efter datorerna. Sedan använde vi dem för att samarbeta med varandra. Nu är det dags för datorerna att förstå oss. Vad innebär det för vår arbetsvardag?
Talare och moderator: Peter Bjellerup, Executive Consultant - Social Business, Collaboration & Knowledge Sharing, IBM
Presentation från Watson Kista Summit 2018
Iwcs and cisco watson kista summit 2018 v2IBM Sverige
Samarbeta både över tid och i realtid
Cisco Spark och IBM Connections – tillsammans! Kombinera ledaren för konversationer i realtid – text, video, individuellt och i team med branschledaren sedan sju år för internt samarbete, transparens och nätverk.
Talare: Bo Holtemann, Solution Specialist, IBM Collaboration Solutions
Presentation från Watson Kista Summit 2018
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
To put cognitive systems into the proper context lets take a look at some of the differences over a more traditional programmatic approach to problem solving. What Google is to search Watson is to discovery. We all have entered key words into a search bar only to have millions of entries returned for our review. Unfortunately, the majority of the information retrieved is not what we were looking for so we start over. Watson looks to bring back relevant results, with confidence, putting content into context. Unlike deterministic outcomes Watson is probabilistic in nature.
Take a simple question like 2+2. A precise answer is 4. That is exactly how a deterministic system would respond. However, Watson is not so sure. It may have a high confidence that 2+2=4 is the right answer, however if the context of the question was regarding automotive, 2+2 could have been a car configuration – two front seats, two back seats. If we had been talking to a family psychologist 2+2 could have been referencing a family unit with 2 parents and 2 children. You can quickly see how things have varying meaning which need to be analyzed and properly considered in the context of the broader questions being asked.
Unlike traditional systems which thrive off structure, were information is stored in a binary fashion all neatly organized into rows and columns, Watson can tackle unstructured data spread across disparate sources to unlock patterns and possibilities. And we have already touched on the importance of working in natural language.
This chart illustrates the evolution from descriptive to predictive and presecriptive to cognitive analytics and lists the key characteristics of each phase or analytics domain.
Important is to highlight the different possible analytics journeys and entry points. Clients will not always need to have a mature presecriptive analytics platform in place to launch a cognitive analytics initiative.
We define Cognitive Systems as those systems that can navigate the complexities of human language and understanding, ingest and process vast amounts of structured and unstructured data, generate and evaluate countless possibilities, and scale in proportion to the task. These systems apply human-like characteristics to conveying and manipulating ideas, that when combined with the inherent strengths of digital computing can solve problems with higher accuracy, more resilience, and on a massive scale.
Watson is an example of a Cognitive System. It is able to tease apart the human language to identify inferences between text passages with human-like high accuracy, and at speeds and scale far faster and far bigger than any person could do on their own. Watson doesn’t really understand the individual words in the language, but it does understand the features of language as used by people and from that is able to determine whether one text passage (call it the ‘question’) infers another text passage (call it the ‘answer’) with incredible accuracy under changing circumstances. In Jeopardy! we had to determine whether the question, “Jody Foster took this home for her role in ‘The Silence of the Lambs’” inferred the answer “Jody Foster won an Oscar for her role in ‘The Silence of the Lambs’”. In this case, taking something home inferred winning an Oscar. But it doesn’t always. Sometimes, ‘taking something home’ infers, ‘a Cold’, or ‘Groceries’, or any number of things. Context matters. Temporal and spatial constrains matters. All of that adds to enabling a Cognitive System to behave with human-like characteristics. A rules-based approach would require a near infinite number of rules to capture every case we might encounter in language.
This chart illustrates the various technical capabilities that characterize the various analytics areas. The reality of analytics-related use case sceanrios is that clients may have requirements on the entire continuum of analytics, ie. The focus may be on descriptive analytics with the need to implement all corresponding technical capabilities, but also analytical requirements from the remaining 3 analytics domains, including cognitive analytics. For instance, some clients may have a rather mature descriptive analytics platform, and require natural language processing capabilities for a sentiment analytics project to derive to brand sentiment, affinity analytical insight, without necessarily implementing a sophisticated predictive analytics platform.
When walking through this chart, explain the individual technical capabilities for each analytics capability.
This chart lists some of the clients that IBM has worked with. In order to become familiar with the details of these projects, please visit the “IBM client reference Database”.
Here is the link: http://w3-01.ibm.com/sales/references/crdb/ibmref.nsf/winsubmit?openform
The key message of this chart is the meaning of the blue areas to enable the various analytics domains. These blue areas are BI Data Infrastructure and Big Data Analytics with its capabilities to embrace the mobile needs, social media analytics, and cloud deployment models. These blue domains collectively enable the predictive, prescriptive and cognitive analytics (and descriptive analytics as well – although not explicitely mentioned on the chart). The capabilities transform into the key initiatives such as Smarter Commerce, Smarter Workforce, Smarter Analytics, and Smarter Cities and provide key business value to C-level stakeholders listed at the top of the chart. The business value is delivered for all Industries, which is illustrated by the 12 little symbols on the top of the figure. So the key message of the chart is the illustrateion – or rather transformation – of key technical domains such as BI Data Infrastructure and Big Data & Analytics capabilities to a broad industry-relevant set of business values.
Following are the 3 key messages of the chart:
Clients are leveraging various types of analytics to solve real business challenges. They soon realize there is no single solution to address all of their analytics requirements. Businesses in different industries will have specific needs that applying analytics can address, but there is no one size fits all.
This is why IBM offers many different analytics offerings. This includes industry specific solutions that address unique needs in major industries as well as optimized business and predictive analytics solutions. Cognitive computing, like IBM Watson is another example.
IBM is delivering these solutions on Power Systems because of the platform’s design points and capabilities. Power was built from the ground up to handle data-related applications and analytics workloads.
Key Points of the chart is to highlight the 4 Vs that represent an essential way to characterize Big Data: veracity, variety, velocity, and volume.
Volume is about rising volumes of data in all of your systems – which presents a challenge for both scaling those systems and also the integration points among them
Variety is about managing many types of data, and understanding and analyzing them in their native form.
Velocity is about ingesting data in real time and in-motion
Veracity deals with the certainty, or truthfulness of big data. Veracity is a big issue – and one that directly relates to confidence. In fact, as the complexity of big data rises (the first 3 Vs grow), it actually becomes harder to establish veracity.
The left part of the chart illustrates just 1 dimension (volume) in the context of increasing analytical complexity.
This chart puts into perspective three key areas that influence and drive the need of cognitive systems and analytics:
Big Data: highlight the 4 Vs again as a key driver, especially the need to analyze text, speech, video content, and other non-structured data, such as LOGs, call center transcripts, etc. Also highlight the veracity – meaning trustworthiness – of the data that requires reasoning, and other cognitive analytical capabilities to put insight into context and provide contextual meaning
Cloud: as a key deployment model, cloud represents a driving force to also take into consideration cognitive analytics in the cloud. Highlight the need to provide analytics capabilities that can be deployed and leverage in the cloud. As an example, point out IBM’s Social Media Analytics (SMA) v1.2 – the former Cognos Consumer Insight – which is not only cloud enabled but is offered as a cloud service by IBM.
SoLoMo: still a rather term, SoLoMo (Social, Local, Mobile) is increasingly used to describe these 3 aspects as a combined area that characterizes today’s consumer lifestyles. As such, all 3 aspects drive specific requirements and influence the technical and business capabilities that cognitive analytics needs to deliver. Social means for instance to understand contributions to social media networks, the meaning in context, and to be able to analyze natural language and text in all languages, dialects, and also the sometimes unique style of communication that takes place in social media networks. Local requires the locality awareness for instance to deliver location based services, preferences and culture-awareness when running cognitive analytics. Mobility in regards to cognitive analytics requires the inclusion and understanding of the mobile lifestyle, mobility patterns and preferences.
This chart focuses on the veracity and trustworthiness of big data, and introduces some dimensions of trustworthiness and veracity (right side of the chart). One of the key aspects of big data is the analytics of social media networks. Contributions via social media networks, however, need to be analyzed by taking the listed dimension into consideration. For instance, what was the usage intention of a social media network contribution, what is its relevance for the given analytics in scope or the use case scenario. The left side of the chart lists some of the challenges that requires sophisticated and state-of-the-art cognitive analytics capabilities in order to for instance understand whether a statement/contribution is done in a certain mood or emotional state, whether it is meant as a joke, whether it represents a sarcastic statement, and so forth.
Watson – the computer system we developed to play Jeopardy! is based on the DeepQA softate archtiecture.Here is a look at the DeepQA architecture. This is like looking inside the brain of the Watson system from about 30,000 feet high.
Remember, the intended meaning of natural language is ambiguous, tacit and highly contextual. The computer needs to consider many possible meanings, attempting to find the evidence and inference paths that are most confidently supported by the data.
So, the primary computational principle supported by the DeepQA architecture is to assume and pursue multiple interpretations of the question, to generate many plausible answers or hypotheses and to collect and evaluate many different competing evidence paths that might support or refute those hypotheses.
Each component in the system adds assumptions about what the question might means or what the content means or what the answer might be or why it might be correct.
DeepQA is implemented as an extensible architecture and was designed at the outset to support interoperability.
<UIMA Mention>
For this reason it was implemented using UIMA, a framework and OASIS standard for interoperable text and multi-modal analysis contributed by IBM to the open-source community.
Over 100 different algorithms, implemented as UIMA components, were integrated into this architecture to build Watson.
In the first step, Question and Category analysis, parsing algorithms decompose the question into its grammatical components. Other algorithms here will identify and tag specific semantic entities like names, places or dates. In particular the type of thing being asked for, if is indicated at all, will be identified. We call this the LAT or Lexical Answer Type, like this “FISH”, this “CHARACTER” or “COUNTRY”.
In Query Decomposition, different assumptions are made about if and how the question might be decomposed into sub questions. The original and each identified sub part follow parallel paths through the system.
In Hypothesis Generation, DeepQA does a variety of very broad searches for each of several interpretations of the question. Note that Watson, to compete on Jeopardy! is not connected to the internet.
These searches are performed over a combination of unstructured data, natural language documents, and structured data, available data bases and knowledge bases fed to Watson during training.
The goal of this step is to generate possible answers to the question and/or its sub parts. At this point there is very little confidence in these possible answers since little intelligence has been applied to understanding the content that might relate to the question. The focus at this point on generating a broad set of hypotheses, – or for this application what we call them “Candidate Answers”.
To implement this step for Watson we integrated and advanced multiple open-source text and KB search components.
After candidate generation DeepQA also performs Soft Filtering where it makes parameterized judgments about which and how many candidate answers are most likely worth investing more computation given specific constrains on time and available hardware. Based on a trained threshold for optimizing the tradeoff between accuracy and speed, Soft Filtering uses different light-weight algorithms to judge which candidates are worth gathering evidence for and which should get less attention and continue through the computation as-is. In contrast, if this were a hard-filter those candidates falling below the threshold would be eliminated from consideration entirely at this point.
In Hypothesis & Evidence Scoring the candidate answers are first scored independently of any additional evidence by deeper analysis algorithms. This may for example include Typing Algorithms. These are algorithms that produce a score indicating how likely it is that a candidate answer is an instance of the Lexical Answer Type determined in the first step – for example Country, Agent, Character, City, Slogan, Book etc.
Many of these algorithms may fire using different resources and techniques to come up with a score. What is the likelihood that “Washington” for example, refers to a “General” or a “Capital” or a “State” or a “Mountain” or a “Father” or a “Founder”?
For each candidate answer many pieces of additional Evidence are search for. Each of these pieces of evidence are subjected to more algorithms that deeply analyze the evidentiary passages and score the likelihood that the passage supports or refutes the correctness of the candidate answer. These algorithms may consider variations in grammatical structure, word usage, and meaning.
In the Synthesis step, if the question had been decomposed into sub-parts, one or more synthesis algorithms will fire. They will apply methods for inferring a coherent final answer from the constituent elements derived from the questions sub-parts.
Finally, arriving at the last step, Final Merging and Ranking, are many possible answers, each paired with many pieces of evidence and each of these scored by many algorithms to produce hundreds of feature scores. All giving some evidence for the correctness of each candidate answer.
Trained models are applied to weigh the relative importance of these feature scores. These models are trained with ML methods to predict, based on past performance, how best to combine all this scores to produce final, single confidence numbers for each candidate answer and to produce the final ranking of all candidates.
The answer with the strongest confidence would be Watson’s final answer. And Watson would try to buzz-in provided that top answer’s confidence was above a certain threshold.
----
The DeepQA system defers commitments and carries possibilities through the entire process while searching for increasing broader contextual evidence and more credible inferences to support the most likely candidate answers.
All the algorithms used to interpret questions, generate candidate answers, score answers, collection evidence and score evidence are loosely coupled but work holistically by virtue of DeepQA’s pervasive machine learning infrastructure.
No one component could realize its impact on end-to-end performance without being integrated and trained with the other components AND they are all evolving simultaneously. In fact what had 10% impact on some metric one day, might 1 month later, only contribute 2% to overall performance due to evolving component algorithms and interactions. This is why the system as it develops in regularly trained and retrained.
DeepQA is a complex system architecture designed to extensibly deal with the challenges of natural language processing applications and to adapt to new domains of knowledge.
The Jeopardy! Challenge has greatly inspired its design and implementation for the Watson system.
IBM Watson is the very embodiment of the new era of cognitive systems. It represents a new category of solutions that leverages deep content analysis and evidence-based reasoning to accelerate and improve decisions, reduce operational costs, and optimize outcomes. Cognitive Systems offer a whole new way of computing. Keeping pace with the demands of an increasingly complex business environment requires a paradigm shift in what we should expect from IT. We need an approach that recognizes today’s realities and treats them as opportunities rather than challenges.
Main point: At the core of what makes Watson different are three powerful technologies - natural language, hypothesis generation, and evidence based learning. But Watson is more than the sum of its individual parts. Watson is about bringing these capabilities together in a way that’s never been done before resulting in a fundamental change in the way businesses look at quickly solving problems
Further speaking points:. Looking at these one by one, understanding natural language and the way we speak breaks down the communication barrier that has stood in the way between people and their machines for so long. Hypothesis generation bypasses the historic deterministic way that computers function and recognizes that there are various probabilities of various outcomes rather than a single definitive ‘right’ response. And adaptation and learning helps Watson continuously improve in the same way that humans learn….it keeps track of which of its selections were selected by users and which responses got positive feedback thus improving future response generation
Additional information: The result is a machine that functions along side of us as an assistant rather than something we wrestle with to get an adequate outcome
This section introduces the Big Data analytics reference model and serves as an introduction into the use case scenarios, which illustrate the various stages of analytics
Best in Breed Analytics Placed On Top
Fuel all decision-making with powerful analytics & analytic adoption without silos
Analyze all data wherever it lives
Accelerate business value with solutions that leverage all data types, with predictive insight to let you know what has happened, what is happening and what is likely to happen next
Delivering optimized decisions at point of impact through business applications
Empower end business users with the information to deliver the best decision every time
All touchpoints are managed in real time, via the appropriate channel
Feedback loop ensures all decisions are accurate, dynamic
Business rules integrated with analytics and optimization
All of these different technologies come together (“an integrated platform”) to create decision services for the different LOB areas (e.g. marketing)
The depicted Big Data analytics reference model serves as an introduction into this section, and illustrates the key capabilities. In presenting this chart, explain the capabilities in the order listed here:
Heterogeneous data sources
Data transformation and integration layer
Data persistency layer
Business analytics and application layer
Visualization and reporting layer
Infrastructure services
Highlight the message that these capability categories need to be addressed in every project. The focus, however, can vary depending on specific project requirements.
This figures describes the Big Data analytics reference model in just a slightly different way and lists the different technical capabilities within the various layers. We are listing essentially the same components as on the previous chart and highlight the breadth of different technical capabilities that each component - or layer – needs to be comprised of. Point out the fact that not all capabilities need to be included in every project. The concrete sublist of technical capabilities is derived from the concrete requirements and set of use cases for an individual project.
This chart contains a product mapping to the Big Data analytics reference model, which has been further customized for CSPs (Communication Service Providers). This chart and the 2 previous ones also serve as an introduction to the examples that are described in the following section. The presenter should make himself familir with all products and tools that are referenced in this product mapping chart.
This section describes – at a very high-level – sample projects for all 4 analytics areas: descriptive, predictive, prescriptive, and cognitive. All examples and corresponding use case sceanrios in this section are derived from real customer engagements in Asia Pacific.
This is an example of descriptive analytics project, where a Telco Service Provider is interested in competitive analytics based on CDR (Call Detail Records). Analysis of CDR records was optionally enhanced with analytics from social media networks. The data sources are depicted on the left side of the chart. The component in the center of the chart is comprised of the core capabilities that are derived from BigInsights and BigInsights applications, such as Customer Modeler (an IBM Research asset). Analytical insight is derived from the combined components in this central box. The analytics can optionally be enhanced with SPSS to deliver predictive analytics. In the real customer project, however, this was not part of the use case scenario.
The left side of the chart illustrates the data warehouse and the descriptive BI analytics component, Cognos BI.
This example also illustrates that descriptive analytics is very much a part of Big Data. CDR records are very large in volume, semi-structured, and especially the combination with non-structured data from social media networking sites, makes descriptive analytics very well Big Data use case scenario. It illustrates the changing paradigm that descriptive analytics plays in Big Data.
This sample project is geared towards determining demographics information for unknown pre-paid subscribers. The 1st main heading on the chart (gain analytical insight for pre-paid demographics) explains the logical flow and main steps that need to be performed. The 2nd main heading (required data sources) lists the data srouces such as voice and data CDRs, behavioral data and so forth. This is also a very nice example, which illustrates that predictive analytics – as well as descriptive analytics – are part of Big Data, ie. can be seen from a Big Data angle.
The main step in this analytics flow is to predict demographics information for pre-paid subscribers by correlating and mapping post-paid with pre-paid subscribers.
This sample project is further described on the following 2 charts with:
a contextual diagram and
an architecture overview diagram
This chart contains a high-level contextual digram with the key components, such as the data sources on the left, the cloud-based analytics system that leverages the IBM SmartCloud at IBM Singaporeand the key products on the right of the chart.
The blue figure at the lower right corner is an illustration of the analytics and admin roles and responsibilities that exist in operating the environment.
The yellow figure at the left upper corner illustrates the LoB user using the system and deriving to predictive insight.
This chart contains an architecture overview diagram that contains the key components and the component interaction at a high-level.
Public data sources: will be used in the scenario to gain analytical insight and to leverage existing categorization of for instance websites that are visited by subscribers
Post-paid data sources: will be used to understand preferences, interest, websites visited, performing micro-segmentation, etc. for post-paid subscribers
Prep-paid data sources: the same data sources will be used from pre-paid subscribers, where the same analytical insight is derived for this category of subscribers
Post-paid demographics information: will be used and correlated with the analytical insight that is derived from post-paid data sources. This allows a comprehensive view on post-paid subscribers, which includes knowledge on demographics.
The analytics engine – depicted in the centre of the chart is used to correlate post- with pre-paid segments, clusters, behavior, interest, … and map known demographics for post-paid to corresponding pre-paid subscribers. This will allow prediction of demographics for pre-paid subscribers, e.g. age, gender, income, and other demographics measures.
Client Name:
XO Communications
Case study Link:
http://www-01.ibm.com/software/success/cssdb.nsf/CS/STRD-9E4L7Y
Pull Quote:
"We are only just starting to realize the true potential that IBM analytics holds across the business."
—Bill Helmrath, Director of Business Intelligence,
XO Communications
Company Background:
XO Communications is one of the United States’ largest communications service providers, offering a comprehensive portfolio of communications, network and hosted IT services through a 19,000-mile nationwide inter-city network and over 1,000 office locations. Priding itself on superior customer experience, the company is always looking for ways to raise the bar.
Solution components:
Software
• IBM® SPSS® Analytics Catalyst
• IBM SPSS Modeler
• IBM SPSS Modeler Server
• IBM SPSS Statistics
• IBM InfoSphere® BigInsights™
Business challenge:
XO Communications had already taken the first steps in identifying customer retention risks through analytics; now it wanted to seize the opportunity to put these insights into action more effectively.
The benefit:
142 percent estimated reduction in revenue erosion for customers at most risk of churning.
$10 million+ estimated savings per year from increased customer retention and reduced customer service costs
5 months to achieve full return on investment
Link to reference profile: http://w3-01.ibm.com/sales/ssi/cgi-bin/ssialias?infotype=CR&subtype=NA&htmlfid=0CRDD-8C53TV&appname=crmd
Solution synopsis
A global provider of information management and electronic commerce services for financial institutions in the United States anticipates increased revenue and increased competitive edge when it works with IBM Global Technology Services - Integrated Technology Services and IBM Software Services for Information Management to develop a powerful predictive analytics service for small to midsize banks comprising IBM Power Systems technology and IBM Information Management software
This chart describes at high-level a sentiment analytics project with ABS-CBN in the Philippines.
The objective of the project was to analyze social media about election candidates and the issues that impact them:
Buzz - Candidates, topics, personalities, broadcastersHow much / what is being said about the candidates (ongoing and for key "events" like debates, advertisements, etc), different shows, news anchors. How does this change over time – what is trending.
Sentiment – Popular OpinionWhat do voters like or dislike about the candidates, the parties, campaigns, constituents, etc?How does this sentiment break down by the different groups (voters, political affiliation, news professionals, demographics, affinity groups, etc)?Understand brand sentiment – whether ABS CBN are being perceived as being unbiased and trusted. How are the different news personalities being perceived – credible, neutral & fair.
Intent - What is the intent to act (support / vote) for each candidate?What election outcomes can be predicted (shifts in candidate sentiment, voter intent, etc)?
Main point: How does Watson work? It’s not a simple answer. But since Watson solutions are built on a set of repeatable assets that draw from decades of market leadership, research, and best practices. Beginning at the bottom:
Watson solutions are implemented with customers with a full lifecycle of readiness preparation, building the solution itself, teaching Watson about the industry, use case, and data involved, and finally putting it into production during which it continues to improve through experience and feedback loops
The basic platform of Watson operation is built on a core of ingestable natural language content, tooling to train and utilize Watson’s functionality, proven methods of successful lifecycle operation, algorithms for analytic parsing of language and identification of responses, and APIs to allow other modular functionality to interact with Watson.
Built on this platform of core function is a set of capabilities used across Watson solutions. These include natural language processing capabilities to understand human communication (both from a user interface perspective and more importantly, as a source of information upon which to draw for evidence-based responses) and machine learning capabilities to learn from experience. Data is the fuel of Watson’s engine and a currated data corpus of structured and unstructured data is where Watson draws for evidence in its responses. Watson draws on IBMs’ leadership in analytics (predictive, business, etc.) to find patterns and relationships invisible to the naked eye. Watson solutions use cloud-based delivery to help scale their reach, optimize utilization of the infrastructure required, and help improve accessibility for users. With cloud-based delivery comes mobile accessibility since processing requirements on the user interface device itself are minimized. And finally, Watson infrastructure is optimized for the unique workloads it requires yet Watson runs on commercial off the shelf IBM p-series hardware.
Drawing upon these capabilities are the Ask, Discover, Decide services discussed previously
Actual Watson solutions are developed in close collaboration with industry and domain leaders. IBM has partnered with leaders in healthcare, financial services, and other areas to develop Watson Advisor Solutions to help professionals make better use of available information to improve outcomes. Early brainstorming has led to initial pilots which has led to full production Watson Advisor Solutions, which is leading to expansion into new use cases, industries and domains. The future of Watson and Cognitive Systems is as bold and compelling as the imagination itself.
This chart elaborates on an IBM research effort to use BigInsights as a platform for massive scale Social Network Analytics (SNA).
Further description of X-RIME can be found here: