Anecdotes about real life usage of Analytics - research done on Google, hence no claims on accuracy. Please use this as a directional insights into the applications and benefits.
In this webinar, Dale Sanders will provide a pragmatic, step-by-step, and measurable roadmap for the adoption of analytics in healthcare-- a roadmap that organizations can use to plot their strategy and evaluate vendors; and that vendors can use to develop their products. Attendees will have a chance to learn about:
1) The details of his eight-level model, 2) A brief introduction to the HIMSS/IIA DELTA Model, 3) The importance of permanent organizational teams to sustain improvements from analytic investments, 4) The process of curating and maturing data governance, and 5) The coordination of a data acquisition strategy with payment and reimbursement strategies
This presentation addresses
*Why do we need access to Health Data and Information?
*What are the challenges we have?
*What are the possible interventions that can be made so that access becomes easy for patients and doctors?
In this webinar, Dale Sanders will provide a pragmatic, step-by-step, and measurable roadmap for the adoption of analytics in healthcare-- a roadmap that organizations can use to plot their strategy and evaluate vendors; and that vendors can use to develop their products. Attendees will have a chance to learn about:
1) The details of his eight-level model, 2) A brief introduction to the HIMSS/IIA DELTA Model, 3) The importance of permanent organizational teams to sustain improvements from analytic investments, 4) The process of curating and maturing data governance, and 5) The coordination of a data acquisition strategy with payment and reimbursement strategies
This presentation addresses
*Why do we need access to Health Data and Information?
*What are the challenges we have?
*What are the possible interventions that can be made so that access becomes easy for patients and doctors?
Health Care Analytics
Table of Content:
What is Healthcare Analytics
Objectives of Healthcare Analytics
Types of Analytics
Source of Data
What do Healthcare companies achieve with healthcare analytics
Booming technologies in the Healthcare Industries with some of their uses
Existing Healthcare analytics tool in the market
-----------------------------------------------------------------------
Objectives of Healthcare Analytics
The fundamental objective of healthcare analytics is to help people make and execute rational decisions.
Data - Driven
Analytics in healthcare can help ensure that all decisions are made based on the best possible evidence derived from accurate and verified sources of information.
Transparent
Healthcare analytics can break down silos based on program, department or even facility by promoting the sharing of accurate, timely and accessible information
Verifiable
The selected option can be tested and verified, based on the available data and decision-making model, to be as good as or better than other alternatives.
Robust
Healthcare is a dynamic environment; decisions making models must be robust enough to perform in non-optimal conditions such as missing data, calculation error, failure to consider all available options and other issues.
-------------------------------------------------------------------------------
Types of Analytics
Descriptive Analytics
Uses business Intelligence and data mining to ask: “What has Happened”
Diagnostics Analytics
Examines data to answer, “Why did it happen ?”
Predictive Analytics
Uses optimization and simulation to ask: “What should we do”
Prescriptive Analytics
Uses optimization and simulation to ask: “What should we do”
----------------------------------------------------------------------------------
Sources of Data
Human Generated data
Web and social media data
Machine to Machine data
Transaction data
Biometric data
---------------------------------------------------------------------------------
What do Healthcare companies achieve with healthcare analytics
Hospitals
Reducing Cost
Reducing cost of analytics by building an easy-to-use analytics platform
Identifying and preventing anomalies such as fraud
Automating external and internal reporting
Improving patient outcomes
Clinical decision support
Pharmacy
Randomized clinical trials are expensive to conduct and are not effective at identifying rare events, heterogeneous treatment effects, long-term outcomes. Pharma companies rely on healthcare analytics to identify such relationships. However, inferring causal relations can be difficult as data can be easily misinterpreted to view unrelated factors as inter-dependent.
CDC NPIN In the Know: Facebook & Visual Social Media for Public HealthCDC NPIN
This is the fourth of six interactive webcasts in the series, In the Know: Social Media for Public Health. Each webcast focuses on a different social media channel and provides basic information, tips, success stories, and discussion on how best to use social media to promote public health and expand outreach initiatives.
What is Health Informatics?
HI Goals
HI stakeholders
HI subfields / subspecialties
Healthcare trends & HI
HI professional environments
HI education / training opportunities & degrees
HI organizations / journals / meetings / events
HI professional certificates
HI books
Inputs und Framework zur Erarbeitung einer Social Media Strategie. Infos & Anmeldung zum Social Media CAS an der FHNW http://www.fhnw.ch/wirtschaft/weiterbildung/cas-social-media-conversion-management-webanalyse
Business intelligence- Components, Tools, Need and Applicationsraj
As part of the research project for the course Technical Foundations of Information Systems at the University of Illinois, our team worked on the topic, Business Intelligence. The presentation focuses on what is Business Intelligence, its various components, latest tools, the need of BI as well as applications of this technology. This project deals with the latest development of BI technologies (hardware or software) and includes comprehensive literature survey from Journals, and the Internet.
This is a case study prepared by Christina Lerouge for IBM Watson Health Data Movement for 2019. In this study, she covers four main points about IBM Watson: Dynamic Cancer-Care Solutions, Big Data Powerhouse, Data Into Reality: Oncology Landscape Video Review and Future Steps for IBM Watson.
Health Care Analytics
Table of Content:
What is Healthcare Analytics
Objectives of Healthcare Analytics
Types of Analytics
Source of Data
What do Healthcare companies achieve with healthcare analytics
Booming technologies in the Healthcare Industries with some of their uses
Existing Healthcare analytics tool in the market
-----------------------------------------------------------------------
Objectives of Healthcare Analytics
The fundamental objective of healthcare analytics is to help people make and execute rational decisions.
Data - Driven
Analytics in healthcare can help ensure that all decisions are made based on the best possible evidence derived from accurate and verified sources of information.
Transparent
Healthcare analytics can break down silos based on program, department or even facility by promoting the sharing of accurate, timely and accessible information
Verifiable
The selected option can be tested and verified, based on the available data and decision-making model, to be as good as or better than other alternatives.
Robust
Healthcare is a dynamic environment; decisions making models must be robust enough to perform in non-optimal conditions such as missing data, calculation error, failure to consider all available options and other issues.
-------------------------------------------------------------------------------
Types of Analytics
Descriptive Analytics
Uses business Intelligence and data mining to ask: “What has Happened”
Diagnostics Analytics
Examines data to answer, “Why did it happen ?”
Predictive Analytics
Uses optimization and simulation to ask: “What should we do”
Prescriptive Analytics
Uses optimization and simulation to ask: “What should we do”
----------------------------------------------------------------------------------
Sources of Data
Human Generated data
Web and social media data
Machine to Machine data
Transaction data
Biometric data
---------------------------------------------------------------------------------
What do Healthcare companies achieve with healthcare analytics
Hospitals
Reducing Cost
Reducing cost of analytics by building an easy-to-use analytics platform
Identifying and preventing anomalies such as fraud
Automating external and internal reporting
Improving patient outcomes
Clinical decision support
Pharmacy
Randomized clinical trials are expensive to conduct and are not effective at identifying rare events, heterogeneous treatment effects, long-term outcomes. Pharma companies rely on healthcare analytics to identify such relationships. However, inferring causal relations can be difficult as data can be easily misinterpreted to view unrelated factors as inter-dependent.
CDC NPIN In the Know: Facebook & Visual Social Media for Public HealthCDC NPIN
This is the fourth of six interactive webcasts in the series, In the Know: Social Media for Public Health. Each webcast focuses on a different social media channel and provides basic information, tips, success stories, and discussion on how best to use social media to promote public health and expand outreach initiatives.
What is Health Informatics?
HI Goals
HI stakeholders
HI subfields / subspecialties
Healthcare trends & HI
HI professional environments
HI education / training opportunities & degrees
HI organizations / journals / meetings / events
HI professional certificates
HI books
Inputs und Framework zur Erarbeitung einer Social Media Strategie. Infos & Anmeldung zum Social Media CAS an der FHNW http://www.fhnw.ch/wirtschaft/weiterbildung/cas-social-media-conversion-management-webanalyse
Business intelligence- Components, Tools, Need and Applicationsraj
As part of the research project for the course Technical Foundations of Information Systems at the University of Illinois, our team worked on the topic, Business Intelligence. The presentation focuses on what is Business Intelligence, its various components, latest tools, the need of BI as well as applications of this technology. This project deals with the latest development of BI technologies (hardware or software) and includes comprehensive literature survey from Journals, and the Internet.
This is a case study prepared by Christina Lerouge for IBM Watson Health Data Movement for 2019. In this study, she covers four main points about IBM Watson: Dynamic Cancer-Care Solutions, Big Data Powerhouse, Data Into Reality: Oncology Landscape Video Review and Future Steps for IBM Watson.
Political polling season is kicking into high gear – and pollsters want to ensure they are getting the most accurate data possible. While much of traditional polling is done on the phone, it has proven that it is not as accurate as it once was. What can be done?
Check out the deck from our webinar, The New Polling Mix: Increasing Accuracy With Online Surveys, to learn how incorporating online surveys into your polling mix can increase your overall accuracy.
Inside the Cave is in effect a cuttings collection, but a very good one at that, brining together the most useful insights into the Obama digital campaign that appeared in the media or online. Including the story about getting drunk people to donate… and also about the value of clipboards (even if they weren’t up to the standards of my German favourite). http://www.markpack.org.uk/47536/get-drunk-people-to-donate-and-other-gems-from-the-obama-digital-campaign/
White Paper from Campaign Sciences, helping to explain how we help our clients win. Campaign Sciences is the only Analytics firm exclusively serving conservative Republican candidates and organizations
[DSC Europe 23] Alen Kisic - How can do Facebook data and machine learning al...DataScienceConferenc1
This talk explores the application of machine learning algorithms on the social network Facebook users' activity data to develop predictive models of election outcomes. The main goal is to compare their accuracy and reliability with the models obtained by traditional public opinion polls. Four different approaches to machine learning are used to develop predictive models: an approach based on: error, information, similarity, and probability. The most effective approach is revealed. Obtaining equally effective predictive models with faster and easier access to data makes a significant scientific and social contribution to research in this area.
Who are you trying to reach and how? Building and using the modern public sec...Alexander Jasperse
Presented at MARCOM 2017 | www.marcom.ca
Do you know what impact your communications makes? How do you know if you are even communicating with or marketing to the right people?
As modern communicators, we instinctively know that our products and messaging needs to be designed with our audiences in mind and with measurable actions. Improving our ability to do so has certainly become easier with the widespread availability of online research and planning tools. With so many tools, which ones are most helpful to improving public sector efforts?
Designed in consultation with practitioners from federal, provincial, municipal, and international levels of government, this session will help you map and apply the essential tools for the modern public sector communicator. With a focus on low-cost and no-cost tools, the objective is to help practitioners identify opportunities to strengthen their current toolkits to increase the return on their efforts and campaigns.
1CONTEXTUAL THINKING ABOUT DIFFERENT SCENARIOS Scenario A L.docxdrennanmicah
1
CONTEXTUAL THINKING ABOUT DIFFERENT SCENARIOS
Scenario A: Local pro capital punishment newspaper reporting on the milestone of the 500th execution.
Context: Curiosity and Purpose
The purpose of this report is to explore the justifications for capital punishment as one of the ways used by the criminal justice system in punishing convicts and providing justice to the victims of capital offenses. There are two curiosities that we identified as very fundamental in exploring this report.
Curiosity #1: What is the level of deterrence on capital offenses?
We were curious to look at how capital punishment helps in the deterrence of capital offenses. The level of deterrence could be measured by using data on the history of the occurrence of capital offenses to date. This curiosity originated from the realization that there were few cases of capital offenses reported on the media platforms within Texas.
Curiosity 2: What is the safety of the society from capital offenders?
We were curious on exploring the level of safety in the society when members of the public understand that the criminals have been removed from the general public and they will not come back to the society again. The safety of the general public can be determined from the dimensions of business activities during late hours, outdoor activities and the number of people using active means of transport. An increase in these dimensions reveal an increase in the level of safety that is experienced by the general public as a result of the removal of capital offenders from the society.
Circumstances
People: in this report, the stakeholders include the newspaper which is an independent newspaper with independent views and the Texas Department of Criminal Justice which will help in providing vital data. The audience of this report includes the general public who would like to understand how capital offense deters others from committing crime and how it ensures the safety of the society.
Consumption
Frequency: This report will be a one off piece of work that will be produced on this single occasion of reporting on the 500th execution milestone.
Deliverables
Format: The intention of the newspaper was to create a report that would work through print media.
Vision: Purpose Map
In order to accomplish the purpose of the report, the news paper would be going for an explanatory experience to explain to the audience the justifications for capital punishment. This would be followed by an exhibitory experience where data would be displayed using bar charts and bubble charts to show the levels deterrence.
Vision: Ideas
Mental visualization
The visual work for this report can be depicted using bar graphs for the cases of capital offenses reported over the years. Bright colors like red and orange would be used for years when high cases were reported and this would indicate the level of danger or lack of safety in the society. Cool colors like blue and green would be used or the bars representing low c.
Divergencias entre las metodologías que diversos autores han utilizado para respectivos análisis de predicción basados en los datos obtenidos en las redes sociales. Carencia de una metodología úncia, lo que conlleva falta de unanimidad en los resultados obtenidos
Making data actionable: A look at the power of people-driven data in activati...Ray Poynter
Innovation is impacting every part of our lives, with smartphones making the world available from our pocket. But can we say the same for market research? As researchers, providing access to data from new sources like customer data platforms, could be considered the equivalent of sticking a kid in a candy store. Exciting yes, but completely overwhelming for some. Technological development in our industry is often focused on how a business can create the latest program or app to benefit their business. However, the very fact that this technological development is becoming a technological dependence for consumers means market research is already benefiting. Or rather, it should be. The innovation has occurred, now we need to tap into this wealth of data.This paper will look at how and what can be appended to existing research projects. It will share tangible examples of ways researchers, marketers, and advertisers alike can tap into the already innovative consumer and leverage the ocean of data they are generating – all while maximizing the investment being made in their survey research. Most importantly, it will leave the audience with better understanding of how to activate research-derived segments through programmatic advertising and how clients can derive incremental value from the investment they’re already making in survey-based research.
1) Values in Computational Models RevaluedComputational mode.docxmonicafrancis71118
1) Values in Computational Models Revalued
Computational models are mathematical representations that are designed to study the behaviour of complex systems. Systems under study are usually nonlinear and complex to the extent that conventional analytics cannot be used. Scholars have tried to establish the role played by trust and values in the use of such models in the analysis of public administration.
Public decision-making is itself a complex endeavour that involves the input of multiple stakeholders. Usually, there are a lot of conflicting interests that influence the final outcome of such decision-making processes (Klabunde & Willekens, 2016). In a computational model, a number of factors equally influence the outcome of the process. One of them is the number of actors involved –the presence of more actors normally implies increased mistrust. Another factor is the amount of trust that already exists among the decision makers. In cases where the group is homogenous, there is likely to be more trust and thus, less concern about the number of actors involved.
Given the importance of these two factors, the designer of any such model bears the largest burden in assuring the value of the model. He or she can choose to implement agency by humans or by technology depending on the number of actors and trust among them. Also, model designer determines the margins of error from each scenario while modelling (Gershman, Markman & Otto, 2014). Since in conventional decision-making processes different actors have different roles, the model designer may decide to accord different levels of authority to different actors. Nevertheless, they must ensure that such a decision does not affect the trust of the system. Overall, what values are sought from a computational model in a public decision-making context?
References
Gershman, S. J., Markman, A. B., & Otto, A. R. (2014). Retrospective revaluation in sequential decision making: A tale of two systems.
Journal of Experimental Psychology: General
,
143
(1), 182-194.
Klabunde, A., & Willekens, F. (2016). Decision-making in agent-based models of migration: state of the art and challenges.
European Journal of Population
,
32
(1), 73-97.
2) Active and Passive Crowdsourcing in Government
The authors of the article “Active and Passive Crowdsourcing in Government” discuss the application of the idea of crowdsourcing by public agencies. It leverages Web-based platforms to gather information from a large number of individuals for solving intricate problems (Loukis and Charalabidis 284). The scholars revealed that the concept of crowdsourcing was first adopted by organizations in the private sector, especially creative and design firms. Later on, state agencies began to determine how to leverage crowdsourcing to obtain “collective wisdom” from citizens aimed at informing the formulation and implementation of public policies.
Active and passive approaches to crowdsourcing are similar as they are both.
Risk Product Management - Creating Safe Digital Experiences, Product School 2019Ramkumar Ravichandran
Sreekant Vijayakumar & I spoke at Product School in Dec 2019 on everything that goes into Risk Management at Digital Enterprises. First part focused on explaining why Risk Management is existential question for organizations today and not cost saving. Second part focuses on educating on the foundations of Risk Management and last part is how a real Risk Management Practice (Product Managers, Data Scientists, Engineers, Operations) is built & run in an organization.
Artificial Intelligence is here to stay and drastically improve our lives. However as with any emerging tech, there is been a FOMO rush to get something (AI-As-A-Brand) out which led to creation of AI products first and then looking for customers and problems to solve. Creating products that drive real impact at scale requires loving your "customers and their problems" instead of loving the "product that you created". It means commitment, persistence and humility to identify real customer needs, give your everything to meet it and learn & improve along the way. The framework of "Learn-Listen-Test" is perfectly to do this at scale and effectiveness by marrying together Reporting to monitor KPIs, Analytics to explain the reasons behind things, User Research to contextualize it and Experimentation to pick the best solution. AI Product leaders today became who they are by going back to the basics and learning their way to become integral part of our lives and we should emulate them as we think of our own products.
Presented at the DCD Mexico 2017. The digital era is characterized by the omnipresence of data and analytics across the value proposition of the organization from being a core offering to an add-on or as a competitive advantage or the optimization support. This has led to an Analytics that is a living & breathing organism, something that grows and changes with time - in the role it plays for the various stakeholders (which changes itself), the forms of delivery, the ownership and finally the size of impact. The "Analytics Maturity Curve" provides a guiding vision and framework for the Analytics programs across the industry. The presentation will focus on the evolution of "Analytics Maturity Curve" itself with time, the need for it, the challenges and finally the lessons learnt during the transition from one phase to another. The success criteria for this presentation is that the audience leaves with a perspective on what differentiates the programs that successfully made the transition and have a best practice checklist to refer to in their own journey.
This will be presented at the Optimizely's San Francisco User Group session on Oct 4th. As with any program, an A/B Testing Practice also follows a specific maturity curve. Since it is much more complex and spans across various domains and business units, it begins with a "Sell" phase focused on getting buy-in from various stakeholders but with a specific focus on Engineering & QA, followed by "Scale" phase with focus on building team, efficiency and program and then on to "Expand" phase focused on wider scope/complex tests and strengthen the platform, over to the "Deepen" phase where the focus is to ingrain testing within the company's DNA, i.e., within the backend/algorithms, cross pollinate learning and testing across various business units. The final phase is the "Sustain" phase where Algorithmic Test Management takes over Testing, and Testing is productized as a Value Add service for monetization and brand captial creation. We will walk the audience through our own journey so far along the maturity curve, the lessons learnt along the way, the challenges and what worked for us. The session will be rounded up with a working session with the audience on their own journey, lessons and advice for others.
This was presented at a Meet Up called Data & Analytics (DNA) at Raipur, India. It was organized by Ashutosh Tripathi of Krishna Public Schools heritage. The audience was the business leaders, students/aspirants, enablers and institutions. The focus was on helping audience understand how Analytics is more than just another fad - it's a weapon to drive better management, cultural transformation and quantifiable business impact. In other words, it's about delivering effective leadership via an actionable vision, guided execution and transformation management.
Augment the actionability of Analytics with the “Voice of Customer”Ramkumar Ravichandran
Currently Voice of Customer, Analytical & Testing are treated as distinct functions and managed across siloed systems, resulting in under realization of true potential of these systems. Some of the biggest complaints cited by user groups of these functions can easily be solved by just leveraging the power of one technique for the other, be it the need for reasoning for analytical findings, scale for research insights or unintended consequences in Testing. Integrating them closely with the ability to talk to each other, having the data pass-throughs and the ability for application servers to process and react to the insights from across these systems will help get a reasoned decision system. Together these disparate but rich data sources can also open up avenues for exploratory research internally and outside, which can also be monetized as actionable data products.
Predictive Analytics has stopped being an advanced analytics project that is done to gain competitive advantage. It is now the mainstay of every business and requires the ability to handle a wide variety of intricate types of problems, day in and day out, at an ever increasing pressure of RoI, at a scale previously unimagined and at speed previously unconceivable. As the current analytics maturity curves evolves to consider Machine Learning & Artificial Intelligence as integral components that an organization should aspire for, it requires predictive analyze imbibe the best of product practices- agility of development, iterative learning & developing, inter-operability and a simpler interface aka API. Having an API like framework helps Predictive Analytics seamlessly integrate with other analytical practices like A/B Testing, Research, fit within the final product offering and also help complement power of predictive analytics to answer what could happen based on not only what happened in the past, why it happened, the motivations/aspirations of customers and the engagement of customers with competitive offerings. This leads to a virtuous cycle of enhanced predictive power, easier integration with prescriptive framework, better actionability of insights and ability to tweak actions via Test & Learn Framework.
Prepping the Analytics organization for Artificial Intelligence evolutionRamkumar Ravichandran
This is a discussion document to be used at the Big Data Spain at Madrid on Nov 18th, 2016. The key takeaway from the deck is that AI is reality and much closer than we realize. It will impact our Analytics Community in a very different way vs. an average Consumer. We can shape and guide the revolution if we start preparing for it now - right from our mindset, design thinking principles and productization of Analytics (API-zation). AI is a need to address the problems of scale, speed, precision in the world that is getting more and more complex around us - it is not humanly possible to answer all the questions ourselves and we will need machines to do it for us. The flow of the story line begins with a reality check on popular misconceptions and some background on AI. It then delves into all the ways it can optimize the current flow and ends with the "Managing Innovation Playbook" a set of three steps that should guide our innovation programs - Strategy, Execution & Transformation, i.e., the principles that tell us what we want to get out of it, how to get it done and finally how much the benefits permanent and consistently improving.
Would love to hear your feedback, thoughts and reactions.
"Big Data, big data, big data" is all that anyone can think about today. It is the rage, it is the "in" thing, it is the "pill for all ills". People call it the new oil! It takes a moment to realize that it is gas that run automobile not raw oil. It requires taking a step back to realize that actionability can come from good reasoning, right analyses, incisive research and rigorous testing even if the data is small. Big data is useful in so many ways - statistically significant sample size, ability to manage "unknown-unknown" , micro targeting, etc. but it brings with it associated costs and noise too. This presentation is an attempt to bring back the conversation to quality of analytics, actionability of insights and confident decision making without dependency on complexity or volume of data. Analytics Value Chain is a framework where Strategic goals drive everything in data, analytics, research and testing with a quantifiable benefit on the bottom line. This was presented at Global Big Data Conference 2016 at Santa Clara.
Marketing is the face of the company, Marketing gives personality to the life that is the firm. Even though Marketing is a critical function, it has sometimes lagged in tapping true potential of analytics for good reasons. Marketing is a complex function with multiple moving parts and it is rather difficult to bring in too much control required for tracking, measuring and acting on the insights. However recent developments in big data, technology, awareness, analytical maturity and analytical techniques have made this easier. This deck is a discussion on practical challenges, potential opportunities and proposes an analytics value chain approach bringing together data, analytics, research and testing to inform and drive Strategy, manage execution and drive business impact with quantifiable business impact. This presentation was done at Digital Summit 2016 at Los Angeles.
Analytics is the hottest commodity on the job market today. Everyone wants to be an analyst and everyone wants analysis to inform their decisions. However barely scratching the surface reveals some disconnect between the Analyst community and their stakeholders ranging from expectations of actionability, to be able to understand the insights on the stakeholders side and the quality of problems being solved and the insights being acted up on from the analyst side. It leads to significant heartburn, churn and lost business opportunity. This presentation is a discussion on the drivers of the issues, possible solutions leveraging analytics and a framework for objective measurement of performance/contribution/action and growth & development. This was presented at TM Forum Live 2016 at Nice,France.
Analytics has proven itself to be a enabler of decisions, strategy and execution. But it is much more, it can help define and empower organizational culture. It can bring in transparency, accountability, collaboration, focus and objective pursuit of company vision and goals. This presentation was done at Customer Analytics & Insights Summit at Austin in Aug 2016.
Digital summit Dallas 2015 - Research brings back the 'human' aspect to insightsRamkumar Ravichandran
Every established firm needs engaged Consumers and brand loyalists and advocates - higher the share of loyal & engaged consumers, higher is the brand respect and business performance. Numbers are relatively inexpensive, quick, efficient and more direct way of understanding the engagement and drivers. However Research adds in the additional dimension of motivations/emotions driving such engagement. Only when we bring them together in a strategic way, can we truly appreciate our Customers & be able to offer them the best solutions & services.
Social media analytics - a delicious treat, but only when handled like a mast...Ramkumar Ravichandran
Social Media provides a wealth of insights into Brand's stand in the minds of consumers. It's usually unsolicited and represents true "connect" and if leveraged well as a channel can add a significant value addition to Consumer Engagement & Brand Management. However, easy it is not! It requires a well planned out strategy with right goals, the success criteria & a dedicated Social team. Reading it requires an "analyst" mindset, a strong technical setup and reacting to it requires strong business acumen. The slides tries to capture key considerations that should go into a Social Media Strategy.
Presented at the Product Management & Innovation Summit 2016 -a discussion on how insights derived from various analytical methods can help optimize decisions across the various stages in Product Life Cycle. Bringing them all together can help strategically prioritize development of features truly desired by Consumers, address issues quickly and capitalize on bigger opportunities.
Analytics has evolved from a support function into a Core Decision making tool. It provides unique capability of connecting the dots across organization & outside and leverage best practices/insights into making Decisions more actionable and outcomes predictable. With a top-down strategic view, iterative Test & Learn framework, hybrid team structure, context based User Experience Design, dual objective (Business & Learning) & recommendation/business case storytelling takes the Analytics deliverables into next level.
We propose a new needs driven framework for managing data with Data Lakes - Scalable Metrics Model. Salient features are modularity, extensiblity, flexibility and scalablity. We want to have self-contained modules which can either feed Reporting/Decision engines themselves with the capability of connecting across various other modules for Deep dive Analytics/Mining.
This will be presented at a Global Big Data Conference at Santa Clara on Sep 2nd. Come join us for a fun and learning event.
What makes insights from Analytics more/less actionable? -not always billion dollar revenue generation. Slides walk you through the various components that make it actionable - challenges & what can be done about them. It was presented at Text Analytics Summit NY 2015.
A/B Testing best practices from strategic vision to operational considerations to communication and finally expectations management. We need to adhere to fundamental project management, technology, statistical, experimental design, UX Design, Customer Relationship, business and data principles to ensure that the insights and hence the decision is as trustworthy as possible.
This talk was done at Business Analytics Innovation Summit 2015 @ Vegas on Jan 22nd. In this talk, we show problems with distributed Insight generation and the resultant problems. We recommend an Outcome Focused Framework for enabling Data Instrumentation, Data Management, Insight Generation and Open Analytics Platform.
Video used: https://www.youtube.com/channel/UCODSVC0WQws607clv0k8mQA/videos
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
1. Interesting Applications of Analytics
In shortest terms, Analytics is “answering questions with data”. With the progress in technology of data
capturing and efficient access and analysis, analytics has become mainstream. Gone are the days when
it was an esoteric science which required scientists to find anything from millions of rows and columsn
of number. Today everyone can do “analytics” from simple aggregated analytics to advanced predictive
modeling with some training and orientation. Like the quip of a wise analyst – bring me the right
question and I will give you the right answer through data. Analytics today is leveraged across functions,
domains, industries and day-to-day activities from a smart-energy meter at home to predictive
presidential election results.
Here we bring you some stories where analytics was leveraged in a non-traditional way in non-
traditional industries to achieve success.
Obama’s re-election campaign
The problem:
Given the losses during the first term of the presidency, re-election was no longer firmly under the belt.
Obama team had a simple fundamental principle – get everyone who voted the first time to vote again
and also attempt to bring in new voters – from new growing demographics or by swaying the
undecideds into their fold through targeted messaging. To them it became a task of reorganizing the
coalition of supporters one-by-one, through personal touchpoints.
The solution:
Obama For America (OFA) 2012 will be quoted in history for leveraging best practices in Analytics,
Survey, Testing, Visualization & Reporting; and how it molded them into a coherent “Data-Driven-
Strategic Framework”.
“The Cave” in OFA’s Chicago headquarters housed the campaign’s Analytics team. Behind closed doors,
more than 50 data analysts used Big Data to predict the individual behavior of tens of millions of
American voters.
Data Collection: (Dan Wagner)
o Survey Manager: Collection of series of surveys on voters’ attitudes and preferences,
which fed into software called ‘Survey Manager’ as tables. Surveys could be short-term
and long term interviews with voters. At one time, the campaign completed 8,000 to
9,000 such calls per night.
o Constituent Relationship Management system: Captured everything about the
interaction with the voter, volunteer, donor and website user. Vertica software from
Hewlett-Packard allowed combined access the party’s 180-million-person voter file and
all the other data systems.
2. o The analytics staff also routinely aggregated all varied data sources-- Benenson's
aggregate battleground survey, the state tracking polls, the analytical calls and even
public polling data and feeding to predictive models
Micro-targeting models: (Predictive Analysis)
o The electorate could be seen as a collection of individual citizens who could each be
measured and assessed on their own terms - casting a ballot and supporting Obama.
These models were adjusted weekly based on new data. Applying these models
identified which non-registrants were most likely to be Democrats and which ones
Republicans. It informed subsequent “Get-out-the-vote” and “persuasion” campaigns.
o Models estimated support for Obama and Romney in each state and media market.
They controlled for the "house effects" of each pollster or data collection method, and
each nightly run of the model involved approximately 66,000 "Monte Carlo"
simulations, which allowed the campaign to calculate its chances of winning each state.
Experiment-informed programs (EIPs): Designed by Analyst Institute, these are A/B Tests (or
Correlation Analysis) used for various purposes
o Testing ‘Resonating’ Messages: measure effectiveness of different types of messages at
moving public opinion. Experimenters would randomly assign voters to receive varied
sequences of direct mail—four pieces on the same policy theme, each making a slightly
different case for Obama—and then use ongoing survey calls to isolate the attributes of
those whose opinions changed as a result, e.g., age group 45-65 responded better to
Medicare messages compared to 65+ already in a program.
o Fundraising: By testing different variations of fundraising e-mails to find the ones with
best response rate. In one of the campaigns, they tested 18 variations on subject line,
email copy and number of mails sent. When they rolled out the winning variation “I will
be outspent” to broader email base message, it raised $2.6+ million on June 26th
, 2012.
o User Interface Optimization: The campaign conducted 240 A/B tests on their donation
page. This resulted in a 49% increase in their conversion rate. By making the platform
60% faster, they saw a 14% increase in donations. In June 2012, the campaign switched
to the 4 step donation process and saw a 5% increase in conversions (donations).
Campaign Response Analytics: (D’Agostino) (Predictive Analysis)
o Customized Donation Requests: Instead of soliciting for a fixed amount like $25, the
campaign tested on different percentages of donors' highest previous donation
amounts and found that all versions of those requests did better than set amount.
o Targeted Communications: Models based on recipients' past responses to past e-mail
campaigns helped organizers better target their specific communications, e.g., certain
recipients more open to volunteering than to donate online. This was facilitated by a
new system called, Narwhal, with above ‘analytics’ algorithms.
Optimizer (Davidsen): (Aggregate Analysis) Product that helped with behaviorally targeted TV
buys. It co-ordinates model predictions from above and user TV viewing behavior and comes up
with quarter-hour segment of day with the greatest number of persuadable targets per dollar
across 60 channels. It was developed in coordination with a company called Rentrak. Campaign
3. estimated that this made the TV buy as a whole 10-20% more efficient. That’s the equivalent of
$40 million and $80 million in added media.
Social Analytics: (Aggregate Analysis) OFA scored 50,000 Twitter accounts by political
affiliation. They used Twitter influence (looking at number of tweets & followers) to target direct
messages asking people to get involved.
Communication Analytics (Matthew Rattigan): (Aggregate Analysis) OFA had a tool to look at
the coverage of speeches in local newspapers and understand people’s reaction across
geographic regions and which parts were quoted most. Speechwriters were therefore able to
see how the messages they wanted to convey were actually the ones that were covered.
Dashboard was the campaign's grassroots organizing platform that mapped directly to how the
campaign was structured in the field. It provided unified view of the team, activity (calls,
messages), voting info, fund raising, etc. A mobile app allowed a canvasser to download and
return walk sheets without ever entering a campaign office
Tool Used: R was used for Analytics projects throughout the campaign. D3 for data visualization.
The Results:
Some of the congressional predictions from the Models were in +/-2.5% range, e.g., final
predictive margin, for a 2009 special election for an open congressional seat in upstate New
York, was 150 votes well before Election Day.
The final simulations were accurate to within 0.2% in Ohio and 0.4% in Florida, but were 1% too
cautious in Colorado.
OFA’s final projection was for a 51-48 battleground-state margin for the president, which is
approximately where the race ended up
“Like OFA summarized, Data Analytics made a national presidential campaign run the way of a local
ward campaign”
References:
http://www.technologyreview.com/featuredstory/509026/how-obamas-team-used-big-data-to-rally-
voters/
http://engagedc.com/download/Inside%20the%20Cave.pdf
http://techpresident.com/news/23214/how-analytics-made-obamas-campaign-communications-more-
efficient
http://www.businessweek.com/articles/2012-11-29/the-science-behind-those-obama-campaign-e-mails
Operation Blue Crush (Crime Reduction Using Statistical History) 2005
The problem:
Blue Crush began as a brainchild of Memphis Professor Richard Janikowski who met with then Memphis
Police Department (MPD), Police director, Larry Godwin to talk about new ways to reduce crime. With
4. MPD opening to both new strategy and Sharing data, was born “Operation Blue Crush” as a predictive
analytics based Crime fighting effort in one of the most crime-ridden cities in America. University of
Memphis actively co-ordinates with MPD on this program.
The solution:
The underlying philosophy is to pre-emptively identify places to dedicate police resources to prevent
and/or reduce crimes.
Data collection/monitoring: The Memphis Police Department gathers data on every crime
reported in the city and then tracks and maps all crimes over time. When patterns of criminal
activity emerge from the data, officers are assigned to "details" and sent to those areas that
data show are being hardest hit. Hand held devices help the Memphis Police Department (MPD)
file reports on the spot -- making them available to detectives within minutes -- and check for
local and national outstanding warrants instantly.
Aggregate Analysis: Past criminal event statistics are used to create maps with crosstabs to
create "focus areas".
Trend & Correlation Analysis: Long/short term Trend in Crimes, by various drivers like time of
day, day of week, etc.
Above analysis help police proactively deployed resources -- from organized crime and special ops units
to the mounted patrol, K-9, traffic and DUI enforcement.
Tool Used: IBM SPSS is a partner.
The Results:
In its first 7 years, violent crime was down 23%. Burglaries went down five times the national
average. Area most impacted by Blue Crush was apartments around the city where violent crime
was cut by more than 35%.
IBM is also circulating a June case study that says Memphis made an 863 percent return on its
investment, calculated using the percentage decline in crime and the number and cost of
additional cops that would be needed to match the declining rate.
The study by Nucleus Research said Memphis has paid on average $395,249 a year on the
initiative, including personnel costs, for a $7.2 million return. (contradicting with IBM number?)
Blue Crush has become a department-wide philosophy – facilitating effective deployment of
resources and higher level of accountability and responsibility from all officers of MPD.
Blue Crush has now evolved into a more intense and targeted community policing strategy at
chronic crime hotspots.
References:
http://www.commercialappeal.com/news/2010/sep/19/blue-crush-gives-ibm-a-boost/
http://www.memphispolice.org/BLUE%20Crush.htm
http://www.commercialappeal.com/news/2013/jan/27/blue-crush-controversy/
http://wreg.com/2013/05/01/the-brain-behind-operation-blue-crush-retires/
5. Global Warming Prediction Report
The problem:
Global Warming is a global problem with global ramifications. IPCC is the leader in this domain and had
come up with a model based on CO2 concentrations way back in 2007. With an aim, of modeling and
predicting the global temperature anomalies, through “Self-Organizing Knowledge Extraction” using
public data, Insights (formerly KnowledgeMiner) a research, consulting and software development
company in the field of high-end predictive modeling initiated this project. Insights presented a “6-year
monthly global mean temperature predictions” in Sep, 2011 which was then discussed in Climate Etc. in
Oct, 2011.
The solution:
The philosophy of this model is based on letting the data tell the story – don’t start with hypotheses to
tests, since there is a lot we human don’t know and can’t use to predict. One technique based on this
philosophy, “Self Organizing Modeling” technique works on Adaptive Networks, where Self-organization
of predictive variables happens to give a mathematical equation of optimal complexity and reliable
predictive accuracy.
Data collection: Data comes from public sources. Data inputs include sun, ozone, cloud, aerosols
and CO2 concentrations.
Predictive Analysis: The self-organized model builds a dynamic system model - a system of
nonlinear difference equations. This system model was obtained from monthly observation data
of the past 33 years. The model when built proved interdependencies in the system, e.g. ozone
affects other variables, and then these interdependencies then merge together in a fashion that
predicts global temperatures.
Tool Used: KnowledgeMiner
The Results:
The model shows that Sun, Ozone, Aersosol and cloud are primary drivers of global warming. It
also acknowledges that there could be outside forces that haven’t been accounted here.
The model shows an accuracy of 75% given the noise and uncertainty in the observation data. It
has also been tested between Apr and Dec, 2012. KnowledgeMiner regularly updates the
performance and predictions.
Model also predicts that level of global mean temperatures is going to stabilize near current
levels, although there may be regional variations.
The above model could have ramifications on the debates/discussions/controversies on the
current strategies to fight global warming, esp role of green-house gases and the combat.
References:
http://www.climateprediction.eu/cc/Main/Entries/2011/9/13_What_Drives_Global_Warming.html
6. http://www.knowledgeminer.eu/about.html
http://www.climateprediction.eu/cc/About.html
Predictive Analytics at Delegat’s Wine Estates, a listed New Zealander Wine Company
The problem:
Delegat’s is the New Zealand’s largest listed Wine company - in 2012 Delegat’s alone sold nearly two
million cases of wine worldwide. The entire winemaking process is managed in-house by 350 staff
globally, from growing the grapes to producing, distributing and selling the wine with direct sales teams
in each country. Winemaking business is demand/supply sensitive- a change in one area can impact the
ability to serve customers in another. It’s also time sensitive - highly specific growing and harvest
seasons give winemakers only a brief window to find and fix supply problems at the vineyards.
Predictive Analytics imparts a unique advantage of being prepared for such fluctuations.
The solution:
Together with IBM Business Partner Cortell NZ, Delegat’s deployed an integrated planning and
reporting tool
Data collection: Internal (Production, Supply, Demand, Product, Sales, Viticulture inputs) and
Market data.
Reporting: Integrated standard planning & monitoring suite to keep track of all aspects of
business using KPIs.
Aggregate Analysis: Supply/Demand profiling (what product for which region) & elasticity
studies (changes and response strategies) of markets and consumers.
Predictive Analysis: Net profitability modeling based on yield, production, supply and demand.
System modeling on how one component affects others in the chain.
Trend & Correlation Analysis: Short and Long term changes in the company and the industry.
Sizing/Estimation: What-if scenarios on profitability, brand and other KPIs.
Tool Used: IBM Cognos TM1
The Results:
Time to produce reports reduced by 90 percent and shortened its planning cycles to six weeks.
Efficient day-to-day and strategic business decisions - e.g.,
o Decision on acquisition of Oyster Bay Marlborough’s remaining shares based on insights
from “what-if” scenario models apart from business analysis.
o Expansion from owning ten vineyards to 15 based on insights from scenario modeling.
References:
7. http://asmarterplanet.com/blog/2012/10/new-zealand-vintner-taps-predictive-analytics-for-global-
wine-markets.html
http://www.huffingtonpost.com/paul-chang/ibm-analyze-this-how-tech_b_1967131.html
Predictive Energy Analytics to reduce Operational Costs
The problem:
Energy cost mitigation at a California wastewater facility was extremely difficult due to many factors
outside the control of plant personnel, including dynamics of wastewater flow, energy sources and the
requirements of integrating effluent from multiple municipal agency treatment facilities provided at
various levels of water treatment. Dissimilar data collection platforms and “raw data only” reporting
compounded the issue.
The solution:
Mike Murray and TR Bietsch from Heliopower group companies worked on implementing a data
analytics framework to overcome the above challenges. The primary goal of the project was to provide
operators with real-time, on-demand energy analytics.
Understanding of requirements:
o Requirement Sessions: HelioPower (creator of PredictEnergy) conducted Energy Audit
to understand cost, energy utilization patters, etc. and other business requirements.
o KPI: The primary philosophy was to increase Output-Per-Energy cost by combining with
financial information. Established consumption, demand and cost baselines and
quantified KPIs and set targets.
o Energy sources (utility, co-gen and solar) were paired against uses(facility process)
Data collection: Data from Energy sources, production information and utility tariff cost
structure. PredictEnergy the tool required for the analysis, combined current SCADA
(Supervisory Control And Data Acquisition) and metering systems with historical, current and
predictive energy data from the utility and distributed (inhouse cogen and Solar) energy sources
by installing at key points like main power meter and load centers like Co-gen.
Reporting: Dashboard to monitor & analyze KPIs across various slices & dices.
Aggregate Analysis: Profiled real-time energy costs for pumping and processing, optimized co-
gen energy cost off-set and quantified cost avoidance provided by Solar.
Trend Analysis: Short and Long term changes in KPIs.
Predictive Analysis: Patent pending algorithms utilizing data on actual energy consumption and
demand to utility billings, baselines, iterative analysis best outcome predictions and constant
feedback error corrections.
Sizing/Estimation: What-if scenarios on KPIs and iterations for best performance.
Tool Used: PredictEnergy from HelioEnergySolutions
8. The Results:
Analyses identified operators to shift process loads and energy source usage to minimize
operational expenses (15+% via man hour reduction, process load reduction and cross-billing
error reduction) and energy costs (3-5%). The findings were shared with four other facilities as
best practices to reduce their own costs.
References:
http://www.heliopower.com/wp-content/uploads/2013/01/Predictive-Energy-Analytics-to-Reduce-
Operational-Costs.pdf
http://www.heliopower.com/wp-content/uploads/2013/01/PredictEnergy-Product-Overview.pdf
http://heliopower.com/wp-content/uploads/2013/03/PredictEnergy-Implementation-Phases.pdf
Predicting High School Graduation and Dropout
The problem:
Every education board needs to understand the drivers of Student graduations and dropout, so that
they can tailor their programs to especially address needs of students at risk, increase gain scores and
reform schools. Raj Subedi a researcher with Department of Educational Psychology and Learning
System, Florida Department of Education submitted a dissertation on Educational Research, which could
inform the Department’s efforts to address this problem.
The solution:
Predictive Models were built to understand the effect of Student, Teacher and School level variables on
the Graduation outcome (Yes or No) of the student.
Data collection: This study used 6,184 students and 253 mathematics teachers from all middle
schools in the Orange County Public Schools (OCPS), which is the tenth largest school district out
of 14,000 in the USA.
o Outcome variable: Grades 6–8 mathematics YoY (’05 vs. ’04) gain scores in NRT-NCE
(Norm Referenced Test-Normal Curve Equivalent) portion of the Florida Comprehensive
Assessment Test, a state mandated standardized test of student achievement of the
benchmarks in reading, mathematics, science, and social studies in Florida schools.
o Student level predictors: Pretest scores, Socio-economic status (participation in the free
and reduced lunch program.
o Teacher level predictors: Mathematics content-area certification, advanced
mathematics or mathematics education degree, and experience.
o School level predictors: School poverty is defined as the percent of free and reduced
lunch students in each school, and teachers’ school mean experience is defined as the
average number of years taught by middle school teachers in a given school.
9. Predictive Analysis: A three level Hierarchical Linear Model (HLM) through Value Added Model
(VAM) approach to investigate Student, Teacher and School level predictors. The HLM method
checks for the interaction between the various level predictors (Student, Teacher & School).
Value Added Modeling is the method of Teacher contribution in a given year by comparing the
current test scores of the students in current year with the scores in previous year and also by
comparing to the scores of other students in the same year.
Tool Used: SAS
The Results:
Findings suggested that already high performing students can be expected to score better.
Students with poorer economic background by themselves performed poorly however guidance
by Content Certified teachers.
However such students did not show performance improvement when paired with teachers
possessing longer work experience.
The findings also indicated that Content Certification & teaching experience had a positive
impact, some of the accountability requirements under “No Child Left Behind” Act. In other
words, teacher competency had a substantial impact on student performance.
References:
http://www.hindawi.com/journals/edu/2011/532737/
http://diginole.lib.fsu.edu/cgi/viewcontent.cgi?article=4896&context=etd
http://www.palmbeachschools.org/dre/documents/Predicting_Graduation_and_Dropout.pdf
Oklahoma Town Uses Workforce Analytics to Lure Manufacturer to Region
The problem:
As U.S. skilled labor supply continues to tightens, economic development groups are struggling with the
problem of showing availability of skilled workers to Investors (Site selectors and businesses). The
Belgian high-tech materials company, Umicore, decided to expand its germanium wafer1 production to
the U.S. The initial search parameters narrowed the location to three cities - Phoenix, Albuquerque and
Quapaw in Oklahoma. Quapaw is a small town of 966 residents. Even though Umicore’s Opticals division
had an existing plant in Quapaw and expanding here would be cost efficient, it couldn’t be sure of
availability of requisite workforce.
When the site selector approached Judee Snodderly, executive director of the Miami Area Economic
Development Service, Okhlohoma’s public data didn’t have sufficient details or flexibility to answer the
questions. To complicate the problem, Miami’s regional labor force is shared by four states (Oklahoma,
Kansas, Missouri and Arkansas) and integrating them to get an accurate and reliable picture was
difficult.
The solution:
10. Data collection:
o EMSI (Economic Modeling Specialists International) captures and stores labor market
data and stores it within web based market research, visualization & analytics tool
“Analyst”. EMSI broad array of products from Analyst (Labor Market) to Career Coach
(Career Vision) and Economic Impact Studies brings together tremendous value addition
for Analysts and Decision makers.
o Tap into a composite of more than 90 federal, state, and private data sources, refreshed
quarterly and available at many granularities -county, ZIP, MSA, or multi-state region.
o It also has data for UK and Canada internationally.
Aggregate Analysis: Analyst tool has dashboard capabilities with multiple visualization options
like maps, charts, tables, etc.
o Gary Box, the business retention coordinator at the Workforce Investment Board of
Southwest Missouri had access to EMSI’s Analyst tool which provided him Industry and
Occupation reports highlighting high-tech manufacturing skill availability. It also enabled
him to emphasize availability of “compatible” talent for Umicore in the region.
Tool Used: EMSI’s web-based labor market tool, Analyst
The Results:
In late June 2008, Umicore chose Quapaw as the new location for its germanium wafer
production site, resulting in an investment of $51 million into the region and 165 new jobs with
an average salary of $51,000 a year not including benefits.
Construction began in 2008 and continued through late 2009. EMSI estimates that the total
impact on the economy of Ottawa County during the construction phase alone was more than
160 jobs and nearly $9 million in earnings annually.
Once the plant began operating, that impact rose to more than 250 jobs and more than $12
million in earnings annually.
References:
http://thehiringsite.careerbuilder.com/2013/03/04/workforce-data-case-study/
http://www.economicmodeling.com/wp-content/uploads/Analyst_onepage2013_v1b.pdf
http://www.economicmodeling.com/analyst/
Optimization of “Procure-to-Pay” process - Strategic Customer benefit initiative by VISA
The problem:
A $2 billion U.S. Construction company was exploring options to maximize its card program by analyzing
its processing costs and efficiency across its entire Procure-to-Pay process. Its Visa Issuer (Issuing bank)
11. introduced it to Visa's Procure-to-Pay and Visa PerformSource, a consultative service aimed at
maximizing value from the commercial card program and Procure-to-Pay process.
The solution:
Visa and its Issuing Banks (Issuers) through a new program helped the US Construction Company identify
opportunities to improve Procure-to-Pay operations and increase savings through their card programs.
The Optimization Review utilized analytical tools (Procure-to-Pay performance Gauge & Accounts
Payable tool) designed to identify, benchmark, and improve the Procure-to-Pay practices. These tools
helped define a plan and financial impact estimate for the expansion of Visa Commercial card programs.
Data collection: Procurement and card operations data
Aggregate Analysis:
o Procurement-to-Pay Performance Gauge: This tool is designed to assist a company in
understanding how to improve its current Procure-to-Pay processes and technology. A
customized diagnostic report comparing against best practice companies of same
revenue size was developed.
o The Accounts Payable Analysis Tool: Helped the company analyze spend patterns and
develop both strategic and tactical commercial card program implementation or
expansion plans organized by commodity, business unit and supplier. Additionally, the
built-in ROI calculator estimated the financial benefits they could realize through the
card program. The tool also allows companies to set program goals over a three-year
time frame for the Visa Commercial card program expansion.
Tool Used: VISA Performsource service – toolkit (Procure-to-Pay Performance Gauge & Accounts
Payable tool)
The Results:
This company scored Overall Advanced Rating (51-75), It had good foundation – clear Procure-
to-Pay strategy, RoI analysis of all initiatives, automated process & inclusion of payment method
into preferred vendor contracts.
It also implemented a best practice commercial card program (distribution of approved vendor
list, accounts payable interception of invoices & distribution of cards to super users)
However there was still scope of further cost reductions, greater control and process efficiencies
– Ongoing Vendor List Management, Communication/Audit/Reporting of non-compliance and
regular reports to Senior Management. VISA projected a net process efficiency savings of $0.6
MM over 3 years from the program.
References:
http://usa.visa.com/corporate/corporate_solutions/perform_source/case_study_construction.html
http://usa.visa.com/corporate/corporate_solutions/perform_source/index.html
12. The US Women Cycling Team’s Big Data Story of London 2012 Olympics
The problem:
The last time US Women team had won a track medal in Olympics was two decades ago. It was
outfunded by British, with a spend of $46 MM and outstaffed in staff by 10 to 1. They were entering the
tournament 5 seconds away from even being considered for the medals. Closing this gap for a medal
was almost considered an impossible task by many.
The solution:
Sky Christopherson himself a recent world record winner and a big user of Quantified Self data, was an
instrumental force in helping the team do the impossible.
Data collection: Quantified Self data from sensors, cameras, ipads on environment, sleep
pattern, genetic, blood glucose tracking & just about everything that is important to the cyclists.
The data was being collected every sec, 24 hours a day, 7 days a week.
Reporting: Visualization of key metrics on charts, tables, etc.
Aggregate Analysis: Profiling and drill-down of various key metrics by other levers.
Correlation Analysis: Among various drivers – lifecycles, routines with performances.
Trend Analysis: Impact of changes on the performance over time.
Tool Used: Datameer
The Results:
Race strategies, health & recovery routines, changes to day-to-day lifecycle pattern and habits;
in short all data driven actions to improve performance of the cyclists.
US team beat Australia, the overwhelming favorites in the Semi-Final round by 0.08 seconds and
finally went on to the win the silver medal.
References:
http://www.datameer.com/learn/videos/us-womens-olympic-cycling-team-big-data-story.html
http://fora.tv/2013/06/27/The_US_Womens_Cycling_Teams_Big_Data_Story
http://en.wikipedia.org/wiki/Quantified_Self
Helping keep Manitoba Food Chain safe by tracking disease outbreaks
The problem:
Manitoba Agriculture, Food & Rural Initiatives (MAFRI) ministry under Chief Veterinary Officer, Dr.
Wayne Lees, is responsible for safeguarding the agri-food chain at Manitoba, Canada and beyond. They
either actively manage animal disease outbreak and/or strategize on how to best prevent it or
13. effectively control it the next time. The electronic tracking system containing the livestock premises
identification system tracks the movement of lifestock across the food chain ensuring ability to pinpoint
risks from animal-to-animal exposure. It also ensures effective inter-agency collaboration for rapid and
effective response in event of outbreaks.
Manitoba is the largest pork-exporting province in Canada and almost two thirds of its hog productions
is exported to US for finishing. Any unmanaged outbreak could be financially catastrophic. Info on
outbreak can come from multiple sources – people, rumour, diagnostic labs and veterinary practitioners.
Within months of the deployment of launch of tracking system, there was a transmittable
gastroenteritis (TGE) in pigs within a cluster of three farms.
The solution:
Sky Christopherson himself a recent world record winner and a big user of Quantified Self data, was an
instrumental force in helping the team do the impossible.
Data collection: Premises ID database collects all information on the livestock.
Aggregate Analysis: Visualization of disease trackers on maps (origin location, proximity & size
of outbreak and at risk, livestock movements)
Sizing & Estimation: Calculation of optimal buffer zone, what-if scenario planning of spread
patterns and response strategies.
Correlation Analysis: Identification of potentially exposed herds, animals that had come into
and out of the affected farms.
Predictive Analysis: Modeling of outbreak spreads based on farm locations, animal densities
and other factors.
Tool Used: IBM Maximo Asset Management & IBM Global Business Services –Industry Consulting
The Results:
By identifying, quantifying and analyzing risk factors at the time of detection, MAFRI reduced
the average disease control cycle by 80% - in this case weeks, with no additional farms affected.
Almost halved the “downstream” containment costs, such as manpower and transport.
Able to do a more targeted and accurate application of epidemic responses such as quarantines.
Reduced the risk of export restrictions and cull-related losses resulting from animal disease
epidemics, which represents millions of dollars in direct losses to the Manitoba livestock
industry and local economies.
Able to execute a more efficient deployment of animal disease control specialists in the field
during outbreaks.
References:
http://www.ibm.com/smarterplanet/us/en/leadership/mafri/assets/pdf/MAFRI_Paper.pdf
http://www.ibm.com/smarterplanet/us/en/leadership/mafri/
14. Chapter Summary
The intent of this chapter was to illustrate the applications of data analysis principles from visual
analytics to advanced analytics in various walks of our lives. Our hope is that the above stories has a
struck a chord with your imagination and the next time you see life around you, you are able to relate to
it at an “analytical” level. Given the rapid pace of our lives digitization, data generation is only going
north and the analysis of such data is only going to make our lives better. Please do not be surprised
tomorrow, if your self-driving, self-thinking car recommends a great exotic restaurant after analyzing
your food channel program viewing pattern.