This document outlines a scientific, hypothesis-driven approach to product management. It discusses focusing on root causes, eliminating bias, and providing real user value. The process involves making observations, forming a hypothesis, running experiments to test the hypothesis, refining the hypothesis based on results, and validating findings with metrics. Key steps are defining the problem, building testable hypotheses, analyzing relevant data, refining over time, and measuring success. Forming good hypotheses requires identifying clear questions and variables. The goal is to objectively solve real problems through replicable, user-centric experiments.
This PPT provides you knowledge about the Human Resource Models.
After completion of the presentation, we will able to know:
What is HRM?
Outlines to keep in mind about the HR Models
Goals
Hard Approach
Soft Approach
The Ulrich Model (The Business Partner Model)
The ASTD Competency Model
Conclusion
Contact for Further Details or any query
Amit Giri
(Teacher at The Changers Academy)
e-mail id :- edu.thechangers@gmail.com
contact number :- 87108 - 20002
aka "Agile adoption stories from highly varied organisational cultures"
Why is the culture change that genuine Agile requires so difficult in most army or machine-like corporate cultures, yet quite natural for certain organisations who have a culture similar to a family or living organism? It turns out that the type of Agile your organisation adopts corresponds with its dominant world view or stage of consciousness. Drawing from 15 years of experience with Agile in Australia and the UK, we describe how Agile was interpreted quite differently by organisations classed as Amber, Orange, Green and Teal in Frederic LaLoux’s model.
Familiarise yourself with the characteristics of the four stages of Frederic LaLoux’s consciousness model.
You will become aware of:
* The stage that your own organisation is at
* How your organisation is likely to interpret and ‘bend’ Agile to fit its world view
* Specific beliefs and motivations that make high agility difficult in organisations with Amber and Orange stages of consciousness
* The Green and Teal beliefs and leadership styles that are genuinely transformational in achieving and sustaining high agility and customer-centric Agile adoptions.
KRA KPI ( Key results area and Key performance indicators)Sagar Paul
The presentation is a creative representation which simplifies the concept of KPI & KRA and its importance in an organisation
Note: View the slide only in SLIDESHOW MODE!
This PPT provides you knowledge about the Human Resource Models.
After completion of the presentation, we will able to know:
What is HRM?
Outlines to keep in mind about the HR Models
Goals
Hard Approach
Soft Approach
The Ulrich Model (The Business Partner Model)
The ASTD Competency Model
Conclusion
Contact for Further Details or any query
Amit Giri
(Teacher at The Changers Academy)
e-mail id :- edu.thechangers@gmail.com
contact number :- 87108 - 20002
aka "Agile adoption stories from highly varied organisational cultures"
Why is the culture change that genuine Agile requires so difficult in most army or machine-like corporate cultures, yet quite natural for certain organisations who have a culture similar to a family or living organism? It turns out that the type of Agile your organisation adopts corresponds with its dominant world view or stage of consciousness. Drawing from 15 years of experience with Agile in Australia and the UK, we describe how Agile was interpreted quite differently by organisations classed as Amber, Orange, Green and Teal in Frederic LaLoux’s model.
Familiarise yourself with the characteristics of the four stages of Frederic LaLoux’s consciousness model.
You will become aware of:
* The stage that your own organisation is at
* How your organisation is likely to interpret and ‘bend’ Agile to fit its world view
* Specific beliefs and motivations that make high agility difficult in organisations with Amber and Orange stages of consciousness
* The Green and Teal beliefs and leadership styles that are genuinely transformational in achieving and sustaining high agility and customer-centric Agile adoptions.
KRA KPI ( Key results area and Key performance indicators)Sagar Paul
The presentation is a creative representation which simplifies the concept of KPI & KRA and its importance in an organisation
Note: View the slide only in SLIDESHOW MODE!
Slides from a product management training workshop with our partners at the Department of the Interior's Office of Natural Resources Revenue as a part of work together on revenuedata.doi.gov
I’m not going to spend any time on Google Analytics.
How healthy is this business?
It’d be great to track metrics like these: 1 MRR, Churn, LTV, acquisition cost 2 Virality, DAU, MAU 3 Average order value, repurchase rate 4 Funnels and conversions
But you don’t have any data yet
Your data is in a constant rate of decay
Your data is messy
Use metrics that measure your biggest problem. Ignore the rest.
Gateway Metrics
When picking metrics, always ask yourself: What’s my biggest constraint right now and which metric will tell me if I’m making progress?
You need to do the right things in the right order.
Gateway #1: Is your idea any good?
Your main constraint: Getting anyone to care about your idea.
Your main metric: Get someone to pay or use your product regularly.
Bad metrics for this gateway: 1 Asking people if they’ll pay 2 AdWords clicks 3 Beta or waiting list signups 4 Traffic
Gateway #2: Is your product good enough?
Your main constraint: Having a product that’s good enough to build a business on.
Your main metric: Ask 500 users the Product/Market Fit Question
What is the P/M Fit Question?
Your goal for the P/M Fit Question: At least 40% of users should say “Very disappointed.” *Sean Ellis and Hiten Shah get credit for this one.
How do you get to the first 500 users/customers? Hustle.
The P/M Fit Question isn’t perfect, verify with a retention metric.
Gateway #3: Can you grow?
Your main constraint: Acquiring customers consistently from at least one channel.
You have plenty of options to choose from: 1 Inbound (Google, Content, Social) 2 Paid (PPC, Affiliates) 3 Virality (Invites, Referrals)
Pick just one to start Work on your channel for at least 3 months. Assume it’ll work and get the resources needed to execute.
Your main metrics: Your main business metric and acquisition funnel.
Main business metrics: 1 SaaS: Monthly Recurring Revenue 2 Ecommerce: Monthly Revenue 3 Consumer Tech: Monthly Active Users
Why not cost per acquisition or lifetime value? You have no idea how much it costs to acquire customers or how much they’ll spend (yet).
Gateway #4: Do you have a stable model?
Your main constraint: In order to keep scaling, you need a stable model for your business.
Your main metrics: Depends entirely on what business model you have.
The SaaS Model
The Ecommerce
The Consumer Tech Model
Find someone in your industry that knows the key benchmarks.
Finally, get serious with data.
If you have a sales team, pile data into your CRM.
If consumer tech, do everything in-house.
Google Analytics plus an internal database will take you far.
Start with constraints, hack together what you need to measure them.
How to get data you really need: 1 One team owns data quality. 2 Hire a data engineer. 3 Clean up and integrate your data. 4 Use customer analytics. 5 Build a Growth Team.
Every startup faces a few key roadblocks that prevent you from moving forward. Not only will you learn how to grow past each roadblock, you'll also see the same process that KISSmetrics uses to increase conversions, discover new channels of growth, and accelerate acquisition at every step. You don’t need any data or traffic either, you can do it as early as day 1.
Should UI/UX be gut-feeling or data-driven? How to stand out from the tough competition by perfecting your owned asset?
A/B testing a long-grind road but it does not have to be tough! Demyth the 4 steps approach to optimization and what it can bring to you
Agile metrics can be used to the advantage or the detriment of teams and an organisation’s Agile success. This session looks at several of the core Agile metrics used to measure success to help you understand what success looks like, why the metric is desirable and what the metrics can tell us.
Understanding why we want these metrics is critical to capturing something of value, rather than just doing 'because'. What will leaders and decision makers do with these metrics? What value do they add?
Steve will also dive into the negative impacts of some of the Agile metrics we are sometimes forced to capture, how chasing velocity leads to gaming the system etc. He’ll look at bad metrics such as the seven deadly sins of Agile measurement and how to avoid them in your enterprise.
Optimizing your marketing automation program for successGetResponse
How to optimize your marketing automation program for success – Kath Pay takes you through this entire process in this actionable webinar organized in partnership with GetResponse. Watch the webinar here – https://getresponse.tv/video/UlCaQ4LLCzM/optimizing-your-marketing-automation-program-for-success-kath-pay-webinar/
Delivered by Kath Pay, CEO of Holistic Email Marketing, at Figaro Digital's Email Marketing & CRM Seminar on Thursday 9 February 2017.
Aimed at marketing professionals looking to push the power of their email marketing, this session reveals how to leverage the testing ability of your emails to improve the performance of your other marketing acquisition and conversion channels. You will learn how to build a hypothesis into your emails, to drive the actions that provide the answers you're looking for. As well as how to use a push channel such as email to inform and improve pull channel performance. Plus, tactics to test for long-term results as well as short-term results.
Tom Willis will cover a range of tools, customer data and analytics that can be used to determine metrics and keep your marketing strategies in line with key business goals, regardless of your budget.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Slides from a product management training workshop with our partners at the Department of the Interior's Office of Natural Resources Revenue as a part of work together on revenuedata.doi.gov
I’m not going to spend any time on Google Analytics.
How healthy is this business?
It’d be great to track metrics like these: 1 MRR, Churn, LTV, acquisition cost 2 Virality, DAU, MAU 3 Average order value, repurchase rate 4 Funnels and conversions
But you don’t have any data yet
Your data is in a constant rate of decay
Your data is messy
Use metrics that measure your biggest problem. Ignore the rest.
Gateway Metrics
When picking metrics, always ask yourself: What’s my biggest constraint right now and which metric will tell me if I’m making progress?
You need to do the right things in the right order.
Gateway #1: Is your idea any good?
Your main constraint: Getting anyone to care about your idea.
Your main metric: Get someone to pay or use your product regularly.
Bad metrics for this gateway: 1 Asking people if they’ll pay 2 AdWords clicks 3 Beta or waiting list signups 4 Traffic
Gateway #2: Is your product good enough?
Your main constraint: Having a product that’s good enough to build a business on.
Your main metric: Ask 500 users the Product/Market Fit Question
What is the P/M Fit Question?
Your goal for the P/M Fit Question: At least 40% of users should say “Very disappointed.” *Sean Ellis and Hiten Shah get credit for this one.
How do you get to the first 500 users/customers? Hustle.
The P/M Fit Question isn’t perfect, verify with a retention metric.
Gateway #3: Can you grow?
Your main constraint: Acquiring customers consistently from at least one channel.
You have plenty of options to choose from: 1 Inbound (Google, Content, Social) 2 Paid (PPC, Affiliates) 3 Virality (Invites, Referrals)
Pick just one to start Work on your channel for at least 3 months. Assume it’ll work and get the resources needed to execute.
Your main metrics: Your main business metric and acquisition funnel.
Main business metrics: 1 SaaS: Monthly Recurring Revenue 2 Ecommerce: Monthly Revenue 3 Consumer Tech: Monthly Active Users
Why not cost per acquisition or lifetime value? You have no idea how much it costs to acquire customers or how much they’ll spend (yet).
Gateway #4: Do you have a stable model?
Your main constraint: In order to keep scaling, you need a stable model for your business.
Your main metrics: Depends entirely on what business model you have.
The SaaS Model
The Ecommerce
The Consumer Tech Model
Find someone in your industry that knows the key benchmarks.
Finally, get serious with data.
If you have a sales team, pile data into your CRM.
If consumer tech, do everything in-house.
Google Analytics plus an internal database will take you far.
Start with constraints, hack together what you need to measure them.
How to get data you really need: 1 One team owns data quality. 2 Hire a data engineer. 3 Clean up and integrate your data. 4 Use customer analytics. 5 Build a Growth Team.
Every startup faces a few key roadblocks that prevent you from moving forward. Not only will you learn how to grow past each roadblock, you'll also see the same process that KISSmetrics uses to increase conversions, discover new channels of growth, and accelerate acquisition at every step. You don’t need any data or traffic either, you can do it as early as day 1.
Should UI/UX be gut-feeling or data-driven? How to stand out from the tough competition by perfecting your owned asset?
A/B testing a long-grind road but it does not have to be tough! Demyth the 4 steps approach to optimization and what it can bring to you
Agile metrics can be used to the advantage or the detriment of teams and an organisation’s Agile success. This session looks at several of the core Agile metrics used to measure success to help you understand what success looks like, why the metric is desirable and what the metrics can tell us.
Understanding why we want these metrics is critical to capturing something of value, rather than just doing 'because'. What will leaders and decision makers do with these metrics? What value do they add?
Steve will also dive into the negative impacts of some of the Agile metrics we are sometimes forced to capture, how chasing velocity leads to gaming the system etc. He’ll look at bad metrics such as the seven deadly sins of Agile measurement and how to avoid them in your enterprise.
Optimizing your marketing automation program for successGetResponse
How to optimize your marketing automation program for success – Kath Pay takes you through this entire process in this actionable webinar organized in partnership with GetResponse. Watch the webinar here – https://getresponse.tv/video/UlCaQ4LLCzM/optimizing-your-marketing-automation-program-for-success-kath-pay-webinar/
Delivered by Kath Pay, CEO of Holistic Email Marketing, at Figaro Digital's Email Marketing & CRM Seminar on Thursday 9 February 2017.
Aimed at marketing professionals looking to push the power of their email marketing, this session reveals how to leverage the testing ability of your emails to improve the performance of your other marketing acquisition and conversion channels. You will learn how to build a hypothesis into your emails, to drive the actions that provide the answers you're looking for. As well as how to use a push channel such as email to inform and improve pull channel performance. Plus, tactics to test for long-term results as well as short-term results.
Tom Willis will cover a range of tools, customer data and analytics that can be used to determine metrics and keep your marketing strategies in line with key business goals, regardless of your budget.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
2. Focus
• Root cause for problems.
• Ditches ideas that irrelevant.
Why should you care?
Feasible
• Solves the real problem.
• No wasted efforts.
Objectivity
• Eliminate personal bias for
what causes what.
Replicable
• Results can be used as facts.
• Results are durable.
User-Centric
• Providing real value to users.
3. The Approach: Step by Step Process (Scientific Method)
1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis
How to measure the effect? Success metrics!
FYI: Scientific Method was founded by Ibn Al Haytham around 250 years ago.
8. Prioritization Methods
Relative Weighting RICE Analysis
Based on business metric Based on usage frequency
Perfect for new features Perfect for optimizing existing features
Considers development effort Considers development effort
Reach, Impact, Confidence, Effort
9. Relative Weighting
Prioritization Criteria Evaluation Factors
Increase Sales; maximize ROI
Increase customer trust
Establish a competitive
advantage
Improve productivity
Cost of development
Value Score
10. Prioritized List
Issue Type and ID ROI ICS ECA IP
Total
Value
Value
Percent
Estimate
Cost
Percent
Priority
Add FB sign up
Feature 10
Cash on Delivery
Bug 3
Feature 7
1 6 1 3
3 1 6 1
6 6 1 1
1 8 1 1
6 6 6 3
11
11
14
11
21
10
20
20
40
55
Add Gmail sign up 3 6 6 6 21 15
Pin Address on map 6 6 3 6 21 35
Bug 5 3 3 3 3 12 25
Feature 8 1 3 6 1 11 40
8%
8%
11%
8%
16%
16%
16%
9%
8%
4%
8%
8%
15%
21%
6%
13%
10%
15%
2
1
1.37
0.54
0.76
2.66
1.23
0.9
0.54
13. 1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
The Approach: Step by Step Process
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis
How to measure the effect? Success metrics!
15. What is a hypothesis?
"If _____[I do this] _____, then _____[this]_____ will happen."
If I added gmail login, then number of registered users will increase.
If I include Cash on Delivery payment method during checkout, then number of purchases will increase.
If I enable customers to pin their location on a map, then number of purchases will increase.
16. Hypotheses Tips
Before you make a hypothesis, you have to clearly identify the question
you are interested in studying.
The question comes first
A hypothesis is a
statement, not a question
Your hypothesis is not the scientific question in your project. The
hypothesis is an educated, testable prediction about what will happen.
Make it clear A good hypothesis is written in clear and simple language.
Keep the variables in mind
A good hypothesis defines the variables in easy-to-measure terms, like
who the participants are, what changes during the testing.
Make sure your
hypothesis is “testable”
Don't bite off more than
you can chew!
To prove or disprove your hypothesis, you need to be able to do an
experiment and take measurements to see how two things are related.
Make sure your hypothesis is a specific statement relating to a single
experiment.
17. Good vs bad hypothesis
Good Hypothesis Bad Hypothesis
Testable
Simple
Written as a statement
Establishes the
participants & variables
Predicts effect
Not testable
Not simply explained
Written as a questions
Doesn’t identify
participants & variables
Cannot use to predict
effect
18. Good vs bad hypothesis
Good Hypothesis Bad Hypothesis
If I added gmail login, then number of
registered users will increase.
If I include Cash on Delivery payment
method during checkout, then number of
purchases will increase.
If I enable customers to pin their location
on a map, then number of purchases will
increase.
Would adding more login options increase
number of registered users?
Cash on delivery payment is requested by
users
We have 10,000 calls from users to track
their orders
19. The point is to prove or disprove a hypothesis
Disproving a hypothesis matters as much as proving it. Both leads to form better
conclusions about the problem at hand!
21. 1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
The Approach: Step by Step Process
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis
How to measure the effect? Success metrics!
23. Creating a mathematical model
What questions do you want to
answer?
What data do you want to look
for?
Where the data you want is
available?
How to use the data available?
How to verify the data?
Will adding Gmail login improve sign up?
How many customers have signed up
using Gmail email?
Customers data base
Find % of customers signed up with Gmail.
Gmail customers / all customers = % of gmail customers.
Consider removing dummy created accounts
24. Creating a mathematical model
What questions do you want to
answer?
What data do you want to look
for?
Where the data you want is
available?
How to use the data available?
How to verify the data?
Will adding Gmail login improve sign up?
How many customers have signed up
using Gmail email?
Customers data base
Find % of customers signed up with Gmail.
Gmail customers / all customers = % of
gmail customers.
Consider removing dummy created
accounts
150K
150K / 1.3M = 12%
25. 1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
The Approach: Step by Step Process
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis
How to measure the effect? Success metrics!
27. Will adding Gmail login improve sign up?
How many customers have signed up
using Gmail email?
Find % of customers signed up with Gmail.
Gmail customers / all customers = % of
gmail customers.
Consider removing dummy created
accounts
150K
150K / 1.3M = 12%
What questions do you want to
answer?
What data do you want to look
for?
Where the data you want is
available?
How to use the data available?
How to verify the data?
Creating a mathematical model
Customers data base
28. We live in a dynamic world, so always consider things
in a timely manner!
29. Creating a mathematical model
What questions do you want to
answer?
What data do you want to look
for?
Where the data you want is
available?
How to use the data available?
How to verify the data?
Will adding Gmail login improve sign up?
How many customers have signed up
using Gmail email in the past 6 months?
Customers data base
Find % of customers signed up with Gmail
in the past 6 months.
Gmail sign ups / all sign ups in the past 6
months = % of gmail customers.
Consider removing dummy created
accounts
120K
120K / 600K = 25%
30. 1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
The Approach: Step by Step Process
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis based on test.
How to measure the effect? Success metrics!
35. 1- Observations
2- Hypothesis
3- Experiment
4- Refine
5- Validate
The Approach: Step by Step Process
Questions to answer. Observing a trend or a problem.
What is a hypothesis? How to build one?
& Good vs Bad hypothesis.
How to test hypothesis? What data to look at?
Refine hypothesis.
How to measure the effect? Success metrics!