This document outlines the body of a 6-week online business analytics course covering topics like exploratory data analysis, big data, machine learning, and data problem-solving approaches. The course teaches students to understand business problems through interviewing and applying frameworks to hypotheses. It also covers collecting, analyzing, and visualizing data to identify patterns and insights, and using storytelling techniques to effectively present findings to stakeholders. The overall goal is to help students learn how to extract meaningful insights from data to help organizations make more informed strategic decisions.
Data Science. Business Analytics is the statistical study of business data to gain insights. Data science is the study of data using statistics, algorithms and technology. Uses mostly structured data. Uses both structured and unstructured data.
Barga, roger. predictive analytics with microsoft azure machine learningmaldonadojorge
This document provides an overview of a book on data science and Microsoft Azure Machine Learning. It contains front matter materials such as information about the authors, acknowledgments, and an introduction.
The introduction previews that the book will provide an overview of data science and an in-depth view of Microsoft Azure Machine Learning. It will also provide practical guidance for solving real-world business problems such as customer modeling, churn analysis, and product recommendation. The book is aimed at budding data scientists, business analysts, and developers and will teach the reader about data science processes and Microsoft Azure Machine Learning.
Dear students get fully solved assignments
Send your semester & Specialization name to our mail id :
“ help.mbaassignments@gmail.com ”
or
Call us at : 08263069601
Lesson 1 - Overview of Machine Learning and Data Analysis.pptxcloudserviceuit
This document provides an overview of machine learning and data analysis. It defines machine learning as a field of artificial intelligence that enables computers to learn from data without being explicitly programmed. The main types of machine learning are supervised, unsupervised, and reinforcement learning. Data analysis is the process of extracting meaningful insights from data through techniques like cleaning, exploring for patterns/trends, statistical analysis, and visualization. Machine learning automates many data analysis tasks and can be applied through techniques like classification, clustering, and regression. The relationship between machine learning and data analysis fuels discovery, with data analysis providing foundation and machine learning generating insights.
This document provides an overview of data analytics including:
- The basics of data analytics including analytics definitions and the need for data analytics due to increasing data volumes.
- Descriptions of different types of analytics including descriptive, diagnostic, predictive, and prescriptive analytics and their purposes.
- An overview of the data analytics lifecycle including phases such as data preparation, model planning, model building, and communication of results.
introduction, data mining, why data mining, application of data mining, steps of data mining, threat of data mining, solution of data mining, role of data mining, data warehouse, oltp & olap, data warehouse, data mining tools, latest research
A strategy for security data analytics - SIRACon 2016Jon Hawes
A snag list for 'things that can go wrong' with big data analytics initiatives in security, and ways to think about the problem space to avoid that happening.
Data Science. Business Analytics is the statistical study of business data to gain insights. Data science is the study of data using statistics, algorithms and technology. Uses mostly structured data. Uses both structured and unstructured data.
Barga, roger. predictive analytics with microsoft azure machine learningmaldonadojorge
This document provides an overview of a book on data science and Microsoft Azure Machine Learning. It contains front matter materials such as information about the authors, acknowledgments, and an introduction.
The introduction previews that the book will provide an overview of data science and an in-depth view of Microsoft Azure Machine Learning. It will also provide practical guidance for solving real-world business problems such as customer modeling, churn analysis, and product recommendation. The book is aimed at budding data scientists, business analysts, and developers and will teach the reader about data science processes and Microsoft Azure Machine Learning.
Dear students get fully solved assignments
Send your semester & Specialization name to our mail id :
“ help.mbaassignments@gmail.com ”
or
Call us at : 08263069601
Lesson 1 - Overview of Machine Learning and Data Analysis.pptxcloudserviceuit
This document provides an overview of machine learning and data analysis. It defines machine learning as a field of artificial intelligence that enables computers to learn from data without being explicitly programmed. The main types of machine learning are supervised, unsupervised, and reinforcement learning. Data analysis is the process of extracting meaningful insights from data through techniques like cleaning, exploring for patterns/trends, statistical analysis, and visualization. Machine learning automates many data analysis tasks and can be applied through techniques like classification, clustering, and regression. The relationship between machine learning and data analysis fuels discovery, with data analysis providing foundation and machine learning generating insights.
This document provides an overview of data analytics including:
- The basics of data analytics including analytics definitions and the need for data analytics due to increasing data volumes.
- Descriptions of different types of analytics including descriptive, diagnostic, predictive, and prescriptive analytics and their purposes.
- An overview of the data analytics lifecycle including phases such as data preparation, model planning, model building, and communication of results.
introduction, data mining, why data mining, application of data mining, steps of data mining, threat of data mining, solution of data mining, role of data mining, data warehouse, oltp & olap, data warehouse, data mining tools, latest research
A strategy for security data analytics - SIRACon 2016Jon Hawes
A snag list for 'things that can go wrong' with big data analytics initiatives in security, and ways to think about the problem space to avoid that happening.
The document discusses the key steps in an AI project cycle:
1) Problem scoping involves understanding the problem, stakeholders, location, and reasons for solving it.
2) Data acquisition collects accurate and reliable structured or unstructured data from various sources.
3) Data exploration arranges and visualizes the data to understand trends and patterns using tools like charts and graphs.
4) Modelling creates algorithms and models by training them on large datasets to perform tasks intelligently.
5) Evaluation tests the project by comparing outputs to actual answers to identify areas for improvement.
If you’re learning data science, you’re probably on the lookout for cool data science projects. Look no further! We have a wide variety of guided projects that’ll get you working with real data in real-world scenarios while also helping you learn and apply new data science skills.
The projects in the list below are also designed to help you get a job! Each project was designed by a data scientist on our content team, and they’re representative examples of the real projects working data analysts and data scientists do every day. They’re designed to guide you through the process while also challenging your skills, and they’re open-ended so that you can put your own twist on each project and use it for your data science portfolio.
You can complete each project right in your browser, or you can download the data set to your computer and work locally! If you work on our site, you’ll also be able to download your code at any time so that you can continue locally, or upload your project to GitHub.
The sky is the limit here and what you decide to look into further is completely up to you and your imagination!
1. Learning by Doing
Learning by doing refers to a theory of education expounded by American philosopher John Dewey. It is a hands-on approach to learning, meaning students must interact with their environment in order to adapt and learn. This way of learning sharpen your current skills and knowledge and also helps in gaining new skills that could only be acquired by doing.
Car driving is a perfect example of this, you can read as much as you would like about the theory of driving and the rules, and this is very important, and the more you understand the theory the better you get in the practical part. But you will only be able to drive better by applying this knowledge on the real road. In addition to that, there are some skills and knowledge that will be only gained by actually driving.
Data science is the same as driving. It is very important to have solid theoretical knowledge and to regularly increase them to be able to get better while working on a project. However, you should always apply this theoretical knowledge to projects. By this, you will deepen your understanding of these concepts and Knowledge, have a better point of view of how they work in a real-life, and will also show others that you have strong theoretical knowledge and are able to put them into practice.
There are different types of guided projects. One of them is a guided project for
There are a lot of benefits for it:
It removes the barriers between you and doing projects
Saves you much time thinking about the project and preparing the data.
It allows you to apply the theoretical knowledge without getting distracted by obstacles.
Practical tips that can save your effort and time in the future.
#datasciencefree
#rohitdubey
#teachtechtoe
#linkedin.com/in/therohitdubey
Data Science has become one of the most demanded jobs of the 21st century. It has become a buzzword that almost everyone talks about these days. But what is Data Science? In this article, we will demystify Data Science, the role of a Data Scientist and have a look at the tools required to master Data Science.
The Comprehensive machine learning canvas is A Tool for Scoping Machine Learning Projects and Defining Solutions.
The Comprehensive Machine Learning Canvas (CMLC) is a tool that helps teams scope machine learning projects and define solutions to business problems. It is based on the idea that machine learning is a creative process, and that the best way to approach it is to start with a hypothesis of how machine learning could help solve a particular business problem. The CMLC helps teams map out the problem, machine learning approach, and potential solutions.
This document provides an introduction to data science. It discusses the different types of data including traditional structured data, big unstructured data, and semi-structured data. It also summarizes the key differences between analysis and analytics, qualitative and quantitative analytics, business intelligence, machine learning, and traditional data science methods. Common data science tools and job positions are also outlined.
1) The document discusses a self-study approach to learning data science through project-based learning using various online resources.
2) It recommends breaking down projects into 5 steps: defining problems/solutions, data extraction/preprocessing, exploration/engineering, model implementation, and evaluation.
3) Each step requires different skillsets from domains like statistics, programming, SQL, visualization, mathematics, and business knowledge.
This document provides an overview of an introductory webinar on AI concepts and terminologies. The webinar agenda includes discussing concepts of AI learning algorithms, understanding the 5 spokes framework of AI, and a team exercise. Key concepts covered are the 7 stages of building intelligence from data, examples of common learning algorithms like neural networks, and the 5 components of the 5 spokes AI framework: sensing, perception, communication, decision making, and interaction. Real-world applications of each component are also examined, such as computer vision, natural language processing, time series analysis, and reinforcement learning.
The Analytics Stack Guidebook (Holistics)Truong Bomi
Chapter 1: High-level Overview of an Analytics Setup
Chapter 2: Centralizing Data
Chapter 3: Data Modeling for Analytics
Chapter 4: Using Data
+++
Trích lời Huy - tác giả cuốn sách, co-founder & CTO của Holistics
+++
"Làm thế nào để thiết kế hệ thống BI stack phù hợp cho công ty mình?"
Có bao giờ bạn được công ty giao nhiệm vụ set up hệ thống BI/analytics stack cho công ty, rồi đến khi lên mạng google thì tá hoả vì mỗi bài viết, mỗi người bạn khác nhau lại khuyên bạn nên sử dụng một bộ công cụ/công nghệ khác nhau? ETL hay ELT, Hadoop hay BigQuery, Data Warehouse hay Data Lake, ...
Rồi bạn thắc mắc: Thiết kế một hệ thống analytics stack như thế nào là phù hợp với nhu cầu hiện tại của công ty mình? Làm thế nào để bắt đầu nhanh nhưng vẫn có thể scale được (mà không phải đập đi xây lại) khi nhu cầu dữ liệu tăng cao?
Thay vì chín người mười ý, bạn ước giá mà có 1 tấm bản đồ (map) có thể giúp bạn định vị được trong thế giới BI/analytics phức tạp này. Một tấm bản đồ cho bạn thấy các thành phần khác nhau của mỗi hệ thống BI là gì, lắp ráp nó lại như thế nào, và tradeoff giữa các cách tiếp cận khác nhau là sao.
Well, sau 2 tháng trời cực khổ thì team mình đã vẽ ra tấm bản đồ đó trong hình dạng một.. cuốn sách:
"The Analytics Setup Guidebook: How to build scalable analytics & BI stacks in modern cloud era."
Cuốn sách là một crash-course để bạn có thể trở thành một "part-time data architect", giúp bạn hiểu được rõ hơn về landscape analytics phức tạp hiện nay.
Sách giải thích high-level overview của một hệ thống analytics ntn, các thành phần tương tác với nhau ra sao, và đi sâu vào đủ chi tiết của những thành phần cũng như best practices cuả nó.
Cuốn sách được viết dành cho các bạn hơi technical được nhận nhiệm vụ phụ trách hệ thống analytics của công ty mình. Bạn có thể là một data analyst đang làm BI, software engineer được kêu qua hỗ trợ làm data engineering, hoặc đơn giản là 1 Product Manager đang thắc mắc sao quy trình data công ty mình chậm quá...
Cuốn sách cũng có những phần chia sẻ nâng cao như Data Modeling, BI evolution phù hợp với các bạn đã có kinh nghiệm làm BI lâu đời.
Data science involves extracting meaningful insights from raw data through scientific methods and algorithms. It is an interdisciplinary field that focuses on analyzing large datasets using skills from computer science, mathematics, and statistics. Python is a commonly used programming language for data science due to its powerful libraries for tasks like data analysis, machine learning, and visualization. Key Python libraries include NumPy, Pandas, Matplotlib, Scikit-learn, and SciPy. The document then discusses tools, applications, and basic concepts in data science and Python.
This document outlines the key steps and analyses involved in developing a business case as a business analyst. It includes sections on feasibility studies, stakeholder analysis, requirements gathering, prioritization, development planning, testing, and deployment. Methodologies covered include PEST analysis, SWOT analysis, Porter's Five Forces, gap analysis, MOSCOW prioritization, and the use of user stories and use cases. The role of the business analyst in justifying the business case and translating requirements between teams is also discussed.
This project is about "Big Data Analytics," and it provides a comprehensive overview of topics related to Data and Analytics and a short note on Cognitive Analytics, Sentiment Analytics, Data Visualization, Artificial intelligence & Data-Driven Decision Making along with examples and diagrams.
what is ..how to process types and methods involved in data analysisData analysis ireland
Data analysis is the process of cleaning, transforming, and processing raw data in order to extract useful and actionable information that can assist businesses in making better decisions.
FOUR TYPES OF BUSINESS ANALYTICS TO KNOWBUSINESS ANALYTICSby AJeanmarieColbert3
FOUR TYPES OF BUSINESS ANALYTICS TO KNOW
BUSINESS ANALYTICS
by Anushka Mehta October 13, 2017
For different stages of business analytics huge amount of data is processed at various steps. Depending on the stage of the workflow and the requirement of data analysis, there are four main kinds of analytics – descriptive, diagnostic, predictive and prescriptive. These four types together answer everything a company needs to know- from what’s going on in the company to what solutions to be adopted for optimizing the functions.
The four types of analytics are usually implemented in stages and no one type of analytics is said to be better than the other. They are interrelated and each of these offers a different insight. With data being important to so many diverse sectors- from manufacturing to energy grids, most of the companies rely on one or all of these types of analytics. With the right choice of analytical techniques, big data can deliver richer insights for the companies
Before diving deeper into each of these, let’s define the four types of analytics:
1) Descriptive Analytics:Describing or summarizing the existing data using existing business intelligence tools to better understand what is going on or what has happened.
2) Diagnostic Analytics: Focus on past performance to determine what happened and why. The result of the analysis is often an analytic dashboard.
3) Predictive Analytics:Emphasizes on predicting the possible outcome using statistical models and machine learning techniques.
4) Prescriptive Analytics:It is a type of predictive analytics that is used to recommend one or more course of action on analyzing the data.
Let’s understand these in a bit more depth.
1. Descriptive Analytics
This can be termed as the simplest form of analytics. The mighty size of big data is beyond human comprehension and the first stage hence involves crunching the data into understandable chunks. The purpose of this analytics type is just to summarize the findings and understand what is going on.
Among some frequently used terms, what people call as advanced analytics or business intelligence is basically usage of descriptive statistics (arithmetic operations, mean, median, max, percentage, etc.) on existing data. It is said that 80% of business analytics mainly involves descriptions based on aggregations of past performance. It is an important step to make raw data understandable to investors, shareholders and managers. This way it gets easy to identify and address the areas of strengths and weaknesses such that it can help in strategizing.
The two main techniques involved are data aggregation and data mining stating that this method is purely used for understanding the underlying behavior and not to make any estimations. By mining historical data, companies can analyze the consumer behaviors and engagements with their businesses that could be helpful in targeted marketing, service improvement, etc. The tools used in this phase are MS Excel, MATLAB ...
The document discusses machine learning and data science concepts. It begins with an introduction to machine learning and the machine learning process. It then provides an overview of select machine learning algorithms and concepts like bias/variance, generalization, underfitting and overfitting. It also discusses ensemble methods. The document then shifts to discussing time series, functions for manipulating time series, and laying the foundation for time series prediction and forecasting. It provides examples of applying techniques like median filtering to smooth time series data. Overall, the document provides a high-level introduction and overview of key machine learning and time series concepts.
The document provides an overview and introduction to "The Analytics Setup Guidebook". It discusses how the guidebook aims to give readers a high-level framework for building a modern analytics setup by explaining the components and best practices for consolidating, transforming, modeling, and using data. The guidebook is intended for those who need guidance in setting up their first analytics stack, such as junior data analysts, product managers, or engineers tasked with building a data stack from scratch.
Fantastic Problems and Where to Find Them: Daryl WeirFuturice
Machine learning and big data have been buzz words for years now, but how do you know you have a machine learning problem on your hands? These slides, taken from a Futurice Beer & Tech talk, describe the types of problems ML methods are well suited to solve, with examples from a wide variety of industries. The deck also tells you where to get started if you want to try solving one of these problems yourself.
Machine Learning with Azure and Databricks Virtual WorkshopCCG
Join CCG and Microsoft for a hands-on demonstration of Azure’s machine learning capabilities. During the workshop, we will:
- Hold a Machine Learning 101 session to explain what machine learning is and how it fits in the analytics landscape
- Demonstrate Azure Databricks’ capabilities for building custom machine learning models
- Take a tour of the Azure Machine Learning’s capabilities for MLOps, Automated Machine Learning, and code-free Machine Learning
By the end of the workshop, you’ll have the tools you need to begin your own journey to AI.
The document discusses decision making and problem solving. It covers defining problems, gathering relevant information to analyze problems, and generating and selecting alternatives. The problem solving process involves defining the problem, collecting information and measures, analyzing the problem, generating alternatives, selecting alternatives, and deciding on and implementing a solution. Cause and effect diagrams like fishbone diagrams can be used to identify and analyze the root causes of problems. Collecting the right information through questions is important for fully understanding problems before attempting to solve them.
BIG DATA AND MACHINE LEARNING
Big Data is a collection of data that is huge in volume, yet growing exponentially with time. It is a data with so large size and complexity that none of traditional data management tools can store it or process it efficiently. Big data is also a data but with huge size.
Discussion - Weeks 1–2COLLAPSETop of FormShared Practice—Rol.docxcuddietheresa
Discussion - Weeks 1–2
COLLAPSE
Top of Form
Shared Practice—Role of Business Information Systems
Note: This Discussion has slightly different due dates than what is typical for this program. Be mindful of this as you post and respond in the Discussion. Your post is due on Day 7 and your Response is due on Day 3 of Week 2.
As a manager, it is critical for you to understand the types of business information systems available to support business operations, management, and strategy. As of 2013, these include, but are certainly not limited to the following:
· Supply Chain Management (SCM)
· Accounting Information System
· Customer Relationship Management (CRM)
· Decision Support Systems (DSS)
· Enterprise Resource Planning (ERP)
· Human Resource Management
These types of systems support critical business functions and operations that every organization must manage. The effective manager understands the purpose of these types of systems and how they can be best used to manage the organization's data and information.
In this Discussion, you will share your knowledge and findings related to business information systems and the role they play in your organization. You will also consider your colleagues' experiences to explore additional ways business information systems might be applied in your colleagues' organizations, or an organization with which you are familiar.
By Day 7
· Describe two or three of the more important technologies or business information systems used in your organization, or in one with which you are familiar.
· Discuss two examples of how these business information systems are affecting the organization you selected. Be sure to discuss how individual behaviors and organizational or individual processes are changing and what you can learn from the issues encountered.
· Summarize what you have learned about the importance of business information systems and why managers need to understand how systems can be used to the organization's advantage.
You should find and use at least one additional current article from a credible resource, either from the Walden Library or the Internet. Please be specific, and remember to use citations and references as necessary.
General Guidance: Your initial Discussion post, due by Day 7, will typically be 3–4 paragraphs in length as a general expectation/estimate. Refer to the rubric for the Week 1 Discussion for grading elements and criteria. Your Instructor will use the rubric to assess your work.
Week 2
By Day 3
In your Week 1 Discussion you described how business information systems have been applied in an organization with which you are familiar. Read through your colleagues' posts and by Day 3 (Week 2), respond to two of your colleagues in one or more of the following ways:
· Examine how the business information systems described by your colleague could be or are being used by your organization. Offer additional ways either organization might take advantage of these systems.
· Examine how the b ...
This document provides an overview of a 5-week online course on digital marketing. The course covers topics such as marketing communications, developing a marketing strategy, digital marketing channels/metrics, social media marketing, search engine optimization, and search engine marketing. It discusses key concepts for each topic at a high level, including the POEM model for digital channels, stages of the marketing funnel, popular social media platforms, SEO best practices, and audience targeting for SEM. The document is intended to introduce students to the main components and learnings that will be covered throughout the course.
The document discusses auditing, auditors, objectives and importance of auditing. It defines auditing as an independent examination of data, statements, records and operations of an enterprise. An auditor evaluates the validity and reliability of a company's financial statements. The main objective of auditing is to verify accounts, examine reliability of financial statements and detect frauds and errors. Auditing helps in detection and prevention of errors and frauds, verification of books and provides an independent opinion. It also ensures compliance and strengthens internal controls. The document then discusses types of audits, a profile of an audit firm and preparation of an audit report.
The document discusses the key steps in an AI project cycle:
1) Problem scoping involves understanding the problem, stakeholders, location, and reasons for solving it.
2) Data acquisition collects accurate and reliable structured or unstructured data from various sources.
3) Data exploration arranges and visualizes the data to understand trends and patterns using tools like charts and graphs.
4) Modelling creates algorithms and models by training them on large datasets to perform tasks intelligently.
5) Evaluation tests the project by comparing outputs to actual answers to identify areas for improvement.
If you’re learning data science, you’re probably on the lookout for cool data science projects. Look no further! We have a wide variety of guided projects that’ll get you working with real data in real-world scenarios while also helping you learn and apply new data science skills.
The projects in the list below are also designed to help you get a job! Each project was designed by a data scientist on our content team, and they’re representative examples of the real projects working data analysts and data scientists do every day. They’re designed to guide you through the process while also challenging your skills, and they’re open-ended so that you can put your own twist on each project and use it for your data science portfolio.
You can complete each project right in your browser, or you can download the data set to your computer and work locally! If you work on our site, you’ll also be able to download your code at any time so that you can continue locally, or upload your project to GitHub.
The sky is the limit here and what you decide to look into further is completely up to you and your imagination!
1. Learning by Doing
Learning by doing refers to a theory of education expounded by American philosopher John Dewey. It is a hands-on approach to learning, meaning students must interact with their environment in order to adapt and learn. This way of learning sharpen your current skills and knowledge and also helps in gaining new skills that could only be acquired by doing.
Car driving is a perfect example of this, you can read as much as you would like about the theory of driving and the rules, and this is very important, and the more you understand the theory the better you get in the practical part. But you will only be able to drive better by applying this knowledge on the real road. In addition to that, there are some skills and knowledge that will be only gained by actually driving.
Data science is the same as driving. It is very important to have solid theoretical knowledge and to regularly increase them to be able to get better while working on a project. However, you should always apply this theoretical knowledge to projects. By this, you will deepen your understanding of these concepts and Knowledge, have a better point of view of how they work in a real-life, and will also show others that you have strong theoretical knowledge and are able to put them into practice.
There are different types of guided projects. One of them is a guided project for
There are a lot of benefits for it:
It removes the barriers between you and doing projects
Saves you much time thinking about the project and preparing the data.
It allows you to apply the theoretical knowledge without getting distracted by obstacles.
Practical tips that can save your effort and time in the future.
#datasciencefree
#rohitdubey
#teachtechtoe
#linkedin.com/in/therohitdubey
Data Science has become one of the most demanded jobs of the 21st century. It has become a buzzword that almost everyone talks about these days. But what is Data Science? In this article, we will demystify Data Science, the role of a Data Scientist and have a look at the tools required to master Data Science.
The Comprehensive machine learning canvas is A Tool for Scoping Machine Learning Projects and Defining Solutions.
The Comprehensive Machine Learning Canvas (CMLC) is a tool that helps teams scope machine learning projects and define solutions to business problems. It is based on the idea that machine learning is a creative process, and that the best way to approach it is to start with a hypothesis of how machine learning could help solve a particular business problem. The CMLC helps teams map out the problem, machine learning approach, and potential solutions.
This document provides an introduction to data science. It discusses the different types of data including traditional structured data, big unstructured data, and semi-structured data. It also summarizes the key differences between analysis and analytics, qualitative and quantitative analytics, business intelligence, machine learning, and traditional data science methods. Common data science tools and job positions are also outlined.
1) The document discusses a self-study approach to learning data science through project-based learning using various online resources.
2) It recommends breaking down projects into 5 steps: defining problems/solutions, data extraction/preprocessing, exploration/engineering, model implementation, and evaluation.
3) Each step requires different skillsets from domains like statistics, programming, SQL, visualization, mathematics, and business knowledge.
This document provides an overview of an introductory webinar on AI concepts and terminologies. The webinar agenda includes discussing concepts of AI learning algorithms, understanding the 5 spokes framework of AI, and a team exercise. Key concepts covered are the 7 stages of building intelligence from data, examples of common learning algorithms like neural networks, and the 5 components of the 5 spokes AI framework: sensing, perception, communication, decision making, and interaction. Real-world applications of each component are also examined, such as computer vision, natural language processing, time series analysis, and reinforcement learning.
The Analytics Stack Guidebook (Holistics)Truong Bomi
Chapter 1: High-level Overview of an Analytics Setup
Chapter 2: Centralizing Data
Chapter 3: Data Modeling for Analytics
Chapter 4: Using Data
+++
Trích lời Huy - tác giả cuốn sách, co-founder & CTO của Holistics
+++
"Làm thế nào để thiết kế hệ thống BI stack phù hợp cho công ty mình?"
Có bao giờ bạn được công ty giao nhiệm vụ set up hệ thống BI/analytics stack cho công ty, rồi đến khi lên mạng google thì tá hoả vì mỗi bài viết, mỗi người bạn khác nhau lại khuyên bạn nên sử dụng một bộ công cụ/công nghệ khác nhau? ETL hay ELT, Hadoop hay BigQuery, Data Warehouse hay Data Lake, ...
Rồi bạn thắc mắc: Thiết kế một hệ thống analytics stack như thế nào là phù hợp với nhu cầu hiện tại của công ty mình? Làm thế nào để bắt đầu nhanh nhưng vẫn có thể scale được (mà không phải đập đi xây lại) khi nhu cầu dữ liệu tăng cao?
Thay vì chín người mười ý, bạn ước giá mà có 1 tấm bản đồ (map) có thể giúp bạn định vị được trong thế giới BI/analytics phức tạp này. Một tấm bản đồ cho bạn thấy các thành phần khác nhau của mỗi hệ thống BI là gì, lắp ráp nó lại như thế nào, và tradeoff giữa các cách tiếp cận khác nhau là sao.
Well, sau 2 tháng trời cực khổ thì team mình đã vẽ ra tấm bản đồ đó trong hình dạng một.. cuốn sách:
"The Analytics Setup Guidebook: How to build scalable analytics & BI stacks in modern cloud era."
Cuốn sách là một crash-course để bạn có thể trở thành một "part-time data architect", giúp bạn hiểu được rõ hơn về landscape analytics phức tạp hiện nay.
Sách giải thích high-level overview của một hệ thống analytics ntn, các thành phần tương tác với nhau ra sao, và đi sâu vào đủ chi tiết của những thành phần cũng như best practices cuả nó.
Cuốn sách được viết dành cho các bạn hơi technical được nhận nhiệm vụ phụ trách hệ thống analytics của công ty mình. Bạn có thể là một data analyst đang làm BI, software engineer được kêu qua hỗ trợ làm data engineering, hoặc đơn giản là 1 Product Manager đang thắc mắc sao quy trình data công ty mình chậm quá...
Cuốn sách cũng có những phần chia sẻ nâng cao như Data Modeling, BI evolution phù hợp với các bạn đã có kinh nghiệm làm BI lâu đời.
Data science involves extracting meaningful insights from raw data through scientific methods and algorithms. It is an interdisciplinary field that focuses on analyzing large datasets using skills from computer science, mathematics, and statistics. Python is a commonly used programming language for data science due to its powerful libraries for tasks like data analysis, machine learning, and visualization. Key Python libraries include NumPy, Pandas, Matplotlib, Scikit-learn, and SciPy. The document then discusses tools, applications, and basic concepts in data science and Python.
This document outlines the key steps and analyses involved in developing a business case as a business analyst. It includes sections on feasibility studies, stakeholder analysis, requirements gathering, prioritization, development planning, testing, and deployment. Methodologies covered include PEST analysis, SWOT analysis, Porter's Five Forces, gap analysis, MOSCOW prioritization, and the use of user stories and use cases. The role of the business analyst in justifying the business case and translating requirements between teams is also discussed.
This project is about "Big Data Analytics," and it provides a comprehensive overview of topics related to Data and Analytics and a short note on Cognitive Analytics, Sentiment Analytics, Data Visualization, Artificial intelligence & Data-Driven Decision Making along with examples and diagrams.
what is ..how to process types and methods involved in data analysisData analysis ireland
Data analysis is the process of cleaning, transforming, and processing raw data in order to extract useful and actionable information that can assist businesses in making better decisions.
FOUR TYPES OF BUSINESS ANALYTICS TO KNOWBUSINESS ANALYTICSby AJeanmarieColbert3
FOUR TYPES OF BUSINESS ANALYTICS TO KNOW
BUSINESS ANALYTICS
by Anushka Mehta October 13, 2017
For different stages of business analytics huge amount of data is processed at various steps. Depending on the stage of the workflow and the requirement of data analysis, there are four main kinds of analytics – descriptive, diagnostic, predictive and prescriptive. These four types together answer everything a company needs to know- from what’s going on in the company to what solutions to be adopted for optimizing the functions.
The four types of analytics are usually implemented in stages and no one type of analytics is said to be better than the other. They are interrelated and each of these offers a different insight. With data being important to so many diverse sectors- from manufacturing to energy grids, most of the companies rely on one or all of these types of analytics. With the right choice of analytical techniques, big data can deliver richer insights for the companies
Before diving deeper into each of these, let’s define the four types of analytics:
1) Descriptive Analytics:Describing or summarizing the existing data using existing business intelligence tools to better understand what is going on or what has happened.
2) Diagnostic Analytics: Focus on past performance to determine what happened and why. The result of the analysis is often an analytic dashboard.
3) Predictive Analytics:Emphasizes on predicting the possible outcome using statistical models and machine learning techniques.
4) Prescriptive Analytics:It is a type of predictive analytics that is used to recommend one or more course of action on analyzing the data.
Let’s understand these in a bit more depth.
1. Descriptive Analytics
This can be termed as the simplest form of analytics. The mighty size of big data is beyond human comprehension and the first stage hence involves crunching the data into understandable chunks. The purpose of this analytics type is just to summarize the findings and understand what is going on.
Among some frequently used terms, what people call as advanced analytics or business intelligence is basically usage of descriptive statistics (arithmetic operations, mean, median, max, percentage, etc.) on existing data. It is said that 80% of business analytics mainly involves descriptions based on aggregations of past performance. It is an important step to make raw data understandable to investors, shareholders and managers. This way it gets easy to identify and address the areas of strengths and weaknesses such that it can help in strategizing.
The two main techniques involved are data aggregation and data mining stating that this method is purely used for understanding the underlying behavior and not to make any estimations. By mining historical data, companies can analyze the consumer behaviors and engagements with their businesses that could be helpful in targeted marketing, service improvement, etc. The tools used in this phase are MS Excel, MATLAB ...
The document discusses machine learning and data science concepts. It begins with an introduction to machine learning and the machine learning process. It then provides an overview of select machine learning algorithms and concepts like bias/variance, generalization, underfitting and overfitting. It also discusses ensemble methods. The document then shifts to discussing time series, functions for manipulating time series, and laying the foundation for time series prediction and forecasting. It provides examples of applying techniques like median filtering to smooth time series data. Overall, the document provides a high-level introduction and overview of key machine learning and time series concepts.
The document provides an overview and introduction to "The Analytics Setup Guidebook". It discusses how the guidebook aims to give readers a high-level framework for building a modern analytics setup by explaining the components and best practices for consolidating, transforming, modeling, and using data. The guidebook is intended for those who need guidance in setting up their first analytics stack, such as junior data analysts, product managers, or engineers tasked with building a data stack from scratch.
Fantastic Problems and Where to Find Them: Daryl WeirFuturice
Machine learning and big data have been buzz words for years now, but how do you know you have a machine learning problem on your hands? These slides, taken from a Futurice Beer & Tech talk, describe the types of problems ML methods are well suited to solve, with examples from a wide variety of industries. The deck also tells you where to get started if you want to try solving one of these problems yourself.
Machine Learning with Azure and Databricks Virtual WorkshopCCG
Join CCG and Microsoft for a hands-on demonstration of Azure’s machine learning capabilities. During the workshop, we will:
- Hold a Machine Learning 101 session to explain what machine learning is and how it fits in the analytics landscape
- Demonstrate Azure Databricks’ capabilities for building custom machine learning models
- Take a tour of the Azure Machine Learning’s capabilities for MLOps, Automated Machine Learning, and code-free Machine Learning
By the end of the workshop, you’ll have the tools you need to begin your own journey to AI.
The document discusses decision making and problem solving. It covers defining problems, gathering relevant information to analyze problems, and generating and selecting alternatives. The problem solving process involves defining the problem, collecting information and measures, analyzing the problem, generating alternatives, selecting alternatives, and deciding on and implementing a solution. Cause and effect diagrams like fishbone diagrams can be used to identify and analyze the root causes of problems. Collecting the right information through questions is important for fully understanding problems before attempting to solve them.
BIG DATA AND MACHINE LEARNING
Big Data is a collection of data that is huge in volume, yet growing exponentially with time. It is a data with so large size and complexity that none of traditional data management tools can store it or process it efficiently. Big data is also a data but with huge size.
Discussion - Weeks 1–2COLLAPSETop of FormShared Practice—Rol.docxcuddietheresa
Discussion - Weeks 1–2
COLLAPSE
Top of Form
Shared Practice—Role of Business Information Systems
Note: This Discussion has slightly different due dates than what is typical for this program. Be mindful of this as you post and respond in the Discussion. Your post is due on Day 7 and your Response is due on Day 3 of Week 2.
As a manager, it is critical for you to understand the types of business information systems available to support business operations, management, and strategy. As of 2013, these include, but are certainly not limited to the following:
· Supply Chain Management (SCM)
· Accounting Information System
· Customer Relationship Management (CRM)
· Decision Support Systems (DSS)
· Enterprise Resource Planning (ERP)
· Human Resource Management
These types of systems support critical business functions and operations that every organization must manage. The effective manager understands the purpose of these types of systems and how they can be best used to manage the organization's data and information.
In this Discussion, you will share your knowledge and findings related to business information systems and the role they play in your organization. You will also consider your colleagues' experiences to explore additional ways business information systems might be applied in your colleagues' organizations, or an organization with which you are familiar.
By Day 7
· Describe two or three of the more important technologies or business information systems used in your organization, or in one with which you are familiar.
· Discuss two examples of how these business information systems are affecting the organization you selected. Be sure to discuss how individual behaviors and organizational or individual processes are changing and what you can learn from the issues encountered.
· Summarize what you have learned about the importance of business information systems and why managers need to understand how systems can be used to the organization's advantage.
You should find and use at least one additional current article from a credible resource, either from the Walden Library or the Internet. Please be specific, and remember to use citations and references as necessary.
General Guidance: Your initial Discussion post, due by Day 7, will typically be 3–4 paragraphs in length as a general expectation/estimate. Refer to the rubric for the Week 1 Discussion for grading elements and criteria. Your Instructor will use the rubric to assess your work.
Week 2
By Day 3
In your Week 1 Discussion you described how business information systems have been applied in an organization with which you are familiar. Read through your colleagues' posts and by Day 3 (Week 2), respond to two of your colleagues in one or more of the following ways:
· Examine how the business information systems described by your colleague could be or are being used by your organization. Offer additional ways either organization might take advantage of these systems.
· Examine how the b ...
This document provides an overview of a 5-week online course on digital marketing. The course covers topics such as marketing communications, developing a marketing strategy, digital marketing channels/metrics, social media marketing, search engine optimization, and search engine marketing. It discusses key concepts for each topic at a high level, including the POEM model for digital channels, stages of the marketing funnel, popular social media platforms, SEO best practices, and audience targeting for SEM. The document is intended to introduce students to the main components and learnings that will be covered throughout the course.
The document discusses auditing, auditors, objectives and importance of auditing. It defines auditing as an independent examination of data, statements, records and operations of an enterprise. An auditor evaluates the validity and reliability of a company's financial statements. The main objective of auditing is to verify accounts, examine reliability of financial statements and detect frauds and errors. Auditing helps in detection and prevention of errors and frauds, verification of books and provides an independent opinion. It also ensures compliance and strengthens internal controls. The document then discusses types of audits, a profile of an audit firm and preparation of an audit report.
This document provides a summary of an online course about investing in stocks. The 11-hour course covers key concepts like different types of stocks, understanding risk, strategies for evaluating stocks, taxes and fees. It also demonstrates how to buy stocks online without or with brokers. The course aims to help students understand the stock market and how to earn money by investing. It provides an overview of important topics like diversification, technical analysis, unusual market situations, and more. The conclusion reflects on the learning experience and opportunity to gain skills and knowledge about stock market investing.
The document summarizes key aspects of corporate social responsibility (CSR) including:
- The meaning and definitions of CSR, how it has evolved from voluntary to mandatory practices.
- The objectives of CSR which include embracing responsibility, maximizing societal impact, and giving back to communities.
- Benefits of CSR for businesses such as increased employee engagement, improved brand perception, and enabling better customer engagement.
- New amendments to CSR rules in India including clarifying eligible CSR activities and treatment of unspent/excess CSR funds which must be transferred to specified funds.
The campaign aimed to promote a switchgear brand through a Facebook campaign with a budget of 500. The target group included business people, industry professionals, and small business owners between 18-65+ years old located in India. Targeting parameters included age, location, professional interests, and language. The campaign screenshots showed reach of over 140,000 and impressions of over 147,000, indicating success. Key performance indicators like reach, impressions, cost per mille, and impression share would be used for analysis. Targeting and creative optimizations suggested include using a single image instead of carousel, optimizing the ad schedule, refining the target audience, adding animated photos, shortening the headline, and making the ad more personally relatable
The campaign aimed to maximize traffic to a poetry website called WORDS SHOWER using Google Search Ads. The budget was 500 rupees. Keywords were chosen related to poetry that had high search volumes but low-medium competition. Steps to improve a low-performing keyword included adding negative keywords, ad extensions, and responsive search ads to increase relevance and clicks. Screenshots showed the search ad copy, campaign results with 0.56% CTR, and potential actions like improving quality score and optimizing headlines/copy to increase CTR above the industry average of 2%.
Digital marketing blog creation project.pptxkushi62
This document summarizes a blog creation project in WordPress. The blog domain name is https://www.wordsshower.upgradcampustest.xyz and is focused on describing thoughts through poetry. Screenshots are provided of the WordPress dashboard, installed plugins, all pages, all posts, and permalinks. Post name permalinks were selected to help point out and understand specific topics and individual pieces of content as the blog will contain different kinds and numbers of poetry.
[4:55 p.m.] Bryan Oates
OJPs are becoming a critical resource for policy-makers and researchers who study the labour market. LMIC continues to work with Vicinity Jobs’ data on OJPs, which can be explored in our Canadian Job Trends Dashboard. Valuable insights have been gained through our analysis of OJP data, including LMIC research lead
Suzanne Spiteri’s recent report on improving the quality and accessibility of job postings to reduce employment barriers for neurodivergent people.
Decoding job postings: Improving accessibility for neurodivergent job seekers
Improving the quality and accessibility of job postings is one way to reduce employment barriers for neurodivergent people.
OJP data from firms like Vicinity Jobs have emerged as a complement to traditional sources of labour demand data, such as the Job Vacancy and Wages Survey (JVWS). Ibrahim Abuallail, PhD Candidate, University of Ottawa, presented research relating to bias in OJPs and a proposed approach to effectively adjust OJP data to complement existing official data (such as from the JVWS) and improve the measurement of labour demand.
In a tight labour market, job-seekers gain bargaining power and leverage it into greater job quality—at least, that’s the conventional wisdom.
Michael, LMIC Economist, presented findings that reveal a weakened relationship between labour market tightness and job quality indicators following the pandemic. Labour market tightness coincided with growth in real wages for only a portion of workers: those in low-wage jobs requiring little education. Several factors—including labour market composition, worker and employer behaviour, and labour market practices—have contributed to the absence of worker benefits. These will be investigated further in future work.
Independent Study - College of Wooster Research (2023-2024) FDI, Culture, Glo...AntoniaOwensDetwiler
"Does Foreign Direct Investment Negatively Affect Preservation of Culture in the Global South? Case Studies in Thailand and Cambodia."
Do elements of globalization, such as Foreign Direct Investment (FDI), negatively affect the ability of countries in the Global South to preserve their culture? This research aims to answer this question by employing a cross-sectional comparative case study analysis utilizing methods of difference. Thailand and Cambodia are compared as they are in the same region and have a similar culture. The metric of difference between Thailand and Cambodia is their ability to preserve their culture. This ability is operationalized by their respective attitudes towards FDI; Thailand imposes stringent regulations and limitations on FDI while Cambodia does not hesitate to accept most FDI and imposes fewer limitations. The evidence from this study suggests that FDI from globally influential countries with high gross domestic products (GDPs) (e.g. China, U.S.) challenges the ability of countries with lower GDPs (e.g. Cambodia) to protect their culture. Furthermore, the ability, or lack thereof, of the receiving countries to protect their culture is amplified by the existence and implementation of restrictive FDI policies imposed by their governments.
My study abroad in Bali, Indonesia, inspired this research topic as I noticed how globalization is changing the culture of its people. I learned their language and way of life which helped me understand the beauty and importance of cultural preservation. I believe we could all benefit from learning new perspectives as they could help us ideate solutions to contemporary issues and empathize with others.
How Does CRISIL Evaluate Lenders in India for Credit RatingsShaheen Kumar
CRISIL evaluates lenders in India by analyzing financial performance, loan portfolio quality, risk management practices, capital adequacy, market position, and adherence to regulatory requirements. This comprehensive assessment ensures a thorough evaluation of creditworthiness and financial strength. Each criterion is meticulously examined to provide credible and reliable ratings.
STREETONOMICS: Exploring the Uncharted Territories of Informal Markets throug...sameer shah
Delve into the world of STREETONOMICS, where a team of 7 enthusiasts embarks on a journey to understand unorganized markets. By engaging with a coffee street vendor and crafting questionnaires, this project uncovers valuable insights into consumer behavior and market dynamics in informal settings."
2. Elemental Economics - Mineral demand.pdfNeal Brewster
After this second you should be able to: Explain the main determinants of demand for any mineral product, and their relative importance; recognise and explain how demand for any product is likely to change with economic activity; recognise and explain the roles of technology and relative prices in influencing demand; be able to explain the differences between the rates of growth of demand for different products.
Falcon stands out as a top-tier P2P Invoice Discounting platform in India, bridging esteemed blue-chip companies and eager investors. Our goal is to transform the investment landscape in India by establishing a comprehensive destination for borrowers and investors with diverse profiles and needs, all while minimizing risk. What sets Falcon apart is the elimination of intermediaries such as commercial banks and depository institutions, allowing investors to enjoy higher yields.
1. BODY OF THE COURSE
COURSE NAME – BUSINESS ANALYTICS
ONLINE PLATFORM – UPGRADE
NUMBER OF WEEKS – 6 WEEKS
ABOUT – 01. INTRODUCTION.
02. EXPLORATORY DATA ANALYSIS
03. BIG DATA
04. MACHINE LEARNING
05. DEEP LEARNING, NEURAL NETWORKING AND
NATURAL LANGUAGE PROCESSING
06. APPROACHES TO SOLVE PROBLEM
07. LEARNING OUTCOMES
08. CONCLUSION
2. LEARNING EXPERIENCE
INTRODUCTION
Business analytics – It is the process by which businesses use statistical
methods and technologies for analysing historical data in order to gain new
insight and improve strategic decision-making.
Data analysis v/s data science
- They are 2 sides of a coin. Analytics is the discovery, interpretation and
communication of meaningful pattern in the data whereas Data science
also means extracting insights from data and help in data driven decision.
Earlier data was not so big, knowledge of statistics was good enough to
analyses these data. This is was era of analytics. Overtime, data set has
become more complex and managing them require specialized
engineering skill sets. So, we require a strong hold on statistics. Therefore,
this is an era of data science.
What kind of data is captured?
First is transaction history like User id, data, time, cart value, delivery date,
return tag and many more. Next is item meta tag like items ID, name, category,
MRP and Selling price, Procurement date etc. third kind of data is Supply chain
history here they capture Order ID, Item ID, Inventory location delivery – date
and time, Pin code and executive. Fourthly User meta tag it capture – User id,
Email id, gender, date of birth, etc. lastly vendor details – Vendor’s ID, location,
rating category and much more. This are some of the data the e-commerce site
capture.
EXPLORATORY DATA ANALYSIS (EDA)
- It is one of the important steps in any kind of data analysis. It refers to the
critical process of performing initial investigations on data so as to
discover patterns, to spot anomalies, to test a hypothesis and to check
3. assumptions with the help of summary statistics and graphical
representations.
- Once the data is explored and have some findings, it is imperative to be
able to present it in a format that can be understood by the senior
management. This is where reporting comes into the picture. To do
further analysis cleaning the unnecessary data is must. Data
cleaning, feature engineering all fall under the broad category of data
preparation. Once the data is prepared, univariate data analysis can be
done on it.
BIG DATA
- For example, The Amazons and the Flipkarts of the world have an
extremely huge amount of data. So huge that it becomes difficult to
analyse it on a single computer. We need a whole different infrastructure
to deal with it. This huge amount of data is termed as 'big data' and
analysing it is termed as big data analytics. It is characterised by 3 Vs -
Volume, Velocity and Variety. Volume refers to the size of the data,
velocity refers to the rate at which the data is being received, and variety
refers to the different types of data that we may get - images, text,
numbers, speech, videos etc.
- Data Architecture: Data architecture is a set of rules, policies, standards
and models that govern and define the type of data collected and how it is
used, stored, managed and integrated within an organization and its
database systems.
- Parallel Computing: Parallel computing is a type of computing in which
many calculations or the execution of processes are carried out
simultaneously. Large problems can often be divided into smaller ones,
which can then be solved at the same time.
MACHINE LEARNING
4. There are 2 types of machine learning i.e., Supervised Learning-it is a type
of machine learning algorithm in which a system is taught to classify input
into specific, known classes. Classification is one such technique which
classifies data points into one of the various possible classes. And next is
Unsupervised Learning where a class of machine learning algorithms
designed to identify groupings of data without knowing in advance what
the groups will be.
DEEP LEARNING, NEURAL NETWORKING AND NATURAL
LANGUAGE PROCESSING
- Deep Learning- Typically, a multi-level algorithm that gradually identifies
things at higher levels of abstraction. For example, the first level may
identify certain lines, then the next level identifies combinations of lines as
shapes, and then the next level identifies combinations of shapes as
specific objects. As we might guess from this example, deep learning is
popular for image classification.
- Neural Networks: A robust function that takes an arbitrary set of inputs
and fits it to an arbitrary set of outputs that are binary. In practice, Neural
Networks are used in deep learning research to match images to picture.
- Natural Language Processing: A branch of computer science for parsing
text of spoken languages (for example, English or Mandarin) to convert it
to structured data that we can use to drive program logically.
- Artificial Intelligence is essentially teaching a machine to think like a
human. It is surprisingly difficult. Suppose we are watching a cricket
match. We can look at the eyes of the batsman and know what shot he
will play. Now if we can train a machine to predict the same, we can
imagine what a big breakthrough that is. That is Artificial Intelligence.
5. APPROACHES TO SOLVE PROBLEM
1. UNDERSTANDING THE BUSINESS PROBLEM
- To start with any analysis, we should first understand the problems, which
the business is facing. After doing plenty of self-research, we need to
understand the problem from the ones who are facing it. Here comes the
part that we term ‘interviewing’. To understand a problem completely, we
will always need to interact with multiple people in the company.
Interviewing people all the time to gather information. Different job roles
will require to interact with different sets of individuals, but the task of
interviewing will remain the same across all of them. Interviewing is an
important segment, while interviewing following things should be kept in
mind – Turn off all the distractions (mobiles, laptops, etc.) around us, use
pen and paper to prepare notes, be patient Don’t be anxious to reply,
Pause Think and then Ask, Playback understanding with the interviewee
for their validation.
Frame works - To overcome this issue, people have developed specific
patterns of asking questions over the years, which we call ‘frameworks.’
There are multiple frameworks available at our disposal, and we need to
pick the one that is the most suitable for our case. There are 3 important
frameworks- 5 WHYs, 5 HOW’s, So what? And 5 W’s. All three
frameworks are useful to understand the context of the problem. It is
helpful in identifying the root cause for a problem. We cannot cover all the
domains using these frameworks, but they give a sense of the problem.
- The SPIN framework (Situation, problem, implication, need-payoff
questions) starts with asking about the current situation and helps us to
visualise the entire journey, from when the problem arises to what will
happen when the problem is solved. It is an excellent approach to follow,
6. as it helps the client (internal or external) realise the extent of both the
problem and the solution.
2. FORMULATING HYPOTHESIS
After understanding the dept of the problems we should ask the questions in an
interview to get the insights regarding the problems the firm is facing. When we
understand the problem, we need to explore the reasons that may be behind it.
After the interviewing process if we think that we have identified the problem,
we are mistaken. It is just one possible reason for the problem that the
company or client is facing. There can be multiple reasons for the problem.
Also, even if we believe that we found the correct reason, first we need to verify
whether there is any data that supports it and then employ the resources to
solve the issue. This test is required because if there was a different reason
behind the problem, all the efforts would then go in vain. Therefore, we need to
realise two things:
1. When we are exploring possible reasons for any problem, we need to cover
all the aspects of it. For this, we will again come across various frameworks that
can be applied to follow a structured approach.
2. Also, the possible reason that we discovered is what we call a hypothesis. A
hypothesis is a possible explanation which has been prepared based on limited
evidence and needs to be validated by further investigation.
BUSINESS MODEL CANVAS
Business Model Canvas is a strategic management template for developing new
or documenting existing business models. It helps you cover all the aspects on a
sheet of paper in a structured format. It is distributed under a 'Creative
Commons license' from Strategyzer and can be used without any restrictions for
modelling businesses. This model is adaptive to all business and it covers all the
domain of the business in a simple and efficient manner.
FRAMEWORKS
After the interviews, we will have possible reasons or causes for the problem
that the company or the client was facing. The focus point is that it is still a
7. "possible" cause. Therefore, we use the term 'hypothesis'. Now, we will learn
the process of formulating the hypotheses using multiple frameworks. Some of
frameworks are:
1. Issue Frame Work - Issue tree framework is one of the most effective
methods to approach a problem. It works by disintegrating the problem into
sub-components. The big complex problem is continuously decomposed into
simpler issues. At last, we will end up with a bunch of hypotheses.
2. Specialised frameworks –The issue tree framework can be clubbed with
different frameworks to provide a complete view of the problem. Based on the
domain of the problem that we want to solve, there are multiple frameworks
available to apply. For example, if the company is trying to create awareness in
the market, it is a marketing problem. If the company wants to optimise the
process of manufacturing, it will be an operations problem. These cases are
different in nature and should be handled differently. In specialised frameworks
there are some of frameworks which are very effective and help us to approach
the problem in a structure manner:
o Business segmentation - This framework is useful when the company or
client has a spread over multiple businesses. Like in issue tree, you will be
required to break the entire company into the respective businesses in
which it operates. For example, e-commerce giants like Amazon, Flipkart,
etc. provide a wide range of products like clothing, electronics, etc. If the
company is facing any problems, you will analyse these segments
separately because the characteristics of each business can be different
from one another.
o Profitability Analysis- This is also an extension of the issue tree
framework. When you are trying to break a branch into further sub-
components, one way to segment is based on the profitability of the
products or services. Here, the main aim will be to analyse the high yield
products prior to others.
o SWOT Analysis - The SWOT analysis helps you segment the processes in
the company based on the capabilities and points out the critical concern
areas for the company.
8. o The balanced scorecard - The idea around the framework is that
your learning and growth will help you to handle the internal
processes better. The betterment of operations will result in a reduction in
the process costs and improve customer experience. Better customer
experience will drive the revenues upwards. Hence, the financial aspect of
the company will grow. You can analyse these four aspects of the
company and formulate hypotheses around them which can be tested in
the later phases.
o 4Ps Framework - The 4Ps framework or the marketing mix model is a very
powerful tool to check the marketing strategy of the company. It focuses
on the following Ps: Product, Price, Promotion, Place. It serves the purpose
of solving the problem for the customer.
o 5C’s Framework - The 5C framework is a very useful framework to
understand both, the internal and external environments in which the firm
is operating. It helps you to identify what is helping the company to
succeed and which factors are restricting it from achieving what is
expected.
These frameworks should help to reach the final hypothesis at the end of the
interviewing process.
3. COLLECT DATA
The next step that follows the hypothesis formulation is to collect data to
validate them. Once the validation of hypothesis is done, we can start working
on a solution around that root cause. Collect the data with regards to the issue.
The data should be collected for a period of 3 to 6 months on the reported
problem, as well as linked issues. This step is essential to validate the hypothesis.
Data can gather by reviewing the business plan, Interviewing the data expert,
analyse forms and reports etc.
There are various types of quality issues when it comes to data, and that’s why
data cleaning is one of the most time-consuming steps in data analysis. In real-
world scenarios, the data you need to analyse often come from a third party,
9. clients, etc., and the data collection/entry methods, etc. often lead to errors,
due to which cleaning the data becomes crucial.
4. ANALYSE DATA
Once we have obtained the data the next step would be to analyse it to observe
any patterns that validate or refutes the hypotheses that we’d made in our
earlier steps. In the business problem-solving procedure this part comes
immediately after the collection of data.
o PATTERNS OF INSIGHTS
Irrespective of the business problem that any industry is trying to solve,
the insight that is generated to solve that problem has some underlying
common patterns.
These patterns can be classified into five categories:
• Unknown Result: When the result is unknown and of significance
• Surprising Extreme: When the result is the highest or the lowest and
was not expected.
• Surprising Comparison: When two values are compared, and the
resulting inference is surprising.
• Significant Outliers: When a value is unusually large or unusually small.
• Abnormal Distribution: When a variable shows unusual trends.
Even though hundreds of results are generated from the given insights, the
corresponding insights need to follow two criteria to become insightful. These
criteria are: check if the insight is interesting and check if the insight is useful.
o DOCUMENTING INSIGHTS
Once the insights are generated, it’s crucial that we’re able to
communicate the results effectively to your audience. One of the most
prevalent ways in which consultants and analysts communicate insights,
the PYRAMID PRINCIPLE is crucial in conveying the message to the
audience very quickly and efficiently. This principle works well because of
10. two reasons: It’s concise and save the time. Pyramid principle can be
utilised in a variety of formats like slides, e-mails, etc. Therefore, we must
practise the skill of communicating the insights that we have using this
principle.
5. PRESENT FINDINGS
Storytelling is important because, research has shown that storytelling is more
persuasive, and people tend to remember stories far better than simple
statistics. The first element of storytelling that you need to learn is the concept
of using effective visualisation. visualisation is important because it makes the
available insights more digestible and easier to interpret. visualisation can be
possible for two types of variables: Qualitative and Quantitative. Visualisation of
Quantitative Variables – Scatter plots, Line charts, Histogram and for
Visualisation of Qualitative Variables – Pie chart, Bar chart, stacked bar chart are
used to analyse the data.
o VISUAL DESIGN PRINCIPLES - The next important aspect of storytelling is
the concept of visual design principles. This uses two key concepts.
1.Trade-off between Accuracy and Precision and 2. Drawing Attention with
Text and Visuals.
o STORY BOARDING - Storyboarding uses the concept of Pyramid Principle
to explain the main points and the supporting arguments and uses
additional nuances like placing information at the right place, removing
superfluous information and visually linking the items. It helps to
assimilate information, identify gaps in analysis, helps to avoid redundant
works.
11. LEARNING OUTCOMES
This course helped me to learn and understand how to analysis the basic
data. It will ultimately help to spot new business opportunities, cut costs,
or identify inefficient processes that need reengineering.
This course gave us a live case study and had a demonstration class to
understand the different types of data and extract root cause of the
business through interviewing, searching or collecting the data from the
clients. when we are interviewing to solve a problem, we will come across
people with different traits and the same approach will not work for all of
them. There are 4 types of interviewee – The old hand, The Weasel, The
Stone Face, I know it all. Tailoring the conversation according to their type
can result in good information source.
And then using different types of frameworks and formulating the
hypothesises and taking that hypothesis into consider to collect the
appropriate data.
Collection of data from right source is important. Once the validation of
hypothesis is done, we can start working on a solution around that root
cause. After collection of data is done, cleaning of data is an important
procedure. After collecting the data next procedure is to analysis it. And to
analysis the data EXCEL is the good platform
This course taught us different types of formulas, graphs, diagrams can be
used to analysis the data. This will help us to understand every corners
and aspects of the data. There is important tool called CONDITIONAL
FORMATING which can be used to understand the data more efficiently.
There is one more important tool PIVOT TABLE this tool converts the data
12. into table form and brings out the different form of information according
to our convenience.
Lastly, they thought us how to present the findings, we should first
choose the top insights, weave into a story, validate the outcome of the
message, and lastly bring out the impact. While
CONCLUSION
1. This course is an opportunity for me to understand about the
BUSINESS ANALYTICS
2. It gave me understanding about the process of the analyzing the
data.
3. It is good platform to do further studies.
4. It brings out all positive and negative aspects of the businesses.
5. This course helps us to know what skills and knowledge we have to
improve in coming time.
- Lastly, I would like to thank all the respected teachers and
commerce department for giving me this opportunity, it was
great learning experience to me while learning this course.