This event took place on 16th September 2020. This was arranged by EMK Center (Makerlab). The title was 'Elementary Data Analysis with MS Excel', where very basic data analysis with MS excel was discussed.
In Day-5, Hypothesis, Statistics, Regression Analysis, T-Test, Z-test, P-Test, ANOVA, Goal Seek, Pivot Chart, Dashboard, Slicer, Solver, Data Analysis Toolpak, and peripheral items were discussed.
Elementary Data Analysis with MS Excel_Day-3Redwan Ferdous
This event took place on 9th September 2020. This was arranged by EMK Center (Makerlab). The title was 'Elementary Data Analysis with MS Excel', where very basic data analysis with MS excel was discussed.
In Day-3, MS Excel formula and functions were covered. Almost 20+ Functions were practiced live with the class along with troubleshooting and different logical explanation. Also Error Handling, Data Validation and Macro were taught in the same class.
Elementary Data Analysis with MS Excel_Day-4Redwan Ferdous
This event took place on 12th September 2020. This was arranged by EMK Center (Makerlab). The title was 'Elementary Data Analysis with MS Excel', where very basic data analysis with MS excel was discussed.
In Day-4, the MS Excel Data Tab, View and Review tab as well as Developer Tab of Horizontal top ribbon was discussed. As well as different Quick analysis tools, What-if Analysis, Data Table, Scenario Manager, Pareto Chart was also discussed.
Power BI, SSAS Tabular, and Excel all use DAX. This presentation is meant to be used with a PBIX notebook found here: https://github.com/IkeEllis/democode/blob/master/IntroToDAX/Power%20BI%20Introduction%20to%20DAX.pbix
This document provides an overview of DAX (Data Analysis Expressions) and how it can be used for data analysis in Power BI and Analysis Services Tabular models. It discusses key DAX concepts like calculated columns, calculated measures, and filter context. It also covers common DAX functions and how to work with dates in DAX. The document provides examples of how to define security and write DAX queries against the BI Semantic Model.
The document provides an overview of various Excel functions organized into categories including:
1. Mathematical functions such as ROUND, MOD, INTEGER, GCD, and LOG functions.
2. Statistical functions such as COUNT, AVERAGE, MAX, MEDIAN, and financial functions such as NPV, PV, PMT.
3. Lookup functions including VLOOKUP, HLOOKUP, MATCH to find data in tables or perform lookups.
4. Date and time functions like DATE, TIME, TODAY, NOW and DATEDIF to work with dates and times.
5. Text functions including LEFT, RIGHT, MID, UPPER, LOWER, LEN to manipulate
This document provides instructions for inputting and managing data in SAS. It discusses creating a SAS library to organize data files. Steps are provided to manually create a SAS data set within a library and input data. Importing data from an external file is also mentioned as an alternative to manual input. The document reviews key SAS concepts like librefs and permanent vs temporary libraries.
This document provides instructions for creating an educational slide presentation using OpenOffice Impress. It outlines how to:
1. Create slides and insert text within text boxes.
2. Format slides by adding numbered lists and changing background colors.
3. Insert graphics and clipart.
4. Save, view, and print the Impress presentation.
The document then provides step-by-step directions for opening an example Impress file, adding two new slides, entering and formatting slide titles, and applying text effects like shadows.
This document discusses creating data visualizations with low-cost tools. It begins by outlining the objectives of understanding the purpose of a visualization, principles of communicating through data, choosing the right visualization, and determining if Excel is suitable. It then covers the eight principles of communicating through data, such as defining the question, using accurate data, and tailoring the visualization to the audience. Next, it discusses choosing the right visualization type based on the purpose, such as line charts, bar charts or tables. The document considers when Excel may not be suitable and introduces specialist tools like Tableau, Microsoft Power BI, and coding options. It concludes with additional resources for data visualization.
Elementary Data Analysis with MS Excel_Day-3Redwan Ferdous
This event took place on 9th September 2020. This was arranged by EMK Center (Makerlab). The title was 'Elementary Data Analysis with MS Excel', where very basic data analysis with MS excel was discussed.
In Day-3, MS Excel formula and functions were covered. Almost 20+ Functions were practiced live with the class along with troubleshooting and different logical explanation. Also Error Handling, Data Validation and Macro were taught in the same class.
Elementary Data Analysis with MS Excel_Day-4Redwan Ferdous
This event took place on 12th September 2020. This was arranged by EMK Center (Makerlab). The title was 'Elementary Data Analysis with MS Excel', where very basic data analysis with MS excel was discussed.
In Day-4, the MS Excel Data Tab, View and Review tab as well as Developer Tab of Horizontal top ribbon was discussed. As well as different Quick analysis tools, What-if Analysis, Data Table, Scenario Manager, Pareto Chart was also discussed.
Power BI, SSAS Tabular, and Excel all use DAX. This presentation is meant to be used with a PBIX notebook found here: https://github.com/IkeEllis/democode/blob/master/IntroToDAX/Power%20BI%20Introduction%20to%20DAX.pbix
This document provides an overview of DAX (Data Analysis Expressions) and how it can be used for data analysis in Power BI and Analysis Services Tabular models. It discusses key DAX concepts like calculated columns, calculated measures, and filter context. It also covers common DAX functions and how to work with dates in DAX. The document provides examples of how to define security and write DAX queries against the BI Semantic Model.
The document provides an overview of various Excel functions organized into categories including:
1. Mathematical functions such as ROUND, MOD, INTEGER, GCD, and LOG functions.
2. Statistical functions such as COUNT, AVERAGE, MAX, MEDIAN, and financial functions such as NPV, PV, PMT.
3. Lookup functions including VLOOKUP, HLOOKUP, MATCH to find data in tables or perform lookups.
4. Date and time functions like DATE, TIME, TODAY, NOW and DATEDIF to work with dates and times.
5. Text functions including LEFT, RIGHT, MID, UPPER, LOWER, LEN to manipulate
This document provides instructions for inputting and managing data in SAS. It discusses creating a SAS library to organize data files. Steps are provided to manually create a SAS data set within a library and input data. Importing data from an external file is also mentioned as an alternative to manual input. The document reviews key SAS concepts like librefs and permanent vs temporary libraries.
This document provides instructions for creating an educational slide presentation using OpenOffice Impress. It outlines how to:
1. Create slides and insert text within text boxes.
2. Format slides by adding numbered lists and changing background colors.
3. Insert graphics and clipart.
4. Save, view, and print the Impress presentation.
The document then provides step-by-step directions for opening an example Impress file, adding two new slides, entering and formatting slide titles, and applying text effects like shadows.
This document discusses creating data visualizations with low-cost tools. It begins by outlining the objectives of understanding the purpose of a visualization, principles of communicating through data, choosing the right visualization, and determining if Excel is suitable. It then covers the eight principles of communicating through data, such as defining the question, using accurate data, and tailoring the visualization to the audience. Next, it discusses choosing the right visualization type based on the purpose, such as line charts, bar charts or tables. The document considers when Excel may not be suitable and introduces specialist tools like Tableau, Microsoft Power BI, and coding options. It concludes with additional resources for data visualization.
This document provides an introduction to statistics. It defines statistics and discusses its importance, limitations, and application areas. It also outlines the main classifications of statistics including descriptive and inferential statistics. Descriptive statistics describes data without making conclusions while inferential statistics makes generalizations beyond the data. The document concludes by defining key statistical terms and outlining the typical steps in a statistical investigation.
** Tableau Certification Training: https://www.edureka.co/tableau-training-for-data-visualization **
"Level of Detail" is a new syntax which, both, simplifies and extends Tableau’s calculation language by making it possible to address level of detail questions directly. In this Edureka tutorial, you’ll gain insights into how LOD Expressions work, along with a more in-depth look at the different types of LOD Expressions and their respective use cases.
Introduction to LOD
Include Calculation
Exclude Calculation
Fixed Calculation
Aggregation & LOD
Nesting in LOD
Data Sources Supported by LOD
How to Create LOD Expressions
Level of Detail vs Table Calculations
Limitations of LOD
Instagram: https://www.instagram.com/edureka_lea...
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
One of the most popular function of Microsoft Excel is VLOOKUP. Most of the users when first time use this function are pretty confused, as it have several options through which it can operate. This slide presentation is created to help the people interested in learning this wonderful function.
This document provides an overview of basic database concepts including:
- Definitions of data, information, and databases
- Components of database systems like users, software, hardware, and data
- Data models including entity-relationship, hierarchical, network, and relational models
- Database architecture types such as centralized, client-server, and distributed
- Advantages and disadvantages of database management systems
This document provides instructions for three methods to add dashes to phone numbers in Excel:
1. Use a REPLACE formula to insert dashes at specific character positions.
2. Use the TEXT formula with a phone number format mask to automatically add dashes.
3. Select the phone numbers and use the Format Cells feature to apply a built-in phone number format.
Introduction to data analysis using excelAhmed Essam
This document provides an overview of key concepts and techniques for data analysis, including statistics such as mean, median, mode, outliers, and correlation. It also covers data analysis tools like sorting, filtering, pivot tables, VLOOKUP, HLOOKUP, and Match functions. The document aims to explain what data analysis is, its importance, and how it relates to statistics, as well as how data analysis can benefit work. Contact information is provided at the end.
Microsoft Access allows users to create and manage databases. When first opening Access, a dialog box appears with options to create a new database or open an existing one. The user can then select the type of database to create. A database contains tables which hold data in fields with specific data types. Forms and reports allow easy viewing and manipulation of table data.
This document provides an overview of getting started with data analysis using Stata. It discusses what Stata is, describes the Stata screen and interface, and covers first steps like setting the working directory, creating log files, allocating memory, using do-files, opening and saving Stata data files, finding variables quickly, subsetting data using conditional statements, understanding Stata's color-coding system, importing data from other programs like SPSS and SAS, and provides an example of a dataset in Excel. The document serves as an introduction to basic functions and workflows in Stata.
DAX and Power BI Training - 002 DAX Level 1 - 3Will Harvey
DAX Level 1 - 3: In this session we explain DAX and cover other foundational concepts in PowerPivot such as the Data Model, Measures and Calculated Columns as well as the important skill of understanding how filtering works in the Data Model.
The document provides an overview of the DAX language. It discusses that DAX is the programming language used in Power BI, Power Pivot, and Analysis Services for data modeling, reporting, and analytics. It describes the basic components of a DAX data model including tables, columns, relationships, measures, and hierarchies. It also covers DAX syntax, functions, operators, and how context and filter context work in DAX calculations and queries.
This document discusses visualizing data in R using various packages and techniques. It introduces ggplot2, a popular package for data visualization that implements Wilkinson's Grammar of Graphics. Ggplot2 can serve as a replacement for base graphics in R and contains defaults for displaying common scales online and in print. The document then covers basic visualizations like histograms, bar charts, box plots, and scatter plots that can be created in R, as well as more advanced visualizations. It also provides examples of code for creating simple time series charts, bar charts, and histograms in R.
This document provides an overview of how to create and manage various schema objects in Oracle, including views, sequences, indexes, and synonyms. It discusses how to create simple and complex views, retrieve data from views, and perform DML operations on views. It also covers how to create, use, modify and cache sequences, and how to create nonunique indexes to improve query performance. Finally, it discusses how to create synonyms to provide alternative names for objects.
The document discusses various techniques for handling data in Excel, including entering data manually or importing it, sorting and filtering data, using subtotals and pivot tables to summarize data, and formatting options. Key techniques covered include importing tab-delimited files, sorting data by clicking Data > Sort, filtering data using Data > Autofilter, creating pivot tables by selecting the data source and dragging field buttons, and formatting cells using conditional formats.
Step-1 Tableau Introduction
Step-2 Connecting to Data
Step-3 Building basic views
Step-4 Data manipulations and Calculated fields
Step-5 Tableau Dashboards
Step-6 Advanced Data Options
Step-7 Advanced graph Options
The document discusses importing and exporting data in R. It describes how to import data from CSV, TXT, and Excel files using functions like read.table(), read.csv(), and read_excel(). It also describes how to export data to CSV, TXT, and Excel file formats using write functions. The document also demonstrates how to check the structure and dimensions of data, modify variable names, derive new variables, and recode categorical variables in R.
Excel 2010 brought with it two new features which extend the usefulness of pivot tables: the slicer and the timeline. They are really useful, among other use cases, when you want to easily monitor indicators in your data. Join our fellow Sheena Opulencia-Calub to learn more about this.
Data validation in Excel allows users to restrict the type of data entered into cells. This includes creating drop-down lists, restricting dates or numbers, and custom validation rules. The document provides steps to apply data validation to a cell by selecting the cell, going to the data validation menu, choosing the type of validation such as a list, selecting the source of the list options, and setting input and error messages. Data validation helps ensure accurate data entry by limiting users to valid options.
Elementary Data Analysis with MS excel_Day-1Redwan Ferdous
This document provides an overview of an elementary data analysis course using MS Excel. The 6-day course will introduce basic concepts like data, data types, and data analysis processes. It will cover collecting, cleaning, and analyzing data in Excel. Topics will include functions, formulas, charts, pivot tables, and more. The goal is to help professionals and students better understand and utilize data through hands-on Excel training and examples.
Online analytical processing (OLAP) allows users to easily extract and analyze data from different perspectives. It originated in the 1970s and was formalized in 1993, with OLAP cubes organizing numeric facts by dimensions to enable fast analysis. OLAP provides operations like roll-up, drill-down, slice, and dice to analyze aggregated data across multiple systems. It offers advantages over relational databases for consistent reporting and analysis.
The document provides instructions for formatting cells and cell contents in Excel, including changing cell alignment, merging and splitting cells, wrapping text, applying number formats, borders and styles, setting column width and row height, and other cell formatting options. Key steps include selecting the relevant cells, using formatting tools on the Home tab, and specifying format properties.
In this tutorial, we discuss how to do a regression analysis in Excel. I will teach you how to activate the regression analysis feature, what are the functions and methods we can use to do a regression analysis in Excel and most importantly, how to interpret the regression analysis results. Source: https://tinytutes.com/tutorials/regression-analysis-in-excel/
Statistics is a branch of mathematics used to organize, analyze, and interpret data. It helps simplify large amounts of data and make objective decisions. There are two main branches: descriptive statistics, which describes data, and inferential statistics, which makes inferences about populations. Common descriptive statistics tools include measures of central tendency (mean, median, mode) and measures of variability (range, standard deviation). Quality control uses seven tools: histograms, check sheets, Pareto charts, cause-and-effect diagrams, scatter diagrams, stratification diagrams, and control charts. Control charts monitor processes over time to determine if variation is due to chance or assignable causes.
This document provides an introduction to statistics. It defines statistics and discusses its importance, limitations, and application areas. It also outlines the main classifications of statistics including descriptive and inferential statistics. Descriptive statistics describes data without making conclusions while inferential statistics makes generalizations beyond the data. The document concludes by defining key statistical terms and outlining the typical steps in a statistical investigation.
** Tableau Certification Training: https://www.edureka.co/tableau-training-for-data-visualization **
"Level of Detail" is a new syntax which, both, simplifies and extends Tableau’s calculation language by making it possible to address level of detail questions directly. In this Edureka tutorial, you’ll gain insights into how LOD Expressions work, along with a more in-depth look at the different types of LOD Expressions and their respective use cases.
Introduction to LOD
Include Calculation
Exclude Calculation
Fixed Calculation
Aggregation & LOD
Nesting in LOD
Data Sources Supported by LOD
How to Create LOD Expressions
Level of Detail vs Table Calculations
Limitations of LOD
Instagram: https://www.instagram.com/edureka_lea...
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
One of the most popular function of Microsoft Excel is VLOOKUP. Most of the users when first time use this function are pretty confused, as it have several options through which it can operate. This slide presentation is created to help the people interested in learning this wonderful function.
This document provides an overview of basic database concepts including:
- Definitions of data, information, and databases
- Components of database systems like users, software, hardware, and data
- Data models including entity-relationship, hierarchical, network, and relational models
- Database architecture types such as centralized, client-server, and distributed
- Advantages and disadvantages of database management systems
This document provides instructions for three methods to add dashes to phone numbers in Excel:
1. Use a REPLACE formula to insert dashes at specific character positions.
2. Use the TEXT formula with a phone number format mask to automatically add dashes.
3. Select the phone numbers and use the Format Cells feature to apply a built-in phone number format.
Introduction to data analysis using excelAhmed Essam
This document provides an overview of key concepts and techniques for data analysis, including statistics such as mean, median, mode, outliers, and correlation. It also covers data analysis tools like sorting, filtering, pivot tables, VLOOKUP, HLOOKUP, and Match functions. The document aims to explain what data analysis is, its importance, and how it relates to statistics, as well as how data analysis can benefit work. Contact information is provided at the end.
Microsoft Access allows users to create and manage databases. When first opening Access, a dialog box appears with options to create a new database or open an existing one. The user can then select the type of database to create. A database contains tables which hold data in fields with specific data types. Forms and reports allow easy viewing and manipulation of table data.
This document provides an overview of getting started with data analysis using Stata. It discusses what Stata is, describes the Stata screen and interface, and covers first steps like setting the working directory, creating log files, allocating memory, using do-files, opening and saving Stata data files, finding variables quickly, subsetting data using conditional statements, understanding Stata's color-coding system, importing data from other programs like SPSS and SAS, and provides an example of a dataset in Excel. The document serves as an introduction to basic functions and workflows in Stata.
DAX and Power BI Training - 002 DAX Level 1 - 3Will Harvey
DAX Level 1 - 3: In this session we explain DAX and cover other foundational concepts in PowerPivot such as the Data Model, Measures and Calculated Columns as well as the important skill of understanding how filtering works in the Data Model.
The document provides an overview of the DAX language. It discusses that DAX is the programming language used in Power BI, Power Pivot, and Analysis Services for data modeling, reporting, and analytics. It describes the basic components of a DAX data model including tables, columns, relationships, measures, and hierarchies. It also covers DAX syntax, functions, operators, and how context and filter context work in DAX calculations and queries.
This document discusses visualizing data in R using various packages and techniques. It introduces ggplot2, a popular package for data visualization that implements Wilkinson's Grammar of Graphics. Ggplot2 can serve as a replacement for base graphics in R and contains defaults for displaying common scales online and in print. The document then covers basic visualizations like histograms, bar charts, box plots, and scatter plots that can be created in R, as well as more advanced visualizations. It also provides examples of code for creating simple time series charts, bar charts, and histograms in R.
This document provides an overview of how to create and manage various schema objects in Oracle, including views, sequences, indexes, and synonyms. It discusses how to create simple and complex views, retrieve data from views, and perform DML operations on views. It also covers how to create, use, modify and cache sequences, and how to create nonunique indexes to improve query performance. Finally, it discusses how to create synonyms to provide alternative names for objects.
The document discusses various techniques for handling data in Excel, including entering data manually or importing it, sorting and filtering data, using subtotals and pivot tables to summarize data, and formatting options. Key techniques covered include importing tab-delimited files, sorting data by clicking Data > Sort, filtering data using Data > Autofilter, creating pivot tables by selecting the data source and dragging field buttons, and formatting cells using conditional formats.
Step-1 Tableau Introduction
Step-2 Connecting to Data
Step-3 Building basic views
Step-4 Data manipulations and Calculated fields
Step-5 Tableau Dashboards
Step-6 Advanced Data Options
Step-7 Advanced graph Options
The document discusses importing and exporting data in R. It describes how to import data from CSV, TXT, and Excel files using functions like read.table(), read.csv(), and read_excel(). It also describes how to export data to CSV, TXT, and Excel file formats using write functions. The document also demonstrates how to check the structure and dimensions of data, modify variable names, derive new variables, and recode categorical variables in R.
Excel 2010 brought with it two new features which extend the usefulness of pivot tables: the slicer and the timeline. They are really useful, among other use cases, when you want to easily monitor indicators in your data. Join our fellow Sheena Opulencia-Calub to learn more about this.
Data validation in Excel allows users to restrict the type of data entered into cells. This includes creating drop-down lists, restricting dates or numbers, and custom validation rules. The document provides steps to apply data validation to a cell by selecting the cell, going to the data validation menu, choosing the type of validation such as a list, selecting the source of the list options, and setting input and error messages. Data validation helps ensure accurate data entry by limiting users to valid options.
Elementary Data Analysis with MS excel_Day-1Redwan Ferdous
This document provides an overview of an elementary data analysis course using MS Excel. The 6-day course will introduce basic concepts like data, data types, and data analysis processes. It will cover collecting, cleaning, and analyzing data in Excel. Topics will include functions, formulas, charts, pivot tables, and more. The goal is to help professionals and students better understand and utilize data through hands-on Excel training and examples.
Online analytical processing (OLAP) allows users to easily extract and analyze data from different perspectives. It originated in the 1970s and was formalized in 1993, with OLAP cubes organizing numeric facts by dimensions to enable fast analysis. OLAP provides operations like roll-up, drill-down, slice, and dice to analyze aggregated data across multiple systems. It offers advantages over relational databases for consistent reporting and analysis.
The document provides instructions for formatting cells and cell contents in Excel, including changing cell alignment, merging and splitting cells, wrapping text, applying number formats, borders and styles, setting column width and row height, and other cell formatting options. Key steps include selecting the relevant cells, using formatting tools on the Home tab, and specifying format properties.
In this tutorial, we discuss how to do a regression analysis in Excel. I will teach you how to activate the regression analysis feature, what are the functions and methods we can use to do a regression analysis in Excel and most importantly, how to interpret the regression analysis results. Source: https://tinytutes.com/tutorials/regression-analysis-in-excel/
Statistics is a branch of mathematics used to organize, analyze, and interpret data. It helps simplify large amounts of data and make objective decisions. There are two main branches: descriptive statistics, which describes data, and inferential statistics, which makes inferences about populations. Common descriptive statistics tools include measures of central tendency (mean, median, mode) and measures of variability (range, standard deviation). Quality control uses seven tools: histograms, check sheets, Pareto charts, cause-and-effect diagrams, scatter diagrams, stratification diagrams, and control charts. Control charts monitor processes over time to determine if variation is due to chance or assignable causes.
The document provides recommendations for ISO 9001 books and reference materials to help someone prepare for a Lead Auditor course who has not been involved in quality assurance in many years. It suggests downloading free eBooks on ISO 9001 implementation from a listed website. The document also contains sections on quality management tools commonly used in ISO 9001 systems like Ishikawa diagrams, histograms, Pareto charts, scatter plots, check sheets and control charts.
This document provides an overview of quality management presentation tools and resources. It includes a 100-slide PowerPoint presentation on quality management topics for $20. The presentation covers introduction to quality, evolution of quality management, total quality management principles and the quality management system. It also describes commonly used quality management tools like check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams and histograms.
Chi square analysis-for_attribute_data_(01-14-06)Daniel Augustine
This document provides an overview of chi-square analysis, including what a chi-square test is, the different types of chi-square tests, the basics of when and how to apply a chi-square test, and how to use Minitab to conduct a chi-square test. It describes chi-square tests as a way to determine if there are statistically significant differences in the proportions between groups. The document outlines the steps for entering data into Minitab and interpreting the results, and provides tips, examples, and supplemental information on chi-square tests.
The document provides information about ISO 9001 courses, including an introduction to the standard and certification process. It also lists several quality management tools used in ISO 9001 such as Ishikawa diagrams, histograms, Pareto charts, scatter plots, check sheets, and control charts. Additional related topics like certification, requirements, training, and checklists are also referenced.
This document discusses quality management dashboards and provides resources for creating them. It explains that a quality management dashboard can track key metrics and analyses on a single page report to help focus quality improvement efforts. The dashboard simplifies reporting and allows managers to monitor quality performance and issues at a glance. The document also lists several quality management tools that can be incorporated into a dashboard, such as check sheets, control charts, Pareto charts, scatter plots, and histograms. These tools help identify problems, analyze causes, and prioritize corrective actions.
This document provides information about quality management system diagrams including definitions, examples, and tools. It discusses the contents of quality management system diagrams and provides examples created in ConceptDraw software. Six common quality management tools are also defined - check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams, and histograms. Other related quality management topics that can be downloaded as PDFs are also listed.
The document provides an explanation of ISO 9001 requirements for design processes and lists several quality management tools including Ishikawa diagrams, histograms, Pareto charts, scatter plots, check sheets, and control charts. It also includes links to additional resources on topics related to ISO 9001 certification such as requirements, training, auditing, and procedures.
7 QC quality control (7 QC) tools for continuous improvement of manufacturing...Chandan Sah
This document discusses 7 quality control tools used for continuous improvement in manufacturing processes. It provides details on each tool: Pareto diagrams identify the few vital problems from many trivial ones. Cause and effect diagrams systematically identify possible causes for problems. Histograms show data distribution patterns. Control charts separate random and assignable variations. Scatter diagrams study relationships between two variables. Graphs provide pictorial data representation. When used together, these 7 tools can solve 95% of quality problems and improve manufacturing processes.
Data can provide information about the natural world. It can be collected through observation and measurement, and expressed quantitatively or qualitatively. Data patterns and relationships can be identified through graphing and analysis. The scientific method involves making observations, developing hypotheses, designing experiments to test hypotheses, and drawing conclusions. Data is key to advancing scientific theories and applying research findings.
This document provides an overview of the algorithms used in the IBM SPSS Statistics Algorithms procedure. It begins with an introduction to algorithms and notes that algorithms are avoided in documentation to promote readability. It then discusses algorithms used across multiple procedures and factors that influence the choice of formulas. The document outlines algorithms for various statistical tests and procedures, including two-stage least squares, autocorrelation/partial autocorrelation, attribute importance testing, and ALSCAL multidimensional scaling. Notation is provided and each algorithm is explained step-by-step with details on computational details, references, and terminology.
The document discusses ISO 9001 templates and provides information on template contents and features, including example ISO 9001 documents and forms. It also outlines several quality management tools used in ISO 9001 such as Ishikawa diagrams, histograms, Pareto charts, scatter plots, check sheets, and control charts. Other related topics like ISO 9001 certification, requirements, training, and standards are also listed.
Project two guidelines and rubric.html competencyin this prPOLY33
This report analyzes housing price and square footage data from a regional real estate market compared to national averages. Two hypotheses will be tested: 1) whether regional housing prices are higher than the national average, and 2) whether regional square footage differs from the national average. A random sample of 100 housing listings from the regional data set will be analyzed using graphical displays, summary statistics, and hypothesis tests. Results will include p-values and decisions to reject or fail to reject the null hypotheses. A 95% confidence interval will also be calculated and interpreted for the regional square footage data. Key findings will be summarized and any surprises discussed.
This document provides information about quality management consulting services. It discusses Delpha Quality Consulting, which offers quality management system consulting, auditing, training and other services. They serve clients across various industries, including manufacturing, education, aerospace and more. The document also outlines several quality management tools, such as check sheets, control charts, Pareto charts, scatter plots and Ishikawa diagrams that are useful for quality management consultants.
The document presents an overview of seven quality tools: cause and effect diagrams, flow charts, checksheets, histograms, Pareto charts, control charts, and scatter diagrams. It describes the purpose, benefits, and how to construct each tool. Cause and effect diagrams help identify root causes of problems. Flow charts visually illustrate processes to find improvements. Checksheets organize data collection. Histograms, Pareto charts, and control charts are used for statistical process control and identifying sources of variation. Scatter diagrams identify correlations between factors. The seven tools can be used together in the six step problem solving process of identify, define, investigate, analyze, solve, and confirm to improve quality.
This document provides information about quality management books and tools. It discusses 10 components of a quality management program according to ISO 9001 standards. It then describes 6 commonly used quality management tools - check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams, and histograms. For each tool it provides a brief definition and example of how it is used. The document is intended to provide resources and information about quality management books and tools.
This document provides a summary of a 4-part training program on using PASW Statistics 17 (SPSS 17) software to perform descriptive statistics, tests of significance, regression analysis, and chi-square/ANOVA. The agenda covers topics like frequency analysis, correlations, t-tests, ANOVA, importing/exporting data, and more. The goal is to help users answer research questions and test hypotheses using techniques in PASW Statistics.
Similar to Elementary Data Analysis with MS Excel_Day-5 (20)
Workshop on IoT and Basic Home Automation_BAIUST.pptxRedwan Ferdous
A 2 days hands-on workshop on IoT and Basic Home Automation was held back in 16-17th August, 2023 in the campus of 'Bangladesh Army International University of Science & Technology' (BAIUST), Cumilla.
There were 2 workshop expert, Mr. Redwan Ferdous- Director, FronTech Ltd. and Mr. Nahidul Alam, Lecturer, Dept of EEE, BAIUST.
In total 2 days, total of 12 projects were being practiced hands-on. Total 45 participants participated in the workshop. Successful 2-days, full workshop participants achieved participation certificate. 03 best performer received crests.
Media Coverages:
1) https://l.facebook.com/l.php?u=https%3A%2F%2Fwww.jagocomilla.com%2F%25e0%25a6%25ac%25e0%25a6%25be%25e0%25a6%2587%25e0%25a6%2589%25e0%25a6%25b8%25e0%25a7%258d%25e0%25a6%259f%25e0%25a7%2587-%25e0%25a6%2586%25e0%25a6%2587%25e0%25a6%2593%25e0%25a6%259f%25e0%25a6%25bf-%25e0%25a6%258f%25e0%25a6%25ac%25e0%25a6%2582-%25e0%25a6%25ac%25e0%25a7%2587%25e0%25a6%25b8%25e0%25a6%25bf%25e0%25a6%2595%2F%3Ffbclid%3DIwAR0SwfJEKqiPmXZDYbjTudVSnb6pZCI9ZTun7LRe3C5YG7U7Uj9ciU6VH7w&h=AT04ghurTxB171Iqx9jkDjpRmC4s-ljW1QDziJeDEULOwH0gTQixVxt5Ck9cDEFgcv3Lx1JFplwTnnMp9sawh2X6pjjWAe7fvReKUZMdjHDhAkRIIXvSnstZtKRbU6VfHDVG&__tn__=R]-R&c[0]=AT1AFR198waQ2tUPRvRRROX7ncbzJPe01QTOoyAp0vtDadgjaX7kIyG2tVauxEuHXt0GcbkCuvoTWd7FJ78-WSRlVRj3Fs0-FpgvP_K8F2QVuHNM_QN6yIGvcjCqTxvdMDkMDrHJ-OpuFlZc4p2HqsJ7vcqZpXslk7biWJNWJaO6k4yQrrm1Hg
2) https://l.facebook.com/l.php?u=https%3A%2F%2Fgonomanusherawaj.com%2Fothers%2F89168%2F%3Ffbclid%3DIwAR3Jut2kJ0DuS8t5e5TQYG3rMGt1eA63wdWHr2KzeXJtACRoYc45FAbBnHQ&h=AT1x-PBWSNW6TQxeOMRbtEsZE51yAd-UWWotcD8s9RU_G8khaJhWMHxWyrsZk32nB64v0Sk2dW1Stq9p68HBJbrMJX6TemYsk4TwR7yx2OrQ_N9q9PmhaFXTODlfNjjTRJoK&__tn__=R]-R&c[0]=AT1AFR198waQ2tUPRvRRROX7ncbzJPe01QTOoyAp0vtDadgjaX7kIyG2tVauxEuHXt0GcbkCuvoTWd7FJ78-WSRlVRj3Fs0-FpgvP_K8F2QVuHNM_QN6yIGvcjCqTxvdMDkMDrHJ-OpuFlZc4p2HqsJ7vcqZpXslk7biWJNWJaO6k4yQrrm1Hg
3) https://l.facebook.com/l.php?u=https%3A%2F%2Fcomillanews.com%2F%25e0%25a6%25ac%25e0%25a6%25be%25e0%25a6%2587%25e0%25a6%2589%25e0%25a6%25b8%25e0%25a7%258d%25e0%25a6%259f%25e0%25a7%2587-%25e0%25a6%25b8%25e0%25a6%25ae%25e0%25a7%258d%25e0%25a6%25aa%25e0%25a6%25a8%25e0%25a7%258d%25e0%25a6%25a8-%25e0%25a6%25b9%25e0%25a6%25b2-%25e0%25a6%2586%25e0%25a6%2587%25e0%25a6%2593%25e0%25a6%259f%3Ffbclid%3DIwAR0bFEj8N9UuFmg_WL5PPlWZefIqEohMedvFfqFKacfl8xLAmyRxb0nE6gg&h=AT0ZNYsvxPvJGFPLPch3kK_67c_W4FtZv01NUSckrCT7HPNzbhpr7TeZ26PDSMG903Vz70Lk3y5ldoTbBDV_Dz-fSU8mohXGt6jLV8yM_N1gEJ4WBP0qjQkMnyRUS4CQfF02&__tn__=R]-R&c[0]=AT1AFR198waQ2tUPRvRRROX7ncbzJPe01QTOoyAp0vtDadgjaX7kIyG2tVauxEuHXt0GcbkCuvoTWd7FJ78-WSRlVRj3Fs0-FpgvP_K8F2QVuHNM_QN6yIGvcjCqTxvdMDkMDrHJ-OpuFlZc4p2HqsJ7vcqZpXslk7biWJNWJaO6k4yQrrm1Hg
Hands On Workshop on IoT: From Arduino to JRC BoardRedwan Ferdous
Hands On Workshop on IoT: From Arduino to JRC Board named workshop held at University of Rajshahi on 12th August, 2023. Total 68 students from 04 different universities of North Bengal participated in the workshop and experienced 14 projects by hand.
A webinar was conducted back in 22nd December, 2021- Wednesday by Maker Lab of EMK Center on 'Amazing IoT'- where the young generations, specially the high school students were introduced with different commercial IoT and home appliances like switch, socket, dimmer, google home, amazon Alexa etc. Different world trends and future career scopes were also discussed.
Smart life: Hands on training on property automation design and commissioning...Redwan Ferdous
Engr. Redwan Ferdous took a session on "Smart Life: Hands-On training on property automation design and commissioning through IoT" in collaboration with Dept. of CSE and BHTPA in Chittagong University of Engineering and Technology (CUET) back in 27th December-2021 Monday.
In total 02 sessions, almost 120 students participated in the hands-on training. 1st session attended from 2pm-4:50pm and 2nd session attended from 5pm-6:50pm. All the participants were from L-3,T-2 of Dept. of CSE of CUET.
In the session, different commercial IoT based home automation products were shown along with their controlling mechanism-practically. Also, different career and research opportunities were also discussed in the session.
Opportunities In Robotics for High School StudentsRedwan Ferdous
In 28th December, 2021- Tuesday, 'MakerLab'- a subsidiary part of EMK Center, Bangladesh arranged a 1.5 hour webinar on "Opportunities In Robotics for High School Students", where the target audiences were High school and college level students.
The main agenda of this session was to introduce young generation with different local (Bangladesh) and international competition based on robotics, rover challenge, idea competition on robotics. From hundred-thousands of competition, almost 10 famous competitions were discussed in details along with the way of preparing oneself for attending these mentioned competitions.
Around 80 participants from different parts of the country joined. The session was held from 7:30pm-9pm and the medium was Zoom. in Q&A session in the end of the webinar- almost 20 different queries from the audiences were entertained.
The session content preparation and was fully conducted by Mr. Redwan Ferdous with the moderation from EMK Center-Maker Lab. All or most of the contents are collected from internet search for non-commercial purpose.
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 4th of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-04, students were introduced with basic electronics and Raspberry Pi. Topics were: Current, Voltage, Diode, Capacitor, Inductor, etc.
The program was held on 07th December, 2021. Both 3rd & 4th Cohort attended from 7:30 pm - 9:00 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center, Dhaka, Bangladesh.
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 3rd of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-03, the students were introduced to basic electronics components including Arduino, Breadboard, Jumper Wires, Buzzer, LCD Display, Ultrasonic Sensor, IR Sensor, LM35 Temperature Sensor and other passive components. They were also trained on how to search for a datasheet of a specific components and search for relevant spec and characteristics. Also, in the session- they were taught how to start and complete making project/solution at the end of this session.
Moreover, the students were introduced to 'Tinknercad'-where they can make project simulations, without buying the products physically.
The 4th and last session of the 2nd phase will close the continuation of the 3rd session and after that, the team will be assigned with the mentors on different expertise.
The program was held on 5th December, 2021- Sunday. 3rd & 4th Cohort students attended from 7:30 pm - 9:00 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center, Dhaka, Bangladesh.
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 2nd of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-02, the students were shared with the knowledge of methodical problem-solving steps, different themes, and subthemes of national and international competition, and sample problem ideas- from where the students will choose one to work with for next 4-5 weeks to make a tangible solution.
The program was held on 4th December, 2021- Saturday. 3rd & 4th Cohort students attended from 7:30 pm - 9:00 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center, Dhaka, Bangladesh.
An initiative was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 3rd & 4th of total 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-01, participants were introduced to different national and international competitions, where they can show their excellence of learning and knowledge (i.e.: BdRO, WRO, IROC, ACM-ICPC, BIG, a2i, CODE RACE, etc.)
The program was held on 2nd December, 2021-Thursday. 3rd & 4th Cohort students attended from 7:30 pm - 9:00 pm.
The session was taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center, Dhaka, Bangladesh
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 4th & last of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-04, students were introduced with basic electronics and Raspberry Pi. Topics were: Current, Voltage, Diode, Capacitor, Inductor, etc.
The program was held on 29th August, 2021. Both 1st & 2nd Cohort attended from 7:15 pm - 9:30 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center.
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 3rd of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-03, the students were introduced to basic electronics components including Arduino, Breadboard, Jumper Wires, Buzzer, LCDD Display, Ultrasonic Sensor, IR Sensor, LM35 Temperature Sensor and other passive components. They were also trained on how to search for a datasheet of a specific components and search for relevant spec and characteristics. Also, in the session- they were taught how to start and complete making project/solution at the end of this session.
Moreover, the students were introduced to 'Tinknercad'-where they can make project simulations, without buying the products physically.
The 4th and last session of the 2nd phase will close the continuation of the 3rd session and after that, the team will be assigned with the mentors on different expertise.
The program was held on 24th August, 2021. Both 1st & 2nd Cohort attended from 7:15 pm - 9:30 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center.
An initiative, named 'Road to 4IR' was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 2nd of total of 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-02, the students were shared with the knowledge of methodical problem-solving steps, different themes, and subthemes of national and international competition, and sample problem ideas- from where the students will choose one to work with for next 4-5 weeks to make a tangible solution.
The program was held on 18th August, 2021. 1st Cohort attended from 7:15 pm - 8:30 pm and 2nd Cohort attended from 8:45 pm- 10:00 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center.
An initiative was taken by EMK Center -Maker Lab for training 30x4= 120 students (60 Male & 60 Female) in 04 cohorts in the technology of the 4th Industrial Revolution.
This session was the 1st of total 04 sessions- where the initially trained participants will be further trained under the supervision of expert mentors to materialize the training by planning, designing, and making a scientifically proven concept paper/ robot/solution.
In this Session-01, participants were introduced to different national and international competitions, where they can show their excellence of learning and knowledge (i.e.: BdRO, WRO, IROC, ACM-ICPC, BIG, a2i, CODE RACE, etc.)
The program was held on 9th August, 2021. 1st Cohort attended from 7:15 pm - 8:30 pm and 2nd Cohort attended from 8:45 pm- 10:00 pm.
Both sessions were taken by Redwan Ferdous, Mentor of Embedded system of MakerLab, EMK Center.
This deck was prepared for the 1st and 2nd cohort of the "Road to 4IR" program, initiated by EMK center in collation with 'Birshreshtha Munshi Abdur Rouf Public College'. The major attractions of both of the cohorts were, all the students/attendee of the program was from Class-07. 1st cohort was for all girls and 2nd cohort was for the all-boys batch. The session took place on 27th June, 2021.
This 'Intro to Digital Citizenship' was the 2nd class of the program. In this class, the students were able to learn different terminologies and uses of digital technologies for ensuring appropriate cybersecurity and self-defense from cyber-bullying.
This deck was prepared for the 1st and 2nd cohort of the "Road to 4IR" program, initiated by EMK center in collation with 'Birshreshtha Munshi Abdur Rouf Public College'. The major attractions of both of the cohorts were, all the students/attendee of the program was from Class-07. 1st cohort was for all girls (Date: 23rd May, 2021) and 2nd cohort was of the all-boys batch (Date: 24th June, 2021).
This 'Introduction to 4th IR' was the first session of the program, where students were introduced to different topics and terminologies of 4IR.
Career as Project Manager for Electrical Engineer_PUC_Redwan FerdousRedwan Ferdous
Premier University, Chittagong (Chattogram) arranged a webinar on "Career as Project Manager for Electrical Engineer" on 16th June, 2021 from 7:30 pm-9:15 pm. It was a Facebook live session, where the Dean of engineering faculty, Chairman of the EEE department, Assistant Professor of PUC, and other faculties were present. 100+ viewers were live on that session and 10+ questions were answered during the session.
Redwan Ferdous was the keynote speaker of the session.
Fundamentals of Arduino- is a 05 days consecutive program and training session and this slide is for the 2nd day of those 05 days. The session took place on 29th December 2020. This is very much designed for school and college-going students-who are interested in Arduino, Robotics, and other 4th Industrial Revolution-peripherals.
In this session, Basic of Arduino Coding, Introduction to Proteus, Simulation using Proteus was discussed.
The event was organized by EMK Center and the trainer was Mr. Redwan Ferdous.
The webinar named "IoT and 5G: Future Career" took place on 30th December 2020 on online platform, organized by EMK Center- MakerLab. The session was taken by EMK Center Mentor (Embedded System), Mr. Redwan Ferdous. 90+ participants attended the session.
IoT, IIoT, Robotics, RPA, PLC, SCADA and other technologies related to 4th Industrial revolution was covered. The session was to guide the school and college level students to be informed about future technologies and thus support to decide about their future career.
Fundamentals of Arduino- is a 05 days consecutive program and training session and this slide is for the 1st day of those 05 days. The session took place on 28th December 2020. This is very much designed for school and college-going students-who are interested in Arduino, Robotics, and other 4th Industrial Revolution-peripherals.
In this session, Basic of Electronics, Basic of Arduino, Technical Spec, Basic Coding, and IDE were discussed.
The event was organized by EMK Center and the trainer was Mr. Redwan Ferdous.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Project Management Semester Long Project - Acuityjpupo2018
Acuity is an innovative learning app designed to transform the way you engage with knowledge. Powered by AI technology, Acuity takes complex topics and distills them into concise, interactive summaries that are easy to read & understand. Whether you're exploring the depths of quantum mechanics or seeking insight into historical events, Acuity provides the key information you need without the burden of lengthy texts.
1. Elementary Data Analysis
with MS Excel
Redwan Ferdous
Electrical Engineer| Tech Enthusiast| Robotics | Automobile| Data Science |
Tech-Entrepreneur & Investor |
redwan.contact@gmail.com | ferdousr@emk.com.bd
https://sites.google.com/view/redwanferdous
Day: 05 out of 06 | September 16, 2020
2. Today’s Agenda
- Statistics, Hypothesis
- Regression Analysis, Trend Line [Practical x 02]
- T-Test [Practical]
- Solver [Practical]
- Goal Seek [Practical x 02]
- Idea : Z-Test, ANOVA Test, P-Value
- Pivot Table and Interactive Dashboard, Charts [Practical x 02]
- Pivot Table with Open Office Platform
- Dashboard
20-Jul-20
All the contents collected from internet, is mentioned with
sources at the bottom slide
2
4. Statistics
• The main purpose of statistics is to test a hypothesis. For example,
you might run an experiment and find that a certain drug is effective
at treating headaches. But if you can’t repeat that experiment, no one
will take your results seriously.
5. Hypothesis
Hypothesis testing helps identify ways to reduce costs and improve
quality. Hypothesis testing asks the question: Are two or more sets of
data the same or different, statistically.
A hypothesis is an educated guess about something in the world
around you. It should be testable, either by experiment or
observation. For example:
• A new medicine you think might work.
• A way of teaching you think might be better.
• A possible location of new species.
• It can really be anything at all as long as you can put it to the test.
6. Hypothesis
Three Types of Hypothesis Tests:
• Classical Method - comparing a test statistic to a critical value
• p value Method - the probability of a test statistic being contrary to
the null hypothesis
• Confidence Interval Method - is the test statistic between or outside
of the confidence interval
7. Hypothesis
Conducting a Hypothesis Test:
• Define the null (H0) and an alternate (Ha) hypothesis
• Conduct the test
• Using data from the test:
• Calculate the test statistic and the critical value (t test, f test, z test, ANOVA,
etc.)
• Calculate a p value and compare it to a significance level (a) or confidence
level (1-a)
• Interpret the results to determine if you can accept or reject the null
hypothesis
8. Hypothesis-
in Lean Management/Manufacturing
• In Six Sigma, hypothesis tests help identify differences between
machines, formulas, raw materials, etc. and whether the differences
are statistically significant or not. Without such testing, teams can run
around changing machine settings, formulas and so on causing more
variation. These knee-jerk responses can amplify variation and cause
more problems than doing nothing at all.
• In manufacturing, you might want to compare two or more types of
raw materials and determine if they produce the same quality. In
other words, do the products have the same or different means and
variances? If they are the same, which one is less expensive to
produce? If they are different, which one best meets the customer's
requirements?
9. Hypothesis-Test Cheat Sheet
Hypothesis Test Compare Result What Does It Mean?
Classical
Method
test statistic > critical value
(i.e. F > F crit)
Reject the null hypothesis Means or Variances are Different
Classical
Method
test statistic < critical value
(i.e. F < F crit)
Cannot Reject the null
hypothesis
Means or Variances are the Same
p value Method p value < a Reject the null hypothesis Means or Variances are Different
p value Method p value > a
Cannot Reject the null
hypothesis
Means or Variances are the Same
12. Data Analysis: Regression Analysis
• Linear regression is a statistical technique that examines the linear
relationship between a dependent variable and one or more
independent variables.
• Linear relationship means the change in an independent variable(s)
causes a change in the dependent variable.
• There are basically two types of linear relationships as well:
• Positive Linear Relationship: When independent variable increases, the
dependent variable increases too.
• Negative Linear Relationship: When independent variable increases, the
dependent variable decreases.
13. Data Analysis: Regression Analysis (cont.)
• These were some of the pre-requisites before you actually proceed
towards regression analysis in excel.
• There are two basic ways to perform linear regression in excel using:
• Regression tool through Analysis ToolPak
• Scatter chart with trendline
• There is actually one more method which is using manual formula’s to
calculate linear regression.
14. Data Analysis: Regression Analysis (cont.)
• Explanation of Regression Mathematically
We have a mathematical expression for linear regression as below:
Y = aX + b + ε
Where,
• Y is a dependent variable or response variable.
• X is an independent variable or predictor.
• a is the slope of the regression line. Which represents that when X changes, there is a change in Y
by “a” units.
• b is intercepting. It is the value Y takes when the value of X is zero.
• ε is the random error term. Occurs because the predicted value of Y will never be exactly the
same to the actual value for given X. This error term, we don’t need to worry about. As there are
some software’s which do the calculation of this error term in the backend for you. Excel is one of
that software.
15. Data Analysis: Regression Analysis (cont.)
In that case, the equation becomes,
Y = aX + b
Which can be represented as:
Weight = a*Height + b
We’ll try to find out the values of these a and b using methods we have
discussed above.
16. Data Analysis: Regression Analysis (cont.)
• #1 – Regression Tool Using Analysis ToolPak in Excel
• We’ll try to fit regression for Weight values (which is dependent
variable) with the help of Height values (which is an independent
variable).
17. Data Analysis: Regression Analysis (cont.)
• In the excel spreadsheet, click on Data Analysis (present under
Analysis Group) under Data.
• Search out for Regression. Select it and press ok.
18. Data Analysis: Regression Analysis (cont.)
• Use the following inputs under Regression pane which opens up.
• Input Y Range: Select the cells which contain your dependent variable
(in this example B1:B11)
19. Data Analysis: Regression Analysis (cont.)
• Input X Range: Select the cells which contain your independent
variable (in this example A1:A11).
• Check the box named Labels if your data have column names (in this
example we have column names).
• Confidence Level is set to 95% by default, which can be changed as
per users requirements.
• Under Output options, you can customize where you want to see the
output of regression analysis in Excel. In this case, we want to see the
output on the same sheet. Therefore, given range accordingly.
21. Data Analysis: Regression Analysis (cont.)
• Under Residuals option, you have optional inputs like Residuals,
Residual Plots, Standardized Residuals, Line Fit Plots which you can
select as per your need.
• Under Normal Probability option, you can
select Normal Probability Plots which can
help you check the normality of predictors.
Click on OK.
22. Data Analysis: Regression Analysis (cont.)
• Excel will compute Regression analysis for you in a fraction of
seconds.
23. Data Analysis: Regression Analysis (cont.)
• One important part of this entire output is R Square/ Adjusted R
Square under SUMMARY OUTPUT table. Which provides information,
how good our model is fit. In this case, R Square value is 0.9547.
Which interprets that the model has a 95.47% accuracy (good fit). Or
in another language, information about Y variable is explained 95.47%
by X variable.
24. Data Analysis: Regression Analysis (cont.)
• The other important part of the entire output is a table of coefficients. It
gives values of coefficients which can be used to build the model for future
predictions.
• Now our, regression equation for prediction becomes:
• Weight = 0.6746*Height – 38.45508 (Slope value for Height is 0.6746… and
Intercept is -38.45508…)
• We have defined a function in which you now just have to put the value of
Height and you’ll get the Weight value.
25. Data Analysis: Regression Analysis (cont.)
#2 – Regression Analysis Using Scatterplot with Trendline in Excel
• Select your entire two columned data
(including headers).
• Click on Insert and select Scatter Plot
under graphs section as shown in the
image below.
27. Data Analysis: Regression Analysis (cont.)
• Now, we need to have a least squared regression line on this graph.
To add this line, right click on any of the data points on the graph and
select Add Trendline option.
28. Data Analysis: Regression Analysis (cont.)
• It will enable you to have a trendline of least square of regression like
below.
29. Data Analysis: Regression Analysis (cont.)
• Under the Format Trend line option, check the box for Display
Equation on Chart.
30. Data Analysis: Regression Analysis (cont.)
• It enables you to see the equation of least squared regression line on
the graph.
• This is the equation using which we can predict the weight values for
any given set of Height values.
31. Data Analysis: Regression Analysis (cont.)
• It is always recommended to have a look at residual plots while you
are doing regression analysis using Data Analysis ToolPak in Excel. It
gives you a better understanding of the spread of the actual Y values
and estimated X values.
• Simple Linear Regression in excel does not need ANOVA and Adjusted
R Square to check. These features can be taken into consideration for
Multiple Linear Regression.
32. Data Analysis: T-Test Analysis
• A t-test is returning the probability of the tests. Look at the below
data of two teams scoring pattern in the tournament.
33. Data Analysis: T-Test Analysis (cont.)
• Step 1: Select the Data Analysis option under the DATA tab.
34. Data Analysis: T-Test Analysis (cont.)
• Step 2: Once you click on Data Analysis you will see a new dialogue
box. Scroll down and find T-test. Under T-test, you will three kinds of
T-test, select the first one i.e. t-Test: Paired Two Sample for Means.
35. Data Analysis: T-Test Analysis (cont.)
• Step 3: After selecting the first t-Test you will see below options.
36. Data Analysis: T-Test Analysis (cont.)
• Step 4: Under Variable 1 Range, select team 1 score and under
Variable 2 Range select team 2 score.
37. Data Analysis: T-Test Analysis (cont.)
• Step 5: Output Range selects the cell where you want to display the
results.
38. Data Analysis: T-Test Analysis (cont.)
• Step 6: Click on Labels because we have selected the ranges including
headings. Click on Ok to finish the test.
39. Data Analysis: T-Test Analysis (cont.)
• Step 7: From the D1 cell it will start showing the test result
40. Data Analysis: T-Test Analysis (cont.)
• Analysis:
• The result will show the mean
value of two teams, Variance
Value, how many observations are
conducted or how many values taken
into consideration, Pearson
Correlation etc…
• If you P (T<=t) two tail it is 0.314 which is higher than the standard
expected P value of 0.05. This means data is not significant.
• We can also do the T-test by using built-in function T.TEST.
41. Data Analysis: Solver
• A solver is nothing but solving the problem. SOLVER works like a goal
seek in excel.
• Look at the below image I have data of product units, units price,
total cost, and the total profit.
42. Data Analysis: Solver (cont.)
• Units sold quantity is 7550 at a selling price of 10 per unit. Total cost
is 52500 and the total profit is 23000.
• As a proprietor, I want to earn a profit of 30000 by increasing the unit
price. As of now, I don’t know how much units price I have to
increase. SOLVER will help me to solve this problem.
44. Data Analysis: Solver (cont.)
• Step 2: Set the objective cell as B7 and
the value of 30000 and by changing the
cell to B2. Since I don’t have any other
special criteria’s to test I am clicking on
the SOLVE button.
45. Data Analysis: Solver (cont.)
• Step 3: The Result will be as below:
So, To make a profit of 30000 I need to sell the products at 11 per unit instead of 10 per unit.
46. Data Analysis: Goal Seek
• A Goal Seek is a tool that is used to find an unknown value from a set
of known values.
• It comes under the What If Analysis feature of Microsoft Excel, which
is useful to find out the value that will give the desired result as a
requirement.
• This function instantly calculates the output when the value is
changed in the cell. You have to mention the result you want the
formula to generate, and then determine the set of input values that
will generate the result.
47. Data Analysis: Goal Seek (cont.)
• Example #1
• In the above figure, there are two numbers A and B with the value of
9 and 6 respectively. a product is done using the function
=PRODUCT(B1,B2), resulting to 54.
• If A is 9, what will be the second number for B, to get the result, 72.
48. Data Analysis: Goal Seek (cont.)
• Following are the steps:
• Click on Data Tab
• Under Data tools group
• Click on What if Analysis drop-down menu
• Click on Goal Seek
49. Data Analysis: Goal Seek (cont.)
• In the Goal Seek dialog Box, select B3 in the ‘Set Cell’
• Enter 72 in the ‘To Value’
• Select B2 in ‘By Changing Cell’
51. Data Analysis: Goal Seek (cont.)
• Example #2
Let us take the example of EMK Ltd. Trading with generators. Price of each generator is BDT 18,000 and the
quantity sold is 100 nos.
52. Data Analysis: Goal Seek (cont.)
• We can see that the company is suffering a loss of 13.8 lacs. It is
identified that the maximum price, for which a generator can be sold,
is BDT 18000. Now, It is required to identify the no. of generators
can be sold, which will return the break-even value (No Profit No
Loss). So the Profit value (Revenue – Fixed Cost + Variable cost) needs
to be zero to attain break-even value.
53. Data Analysis: Goal Seek (cont.)
• Following are the steps:
• Click on Data Tab
• Under Data tools group
• Click on What if Analysis drop-down menu
• Click on Goal Seek
54. Data Analysis: Goal Seek (cont.)
• Select C8 in the ‘Set Cell’
• Enter 0 in the ‘To Value’
• Select C3 in ‘By Changing Cell’
Then press OK
56. Other Tests of Data Analysis (Z Test)
• Z-Test:
With the help of Z-Test, we compare the means of two datasets in
Excel that are equal or not.
• In Excel, we have a function for Z-Test named as ZTest, where, as per
syntax we need to have Array and X value (Hypothesized sample
mean) and Sigma value (Optional).
• Mostly X is considered a minimum of 95% of probability for that it can
be taken from 0 to 5.
• There we would need 2 variable ranges, 2 variances of each range.
• If Z < Z Critical then we will reject the null hypothesis.
57. Other Tests of Data Analysis (ANOVA)
• ANOVA Test:
ANOVA (Analysis of Variance) in Excel is the single and two-factor
method which is used to perform the null hypothesis test which says
the test will be PASSED for Null Hypothesis if from all the population
values are exactly equal to each other.
• If any or at least one value is different from other values, then the null
hypothesis will be FAlLED.
58. Other Tests of Data Analysis (P-Value)
• P-Values in excel can be called probability values, it’s used to understand
the statically significance of a finding.
• The P-Value is used to test the validity of demonstrate that we are testing
the Null Hypothesis. If the null hypothesis is considered improbable
according to the P-Value then it leads us to believe that the alternative
hypothesis might be true. Basically, it allows us whether the provided
results been caused by chance or these wo unrelated things. So P-Value is
an investigator and not a Judge.
• A P-Value is a number between 0 and 1 but it’s easier to think about them
in percentages (i.e. for Pvalue of 0.05 is 5%. Smaller P-value leads to the
rejection of the null hypothesis.
• The formula to calculate the P-Value is TDIST(x, deg_freedom, tails)
59. Pivot Table
• Pivot Meaning:
• According to dictionary: “the central point, pin, or shaft on which a
mechanism turns or oscillates.”
• Meaning in Bengali [According to Google Translate]:
“অপরিহার্য গুরুত্বপূর্য ব্যারি” (!!)
60. Pivot Table
• Pivot Table in excel is used to categorize, sort, filter and summarize
any length of data table which we want to get count, sum, values
either in tabular form or in form of 2 column sets.
• To insert the pivot table, select the Pivot table option from the Insert
menu tab, which will automatically find the table or range. We can
use the short cut keys Alt + D + P simultaneously which we will detect
the range of cells and take us to the final pivot option.
• We can also create a customized table by considering those columns
which are actually required.
61. 1st Step
• Find all the available properties of the dataset in Word Format.
• Example:
• Number of Rows, Columns
• What is this dataset about [Place, Date, Time]
• Number of Unique Properties of the dataset [columns]
• If there is any missing data
• Sorted/ Curated
• Objective of the analysis
62. 2nd Step
• Dataset Curation
• Sorting, Filtering
• Missing Data (if any) handling
• Pivot Table Loading and Check if the unique properties set earlier is available
for conditioning for pivoting.
• Use: Conditional Formatting, Knowledge of Quick Analysis, Filters, Sort.
63. Pivot Table (cont.)
Example #1
• In a Company where department work is to
mark certain Id’s are they correct or not.
They process data which has some sort of
id’s and mark it as correct or not.
• Below is the screenshot of raw data:
64. Pivot Table (cont.)
• Suppose a company’s manager of the department wants to know the
count of how many Prop_ID were correct and incorrect. He can
manually count those values but for a large set of data, it will be a
slow task. But there is a way to do this easily.
• In his excel workbook, he will hit the Insert button and click on the
pivot table on the leftmost button of his screen. Now he can choose
the same worksheet or a new worksheet to open this pivot table.
• To know the count of Correct and Incorrect values in that prop_id the
Status field which is correct or incorrect will be drag down to rows
section and the values the count in reference of the property id’s the
Prop_ID will be drag down to Values Section.
65. Pivot Table (cont.)
• Select the data and go to Insert tab click on pivot tables under the
tables section
66. Pivot Table (cont.)
• A dialog box appears. In the above image
there are few checkboxes, first to select
the table range which we did by selecting
the data.
• Now where to insert the pivot table to
insert in the same worksheet or different
worksheet?
• If the data is very huge then it is convenient
to insert the pivot table in the
new worksheet and click OK.
67. Pivot Table (cont.)
• We get the below result, on the
right-hand side we have our fields
of the pivot tables which will be
moved to rows and columns as the
desired report and on the
left-hand side, the pivot table will
be created.
68. Pivot Table (cont.)
• Our Task is to check how many
property ids were marked as correct
and how many were marked as
incorrect by the auditor.
• Drag Auditor fields to Rows section,
Property_id to value section whereas
the status field to filters section.
69. Pivot Table (cont.)
• We have made our pivot table which
currently shows the total count of
property id’s marked by the auditors.
• Now to check the status of how many
property ids were marked as correct and
how many were marked as incorrect.
In the pivot table under the status section
click on it.
70. Pivot Table (cont.)
• Now check select multiple items and then check Correct and click on
ok.
71. Pivot Table (cont.)
• Now we have a count of property id’s
marked as correct by the auditor.
• Similarly, we can have counted for
incorrect ones.
72. Pivot Table (cont.)
Example #2
• In a sales company, we have a
transactional sales data which
contains which product made
how many sales under which
quarter and in what year.
• Below is the screenshot of raw data:
73. Pivot Table (cont.)
• This data is not up to 20 rows but it
66 rows in this example which can
go down even further and it would
be a tedious task to check for sales
of a specific product under any quarter.
• Select the data to insert an pivot table.
This time we will use a shortcut key to
insert pivot tables, click
alt then “D” and then “P”.
• Another dialog box appears. We have
our data in excel and we want to create a
pivot table then we have to click the next button.
74. Pivot Table (cont.)
• In the next step, it asks for a range of data. But as we had already
selected the data so it is prefilled. Click on Next.
75. Pivot Table (cont.)
• Now the last dialog box asks us where we want our pivot tables in the
same worksheet or another. We have to select New worksheet and
then Click on Finish.
76. Pivot Table (cont.)
• Drag product in the rows
section, sales under the values
and quarter under the column
section whereas year in the
filter.
• And we have our report.
• The above pivot table shows which product made how many sales in
which quarter.
77. Pivot Table (cont.)
• If we want to check in the year 2017 what was the sales for the
products we simply uncheck the year 2018 in the year tab.
79. Pivot Table (cont.)
Explanation of the Pivot Table in Excel
• Basically, pivot tables is a powerful Excel tool which helps us to
summarize large amounts of data and saves us a lot of time.
• Pivot tables are a reporting tool which has fields section which
contains four fields:
• Rows: Data which is taken as a specifier.
• Values: Count of the data.
• Filters: Filters to hide out certain data.
• Columns: Values under different conditions.
80. Pivot Table (cont.)
Things to Remember About Excel Pivot Table:
• Pivot tables do not change the values in the database.
• Pivot tables can be inserted in the same worksheet with the data or in
another worksheet.
• For convenience, we add pivot tables in a new worksheet.
81. PowerPivot
• PowerPivot is a popular Add-In under Microsoft Excel which can be used to
import the data sets with millions/trillions of rows from different sources
and help us do quick data analysis with large data sets under excel in a jiffy.
• This add-in was first introduced under Microsoft Excel 2010 and made as a
native feature under 2013 onwards versions of Excel and Office 365 as well.
• The power that PowerPivot posses lies in the data models of its own which
can be considered as databases. The data models are nothing but the data
tables similar to those we use in SQL. We can slice and dice with these
data tables. Create relationships between them, combine different data
tables, create calculated columns for advanced analysis and obviously, the
advanced reporting as well.
82. MS Excel-Sample Pivot Table Report Format
• Please google and make yourself in presenting analyzed data.
• Let’s Check a sample:
https://towardsdatascience.com/building-interactive-dashboards-
using-excel-3b5402da5d22
83. Pivot Table in Libre Office [Open Office]
To create a Pivot Table in Open Office (Calc):
• Select only one cell from your data.
• Choose the Insert Pivot Table
command from the main menu or
click the from the Standard toolbar.
• Calc automatically selects all the
cells and opens the Select Source
dialog.
• Click OK to continue
84. Pivot Table in Libre Office [Open Office]
• In the Pivot Table Layout Dialog
you set up the pivot table.
• In general you drag fields from
the Available Fields pane to the
other white areas.
85. Pivot Table in Libre Office [Open Office]
• Drag the employee field to the Row fields are and the sales field to
the Data Fields area and click OK.
86. Pivot Table in Libre Office [Open Office]
• The pivot table is created in a new sheet. Now we get a summary of
the sum of sales for each employee.
87. Pivot Table in Libre Office [Open Office]
Pivot Table Layout
• The layout of the pivot table is divided into 4 parts: Rows, Columns,
Data and Page. If you understand the layout you will be able to create
more complex pivot tables and extract important information from
your data.
88. Bonus: A Good Resource for Pivot Table
• Please Check: https://www.contextures.com/pivottableindex.html
89. Dashboard of MS Excel
• Dashboards track KPIs, metrics, and other data points in one visual,
central place. They give you a high-level view of work, helping you
make quick decisions and keeping everyone up to date. A
dashboard’s visual nature simplifies complex data and provides an
at-a-glance view of current status or performance in real time.
• Dashboards are made up of tables, charts, gauges, and numbers. They
can be used in any industry, for almost any purpose. For example, you
could make a project dashboard, financial dashboard, marketing
dashboard, and more.
91. Dashboard of MS Excel (cont.)
• General steps of making Dashboard in MS Excel:
• Sourcing/ Jotting the raw data / dataset
• Create Excel Dashboard file/ Sheet
• Sort and sync data in tabular format
• Analyze the Data using the tools [pivot/table/validation/range/chart/macro…]
• Build the Dashboard [Interactive Charts, Pivot Table-Charts]
• Customization [Color, Interpretation, Macro, representation]
92. Dashboard of MS Excel (cont.)
Creating Dynamic Chart:
• Drop-down list/data validation list: Using VLOOKUP, you can use a drop-down list
(also known as a data validation list) to create interactive charts. With this drop-
down list, viewers can select the criteria they want to filter on and the chart will
automatically change to reflect that criteria. For more information on creating a
data validation list with VLOOKUP, click here.
• Macros: You can write a Macro using Excel’s coding language (called Visual Basic)
to automate a task. For example, You can also use Macros to create a button on
your dashboard. When you click that button and select a certain criteria, all the
charts will automatically change to represent that specific criteria.
• Slicers: If you want to add another layer of filtering to your pivot table, you can
use slicers. Slicers contain a set of buttons that let you filter the data and also
show you which filter you are viewing.
93. Dashboard of MS Excel (cont.)
• Let’s Check Some Sample Dashboard of:
• HR Management
• Project Management
• Ecommerce Order Management and Advertisement Dashboard
• Social Media Platform
• Web Analytics
• Supply Chain Dashboard
• DevOps Dashboard
• Sales Management Dashboard