This document provides an overview of statistical software packages including their common features, advantages, types, and most common packages used in social science. It discusses features of Microsoft Excel, SPSS, SAS, Stata, R, Minitab, and GraphPad Prism. These software packages vary in their ease of use, available statistical analyses, programming capabilities, and cost. Excel is best for basic statistics while SAS and Stata provide more advanced analyses but are harder to use. SPSS and Minitab offer a good balance of capabilities and usability.
Statistical analysis and its applications can be used in many fields including pharmaceutical research, clinical trials, public health, epidemiology, genetics, and demographics. Some key uses of statistics include evaluating drug effects, comparing drug treatments, exploring associations between diseases and risk factors, and analyzing clinical trial and genomics data. Measures of central tendency, dispersion, and other statistical methodologies help researchers draw conclusions from collected data.
Statistics is the discipline dealing with the collection, manipulation, analysis, and interpretation of data to draw conclusions and make decisions. Statistical software packages are essential tools that allow statisticians to efficiently analyze large datasets using computers for simulation, data storage, calculations, analysis, and presentation of results. Some common statistical software packages include Excel, which supports basic statistical functions, and more specialized packages like Costat, Minitab, SAS, SPSS, and R, which provide advanced statistical analysis capabilities.
This document discusses data collection in quantitative studies. It explains that data are facts that provide information about the phenomenon being studied. There are several steps to collecting data quantitatively: identifying data needs like variables to measure or hypotheses to test; selecting appropriate measurement tools; pretesting instruments; developing data collection forms and procedures; implementing a data collection plan including selecting and training personnel; and addressing issues that may arise during the process like maintaining controls. The goal is to gather information consistently and validly to address the research questions.
This document discusses computer assisted qualitative data analysis software (CAQDAS). It defines CAQDAS as software packages that help organize and archive qualitative text data for administration and archiving rather than analysis. Some examples of CAQDAS packages include Atlas-ti, Winmax, Nvivo, NUDIST, and Ethnograph. The document outlines the nature of qualitative data as largely unstructured and text-based, and lists ways that computers can be used to assist with statistical analysis and the qualitative research process, including transcription, coding, retrieval, linking, and reporting of data. It notes advantages like efficiency and supporting larger sample sizes or teams, but also limitations such as time constraints, potential to hinder creativity, and challenges of
This document provides an introduction to statistics, defining key concepts and uses. It discusses how statistics is the science of collecting, organizing, analyzing, and interpreting numerical data. Various types of data are described including quantitative, qualitative, discrete, continuous, and different scales of measurement. Common statistical analyses like descriptive statistics, inferential statistics, and different ways of presenting data through tables and graphs are also outlined.
This document discusses various statistical software packages. It provides information on:
- Open source packages like R and SciPy which are free to use.
- Public domain packages such as CSPro and Epi Info which are developed by government organizations for use in fields like epidemiology.
- Freeware packages like WinBUGS and Winpepi that can be downloaded and used at no cost.
- Proprietary packages including SAS, SPSS, and MATLAB that usually require purchasing a license but provide comprehensive statistical functionality.
Commonly used statistical software in pharmacy include SAS, SPSS, GraphPad InStat, and GraphPad Prism. SPSS allows for a range of descriptive, bivariate
This document discusses descriptive and inferential statistics used in nursing research. It defines key statistical concepts like levels of measurement, measures of central tendency, descriptive versus inferential statistics, and commonly used statistical tests. Nominal, ordinal, interval and ratio are the four levels of measurement, with ratio allowing the most data manipulation. Descriptive statistics describe sample data while inferential statistics allow estimating population parameters and testing hypotheses. Common descriptive statistics include mean, median and mode, while common inferential tests are t-tests, ANOVA, chi-square and correlation. Type I errors incorrectly reject the null hypothesis.
This document discusses various methods and instruments for collecting data in research studies. It begins by defining data and explaining why data collection is important. It then covers primary and secondary sources of data, as well as internal and external sources. The main methods of collecting primary data discussed are direct personal investigation through interviews, indirect oral investigation, case studies, measurements, and observation. Secondary data sources include published and unpublished sources. The document also discusses self-reported data collection methods like surveys, interviews, and questionnaires. Other methods covered include document review, focus groups, and observation. Mixed methods are also briefly discussed.
Statistical analysis and its applications can be used in many fields including pharmaceutical research, clinical trials, public health, epidemiology, genetics, and demographics. Some key uses of statistics include evaluating drug effects, comparing drug treatments, exploring associations between diseases and risk factors, and analyzing clinical trial and genomics data. Measures of central tendency, dispersion, and other statistical methodologies help researchers draw conclusions from collected data.
Statistics is the discipline dealing with the collection, manipulation, analysis, and interpretation of data to draw conclusions and make decisions. Statistical software packages are essential tools that allow statisticians to efficiently analyze large datasets using computers for simulation, data storage, calculations, analysis, and presentation of results. Some common statistical software packages include Excel, which supports basic statistical functions, and more specialized packages like Costat, Minitab, SAS, SPSS, and R, which provide advanced statistical analysis capabilities.
This document discusses data collection in quantitative studies. It explains that data are facts that provide information about the phenomenon being studied. There are several steps to collecting data quantitatively: identifying data needs like variables to measure or hypotheses to test; selecting appropriate measurement tools; pretesting instruments; developing data collection forms and procedures; implementing a data collection plan including selecting and training personnel; and addressing issues that may arise during the process like maintaining controls. The goal is to gather information consistently and validly to address the research questions.
This document discusses computer assisted qualitative data analysis software (CAQDAS). It defines CAQDAS as software packages that help organize and archive qualitative text data for administration and archiving rather than analysis. Some examples of CAQDAS packages include Atlas-ti, Winmax, Nvivo, NUDIST, and Ethnograph. The document outlines the nature of qualitative data as largely unstructured and text-based, and lists ways that computers can be used to assist with statistical analysis and the qualitative research process, including transcription, coding, retrieval, linking, and reporting of data. It notes advantages like efficiency and supporting larger sample sizes or teams, but also limitations such as time constraints, potential to hinder creativity, and challenges of
This document provides an introduction to statistics, defining key concepts and uses. It discusses how statistics is the science of collecting, organizing, analyzing, and interpreting numerical data. Various types of data are described including quantitative, qualitative, discrete, continuous, and different scales of measurement. Common statistical analyses like descriptive statistics, inferential statistics, and different ways of presenting data through tables and graphs are also outlined.
This document discusses various statistical software packages. It provides information on:
- Open source packages like R and SciPy which are free to use.
- Public domain packages such as CSPro and Epi Info which are developed by government organizations for use in fields like epidemiology.
- Freeware packages like WinBUGS and Winpepi that can be downloaded and used at no cost.
- Proprietary packages including SAS, SPSS, and MATLAB that usually require purchasing a license but provide comprehensive statistical functionality.
Commonly used statistical software in pharmacy include SAS, SPSS, GraphPad InStat, and GraphPad Prism. SPSS allows for a range of descriptive, bivariate
This document discusses descriptive and inferential statistics used in nursing research. It defines key statistical concepts like levels of measurement, measures of central tendency, descriptive versus inferential statistics, and commonly used statistical tests. Nominal, ordinal, interval and ratio are the four levels of measurement, with ratio allowing the most data manipulation. Descriptive statistics describe sample data while inferential statistics allow estimating population parameters and testing hypotheses. Common descriptive statistics include mean, median and mode, while common inferential tests are t-tests, ANOVA, chi-square and correlation. Type I errors incorrectly reject the null hypothesis.
This document discusses various methods and instruments for collecting data in research studies. It begins by defining data and explaining why data collection is important. It then covers primary and secondary sources of data, as well as internal and external sources. The main methods of collecting primary data discussed are direct personal investigation through interviews, indirect oral investigation, case studies, measurements, and observation. Secondary data sources include published and unpublished sources. The document also discusses self-reported data collection methods like surveys, interviews, and questionnaires. Other methods covered include document review, focus groups, and observation. Mixed methods are also briefly discussed.
This document discusses different types of data classification. It defines classification as systematically grouping data based on common characteristics. There are several ways to classify data, including by the nature of the variable (e.g. quantitative, qualitative), source of collection (e.g. primary, secondary), presentation (e.g. grouped, ungrouped), and content (e.g. simple, manifold). Examples are provided for each type of classification.
Descriptive statistics offer nurse researchers valuable options for analysing and pre-senting large and complex sets of data, suggests Christine Hallett
Research process quantitative and qualitativeEMERENSIA X
The document outlines the steps in conducting qualitative research, including: 1) identifying a broad research problem area and objectives; 2) reviewing literature to gain preliminary information; 3) entering the research setting and contacting key informants; 4) selecting a small, qualitative sample and semi-structured data collection tools; 5) collecting data through interviews and observations while building rapport; 6) organizing and analyzing data through techniques like coding and thematic analysis; and 7) disseminating findings in publications or presentations.
The document discusses data analysis and provides details about:
- The steps involved in quantitative data analysis including data preparation, description, inference drawing, and interpretation.
- The different scales of measurement - nominal, ordinal, interval, and ratio - and their properties.
- Descriptive statistics used to organize and summarize data such as measures of central tendency, dispersion, and relationship.
- Methods for condensing data including frequency distribution tables, contingency tables, and graphs.
This ppt includes basic concepts about data types, levels of measurements. It also explains which descriptive measure, graph and tests should be used for different types of data. A brief of Pivot tables and charts is also included.
The document defines and describes the coefficient of correlation, which is a statistic that measures the strength and direction of the linear relationship between two variables on a scale of -1 to +1. It explains that a correlation of -1 is a perfect negative correlation, 0 is no correlation, and +1 is a perfect positive correlation. The document also differentiates between types of correlation coefficients and methods for computing correlation coefficients, including Karl Pearson's and Spearman's rank correlation coefficients.
The document provides an overview of statistics as used in nursing research. It defines statistics as the science of making effective use of numerical data through collection, analysis, and interpretation. There are two main types of statistics: descriptive statistics which organize and summarize sample data, and inferential statistics which help determine if study outcomes are due to planned factors or chance. Key concepts covered include frequency distributions, measures of central tendency, variability, correlation, hypothesis testing, estimation, t-tests, chi-square tests, and analysis of variance procedures.
A literature review is a search and evaluation of the available literature in your given subject or chosen topic area. It documents the state of the art with respect to the subject or topic you are writing about. It surveys the literature in your chosen area of study.
The document discusses statistical analysis packages including SPSS. It provides an overview of SPSS, describing it as a statistical analysis and data management software package that can perform various analyses and generate reports. The document outlines some key features of SPSS, such as its ease of use, data management and editing tools, and statistical and visualization capabilities. It also briefly describes the different windows in SPSS and how to define and manipulate data.
The document discusses vital statistics, which are numerical records of life events like births, deaths, marriages, and divorces that can be used to study public health trends. Vital statistics are collected through civil registration systems and sample surveys. They provide data to evaluate health programs, plan for disease control, inform legislation and policymaking, and allow comparisons between populations. Important vital statistics include crude death rate, age-specific death rate, infant mortality rate, neonatal mortality rate, post-neonatal mortality rate, and maternal mortality rate.
Data collection,tabulation,processing and analysisRobinsonRaja1
This document discusses data collection, tabulation, processing, and analysis. It begins by outlining the need for data collection to support scientific research and problem solving. It then describes various methods of data collection including warranty cards, audits, and mechanical devices. The document emphasizes the importance of processing and analyzing raw data to make it meaningful and test hypotheses. It outlines steps in processing like editing, coding, classification, and tabulation. Finally, it discusses various statistical analysis techniques including measures of central tendency, frequency distributions, correlation, regression, and parametric and non-parametric tests.
This document provides an overview of how to search and use key features of the Cumulative Index of Nursing and Allied Health Literature database. It describes how to perform keyword and subject heading searches, place articles in folders, email citations, and set up a MyEBSCOhost personal account to save searches and set up alerts. Key features covered include searching by keyword, choosing subject headings, viewing and narrowing search results, adding items to folders, output and sharing options, and creating a personal account to save searches and set up automated alerts.
This document discusses different types of statistical software. It describes open-source statistical packages like R and SciPy that are free to use. Public domain packages like CSPro and Dataplot that have no ownership restrictions are also mentioned. Freeware packages like WinBUGS that are free but have restrictions are covered. Popular proprietary packages like SAS, SPSS, and MATLAB that must be paid for are listed. The document concludes by noting Excel add-ons for statistics and commonly used software in pharmaceutical education like SAS, SPSS, and GraphPad.
This document defines literature review and outlines its importance and purpose. A literature review aims to critically review knowledge on a research topic. It provides a guide for professionals to stay up-to-date in their field. Literature reviews help identify research problems, gaps in knowledge, and inform the methodology. Sources include primary research articles and secondary sources that summarize others' findings. The document describes the steps of literature review including searching databases and other sources, analyzing sources, and writing an introduction, body, and conclusion. It also outlines strategies like using references and searching forward and backward to identify relevant literature.
The document discusses the definition of research, research problems, and the steps in the research process including the conceptual phase, design and planning phase, and empirical phase. It specifically focuses on pilot studies, which are preliminary small-scale trials that help identify problems in research design and procedures before the major study. The purposes of a pilot study are to refine instruments, identify issues with sampling or data collection, and examine reliability and validity. Conducting a pilot study can help reduce errors and save time and resources for the full research project.
This document outlines the key steps in the data preparation process:
1. Check questionnaires for completeness and logical responses
2. Edit data to ensure consistency, correct errors, and fill in missing values
3. Code data by assigning numerical values to question responses
4. Clean data by identifying outliers and inconsistencies to improve data quality
This presentation contains ;-
1. Introduction of research
2. Meaning of research
3. Definition of research
4. Need of nursing research
5. Methods of acquiring knowledge
6. Problem solving method
7. Scientific method
8. Steps of scientific methods
9. Characteristics of good research
10. Qualities of a good researcher
11. Ethics in nursing research
12. Informed consent
13. Types of research
14. Quantitative research
15. Qualitative research
16. Mixed method of research
17. Research based on purpose
18. Purpose based research
19. Applied research
20. Research process
21. Steps of quantitative research process
22. Conceptual frame work
23. Formulating research problem
24. Determining study objectives
25. Review of literature
26. Developing conceptual framework
27. Formulating hypothesis
28. Design and planning phase
29. Research approach or research design
30. Specify population
31. sampling
32. Developing tool for data collection
33. Establishing ethical consideration
34. Conducting the pilot study
35. Pilot study
36. Empirical phase
37. Sample selection
38. Data collection
39. Preparing for data analysis
40. Analytic phase
41. Dissemination phase
42. Steps in qualitative research process
43. Role of nurse in research
This document discusses various computer software packages that are used for data analysis. It describes how researchers can use computer-assisted data analysis and mentions fields that commonly use these tools like sociology, psychology, and medicine. Examples of specific software packages are given like SPSS, Excel, Minitab, Stata, and Origin which can assist with tasks like data management, statistical analysis, data visualization, and modeling. Details are provided on the uses, features, and examples of how to operate some of these programs.
This document provides information on planning for data processing and statistical analysis. It discusses the importance of collecting only necessary standardized data and ensuring all needed information is gathered. It then describes the process of data processing, including data entry, coding, cleaning, and transferring data for analysis. Finally, it outlines different statistical analysis techniques that can be used, including descriptive statistics, comparisons, correlations, regressions, and more. Examples of software packages for these tasks like EpiInfo, Excel, SPSS and R are also mentioned.
This document discusses different types of data classification. It defines classification as systematically grouping data based on common characteristics. There are several ways to classify data, including by the nature of the variable (e.g. quantitative, qualitative), source of collection (e.g. primary, secondary), presentation (e.g. grouped, ungrouped), and content (e.g. simple, manifold). Examples are provided for each type of classification.
Descriptive statistics offer nurse researchers valuable options for analysing and pre-senting large and complex sets of data, suggests Christine Hallett
Research process quantitative and qualitativeEMERENSIA X
The document outlines the steps in conducting qualitative research, including: 1) identifying a broad research problem area and objectives; 2) reviewing literature to gain preliminary information; 3) entering the research setting and contacting key informants; 4) selecting a small, qualitative sample and semi-structured data collection tools; 5) collecting data through interviews and observations while building rapport; 6) organizing and analyzing data through techniques like coding and thematic analysis; and 7) disseminating findings in publications or presentations.
The document discusses data analysis and provides details about:
- The steps involved in quantitative data analysis including data preparation, description, inference drawing, and interpretation.
- The different scales of measurement - nominal, ordinal, interval, and ratio - and their properties.
- Descriptive statistics used to organize and summarize data such as measures of central tendency, dispersion, and relationship.
- Methods for condensing data including frequency distribution tables, contingency tables, and graphs.
This ppt includes basic concepts about data types, levels of measurements. It also explains which descriptive measure, graph and tests should be used for different types of data. A brief of Pivot tables and charts is also included.
The document defines and describes the coefficient of correlation, which is a statistic that measures the strength and direction of the linear relationship between two variables on a scale of -1 to +1. It explains that a correlation of -1 is a perfect negative correlation, 0 is no correlation, and +1 is a perfect positive correlation. The document also differentiates between types of correlation coefficients and methods for computing correlation coefficients, including Karl Pearson's and Spearman's rank correlation coefficients.
The document provides an overview of statistics as used in nursing research. It defines statistics as the science of making effective use of numerical data through collection, analysis, and interpretation. There are two main types of statistics: descriptive statistics which organize and summarize sample data, and inferential statistics which help determine if study outcomes are due to planned factors or chance. Key concepts covered include frequency distributions, measures of central tendency, variability, correlation, hypothesis testing, estimation, t-tests, chi-square tests, and analysis of variance procedures.
A literature review is a search and evaluation of the available literature in your given subject or chosen topic area. It documents the state of the art with respect to the subject or topic you are writing about. It surveys the literature in your chosen area of study.
The document discusses statistical analysis packages including SPSS. It provides an overview of SPSS, describing it as a statistical analysis and data management software package that can perform various analyses and generate reports. The document outlines some key features of SPSS, such as its ease of use, data management and editing tools, and statistical and visualization capabilities. It also briefly describes the different windows in SPSS and how to define and manipulate data.
The document discusses vital statistics, which are numerical records of life events like births, deaths, marriages, and divorces that can be used to study public health trends. Vital statistics are collected through civil registration systems and sample surveys. They provide data to evaluate health programs, plan for disease control, inform legislation and policymaking, and allow comparisons between populations. Important vital statistics include crude death rate, age-specific death rate, infant mortality rate, neonatal mortality rate, post-neonatal mortality rate, and maternal mortality rate.
Data collection,tabulation,processing and analysisRobinsonRaja1
This document discusses data collection, tabulation, processing, and analysis. It begins by outlining the need for data collection to support scientific research and problem solving. It then describes various methods of data collection including warranty cards, audits, and mechanical devices. The document emphasizes the importance of processing and analyzing raw data to make it meaningful and test hypotheses. It outlines steps in processing like editing, coding, classification, and tabulation. Finally, it discusses various statistical analysis techniques including measures of central tendency, frequency distributions, correlation, regression, and parametric and non-parametric tests.
This document provides an overview of how to search and use key features of the Cumulative Index of Nursing and Allied Health Literature database. It describes how to perform keyword and subject heading searches, place articles in folders, email citations, and set up a MyEBSCOhost personal account to save searches and set up alerts. Key features covered include searching by keyword, choosing subject headings, viewing and narrowing search results, adding items to folders, output and sharing options, and creating a personal account to save searches and set up automated alerts.
This document discusses different types of statistical software. It describes open-source statistical packages like R and SciPy that are free to use. Public domain packages like CSPro and Dataplot that have no ownership restrictions are also mentioned. Freeware packages like WinBUGS that are free but have restrictions are covered. Popular proprietary packages like SAS, SPSS, and MATLAB that must be paid for are listed. The document concludes by noting Excel add-ons for statistics and commonly used software in pharmaceutical education like SAS, SPSS, and GraphPad.
This document defines literature review and outlines its importance and purpose. A literature review aims to critically review knowledge on a research topic. It provides a guide for professionals to stay up-to-date in their field. Literature reviews help identify research problems, gaps in knowledge, and inform the methodology. Sources include primary research articles and secondary sources that summarize others' findings. The document describes the steps of literature review including searching databases and other sources, analyzing sources, and writing an introduction, body, and conclusion. It also outlines strategies like using references and searching forward and backward to identify relevant literature.
The document discusses the definition of research, research problems, and the steps in the research process including the conceptual phase, design and planning phase, and empirical phase. It specifically focuses on pilot studies, which are preliminary small-scale trials that help identify problems in research design and procedures before the major study. The purposes of a pilot study are to refine instruments, identify issues with sampling or data collection, and examine reliability and validity. Conducting a pilot study can help reduce errors and save time and resources for the full research project.
This document outlines the key steps in the data preparation process:
1. Check questionnaires for completeness and logical responses
2. Edit data to ensure consistency, correct errors, and fill in missing values
3. Code data by assigning numerical values to question responses
4. Clean data by identifying outliers and inconsistencies to improve data quality
This presentation contains ;-
1. Introduction of research
2. Meaning of research
3. Definition of research
4. Need of nursing research
5. Methods of acquiring knowledge
6. Problem solving method
7. Scientific method
8. Steps of scientific methods
9. Characteristics of good research
10. Qualities of a good researcher
11. Ethics in nursing research
12. Informed consent
13. Types of research
14. Quantitative research
15. Qualitative research
16. Mixed method of research
17. Research based on purpose
18. Purpose based research
19. Applied research
20. Research process
21. Steps of quantitative research process
22. Conceptual frame work
23. Formulating research problem
24. Determining study objectives
25. Review of literature
26. Developing conceptual framework
27. Formulating hypothesis
28. Design and planning phase
29. Research approach or research design
30. Specify population
31. sampling
32. Developing tool for data collection
33. Establishing ethical consideration
34. Conducting the pilot study
35. Pilot study
36. Empirical phase
37. Sample selection
38. Data collection
39. Preparing for data analysis
40. Analytic phase
41. Dissemination phase
42. Steps in qualitative research process
43. Role of nurse in research
This document discusses various computer software packages that are used for data analysis. It describes how researchers can use computer-assisted data analysis and mentions fields that commonly use these tools like sociology, psychology, and medicine. Examples of specific software packages are given like SPSS, Excel, Minitab, Stata, and Origin which can assist with tasks like data management, statistical analysis, data visualization, and modeling. Details are provided on the uses, features, and examples of how to operate some of these programs.
This document provides information on planning for data processing and statistical analysis. It discusses the importance of collecting only necessary standardized data and ensuring all needed information is gathered. It then describes the process of data processing, including data entry, coding, cleaning, and transferring data for analysis. Finally, it outlines different statistical analysis techniques that can be used, including descriptive statistics, comparisons, correlations, regressions, and more. Examples of software packages for these tasks like EpiInfo, Excel, SPSS and R are also mentioned.
This document provides an overview of software skills needed for research and discusses various data analysis software options. It notes that researchers today need skills in software evaluation, selection, use and data management. Examples of required qualifications from job postings emphasize experience with visualization, spatial data, programming languages and operating systems. The document categorizes types of data analysis software and provides details on qualitative analysis software like Atlas.ti, quantitative software like SPSS and SAS, data visualization software like Excel and MATLAB, and programming languages like R. It emphasizes selecting the right tools for tasks and considering open source and free options.
This presentation introduces SPSS as a statistical analysis software tool. SPSS can be used for tasks like processing questionnaires, reporting data in tables and graphs, and analyzing data through various tests like means, chi-square, and regression. It is commonly used in fields like telecommunications, banking, finance, insurance, healthcare, manufacturing, retail, education, and market research. Key features of SPSS include its ease of use, robust data management and editing tools, in-depth statistical capabilities, and reporting features. The presentation demonstrates how to get data into SPSS, enter and edit variables and data, and describes the basic steps of data analysis using various statistical procedures available in SPSS.
This document discusses various software used in pharmacoepidemiology research. It describes statistical analysis packages like SAS and SPSS that are commonly used to analyze collected data. It also mentions more specialized software like WinBUGS, EpiInfo, and STATA designed for large dataset analysis. Decision-making software like CLEO and TREEAGE are explained. Spreadsheet programs like MS Excel are also used. The benefits of software include reduced analysis time and ability to identify patterns in large data. Potential issues include cost and complexity of software. Training is needed to ensure proper use.
Statistical software tools like MS Excel, SPSS, and MiniTab can be used for statistical analysis.
MS Excel is commonly used due to its convenience and low cost, but requires statistical knowledge. It provides functions for descriptive statistics. SPSS is commonly used in social sciences for tasks like frequencies, cross-tabulation, and regression without coding. MiniTab provides statistical analysis tools and graphical visualization for processes like Six Sigma. Each tool has advantages like ease of use, analysis capabilities, and limitations like learning curves, file sizes, and costs.
Application of Excel and SPSS software for statistical analysis- Biostatistic...Himanshu Sharma
This slide contains B.Pharm Biostatistics and Research methodology 8th Sem. Unit-3 L2 topic- "Statistical Analysis using Software"
It contains topics:
1. MS Excel
2. SPSS
3. MiniTab
#StatisticalAnalysisusingMSExcel
#StatisticalAnalysisusingMiniTab
#StatisticalAnalysisusingSPSS
This document discusses statistical packages used for data analysis. It describes the Statistical Package for the Social Sciences (SPSS) in detail. SPSS is a widely used statistical software package that can perform complex data analysis, including regression, ANOVA, and multivariate analysis. It has advantages such as easy data import, reliable results, and useful graphs. However, it also has limitations such as difficulty interpreting error logs and incomplete menus. The document also briefly mentions other packages like Microsoft Excel, SAS, Minitab, and Stata.
What is SPSS (Theory)?
Data Editor
Variable definition
Briefing for SPSS
How to subscribe SPSS?
Recommended books on SPSS
Uses for SPSS (Data Management)
Uses for SPSS (Data Analysis)
Introduction to Computational StatisticsSetia Pramana
This document outlines courses in computational statistics that utilize various statistical software packages like R, SPSS, and Excel. The courses cover topics ranging from data preparation and visualization to statistical modeling techniques like linear regression, resampling methods, and hypothesis testing. They emphasize hands-on practice over theory, involve group projects, and provide reference materials for further learning.
In this ppt the viewer will able to understand about SAS software. It is a statistical software suite developed by SAS Institute for data management. SAS was developed at North Carolina State University from 1966 until 1976, when SAS Institute was incorporated. SAS was further developed in the 1980s and 1990s with the addition of new statistical procedures, additional components and the introduction of JMP. A point-and-click interface was added in version 9 in 2004. A social media analytics product was added in 2010.
• Portion explained:
• Components of SAS Software
• Origins of SAS Software
• Development of SAS Software
• Recent History of SAS Software
• Software products of SAS Software
• Adoption of SAS Software
• Application of SAS Software
Very brief introduction to R software that I have presented at UNISZA. No R codes and No Statistical Contents. Basically for those who just heard about R software for the first time
This document provides an overview of data science tools, techniques, and applications. It begins by defining data science and explaining why it is an important and in-demand field. Examples of applications in healthcare, marketing, and logistics are given. Common computational tools for data science like RapidMiner, WEKA, R, Python, and Rattle are described. Techniques like regression, classification, clustering, recommendation, association rules, outlier detection, and prediction are explained along with examples of how they are used. The advantages of using computational tools to analyze data are highlighted.
This document outlines the key aspects of head injury including:
- Head injuries are the most common cause of prehospital deaths and range from mild to severe.
- The primary survey focuses on airway, breathing, circulation, disability, and exposure to stabilize the patient. The secondary survey involves a full neurological exam and diagnostic testing.
- Treatment depends on the severity and includes intravenous fluids, oxygen therapy, hyperventilation, mannitol, hypertonic saline, barbiturates, and surgery for mass lesions or fractures. Brain death is defined by specific clinical criteria including absent brainstem reflexes and no brain activity on tests.
This document outlines types of spine injuries, their effects, documentation requirements, and general management. It discusses that 25% of major trauma injuries involve the spine, with the cervical spine making up 55% of cases. Spine injuries can cause respiratory failure if they involve certain levels of the cervical or thoracic spine due to effects on intercostal muscles and the diaphragm. Documentation of spine injuries should include level, bony or neurological involvement, and whether it is complete or incomplete. General management includes spinal motion restriction, careful patient transfer techniques, administration of fluids and steroids, and DVT prophylaxis.
This document presents a case report of a 25-year-old man who presented with progressive hearing loss over 15 years, right-sided facial weakness for 6 months, and difficulty walking for 5 years. On examination, he had positive whisper and Rhinne's tests bilaterally, as well as right-sided cerebellar signs and facial weakness. An MRI of the brain and inner ear function tests were ordered. The document lists the patient's history and examination findings but does not provide the results of the ordered tests or a diagnosis. The differential diagnoses being considered are not specified.
Protein microarrays are a high-throughput method to study protein interactions and function on a large scale. The microarray consists of capture proteins bound to a solid surface, such as a glass slide. Probe molecules, typically labeled with a fluorescent dye, are added and any reactions emit signals detected by a laser scanner. A variety of methods can be used to prepare the microarray by immobilizing proteins to the solid surface while maintaining their structure and binding abilities. Protein microarrays have applications in diagnostics, proteomics, and antibody characterization. However, working with proteins presents challenges including maintaining their stability on the surface and reducing non-specific binding.
This document outlines traumatic brain injury, including its incidence, mechanisms, symptoms, neurological examination findings, and various types of injuries seen on imaging like contusions, hematomas, and skull fractures. It discusses the Monro-Kellie doctrine and treatments for traumatic brain injury, including both medical management with medications, fluids and monitoring, as well as various surgical procedures to evacuate hematomas.
This document outlines a plan to reduce traumatic brain injuries from road traffic accidents. It discusses that road traffic accidents are the leading cause of death for people aged 15-29 worldwide, killing nearly 1.3 million people and injuring 20-50 million more each year. The plan proposes focusing on improving road safety management, safer road infrastructure, safer vehicles, safer road user behavior, and enhancing post-crash response through strategies like implementing graduated driver licensing programs, setting and enforcing speed limits, increasing seatbelt and helmet use, reducing drunk driving, and strengthening emergency response systems. The goal is to cut deaths and injuries from traffic accidents through low-cost road improvements and promoting protective equipment.
This document outlines different types of spinal injuries and provides key details about each. It begins with an overview of spinal cord tracts and classifications of complete versus incomplete cord injuries. Various syndromes of incomplete cord injuries are described. Distinctions between conus medullaris and cauda equina injuries are noted. Spinal shock is defined as a transient loss of neurologic function below the level of injury. Specific injury types are outlined for different spinal regions from C1 to the coccyx, including things like odontoid fractures, hangman's fractures, and burst fractures. Graphs are referenced to depict spinal tracts.
This document provides an outline for a lecture on spine pathology for final year medical students. It covers spinal anatomy, trauma, degenerative diseases, tumors, infections, congenital abnormalities, deformities, and metabolic bone diseases. For each topic, key points are outlined such as epidemiology, clinical presentation, diagnostic imaging, and management principles. Spinal anatomy includes the vertebrae, spinal cord, and blood supply. Trauma management focuses on spinal alignment and stabilization. Degenerative diseases discussed are disc herniation and spinal stenosis. Infections can be pyogenic or tuberculosis. Congenital conditions include spina bifida and syringomyelia. Metabolic bone disease highlights osteoporosis. The document concludes with an
This document discusses neuromuscular transmission and excitation-contraction coupling at the neuromuscular junction (NMJ). It describes the secretion of acetylcholine (Ach) from nerve terminals, how Ach opens ion channels on the postsynaptic membrane, and how this leads to end plate potentials and excitation of skeletal muscle fibers. The document also discusses Ach formation and release, safety factors at the NMJ, fatigue of the NMJ, and the molecular mechanisms of excitation-contraction coupling in skeletal muscle.
This document summarizes the anatomy and physiology of skeletal muscle contraction. It describes the gross and molecular structures of muscle fibers, including the sarcomere unit composed of actin and myosin filaments. The sliding filament model of contraction is explained, where crossbridges on the myosin head bind to actin and hydrolyze ATP for movement. Calcium released from the sarcoplasmic reticulum initiates the crossbridge cycling. Fast and slow muscle fiber types are compared. The energetics and mechanisms of isotonic and isometric contractions are also summarized.
This document describes the medical history, examination, diagnosis, and treatment for a 38-year-old male patient presenting with decreased hearing in his left ear over 7 months and falling to his left side over 5 months. MRI and CT scans revealed a left cerebellopontine angle schwannoma with hydrocephalus. The patient underwent VP shunt placement followed by a left retrosigmoid approach for tumor resection. Post-operatively, the patient was monitored for complications which can include cranial nerve injuries, CSF leakage, meningitis, and recurrence of the tumor. The outcomes for this type of surgery generally have high rates of gross total resection, hearing preservation, and facial nerve function preservation depending on the size and classification of the
A 15-year-old female presented with neck pain radiating to the scapular blade for 4 months and progressive weakness starting in the left side 4 months ago and right side 3 months ago. On examination, she had weakness, increased tone and reflexes more on the left side. Imaging showed an extramedullary tumor at the upper cervical spine. She underwent a posterior midline laminectomy with microscopic dissection and excision of the tumor. Post-operatively, she was monitored for complications like nerve root injury, spinal cord injury, CSF leak, or tumor recurrence.
A 22-year-old male presented with a 1-year history of back pain and 8 months of lower limb weakness and difficulty walking. MRI revealed a tumor in the lower dorsal spine. The patient underwent a posterior midline approach and excision of the tumor. Possible diagnoses included ependymoma, astrocytoma, or tuberculosis of the spine. The surgery aimed to prevent further neurological dysfunction and cure the condition through complete tumor resection.
A 25-year-old female presented with a one-year history of lower back pain radiating to both legs. Examination revealed decreased pinprick sensation in the left leg. MRI showed a tumor at L5. The diagnosis was cauda equina syndrome secondary to a spinal meningioma at L5. The patient underwent posterior laminectomy and excision of the tumor. Post-operatively, the patient's pain improved.
A 28-year-old female presented with a 1-year history of headaches and 1-month history of double vision. Examination found double vision in the right upper and lower visual fields. Imaging revealed a right frontotemporal mass, and the patient underwent a right pterional craniotomy for total excision of a sphenoid wing meningioma. Post-operatively, the patient received follow-up care and rehabilitation with the goal of reducing risk of recurrence.
This document describes the medical history and examination of a 28-year-old female patient presenting with proptosis of the right eye and decreased vision in that eye for 3 years. On examination, protrusion of the right eyeball was noted. Vision was reduced to finger counting in the right eye compared to 6/6 in the left eye. Neurological examination was normal except for slight restriction of abduction of the right eyeball. The patient was admitted for further investigation and management of a suspected orbital tumor.
A 45-year-old female presented with a 1-year history of headaches and recent onset of vision loss, smell loss, and urinary incontinence. Examination found visual impairment, anosmia, and papilledema in one eye with optic atrophy in the other. Imaging revealed a likely anterior skull base tumor involving the frontal lobe, with differential diagnoses including meningioma, pituitary adenoma, or fungal infection. The patient underwent a bifrontal craniotomy for tumor excision. Postoperative complications can include issues like CSF leak, hemorrhage, or deficits in motor function or vision. Long-term outcomes depend on factors like the tumor grade, extent of resection, and use of radiation therapy
Measures in SQL (SIGMOD 2024, Santiago, Chile)Julian Hyde
SQL has attained widespread adoption, but Business Intelligence tools still use their own higher level languages based upon a multidimensional paradigm. Composable calculations are what is missing from SQL, and we propose a new kind of column, called a measure, that attaches a calculation to a table. Like regular tables, tables with measures are composable and closed when used in queries.
SQL-with-measures has the power, conciseness and reusability of multidimensional languages but retains SQL semantics. Measure invocations can be expanded in place to simple, clear SQL.
To define the evaluation semantics for measures, we introduce context-sensitive expressions (a way to evaluate multidimensional expressions that is consistent with existing SQL semantics), a concept called evaluation context, and several operations for setting and modifying the evaluation context.
A talk at SIGMOD, June 9–15, 2024, Santiago, Chile
Authors: Julian Hyde (Google) and John Fremlin (Google)
https://doi.org/10.1145/3626246.3653374
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsPeter Muessig
The UI5 tooling is the development and build tooling of UI5. It is built in a modular and extensible way so that it can be easily extended by your needs. This session will showcase various tooling extensions which can boost your development experience by far so that you can really work offline, transpile your code in your project to use even newer versions of EcmaScript (than 2022 which is supported right now by the UI5 tooling), consume any npm package of your choice in your project, using different kind of proxies, and even stitching UI5 projects during development together to mimic your target environment.
Transform Your Communication with Cloud-Based IVR SolutionsTheSMSPoint
Discover the power of Cloud-Based IVR Solutions to streamline communication processes. Embrace scalability and cost-efficiency while enhancing customer experiences with features like automated call routing and voice recognition. Accessible from anywhere, these solutions integrate seamlessly with existing systems, providing real-time analytics for continuous improvement. Revolutionize your communication strategy today with Cloud-Based IVR Solutions. Learn more at: https://thesmspoint.com/channel/cloud-telephony
OpenMetadata Community Meeting - 5th June 2024OpenMetadata
The OpenMetadata Community Meeting was held on June 5th, 2024. In this meeting, we discussed about the data quality capabilities that are integrated with the Incident Manager, providing a complete solution to handle your data observability needs. Watch the end-to-end demo of the data quality features.
* How to run your own data quality framework
* What is the performance impact of running data quality frameworks
* How to run the test cases in your own ETL pipelines
* How the Incident Manager is integrated
* Get notified with alerts when test cases fail
Watch the meeting recording here - https://www.youtube.com/watch?v=UbNOje0kf6E
E-Invoicing Implementation: A Step-by-Step Guide for Saudi Arabian CompaniesQuickdice ERP
Explore the seamless transition to e-invoicing with this comprehensive guide tailored for Saudi Arabian businesses. Navigate the process effortlessly with step-by-step instructions designed to streamline implementation and enhance efficiency.
Unveiling the Advantages of Agile Software Development.pdfbrainerhub1
Learn about Agile Software Development's advantages. Simplify your workflow to spur quicker innovation. Jump right in! We have also discussed the advantages.
Microservice Teams - How the cloud changes the way we workSven Peters
A lot of technical challenges and complexity come with building a cloud-native and distributed architecture. The way we develop backend software has fundamentally changed in the last ten years. Managing a microservices architecture demands a lot of us to ensure observability and operational resiliency. But did you also change the way you run your development teams?
Sven will talk about Atlassian’s journey from a monolith to a multi-tenanted architecture and how it affected the way the engineering teams work. You will learn how we shifted to service ownership, moved to more autonomous teams (and its challenges), and established platform and enablement teams.
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...Crescat
Crescat is industry-trusted event management software, built by event professionals for event professionals. Founded in 2017, we have three key products tailored for the live event industry.
Crescat Event for concert promoters and event agencies. Crescat Venue for music venues, conference centers, wedding venues, concert halls and more. And Crescat Festival for festivals, conferences and complex events.
With a wide range of popular features such as event scheduling, shift management, volunteer and crew coordination, artist booking and much more, Crescat is designed for customisation and ease-of-use.
Over 125,000 events have been planned in Crescat and with hundreds of customers of all shapes and sizes, from boutique event agencies through to international concert promoters, Crescat is rigged for success. What's more, we highly value feedback from our users and we are constantly improving our software with updates, new features and improvements.
If you plan events, run a venue or produce festivals and you're looking for ways to make your life easier, then we have a solution for you. Try our software for free or schedule a no-obligation demo with one of our product specialists today at crescat.io
Takashi Kobayashi and Hironori Washizaki, "SWEBOK Guide and Future of SE Education," First International Symposium on the Future of Software Engineering (FUSE), June 3-6, 2024, Okinawa, Japan
Need for Speed: Removing speed bumps from your Symfony projects ⚡️Łukasz Chruściel
No one wants their application to drag like a car stuck in the slow lane! Yet it’s all too common to encounter bumpy, pothole-filled solutions that slow the speed of any application. Symfony apps are not an exception.
In this talk, I will take you for a spin around the performance racetrack. We’ll explore common pitfalls - those hidden potholes on your application that can cause unexpected slowdowns. Learn how to spot these performance bumps early, and more importantly, how to navigate around them to keep your application running at top speed.
We will focus in particular on tuning your engine at the application level, making the right adjustments to ensure that your system responds like a well-oiled, high-performance race car.
Artificia Intellicence and XPath Extension FunctionsOctavian Nadolu
The purpose of this presentation is to provide an overview of how you can use AI from XSLT, XQuery, Schematron, or XML Refactoring operations, the potential benefits of using AI, and some of the challenges we face.
Odoo ERP software
Odoo ERP software, a leading open-source software for Enterprise Resource Planning (ERP) and business management, has recently launched its latest version, Odoo 17 Community Edition. This update introduces a range of new features and enhancements designed to streamline business operations and support growth.
The Odoo Community serves as a cost-free edition within the Odoo suite of ERP systems. Tailored to accommodate the standard needs of business operations, it provides a robust platform suitable for organisations of different sizes and business sectors. Within the Odoo Community Edition, users can access a variety of essential features and services essential for managing day-to-day tasks efficiently.
This blog presents a detailed overview of the features available within the Odoo 17 Community edition, and the differences between Odoo 17 community and enterprise editions, aiming to equip you with the necessary information to make an informed decision about its suitability for your business.
UI5con 2024 - Keynote: Latest News about UI5 and it’s EcosystemPeter Muessig
Learn about the latest innovations in and around OpenUI5/SAPUI5: UI5 Tooling, UI5 linter, UI5 Web Components, Web Components Integration, UI5 2.x, UI5 GenAI.
Recording:
https://www.youtube.com/live/MSdGLG2zLy8?si=INxBHTqkwHhxV5Ta&t=0
Using Query Store in Azure PostgreSQL to Understand Query PerformanceGrant Fritchey
Microsoft has added an excellent new extension in PostgreSQL on their Azure Platform. This session, presented at Posette 2024, covers what Query Store is and the types of information you can get out of it.
A Study of Variable-Role-based Feature Enrichment in Neural Models of CodeAftab Hussain
Understanding variable roles in code has been found to be helpful by students
in learning programming -- could variable roles help deep neural models in
performing coding tasks? We do an exploratory study.
- These are slides of the talk given at InteNSE'23: The 1st International Workshop on Interpretability and Robustness in Neural Software Engineering, co-located with the 45th International Conference on Software Engineering, ICSE 2023, Melbourne Australia
3. STATISTICAL SOFTWARE
• Specialized programs=>complex statistical analysis
• organization =>collection ,interpretation , analysis , calculations ,
presentation of data
• vital tool for research analysis, data validation and findings
• Statistical solutions => statistical analysis
• capabilities =>support & analysis methodologies
regression analysis
predictive analysis
statistical modelling
4. • data scientists and mathematicians
• industry-specific features
• avoid routine mathematical mistakes and produce accurate figures
• features tailored to scientific research, cost modelling, or health
• Qualities:
Package statistical analysis capabilities, equations, and models
Facilitate data importing, preparation and modelling
Perform complex statistical analysis
Compare Statistical Analysis Software
improve in the quality of research
5. COMMON FEATURES OF STATISTICAL SOFTWARES
• common characteristics that make reliable & suitable for data analysis
• Data editor is in rows & columns : very easy to enter numeric data
• availability of menu bar comprises drop-down menu, quick analysis
as well as brief user manual
• Statistical level of measurement is put into consideration in data entry
• Getting your data ready to enter into the software
• Defining and labeling variable
• Entering data appropriately with each row containing each case and
each column as variable
6. • Data checking and cleaning is possible
• All data should be numeric
• Data exploration can be done to check for errors and other accuracy
• The statistical level of significance for rejecting null hypothesis (Ho) is
when your p-value significance is less than 0.05
• Time & cost effective
7. ADVANTAGES OF STATISTICAL SOFTWARE
• Accuracy & speed
• Varsality
• Validity
• Graphics
• Flexibilty
• New variables
• Volume of data
• Easy transfer of data
• Easy compilation , tabulation ,Diagramatics prrsentaion
• averages , co-efficients of variation ,standard deviation error & percentiles
8.
9. TYPES OF STATISTICAL SOFTWARE PACKAGES
• Open source
• Public domain
• Freeware
• Proprietary
10. OPEN SOURCE STATISTICAL SOFTWARE PACKAGES
• ADMB : Non-linear statistical modelling on C++
• DAP : Free replecement for SAS
• FITYK : Non-linear regression
• OPENEPI : Web-based , open source , independent for epidiomiology
& STATISTICS
• SCIPY: Regression , plotting , anova
• PSPP : Free & alternative to IBM SPSS
• R : A free implementation of S
11. PUBLIC DOMAIN STATISTICAL SOFTWARE
PACKAGES
• CSPRO :
Developed : US census beureau & ICF international
Used : entering , editing , tabulating , mapping , disseminting census & surveying data
• EPI-INFO :
Epidemiology
Developed : centre for disease controll & prevention in Atlanta & georgia (USA)
used : electronic survey creation , data entry , Analysis (t-test & Anova)
• X-12 -ARIMA :
Developed : US census beureau
Use : seasonal variations
12. FREEWARE STATISTICAL SOFTWARE PACKAGES
• WINBUGS :
Baysian analysis
use markov chain monte carlos methods
• WINPEPI :
Epidemilogy
13. PROPRIETARY STATISTICAL SOFTWARE PACKAGES
• GRAPHPAD INSTAT : very simple , lots guidance & explanation
• GRAPHPAD PRISM : biostatistic , non-linear regression & explanations
• IBM SPSS STATISTICS: comprehensive statistical package
• IBM SPSS MODELER: Comprehemsive data mining & text anaylsis
• MATLAB : programming language with statistical features
• SAS : comprehensive statistical package
• SPSS : social science
• STATS DIRECT : biomedical , public health & general health science
14. MICROSOFT ADDON STATISTICAL SOFTWARE
PACKAGES
• ANALYSE IT : analysis
• NUM XL : general statistics & economics
• REGRESS IT : multivariate data analysis & linear regression(freeware)
• SIGMA XL : statistical & graphical analysis
• SPC XL : general statistics
• STATS HELPER : descriptive statistics & six sigma
15. MOST COMMON STATISTICAL SOFTWARE
PACKAGES IN SOCIAL SCIENCE
• MS-EXCEL
• SPSS
• GRAPHPAD INSTAT
• GRAPHPAD PRISM
• STATISTIX
16. MICROSOFT EXCEL
• Part of the Microsoft Office suite of programs
• Excel version 1.0 was first released in 1985
• latest version Excel 2016
• most popular software application worldwide
• Good points:
Extremely easy to use and interchanges nicely with other Microsoft products
Excel to analyze data, for example, in accounts, budgets, billing and many other areas
Excel spreadsheets can be read by many other statistical packages
Add on module which is part of Excel for undertaking basic statistical analyses
Can produce very nice graphs
17. • Bad points :
Good in only general statistics but poor in regression analysis ,logistic
regression ,survival , variance , Factor & multivariate analysis
Excel is designed for financial calculations, although it is possible to
use it for many other things
Cannot undertake more sophisticated statistical analyses without
purchase of expensive commercial add ons.
• Availability
Microsoft software already installed
For blue-plated (UniSA) computers, contact the IT Help Desk to install
the latest Microsoft office software
For your own computer, you can always purchase Microsoft Office
from a retail store.
18.
19.
20.
21. SPSS
• Statistical Package for the Social Sciences
• Version 1 being released in 1968, well before the advent of desktop computers
• It is now on Version 23
• Data editor ,output viewer , syntax editor , script window
• Good points :
Very easy to learn and use
Can use either with menus or syntax files
Quite good graphics
Excels at descriptive statistics,testing hypothesis , co-relation, basic regression analysis,
analysis of variance, and some newer techniques such as Classification and Regression Trees
(CART)
Has its own structural equation modelling software AMOS, that dovetails with SPSS
22. • Bad points :
Focus is on statistical methods mainly used in the social sciences,
market research and psychology
Has advanced regression modelling procedures such as LMM and
GEE, but they are awful to use with very obscure syntax
Has few of the more powerful techniques required in epidemiological
analysis, such as competing risk analysis or standardised rates
Availability :
SPSS is available on blue-plated (UniSA) computers
contact the IT Help Desk to install it
23.
24.
25.
26. SAS
• Statistical Analysis System
• North Carolina State University in 1966
• contemporary with SPSS
• Good points :
Can use either with menus or syntax files
Much more powerful than SPSS
"power users" like because of its power and programmability
Commonly used for data management in clinical trials
• Bad points :
Harder & longer time to learn and use than SPSS
number of records is generally limited to the size of your hard disk.
• Availability :
To organise installation contact the IT Help Desk
27.
28. STATA
• more recent statistical package with Version 1 being released in 1985
• popular in the areas of epidemiology and economics
• We are now on Version 14
• available for Windows, Unix, and Mac computers
• Good points :
Can use either with menus or syntax files
Much more powerful than SPSS – probably equivalent to SAS
Excels at advanced regression modelling
Has its own in-built structural equation modelling
Has a good suite of epidemiological procedures
Researchers around the world write their own procedures in Stata
29. • powerful statistical package with smart data-management facilities
• an excellent system for producing publication-quality graphs
• a wide array of up-to-date statistical techniques
• Bad points :
Harder to learn and use than SPSS
most general statistical analyses (regression, logistic regression, survival analysis, analysis
of variance, factor analysis, multivariate analysis and time series analysis
Does not yet have some specialised techniques such as CART or Partial Least squares
regression
• Availability :
Stata can be downloaded onto blue-plated computers by contacting the IT Help Desk
Students can purchase a full copy with a perpetual license from the Australian
distributors (Survey Design and Analysis) for about $200
30.
31. R
• S-plus is a statistical programming language developed in Seattle in 1988
• R is a free version of S-plus developed in 1996
• it is a programming language and environment
• richest statistical systems contain impressive amount of libraries, growing each day
• Good points
Very powerful – easily matches or even surpasses many of the models found in SAS
or Statas
Researchers around the world write their own procedures in R
• Bad points
Much harder to learn and use than SAS or Stata
general statistical analysis
• Availability
http://cran.csiro.au/
32. MINITAB
• used by educators, students, scientists, business associates and researchers in a
multitude of areas
• developed around 1990
• one of the oldest statistical software programs available
• has compatibility with PC, Macintosh, Linux
• GOOD POINTS:
easiest statistical software programs to use
popular choice with those new statistical software.
With drop-down menus and dialog boxes describing how and what to do next
persists as a popular choice for teaching students about statistics and data analysis
primarily has a user base of educators using the program to show students research
methods and analysis
33. • BAD POINTS :
performs most general statistical analyses (regression, logistic regression,
survival analysis, analysis of variance, factor analysis
has its weaknesses in general linear model (GLM) and Multilevel regression)
41. GRAPHPAD PRISM
• written by Harvey Motulsky in 1989
• 2D graphics , curve fitting & statistical software for windows
• Good Points :
Non-linear regression & removal of outliers
comparisons of models & curves, interpolation of standard curves
automatic updating of results and graphs
functionality for displaying error bars
Built in formulae, batch processing and standardisation features,
along with automated analysis and data validation makes GraphPad
Prism a popular software amongst users
42.
43.
44. STATISTIX
• Statistix is a powerful statistical analysis program you can use to quickly analyze your
data
• Easy to Learn and Use
• Completely menu-driven, procedures are specified using concise Windows-style
dialog boxes.
• Reliable
• Developed in 1985
• Comprehensive
• Statistix performs all the basic and advanced statistics needed by most users.
• "Statistix gives the user easy access to all the common tools of data analysis
• Fast Computes lightening fast. No time consuming disk access needed
• Data are memory resident.
• "Statistix is fast, very fast.
45.
46.
47.
48.
49. APPLICATIONS
• quantitative research cannot be done effectively without SS
• It helps professionals to interact with data thereby paving way for creativity
and innovation
• user friendly interface with drop-down tips
• allowed experts greater freedom to come out with results within twinkle of
eye than ever before where it takes time to finish analysis
• It has been discovered that some analysis such as post Hoc, complex
analysis in time series, regression and variance analysis cannot be
calculated manually effectively without statistical software packages
• statistical software has contributed immensely to social research especially
in the area of demographic and data analysis
50. • Statistical software packages have been discovered to help academic
staffs in higher institution to improve their research expertise by attending
training on usage of statistical packages.
• Statistical packages make research work robust and faster.
• It was discovered that 81% efficiency of staff in statistical software is
determine by the years of experience in usage and the area of
specialization.
• Most reason for using statistical software is its easy usage, suitability for
many statistical analysis
• While reason for non usage range from lack of attention to learn, difficult
usage, cost of licensing
• statistical software are not expensive neither are they too difficult to use
but people do not give attention to its learning
51. • To provide magnitude of any health problem in community
• To findout basic factors underlying ill-health
• To calculate sample size from large population
• To calculate survival rates of varius diseases
• To determine association between two variables
• To study prevalence & incidence of disease
• To findout odd ratio ,relative risk ,attributable risk in case controll &
cohort
• To find out normal distribution of disease
• To test usefullness of both sera & vaccines
• Role of causative factors in disease
• To introdue & promote health legislation
52. • To evaluate the activity of drug
• To explore changes produced by drug are whether due to action of
drug or by chance
• To compare the actions of two or more different drugs
• To find out association between disease & risk factor like coronary
artery disease & smoking
• Population genetics inorder to findout variation in genotype &
phenotype
• Genomics & Proteomics
• Demography
53. • Education
• Government
• Marketing organizations
• NGO’s
• Telecommunication
• Banking
• Insurance
• Healthcare
• Manufacturing
• Social science
• Health scinece
• Pharmacy
• Economics