Designing and methodology of a study<br />             Pg guide: dr s r suryawanshi              student: dr saiprasad bha...
Researchis a topic that many find intimidating<br />This issue of Research methodology will look at theconcept of research...
Basic classification<br />Various issues in any of the research methodology can be conveniently divided into six steps for...
4. Reviewing the literature<br />5. Study design<br />6. Quantitative data analysis.<br />
What is research<br />A broad definition of research is any activity which is undertaken to increase the knowledge.<br />I...
Evaluation:- People who work In the health area are involved in the implementation of projects, which aim to improve the h...
Quality improvement(QI):- QI provides a framework for monitoring and improving performance, by systematically reviewing ca...
What is evaluation?<br />Evaluation is the process of judging the worth of something.<br />It can determine if an interven...
Some basic principles of evaluation<br />Why to EVALUATE?:- <br />A good program evaluation will tell you and others:<br /...
how well it has contributed to the goal, met the objectives and undertaken the strategies;
what worked well and what didn’t, and why;
whether there were any unintended outcomes; and
what can be learnt from the program to improve practice and inform other programs.</li></li></ul><li>When should evaluatio...
What steps are involved in evaluating a program?<br />There are three broad tasks involved in evaluating a program<br /><u...
assessing the results;
communicating the results and recommendations.</li></li></ul><li>What should be in an evaluation plan?<br />Developing an ...
selecting the scale and scope;
determining the methodology;
organising how the evaluation</li></ul>will be conducted.<br />
What types of evaluation are there?<br /><ul><li>Formative and summative</li></ul>1) Formative evaluation:-<br />-conducte...
Process, impact and outcome evaluation<br />Process evaluation:<br />-focuses on how the program has been implemented;<br ...
How to formulate a research question<br />Research originates with an idea about some general problem or question. <br />T...
Descriptive studies ask simpler questions about what is going on. <br />For example, .How many or what proportion of patie...
First steps in formulating your question<br /><ul><li>Focusing analytical questions:-</li></ul>Well-built clinical questio...
Criteria for a good question<br />A good research question is described by the acronym FINER<br /><ul><li>Feasible (adequa...
Interesting to the investigator
Novel (confirms or refutes previous findings, provides new findings)
Ethical
Relevant (to scientific knowledge, clinical and health policy, future research directions)</li></li></ul><li>Reviewing the...
Identifies past relevant studies as well as the methods used
Assists in refining your research question
Puts the project and methodology into a relevant context
Adds valuable background to the study or formal report
Suggests areas requiring further investigation
Required for funding applications</li></li></ul><li>The two main components are conducting the search and critically appra...
Critical appraisal skills<br />Critical appraisal skills are essential for helping to decide if published research is of s...
What are the results?
Will the results help locally?</li></li></ul><li>STUDY DESIGN<br />Various designs are available depending on the kind of ...
Descriptive and analytical studies<br />One way of classifying study design is to divide them into descriptive studies tha...
Main study designs <br /><ul><li>Randomised controlled trials (RCT)
Intervention trials
Longitudinal studies
Cross-sectional studies
Cohort studies
Upcoming SlideShare
Loading in …5
×

Research methodology

1,260 views
1,164 views

Published on

research Methodology by Saiprasad Bhavsar

0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,260
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
80
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide

Research methodology

  1. 1. Designing and methodology of a study<br /> Pg guide: dr s r suryawanshi student: dr saiprasad bhavsar<br />
  2. 2. Researchis a topic that many find intimidating<br />This issue of Research methodology will look at theconcept of research, as well as the related concepts of evaluation and quality improvement, study design and analysis activities which many health workers are familiar with through their current work.<br />
  3. 3. Basic classification<br />Various issues in any of the research methodology can be conveniently divided into six steps for the better understanding of the process.<br />Defining what is research<br />Defining what is evaluation<br />How to formulate a research question.<br />
  4. 4. 4. Reviewing the literature<br />5. Study design<br />6. Quantitative data analysis.<br />
  5. 5. What is research<br />A broad definition of research is any activity which is undertaken to increase the knowledge.<br />In the health field research can be defined as the systematic investigation of a problem, issue or question, which increases the knowledge and understanding of health and the provision of care.<br />
  6. 6. Evaluation:- People who work In the health area are involved in the implementation of projects, which aim to improve the health or health service delivery.<br />Evaluation is the integral part of this implementation.<br />It can be defined as the process of value of an intervention by systematically gathering information to make more informed decisions.<br />
  7. 7. Quality improvement(QI):- QI provides a framework for monitoring and improving performance, by systematically reviewing care provided, or outcomes achieved, against explicit criteria.<br />Thus it can be stated in short that, <br />Research questions aim to discover new knowledge;<br />evaluation questions aim to judge the worth of an intervention<br />QI questions aim to examine how well something is done and improve performance<br />
  8. 8. What is evaluation?<br />Evaluation is the process of judging the worth of something.<br />It can determine if an intervention worked, help decide if it should continue, and provide evidence of effectiveness to obtain additional funding.<br />
  9. 9. Some basic principles of evaluation<br />Why to EVALUATE?:- <br />A good program evaluation will tell you and others:<br /><ul><li>what your program has done;
  10. 10. how well it has contributed to the goal, met the objectives and undertaken the strategies;
  11. 11. what worked well and what didn’t, and why;
  12. 12. whether there were any unintended outcomes; and
  13. 13. what can be learnt from the program to improve practice and inform other programs.</li></li></ul><li>When should evaluation occur? <br />Evaluation should run in parallel to the planning and implementation of your program.<br />It should become part of a continual development process by providing feedback about progress, encouraging reflection about outcomes, and providing a basis for considering future strategies.<br />
  14. 14. What steps are involved in evaluating a program?<br />There are three broad tasks involved in evaluating a program<br /><ul><li>developing an evaluation plan;
  15. 15. assessing the results;
  16. 16. communicating the results and recommendations.</li></li></ul><li>What should be in an evaluation plan?<br />Developing an evaluation plan involves:<br /><ul><li>clarifying the purpose of your evaluation;
  17. 17. selecting the scale and scope;
  18. 18. determining the methodology;
  19. 19. organising how the evaluation</li></ul>will be conducted.<br />
  20. 20. What types of evaluation are there?<br /><ul><li>Formative and summative</li></ul>1) Formative evaluation:-<br />-conducted early in the implementation of a program;<br />-aims to identify problems that arise during development and allows modification.<br />2) Summative evaluation:<br />-conducted at the end;<br />-looks at effects or impact;<br />-helps to decide what to do next.<br />
  21. 21. Process, impact and outcome evaluation<br />Process evaluation:<br />-focuses on how the program has been implemented;<br />-assesses whether activities were conducted as planned.<br />2) Impact evaluation:<br />-focuses on the immediate effects of the program;<br />-judges how well the objectives were met.<br />3) Outcome evaluation:<br />-focuses on the longer term effects of the program;<br />-judges how well the goal has been achieved<br />
  22. 22. How to formulate a research question<br />Research originates with an idea about some general problem or question. <br />This problem or question is narrowed down to a more specific research question, which then represents the central issue being addressed.<br />First, it is important to distinguish between descriptive and analytical studies.<br />
  23. 23. Descriptive studies ask simpler questions about what is going on. <br />For example, .How many or what proportion of patients admitted to hospital are with a fractured neck of femur?.<br />Analytical studies compare one or more interventions or exposures.<br />For example, Is it more effective to educate GPs about depression guidelines with group education sessions or practice visits?. or .Is lung cancer associated with cigarette smoking?.<br />
  24. 24. First steps in formulating your question<br /><ul><li>Focusing analytical questions:-</li></ul>Well-built clinical questions usually contain four elements<br />-Patient or problem. Starting with your patient, ask yourself, How would I describe a group of patients similar to mine?<br />-Intervention or exposure: Ask, Which main intervention am I considering?<br />-Comparison intervention: Ask, What is the main alternative to compare with the intervention?<br />- Outcomes: Ask, What can I hope to accomplish? or What could this intervention really affect?<br />
  25. 25. Criteria for a good question<br />A good research question is described by the acronym FINER<br /><ul><li>Feasible (adequate subjects, technical expertise, time and money, and scope)
  26. 26. Interesting to the investigator
  27. 27. Novel (confirms or refutes previous findings, provides new findings)
  28. 28. Ethical
  29. 29. Relevant (to scientific knowledge, clinical and health policy, future research directions)</li></li></ul><li>Reviewing the literature<br />Why do a literature review?<br /><ul><li> Determines to what extent the issue or research question has been previously researched
  30. 30. Identifies past relevant studies as well as the methods used
  31. 31. Assists in refining your research question
  32. 32. Puts the project and methodology into a relevant context
  33. 33. Adds valuable background to the study or formal report
  34. 34. Suggests areas requiring further investigation
  35. 35. Required for funding applications</li></li></ul><li>The two main components are conducting the search and critically appraising the results of your search, the published papers.<br />There are a number of courses on how to do literature searches.<br />These are available through Area Health Services (e.g. through the Clinical Information Access Program - CIAP) and university libraries.<br />Free databases are available at Pub Med, Cochrane Library and PEDro.<br />
  36. 36. Critical appraisal skills<br />Critical appraisal skills are essential for helping to decide if published research is of sufficiently high quality. Critical appraisal checklists have been produced by many authors.<br />CASP (2000) identified three broad issues that need to be considered when appraising research. They are:<br /><ul><li>Are the results of the study valid?
  37. 37. What are the results?
  38. 38. Will the results help locally?</li></li></ul><li>STUDY DESIGN<br />Various designs are available depending on the kind of question/s being asked. <br />For example, descriptive questions do not require a randomised controlled trial (RCT), but studies to evaluate a new treatment would benefit from a RCT.<br />Following are the various types of the study designs used for the study depending upon the conditions appropriate for the study.<br />
  39. 39. Descriptive and analytical studies<br />One way of classifying study design is to divide them into descriptive studies that describe a situation or analytical studies that try to explain a situation by formulating and testing hypotheses<br />Non-experimental and experimental studies<br />Studies can also be categorised as non-experimental (or observational) with no intervention or experimental where the researcher intervenes, e.g. by introducing a new treatment<br />
  40. 40. Main study designs <br /><ul><li>Randomised controlled trials (RCT)
  41. 41. Intervention trials
  42. 42. Longitudinal studies
  43. 43. Cross-sectional studies
  44. 44. Cohort studies
  45. 45. Case-control studies
  46. 46. Case study or series</li></li></ul><li>RANDOMIZED CONTROLL TRIAL<br /><ul><li>In true sense this is an experimental epidemiological study.
  47. 47. Heart of this control trial is Randomization
  48. 48. Randomization is the statistical procedure by which the participants are allocated in the groups usually ‘study’ and ‘control’ groups. To receive or not to receive the intervention.
  49. 49. Thus it ensures that the investigator has no control over the allocation of participants thus eliminating what is known “Selection Bias”</li></li></ul><li><ul><li>Thus every individual gets an EQUAL opportunity to get selected into any of the trial group.
  50. 50. This can be done by Blinding
  51. 51. There are three types</li></ul>Single Blinding<br />Double Blinding<br />Triple Blinding<br />
  52. 52. Types of Randomized trials are as follows<br />Clinical trials:- E.g., The evaluation of beta-blockers in reducing the cardiovascular mortality in patients surviving the acute phase of myocardial infarction. <br />Preventive trials:- E.g., Vaccination trial<br />
  53. 53. C) Risk factor trial:- E.g., four main intervention to reduce the CHD are reduction in the blood cholesterol, cessation of smoking, control of hypertension, and promotion of regular physical exercise.<br />d) Cessation experiments:- E.g., decreased incidence of lung cancer in the group who have given up the smoking.<br />
  54. 54. e) Trial of etiological agent:- E.g., Retrolental fibroplasia as a cause of the blindness in premature babies exposed of high concentration continuous supply of oxygen than the control group who have been given oxygen only on emergency basis.<br />f) Evaluation of health service:- E.g., Excellent example is that of chemotherapy of tuberculosis in India which showed ‘Domiciliary treatment’ of TB was as effective as much costlier hospital or sanatorium treatment.<br />
  55. 55. COHORT STUDY<br /><ul><li>Is the another type of analytical or observational study
  56. 56. Known as prospective study, longitudinal study, incidence study, forward looking study.
  57. 57. Distinguishing features are-</li></ul>Cohorts are identified prior to the appearance of the disease<br />Study group so defined are observed over a period of time to determine the frequency of the disease<br />Study proceeds forwards from cause to effect.<br />
  58. 58. Advantages are-<br />Incidence can be calculated<br />Several possible outcomes<br />Direct estimation of relative risk and Attributable risk<br />Dose response ratio can be calculated<br />Bias like misclassification of groups into diseased and non diseased is minimized<br />
  59. 59. CASE CONTROL STUDY<br />Often called “retrospective Study” is a common first approach to test the causal hypothesis.<br />Case Control study has three distinct features<br />Both exposure and outcome have occurred prior to the start of the study<br />The study proceeds backwards from effect to cause<br />It uses control or comparison group to support or refute the study.<br />This is basically a comparison study<br />
  60. 60. e) Estimation of the relative risk and odds ratio is the main distinguishing feature of case control study<br />
  61. 61. Quantitative data analysis<br />Type of variables<br />An item of data that can be observed or measured is called a variable. There are two main types of variables.<br />1) Numerical variables can be: <br />-Discrete variables - values that are separate and distinct e.g. number of GP visits; or<br />-Continuous variables - when all values are possible e.g. blood pressure, weight.<br />
  62. 62. 2 )Categorical variables represent membership of a particular category. They can be:<br />-Ordinal variables – several categories where order is relevant e.g. physical activity measured as minimal, moderate or vigorous;<br />- Nominal variables - no natural order e.g. area of residence; or<br />-Dichotomous variables – only two responses e.g. yes/no.<br />
  63. 63. In summarising numerical data, some common measures are:<br /><ul><li>Mean - sum of all individual counts or measures divided by the number of individuals;
  64. 64. Mode - most frequently occurring count or measure across a group of individuals;
  65. 65. Median - middle observation in a sample of individuals;
  66. 66. Range - difference between the maximum and minimum observations; and
  67. 67. Standard deviation - a measure of how much the individual data tend to deviate from the mean.</li></li></ul><li>To describe the relationship between numerical variables, a common test used is the correlation coefficient, r. Correlations range between –1 and +1 and describe the nature and degree of association between two variables.<br />It is important to remember that correlations are not concerned with causality. An additional factor may underlie both variables.<br />
  68. 68. In summarising categorical data, counts are used. They can also be expressed as proportions or percentages by dividing the count by the total number of individuals.<br />While categorical variables may be coded using numbers, it is important not to summarise them as numerical data. e.g.average coded numbers in Likert scales as below:<br />Very dissatisfied very satisfied<br />
  69. 69. The relationship between categorical variables is usually presented in a contingency table and tested using chi-square test .<br />Statistical tests:- There are many statistical tests available. Two of the most common tests are the t-test for numerical data and the chi-square test for categorical data.<br />
  70. 70. Inferential statistics In inferential statistics you are trying to reach conclusions about a population, based on a sample of individuals from the population.<br />In hypothesis testing, statistical methods are used to determine the probability of obtaining the observed effect by chance. The p value of your chosen statistical test is compared to the level of significance (usually set at 0.05 or 0.01).<br />For example, a t-test with a p-value of 0.03 (and level of significance of 0.05) indicates that the results are not due to chance and are statistically significant.<br />
  71. 71. Confidence intervals can also be used. Based on a sample, they are estimations of a range of values (confidence interval) within which the population parameter is likely to lie.<br />With a level of significance set at 95%, you can say that based on your sample, you are 95% confident that the population value lies within your confidence interval.<br />
  72. 72. REMEMBER<br />The many steps as explained earlierin planning, implementing and completing research<br />Among these, it is essential to choose the appropriate study design<br />Plan carefully before implementing your study <br />
  73. 73. Choice of a Good Study Design is very important because <br />A badly designed study can NEVER be RETRIEVED……. <br />
  74. 74. THAT’S ALL FOR THE DAY….<br />THANK YOU VERY MUCH…..<br />

×