Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Retrospective Study Design

2,646 views

Published on

Retrospective Study Design by Dr Ashok S Gavaskar, Assistant Editor, Indian Journal of Orthopaedics

Published in: Health & Medicine
  • Be the first to comment

Retrospective Study Design

  1. 1. Ashok S Gavaskar Asst. Editor - Indian Journal of Orthopaedics Retrospective study design How to set it up? Research workshop - IOACON’ 16
  2. 2. Retrospective study - design • most common form of analysis (Data originally collected for other reasons) • quick • not expensive • rare outcomes • long latent period • generates hypothesis • Bias • Cannot provide valid solutions
  3. 3. ` Outcome -measurable parameter of clinical interest “Has already occurred” Retrospective design: Key points
  4. 4. Retrospective design: Key points Exposure: ‘Factor of interest’ Interventional (can only be prospective) you control the factor of interest Observational (prospective/retrospective) “you just observe”
  5. 5. Retrospective design: Key points Cross-sectional Cohort Case control Observational (retrospective)
  6. 6. Cross sectional design No direction One time (eg: Survey) 1 3 2 Different groups compared at ONE time • Descriptive purposes (states the problem) • Poor inference
  7. 7. Case control design Unexposed Exposed Exposed Unexposed DISEASE (cases) DISEASE (controls) Review records Review records • Rare outcome (only one outcome) • Multiple exposures • Inference - moderate
  8. 8. Cohort design Unexposed Exposed outcomes (study begins) records review Disease NoDisease Disease NoDisease • common outcomes (multiple outcomes) • Multiple exposures • Strongest - Observational
  9. 9. Doing a good retrospective study Research Question • Description • Relationship • Comparison what is going on? (incidence/prevalence research) proportion/percentage/ central tendency/ variability how phenomena are related? correlation co-efficients variable of interest (difference among groups) central tendency
  10. 10. Literature review • An essential pre-requisite • Systematic review (study’s area of focus, demographics, criteria) • Multiple databases • Background - key concepts & variables
  11. 11. Study proposal • abstract • introduction • research question • literature review • methodology • significance • limitations • budget • references Sample Design Variables Instruments
  12. 12. Key elements: Sampling issues • Sample size • Sampling strategy • Key element in any research proposal
  13. 13. Sample size • Power analysis (probability of rejecting the null hypothesis) related to sample size (10 cases per variable) Tools: • textbooks • journal articles • downloadable software programs (G*Power 3.0)
  14. 14. Sampling strategy • Convenience sampling what is available at disposal (e.g:cases with in a particular time frame) • rare cases, outcomes • small sample size
  15. 15. Sampling strategy Gold standard, (has equal chance) • suitable for multi-centre trials • common disorders • Random sampling
  16. 16. Sampling strategy every nth case is selected (not truly random) access to large number of records • Systematic sampling
  17. 17. Study proposal • abstract • introduction • research question • literature review • methodology • significance • limitations • budget • references • Future prospective studies • Variables Define Operationalise (literature review) translating a construct to its manifestation
  18. 18. Study proposal • Design • flow of information • go through a few charts • on site clinicians (multi-centre) • abstract • introduction • research question • literature review • methodology • significance • limitations • budget • references
  19. 19. Methodology • Instruments (paper/digital) Paper cost effective pre-printed form (avoids coder’s interpretation of data) not so good….. • Handwriting • storage • maintenance
  20. 20. Methodology • Instruments • Digital • large RCRs • centralisation of data storage • entry and transcription errors • can be generated from software packages
  21. 21. Data abstraction Inclusion/ Exclusion criteria • lack of sufficient variables recorded • presence of excessive/confounding co- morbidities • confounding factors that can degrade the validity of data Constant review to assess excluded data
  22. 22. Data abstraction • Coding/procedure manual to ensure accuracy, reliability & consistency of data • clear definitions • protocols • steps for data extraction
  23. 23. Data abstraction • Data abstractors: • selection & training • blinding reviewer bias • Intra and inter -rater reliability
  24. 24. Data abstraction • Intra & Inter rater reliability • (statistical estimate to report consistency in coding) Inter: Cohen kappa (extent of agreement -1 to +1, for RCR: 0.6) Intra: calculating ICC (intra class correlation)
  25. 25. Data management • Data management Software package (Microsoft access, Medquest) • data input • statistics • reporting
  26. 26. Pilot study • Very useful helps to assess study design feasibility evaluate methodology • 10% of the target population
  27. 27. Summary • Well defined research questions • Sampling: size & strategy • Operationalise variables • Data abstraction process: most important • Inclusion and exclusion criteria • Observer reliability • Pilot test For a good RCR…

×