Publishing in Public Health

445 views

Published on

This session (from CORE Group Fall 2008 meeting) provides an overview of the things to consider when seeking to publish an article in a public health journal. Elements discussed included: developing a focus for your article, writing an abstract, working with field staff to gather data and information, space limitations, and working with an editorial review board.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
445
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Publishing in Public Health

  1. 1. Publishing Roundtable: Publishing results of program evaluations Peter Winch, Houkje Ross, Jim Ricca CORE Fall Membership Meeting 2008
  2. 2. Some recent history <ul><li>Session on publishing at 2008 CORE Spring Meeting in Atlanta: David Marsh & myself </li></ul><ul><li>Elluminate session “Documenting and Testing Innovations”, 15 Aug 2008 </li></ul><ul><li>Today: Publishing Roundtable, focus on how to publish program evaluations </li></ul>
  3. 3. “ Emergent” topic for today’s roundtable <ul><li>CORE members frequently have exciting results from final project evaluations </li></ul><ul><ul><li>Eagerness to disseminate the results beyond the narrow audience for the final report </li></ul></ul><ul><ul><li>Final results may represent many years of effort, want to demonstrate what has been achieved </li></ul></ul><ul><li>Can final project evaluations be turned into peer-reviewed articles? </li></ul>
  4. 4. Challenges to publishing results of final evaluations <ul><li>Estimation of effort required </li></ul><ul><li>Identification of key messages </li></ul><ul><li>Selection of journal </li></ul><ul><li>Presentation of quantitative data </li></ul><ul><li>Presentation of qualitative data </li></ul>
  5. 5. Challenges to publishing evaluations 1. Estimation of effort required <ul><li>Implementing the project, carrying out the final evaluation, and writing the final evaluation report take much work </li></ul><ul><li>When people aim to publish a paper out of a final evaluation, there is often an assumption that most of the work is already done </li></ul><ul><li>Also difficult to decide who will do the work, as project staff and evaluation team may quickly move on to other things </li></ul>
  6. 6. Don’t Underestimate Level Of Effort <ul><li>Remember What’s Involved in Article Development </li></ul><ul><ul><li>Data and information collection </li></ul></ul><ul><ul><li>Field staff and others involved in the project need to be available to answer any questions </li></ul></ul><ul><ul><li>Dedicated person to write up the information </li></ul></ul><ul><ul><li>Review team </li></ul></ul>
  7. 7. Challenges to publishing evaluations 2. Identification of key messages <ul><li>Final evaluation reports tend to be comprehensive, deal with all aspects of the project </li></ul><ul><li>One reason they are comprehensive is need to be accountable to donor for how funds were spent </li></ul><ul><li>No article can be comprehensive, need to strategically select a few key messages, and present data to support messages </li></ul>
  8. 8. Challenges to publishing evaluations 2. Identification of key messages <ul><li>One approach is “abstract first” </li></ul><ul><ul><li>Write ~ 300 word abstract prior to working on the article </li></ul></ul><ul><ul><li>Good to start with a structured abstract </li></ul></ul><ul><ul><li>Show abstract to some people not involved in the project – Do they get excited about the article? </li></ul></ul><ul><ul><li>Then outline the article, only including content directly related to the abstract </li></ul></ul>
  9. 9. Abstract example #1 <ul><li>Belachew T, Nekatibeb H. Assessment of outpatient therapeutic programme for severe acute malnutrition in three regions of Ethiopia. East African Medical Journal. 2007; 84(12): 577-588. </li></ul><ul><li>This is a structured abstract, and the structure is indicated by the labels. </li></ul><ul><li>A bit long: 358 words. </li></ul>
  10. 10. <ul><li>OBJECTIVE : To document the experiences and lessons for rolling out of the OTP service at the wider scale with the aim of assessing the strengths and weaknesses of the project and suggest recommendations for future programming. </li></ul><ul><li>DESIGN : Qualitative methods of data collection including focus group discussion, observation and in-depth interview of key informants were employed to get relevant data. Review of health facility, reports and programme documents were done to capture further- information. </li></ul><ul><li>SETTING : Out Patient Programme (OTP) pilot programme implemented by CONCERN/ VALID in three administrative regions of Ethiopia namely: South Nations and Nationalities Peoples Regions (SPNNR), Addis Ababa and Oromia regions. A total of thirteen health centres which had started OTP service from the three regions were included in the study. </li></ul><ul><li>SUBJECTS : Thirty six key informants and 30 focus group discussants were involved in the study conducted from 16th to 25th November 2006. </li></ul><ul><li>RESULTS : Out Patient Programme (OTP) has enhanced community's understanding of malnutrition as a health problem through an excellent entry point it created for behaviour change communication (BCC) on optimal infant and young child feeding (IYCF). It has also enhanced utilisation of the existing equipments of the respective health services to promote nutrition and increased mental satisfaction of the providers who observed rapid recovery of malnourished children taking the plumpy nut. It also resulted in increased awareness of the community about malnutrition and its treatment, which resulted in increased need-based demand for the OTP and self-referral of children to health facilities. Shift in the thinking of the providers on the fact that malnutrition can be treated without admitting the child and reduction in the burden of malnutrition and associated mortality are other positive findings of the study. </li></ul><ul><li>CONCLUSION : While it was observed that the programme was very effective in treating case of severe acute malnutrition and is highly acceptable by planners, health care providers and beneficiaries, there were different operational issues that needed to be strengthened. The irregularity and incompleteness of supply availability, high attrition of trained human power, inadequate supportive supervision especially from local ministry of health, inadequate community mobilisations are some of the shortcomings identified. Based of these findings recommendations were forwarded. </li></ul>
  11. 11. Abstract example #2 <ul><li>Sibley L, Buffington ST, Tedessa L Sr, McNatt K. Home-Based Life Saving Skills in Ethiopia: an update on the second phase of field testing. Journal of Midwifery and Womens Health. 2006;51(4): 284-291. </li></ul><ul><li>Not a structured abstract, but has a structure. </li></ul><ul><li>Much shorter: 195 words. </li></ul>
  12. 12. <ul><li>Home-Based Life Saving Skills (HBLSS) was integrated over 3 years into a district-level child survival project coordinated through the Ministry of Health and Save the Children Foundation/US in Liben Woreda, Guji Zone, Oromia Region, southern Ethiopia. </li></ul><ul><li>During late 2004, the second phase of the program was reviewed for performance, home-based management, learning transfer, and program coverage. </li></ul><ul><li>The immediate posttraining performance score for HBLSS guides for &quot;First Actions&quot; was 87% (a 78% increase over the pretraining baseline) and 79% at 1 year (a 9% decrease from the immediate posttraining score). The home-based management score of women attended by HBLSS guides for &quot;First Actions&quot; was 89%, compared to 32% for women assisted by other unskilled attendants. HBLSS guides teach women and families in the community as they were taught, by using pictorial Take Action Cards, role-play and demonstration, and a variety of venues. Estimates of HBLSS coverage suggest that HBLSS guides attended 24% to 26% of births, and 54% of women giving birth were exposed to HBLSS training. </li></ul><ul><li>The HBLSS field tests demonstrate a promising program that increases access to basic care for poor, underserved, rural populations who carry the greatest burden of maternal and neonatal mortality. </li></ul>
  13. 13. Challenges to publishing evaluations 3. Selection of journal <ul><li>The best known journals rarely publish articles based on program evaluations: Lancet, British Medical Journal, New England Journal of Medicine, American Journal of Public Health </li></ul><ul><li>They see their niche as publishing results of randomized trials </li></ul>
  14. 14. Challenges to publishing evaluations 3. Selection of journal <ul><li>Journals you might consider: </li></ul><ul><ul><li>Health Policy and Planning </li></ul></ul><ul><ul><li>Implementation Science </li></ul></ul><ul><ul><li>Tropical Medicine and International Health </li></ul></ul><ul><ul><li>Journals on maternal and child health </li></ul></ul><ul><ul><ul><li>International Journal of Gynecology and Obstetrics </li></ul></ul></ul><ul><ul><ul><li>Journal of Midwifery and Women’s Health </li></ul></ul></ul><ul><ul><ul><li>Journal of Tropical Paediatrics </li></ul></ul></ul><ul><ul><ul><li>Annals of Tropical Paediatrics </li></ul></ul></ul><ul><ul><li>Local/regional journals </li></ul></ul>
  15. 15. Challenges to publishing evaluations 4. Presentation of quantitative data <ul><li>Focus on impact on mortality and morbidity tends to cause problems with the reviewers, unless there was a strong study design: comparison group, randomization of who got the intervention, pre and post intervention measurements </li></ul><ul><li>Claims you are making should match the level of evidence </li></ul><ul><li>Although you are most excited about the mortality impact, stressing it may lead to rejection of your manuscript </li></ul>
  16. 16. What is your level of evidence? <ul><li>Higher </li></ul><ul><li>Separate implementation & evaluation teams/staff </li></ul><ul><li>Prospective study </li></ul><ul><li>Have comparison group </li></ul><ul><li>Randomization of who gets intervention </li></ul><ul><li>Many communities/ units randomized </li></ul><ul><li>Sample size calculation based on hypothesis </li></ul><ul><li>Quality control of data collection & entry </li></ul><ul><li>Lower </li></ul><ul><li>Implementation team/ staff also evaluate </li></ul><ul><li>Retrospective or cross-sectional study </li></ul><ul><li>No randomization </li></ul><ul><li>No comparison area at all </li></ul><ul><li>Only one intervention and only one comparison area or district </li></ul><ul><li>No sample size calculation </li></ul><ul><li>Routine data, no or limited quality control </li></ul>
  17. 17. Level of evidence Higher Lower Causal attribution of change to prg. Yes (or a qualified yes) No - doesn’t control for other changes Statements in your paper about the links between the observed changes and the intervention or program “ The intervention was effective in increasing X” “ The intervention had a significant impact” “ The program produced a large drop in mortality” “ A triumph for humanity” “ Two thumbs up” “ We observed higher coverage in the intervention area” “ The intervention was associated with increases in X and Y” “ We recorded fewer deaths in the intervention area, which may be due to X, Y or Z…..”
  18. 18. Challenges to publishing evaluations 5. Presentation of qualitative data <ul><li>Key data may be qualitative </li></ul><ul><li>Presenting quotes can use up much of your word count </li></ul><ul><li>Sometimes quotes can be presented in tables </li></ul><ul><li>Need to find other program evaluation papers to see how they present qualitative data </li></ul>
  19. 19. Finding your focus
  20. 20. <ul><li>“ The difficulty of literature is not to write, but to write what you mean.” </li></ul><ul><li>--Robert Louis Stevenson </li></ul>
  21. 21. YOUR PAPER NEEDS A FOCUS
  22. 22. Consider Your Purpose & Audience <ul><li>Who do you want to share the information with? </li></ul><ul><ul><li>Other CORE members? </li></ul></ul><ul><ul><li>Outside audiences? </li></ul></ul><ul><ul><li>A specific publication? </li></ul></ul><ul><li>For what purposes? </li></ul><ul><ul><li>Sharing effective strategies </li></ul></ul><ul><ul><li>Publicity </li></ul></ul><ul><ul><li>Knowledge sharing among colleagues </li></ul></ul>
  23. 23. Some Things to Focus On… <ul><li>New method( ie. No one has tried this before, you did, and you have promising results) </li></ul><ul><li>A particular aspect of your program or project that worked well and achieved a measurable result </li></ul><ul><li>A particular aspect of your program that changed something (like policy or behavior) in some way </li></ul>
  24. 24. Too many good ideas? Finding A Topic <ul><li>Take your time in developing a topic </li></ul><ul><li>Review what you know </li></ul><ul><li>Brainstorm— </li></ul><ul><ul><li>Allow yourself and/or your team time to consider the ideas </li></ul></ul><ul><ul><li>Look for ideas that are manageable and doable </li></ul></ul>
  25. 25. Try Clustering Changing Cultural Norms Behavior Change Methods Grandmother Solidarity Circles Training and Supervision of Volunteers M & E Revitalized Health Huts DOTS Bamako Initiative 1987 TB Services at Community Level? Cultural Views of disease What are its components? Senegal Health System
  26. 26. Where are the Information Gaps? <ul><li>After you cluster, your idea may become clear </li></ul><ul><li>Additional information gathering may be needed </li></ul><ul><ul><li>Interviews with project staff </li></ul></ul><ul><ul><li>Requests for M&E data </li></ul></ul><ul><ul><li>Review of other literature to make comparisons </li></ul></ul>
  27. 27. What do You Mean? <ul><li>Details are important. You don’t want your reader to have to ask, “what does that mean?” </li></ul><ul><li>It’s not enough to say: This project was innovative. </li></ul><ul><li>You must tell the reader what you mean when you say innovative. </li></ul>
  28. 28. Examples for discussion <ul><li>Feedback from Lancet on manuscript estimated lives saved from NGO programs </li></ul><ul><li>Evaluation of CCF programs in Senegal </li></ul><ul><li>Evaluating quality in PEPFAR programs: URC </li></ul>
  29. 29. CSHGP Lives Saved Article Case Study Submitted to The Lancet March 2008 Publishing Roundtable CORE Fall Meeting October 2, 2008
  30. 30. CSHGP Lives Saved article: Background <ul><li>Preliminary work had been done over the last 2 ½ years, continually refining the initial analysis </li></ul><ul><li>Various stages of the work had been presented at CORE x 3, USAID x 2, APHA x 2, and GHC; also in last three CSHGP Results Reports </li></ul><ul><li>There was a desire for a peer-reviewed publication to increase the credibility and visibility of the work done by PVOs generally and CSHGP grantees specifically </li></ul>
  31. 31. CSHGP Lives Saved article: Key Issues <ul><li>What would be the main theme? </li></ul><ul><ul><li>The method itself? </li></ul></ul><ul><ul><li>The level of impact of projects? (If so, how to present results of “small” projects in a meaningful way?) </li></ul></ul><ul><ul><li>The cost-effectiveness? (If so, what measure and what benchmark?) </li></ul></ul><ul><li>Lancet wants RCTs. Was there a control that we could retrospectively try to use? </li></ul>
  32. 32. CSHGP Lives Saved article: Abstract <ul><li>Methods </li></ul><ul><li>The IMPACT tool developed by the Child Health Epidemiology Reference Group (CHERG) was used to convert project-generated outcome data into estimates of impact on reducing child mortality. Regional (if available) or national trend in under-five mortality was subtracted from the modeled project impact. The additional mortality decline was attributed to the project. Project cost data were also analyzed. The additional costs of the project-associated community-oriented activities were divided by the incremental impact estimated using the IMPACT tool to give the incremental cost effectiveness. Contextual analysis for explanatory and confounding factors was done through a review of project documentation and interviews with project staff. </li></ul><ul><li>  </li></ul><ul><li>Findings </li></ul><ul><li>These integrated community-oriented projects intervened on an average of 8·6 CHERG life-saving indicators (range 4 to 12). In the aggregate, they achieved statistically significant positive change on an average of 6·3 of these indicators (range 1 to 12). The interventions most responsible for the estimated mortality decline were preventive and delivered in community settings, although not exclusively so. These outcomes were modeled to result in a median estimated annual decline in under-five mortality of 5·6% (range 0·3 to 10·9%). Seventeen of the 22 projects are estimated to have an impact exceeding the comparison regional/national mortality trend, generating an estimated additional median annual mortality decline of 3·0% (range 0·0 to 9·1%). Among the top tercile of projects, the median estimated cost per project-attributable Disability Adjusted Life Year was $65 (range $22 to $119). A variety of specific community mobilization approaches were used by the projects in the top tercile, but all were strongly community-oriented, built on locally relevant community structures that could deliver a variety of interventions simultaneously, combined with behavior change strategies that focused heavily on peer support. </li></ul>
  33. 33. CSHGP Lives Saved article: Discussion <ul><li>Discussion (pg. 20) </li></ul><ul><li>As concerted efforts are made to accelerate the achievement of MDG 4 with a focus on equity and health for all, it is essential to include effective and cost-effective community-oriented, integrated approaches as part of national packages. Although this analysis points to the importance of strong community mobilization approaches to child mortality reduction in underserved populations, further systematic study of these approaches is necessary. 23,24 These data and lessons from practice contribute to the national and global evidence base that is critical to revitalizing a focus on primary health care, particularly at the district level. Such analysis generates lessons that can be used by global health partners to identify innovative and effective approaches that improve coverage, especially for vulnerable populations. Project- and program-level information from the CSHGP-funded NGO projects can be used as a starting point for the development of systematic studies, using more rigorous research designs, to develop and evaluate a mix of cost-effective delivery approaches. A shared agenda with NGOs that uses their contribution to program learning and improved monitoring will significantly contribute to further strengthening the partnerships necessary for achievement of MDG 4. </li></ul>
  34. 34. Selected Reviewer Comments <ul><li>Reviewer # 2 </li></ul><ul><li>Would it not be relevant to consider maternal mortality as an outcome as well as under 5? </li></ul><ul><li>Reviewer #3 </li></ul><ul><li>The contribution of this analysis is the general finding that NGOs can contribute to going to scale, and in theory can provide a testing ground for innovative delivery strategies.  Unfortunately, the use of pre-post designs with no comparison group for monitoring the effects of the NGO programs is too weak to support an analysis leading to any more definitive conclusions. </li></ul><ul><li>Reviewer #4 : </li></ul><ul><li>p. 20, para 3:  &quot;Such analysis generates lessons…&quot;  Unfortunately this analysis does not generate many lessons.  It does indicate that NGOs can reach significant populations of children, which is important and overlooked, and that simple adequacy evaluations suggest that NGO activities are associated with increases in coverage.  That's the only lesson that is supported by the analysis, except for the important methodological point that NGO projects need to have stronger evaluation designs so that their important contributions can be documented and used to improve program and project design. </li></ul>
  35. 35. Reviewer #4 :  I have some problems with Figure 3. The figure as it stands ONLY applies to these specific programmes and the specific areas and populations they covered. It is incorrect to interpret this as showing which interventions save the most lives, since not all interventions were included in all the programmes.
  36. 36. Discussion Points <ul><li>Was the choice of journal appropriate? Why or why not? </li></ul><ul><li>What do we think about the reviewer comments? How would we address them? </li></ul>

×