My name is Julie. I am interested in new theories and directions in education and workforce training. My project surveys the decision-support practices and training needs of nonprofit organizations and outlines additional management training in the area of program evaluation.
What I wanted to do with this project is linkpast experiences with management education to develop additional but needed training for nonprofit managers.Current nonprofit education was developed long ago and includes a broad and rich curriculum that is both relevant and essential – including organizational behavior, marketing, budget & Finance, Law and ethics… But curriculums are not static and must adapt to changed demands in workplace.Now more than ever – to substantiate funding – government agencies and nonprofits must demonstrate organizational value to stakeholders demanding evidence of performance in achieving community-wide and often politically charged goals.The situation is so severe that there is a movement towards professionalizing the field of program evaluation. Indeed a constellation of events coalesce to make it so:There is a peer-reviewed journal and a respected national association. There is a lot of discussion among researchers and practitioners surrounding development of a set of essential competencies for program evaluators.The Essential Competencies is the latest effort to develop standard competencies to facilitate improved evaluation training, enhance reflective practice, to enumerate specific skills, knowledge, and dispositions.And, finally universities are beginning to take note. In the spring of 2011, the University of Oregon began it’s nonprofit management degree program which includes a course in program evaluation – but that is rare so far.
Briefly describe the basic logic model…..A nonprofit that can clearly articulate, with appropriate evidence, a chain of logic or reasoning demonstrating how their programs and services contribute to changes in client behavior or to community goals have evaluation designs that have the greatest potential for programmatic, policy, and legislative influence. They can link project-level results to community-level outcomes.
Survey of Oregon nonprofits was designed to describe research practices and needs AND to measure the extent to which evaluation practices comply with systematic inquiry and situational analysis categories of the Taxonomy of Essential Competencies for Program Evaluators.
Demographics – useful for testing the reliability/validity of the data as well as for descriptive statistics ==county of operation, respondent’s role, activity code, evaluation requirements.Personnel Capacity – Number of volunteers and paid staff, total hours per week, staff able to conduct specific decision-support activities: trend analysis, pop estimation, synthesis & interpretation, basic descriptive statistics, etc…Evaluation Vision & Planning – looks at the overall vision and plan for evaluation – include data from multiple sources? Synthesize and interpret research findings? Identify data needs of providers of similar services or advocates or the legislature? Is data collected in one step of the program used to inform other steps in the evaluation? – a backdoor to logic model design.Data Analysis and Decision-support – use relevant geographic data and at what point in the evaluation process do they use such data? Engage in specific decision-support practices like 1) conduct a survey, 2) estimate the potential client-base, 3) analyze data for patterns, 4) analyze data for trends,5) evaluate GIS data, 6) conduct population-based evaluations 7) PBE at multiple geographic levels.Dissemination and Usage - looks at ways data are translated into decisions about programs or policies - who contributes to the analyses? Do they use their own data, do outside advocates use their data in support of their agenda’s?
Arguably, one of the most important subcategories of the “Essential Competencies” is “Specifies program theory” – aka a logic model.57% with evaluation requirements – 51.4% use data collected in one step of the planning cycle to inform activities in other steps – an internal step-wise use of data that indicates use of a logic model. But only 33.3% can identify using a logic model (pop-based) that has a chance for program/policy/leg influence – and only 14% do so at multiple geographic levels to pinpoint precisely their impact in the community.What this shows is that funder-imposed evaluation requirements do not guarantee program evaluation competence.
Research shows that the level of analysis makes a difference in studying and understanding effectiveness. And, an example of how my research is unique and contributory to evaluation research is the level of analysis. One subcategory of the Essential Competencies - “Analyzes data” – is nondescript so I’m demonstrating levels of analysis.Of nonprofits with evaluation requirements 57% reported the nonprofit analyzed client data for patterns BUT 25% reported analyzing time-series data for trends – in the first instance finding outliers and anomalies but not – as in the second case – identifying changes to the client-base or impact to the community over time.An even deeper level of analysis – using time-series data to develop predictive models to support decision-making likely has far fewer practitioners.
Essentially, in my study, the extent to which a nonprofit complies with the essential competencies breaks down to four groups with unique organizational behaviors and characteristics.
Nonprofits that consistently conduct population-based outcome evaluations are technically skilled evaluators in a better position to defend their missions.BUT – they do not attend to issues of evaluation use – an even higher level of performance is displayed by influential nonprofits.
Nonprofits that consistently conduct population-based outcome evaluations show the highest compliance with the essential competencies than other study groups and distinct organizational behaviors and characteristics are evident.Evaluations linked to broader management strategy: Strategic plan or quality assurance effortsReports used internally by management & board membersProduce information useful to external audiencesEngage in voluminous data management and analysis activities (data collection, trend analysis, pop estimation, surveys, synthesis and interpretation, lit reviews….)Dynamic exchange of information (legislature, advocates, providers of similar studies)Role of staff in evaluation activities – empowered to respond to survey for example – and has input into the meaning of the data.
The Decision Support Practices And Research Needs Of Nonprofits
The Decision-support Practices and Training Needs of Nonprofits<br />A Proposal for Fine-tuning Nonprofit Management Education<br />Julie Grace Grey - June 2011<br />
The Situation<br />Past is Prologue<br />Current Nonprofit Management Education<br />Nonprofit Management in Practice<br />Movement to Professionalize Evaluation<br />Peer-reviewed Journal<br />National Association<br />Research and Practice <br />Taxonomy of Essential Competencies<br />University Offerings<br />
Program Logic Model & Planning Cycle<br />Planning Cycle:<br /> Surveillance –> Evaluation –>Analytic Studies –> Policy Development<br />
Survey of Oregon nonprofits<br />358 Respondents<br />252 Selected for Analysis<br />
Survey Tool<br />Demographics<br />Personnel Capacity<br />Evaluation Vision & Planning<br />Data Analysis and Decision-support<br />Dissemination and Usage<br />Top 5 Data Needs Wish List<br />
Specifies Program Theory<br />57% of nonprofits have evaluation requirements<br />51% use a logic model (step-wise use of data)<br />33% use population-based outcome evaluations<br />14% conduct population-based evaluations at multiple geographic levels (community impact)<br />
Level of Analysis<br />“Analyzes Data”<br />57% analyzed client data for patterns.<br />25% conducted trend analysis.<br />22% evaluated client data by geographic area<br />An even deeper level of analysis would use time-series data to develop predictive models in support of planning and decision making.<br />
Study Groups<br />Nonprofits without evaluation requirements – 43%<br />Nonprofits with evaluation requirements – 57%<br />Nonprofits that consistently conduct population-based outcome evaluations – 27%<br />Nonprofits whose reports/analyses are consistently used by external groups – 13%<br />
Nonprofits that Consistently Conduct Population-based Outcome Evaluations<br />Have staff that can:<br />Respond to special requests for data<br />Perform basic descriptive statistics<br />Analyze data by geographic levels<br />Synthesize and interpret research data<br />Analyze time-series data<br />Estimate potential population coverage<br />Etc…<br />But they do not attend to evaluation use…<br />
Influential Nonprofits<br />Evaluations linked to broader management plan<br />Reports used by management/board<br />Reports useful to external audiences<br />Voluminous data management activities<br />Dynamic communication between stakeholders<br />Staff role in evaluation activities<br />
Top 5 Items on Data Needs Wish List<br />Grant Writing and Fundraising<br />Collaboration w/ Local Universities & Each Other<br />Program Evaluation & Outcome Assessment<br />Technical Assistance – computer soft/hardware<br />Community-wide Database<br />
Recommendations<br />Management Programs with Nonprofit Certification<br />Program Evaluation Theory & Methods<br />Quantitative Data Analysis for Managers<br />Management Programs without NPM Certification<br />Concentration in Program Evaluation including<br />Program Evaluation Theory & Methods<br />Quantitative Data Analysis for Managers<br />Qualitative Methods, Survey Design, & Report Writing<br />Mediation & Conflict Resolution<br />Organizational Leadership<br />Grant Writing<br />Project Management<br />Vertical Practicum Series x 3<br />