SHADAC: Overview and Evaluation

741 views

Published on

Presentation by Lynn Blewett at the SHAP Grantee Meeting in Arlington, VA, Jan 14 2010.

Published in: Economy & Finance, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
741
On SlideShare
0
From Embeds
0
Number of Embeds
36
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

SHADAC: Overview and Evaluation

  1. 1. SHADAC Overview and Evaluation<br />Lynn Blewett, PhD<br />State Health Access Data Assistance Center <br />University of Minnesota, Minneapolis, MN<br />SHAP Grantee Meeting<br />January 14, 2010<br />Funded by a grant from the Robert Wood Johnson Foundation<br />
  2. 2. Overview<br />About SHADAC<br />Measuring Health Insurance<br /><ul><li>Survey Assistance</li></ul>Data Center<br />Evaluation Support<br />HRSA Benchmark Areas<br />Strategic Considerations<br />2<br />
  3. 3. About SHADAC<br />3<br />
  4. 4. 4<br />The SHADAC Vision<br />Bridging the Gap Between Research and Policy<br />
  5. 5. 5<br />What is SHADAC<br />Independent research center located at the University of Minnesota School of Public Health<br />Led by an interdisciplinary team of tenured faculty and supported by research fellows and graduate research assistants<br />Primary funding from Robert Wood Johnson Foundation<br />Additional project-specific funding from CDC, ASPE, CMS, state-specific contracts, etc.<br />New funding from HRSA to provide technical assistance to SHAP grantees <br />
  6. 6. 6<br />SHADAC Objectives<br />Support states in their data, survey, policy and evaluation activities<br />Help states monitor rates of insurance coverage and understand factors associated with uninsurance<br />Provide assistance to states on policy development, program evaluation and assessment<br />Provide support to federal agencies related to conducting health insurance surveys<br /><ul><li>Disseminate research findings in a manner that is meaningful to state and national policy-makers</li></li></ul><li>7<br />SHADAC’s Core Activities<br />Technical Consultation with States<br />State survey design and implementation<br />Clarify variance between state estimates from different surveys<br />Targeted Policy and Evaluation Work<br />Evaluation design and implementation<br />Policy analysis of coverage options<br />Best Practices for Surveys on the Uninsured<br />Best way of asking insurance, income, race/ethnicity questions<br />Reviewing and improving estimates from national surveys<br />Promoting Use of Sound Data and Methods<br />Production of issue briefs, webinars, presentations, web content<br />Translating research into useful information<br />
  7. 7. 8<br />State Health Access Reform Evaluation (SHARE)<br />National Program of RWJF<br />Supports evaluation of state health reform initiatives<br />16 single and multi-study projects covering more than 25 states<br /> Wide variety of topics including insurance market reforms, outreach and enrollment initiatives, Medicaid/CHIP expansions<br />Aim to translate this research to inform other states and the national reform debate<br />
  8. 8. SHAP Technical Assistance <br />Review grantee evaluation plans<br />Provide advice on outcome indicators, data sources, data availability, and evaluation methods<br />Help states identify data sources for benchmarks<br />Provide technical assistance to grantees in: <br />Selecting appropriate metrics to allow measurement of progress toward objectives<br />Identifying the types and sources of available data <br />Assisting in the use of longitudinal data where feasible<br />Survey assistance (as previously described)<br />Assessing differences between state and federal survey data<br />Resources can be found here: www.shadac.org/shap<br />9<br />
  9. 9. Measuring Health Insurance Coverage<br />10<br />
  10. 10. Measuring Health Insurance Coverage<br />Current Population Survey (CPS)<br />American Community Survey (ACS)<br />State-Specific Household Surveys<br />11<br />
  11. 11. Current Population Survey (CPS)<br />Currently the most commonly used survey for estimating uninsurance rates at the state and federal level<br />Nationally representative household based survey<br />Large enough sample for state‐level estimates<br />Added an insurance verification question in 2000, which improved accuracy<br />Used in SCHIP funding formula – this may be changing soon…..<br />12<br />
  12. 12. American Community Survey (ACS)<br />New source of data for health insurance coverage (2008 is first year)<br />Eventually replacing the Decennial Census long form<br />Phone survey, mail and in-person follow up<br />Large enough sample for state‐level and sub‐state estimates<br />Cities, counties, political districts and census tracks<br />13<br />
  13. 13. ACS – Benefits<br />Large sample size<br />1.94 million households per year in ACS vs. 75,477 households for CPS<br />Ability to drill down to geographic areas<br />Geographic areas with at least 100,000 people in public use file<br />Counties with populations over 65,000 in restricted Census file (smaller counties added later with multi-year avg.)<br />More precision on estimating subpopulations by state<br />e.g. low-income uninsured children<br />Point-in-time health insurance question<br />14<br />
  14. 14. ACS – Initial Concerns<br />Impact of using mail surveys in addition to telephone and in-person interviews<br />Only one health insurance question<br />None on health status, access<br />Disability-related question only<br />Does not include state-specific names for Medicaid and SCHIP<br />No verification question for health insurance coverage<br />15<br />
  15. 15. ACS Question: Is this person CURRENTLY covered by any of the following types of health insurance or health coverage plans?<br />a. Insurance through a current or former employer or union;<br />b. Insurance purchased directly from an insurance company;<br />c. Medicare, for people 65 and older;<br />d. Medicaid, Medical Assistance, or any kind of government-assistance plan for those with low incomes or a disability<br />(e) VA; (f) TRICARE; (g) Indian Health Service<br />16<br />
  16. 16. ACS – Different Data Source, Different Data<br />U.S. Census FactFinder<br />Limited age categories (0-17)<br />More variables including counties over 65,000<br />http://www.census.gov/acs/www/index.html<br />SHADAC DataCenter uses the Public Use Microdata<br />Age (0-17) or (0-18) uninsurance characteristics<br />State-level only <br />Ease of access but limited variables<br />http://www.shadac.org/datacenter<br />17<br />
  17. 17. ASC - Public Use Microdata Samples (PUMS)<br />Public use microdata sample (PUMS) is 1% of the U.S. population<br />Single-year file for geographic areas with population of 100,000 <br />Counties with populations 65,000 and over are included in the FactFinder <br />PUMS uses a different geographic area called the <br />PUMA - Public Use Microdata Area <br />http://www.census.gov/acs/www/Products/users_guide/index.htm<br />18<br />
  18. 18. Households - ACS vs. CPS Sample Size Comparison<br />19<br />Source: U.S. Census Bureau Current Population Survey Annual Social and Economic Supplement, 2008; and 2007 American Community Survey. Sample counts do not include group quarters or vacant housing units.<br />
  19. 19. ACS vs CPS - Uninsurance Rates for Adults and Adults &lt;200% FPL<br />20<br />Source: U.S. Census Bureau 2008 American Community Survey, Public Use Microdata Sample and CPS-ASEC 2009<br />Significance test for difference of ACS and CPS * p&lt;.05 **p&lt;.01 ***p&lt;.001<br />
  20. 20. ACS vs CPS - Uninsurance Rates for Kids and Kids &lt;200% FPL<br />21<br />Source: U.S. Census Bureau 2008 American Community Survey, Public Use Microdata Sample and CPS-ASEC 2009<br />Significance test for difference of ACS and CPS * p&lt;.05 **p&lt;.01 ***p&lt;.001 (Children = 0 to 18 years of age)<br />
  21. 21. What’s a PUMA?<br />Unique geographic areas <br />Required to have a minimum population of 100,000<br />All PUMA areas exceed the established population threshold (65,000), thus insuring that there will be single-year ACS data for them published each year<br />PUMAs provide more state geographic coverage but may be new to many users<br />22<br />
  22. 22. ACS - Wisconsin Uninsurance Estimates by County for Children 0-17*<br />23<br />* Summary tables from American Fact Finder contain only fixed age categories. <br />
  23. 23. ACS - Wisconsin Uninsurance Estimates by PUMAfor Children 0-18*, 2008<br />24<br />* Analysis using ACS public use microdata allows user-defined age categories.<br />
  24. 24. ACS - Wisconsin Uninsurance by PUMAfor Children 0-18 Under 200% FPL, 2008<br />25<br />
  25. 25. Survey Assistance<br />26<br />
  26. 26. Survey Assistance<br />State survey design and implementation<br />Clarify variance between state estimates from different surveys<br />Best way of asking insurance, income, race/ethnicity questions<br />Assistance to states using SHADAC&apos;s Coordinated State Coverage Survey (CSCS), a survey tool for estimating insurance coverage rates in states - http://www.shadac.org/content/coordinated-state-coverage-survey-cscs<br />Online library of state survey tools http://www.shadac.org/content/state-survey-research-activity<br />27<br />
  27. 27. Survey Assistance - State Surveys in SHAP States<br />28<br />
  28. 28. 29<br />Survey Assistance - Strengths of State Survey Data<br />Typically more sample than national data<br />Flexibility in adding policy relevant questions<br />Ability to over-sample and drill down to subpopulations <br />Children, geographic units, race/ethnicity<br />Analysts have data in hand <br />Ability to do analysis in-house<br />Quick turn-around<br />Policy development: Simulation of policy options<br />Program design and development, marketing and outreach<br />
  29. 29. 30<br />Survey Assistance - Weaknesses of State Survey Data<br />Lack of comparability across states<br />Variability in timing of surveys<br />Most are telephone surveys – coverage issues due to large cell-phone coverage<br />Inconsistency in data documentation<br />Cost concerns limits number of variables<br />Discrepancies with other data sources (survey and administrative data)<br />
  30. 30. Data Center<br />31<br />
  31. 31. Data Center<br />Online table and chart generator<br />Designed to help health policy analysts build policy-relevant tables of health insurance coverage estimates.<br />Easy to access.<br />Easy to use. <br />Estimates available from three sources<br />CPS, as published by the Census Bureau.<br />CPS, enhanced by SHADAC to account for historical changes in methodology.<br />ACS, as published by the Census Bureau (coming soon).<br />Trended data<br />CPS estimates from survey years 1988 to the present.<br />Easy to export<br />32<br />
  32. 32. Data Center – Available Estimates<br />Health insurance coverage<br />Uninsured, Insured (private, government, and military)<br />Counts, percents, standard error<br />Table options<br />Race/ethnicity<br />Age<br />Poverty<br />Household income<br />Sex<br />Marital status (individual and family)<br />Children in household<br />Work status (individual and family)<br />Education (individual and family)<br />Health status (CPS only)<br />Citizenship (ACS only)<br />33<br />
  33. 33. Data Center - Getting to the Data Center<br />34<br />Go towww.shadac.org<br />Click on <br />“Data Center”<br />
  34. 34. Evaluation Support<br />35<br />
  35. 35. Evaluation Support - Resources<br />Assistance with developing interview guides, focus group protocols, survey instruments<br />Review of qualitative data analysis strategies<br />Review of logic models<br />Stakeholder analysis<br />Information about available evaluation resources<br />36<br />
  36. 36. Evaluation Support - Plan Review<br />Recommend additions or revisions,<br />Provide advice on outcome indicators<br />Work with grantees to identify data sources<br />Help determine data availability and data-sharing agreement requirements<br />Advise on evaluation methods<br />37<br />
  37. 37. Evaluation Support - Assist with Evaluation Design<br />Assist states in developing an evaluation plan to meet benchmark and reporting requirements <br />Review evaluation plan in relation to policy objectives and access initiatives.<br />Identify areas that are not aligned and other gaps in the evaluation plan, including identifying existing data and data still needing to be collected.<br />38<br />
  38. 38. HRSA Benchmark Areas<br />39<br />
  39. 39. Selection Criteria<br />Responsive to the needs of HRSA<br />Common data are available for all SHAP states allowing a fairly standard comparison of outcomes<br />Measures are consistent with the existing grantee evaluation plans and not require additional resources<br />40<br />
  40. 40. Benchmarks<br />Rates of health insurance coverage for target populations<br />Generated from the American Community Survey (ACS) and CPS<br />Program enrollment of target populations and previous insurance status if possible<br />Program costs<br />Illustration of funding from all sources<br />41<br />SHADAC will work with states on these benchmarks and other evaluation needs<br />
  41. 41. Strategic considerations<br />42<br />
  42. 42. Data Acquisition<br />Successful evaluations depend on good data<br />Data acquisition within and across agencies and between entities can be difficult and time consuming<br />HIPAA and IRB process can stall process<br />Include data agreements in contracts, legislation and discuss early in the process<br />Seek [or consider obtaining] legal review and input on state-specific data sharing and data privacy laws<br />43<br />
  43. 43. Evaluation Timing<br />Many important evaluation measures will only come in year 2 and year 3 of the grant<br />Think of ways to track early progress through process measures or interim outcome measures<br />Determine implementation milestones and document these to show progress <br />44<br />
  44. 44. Mid-Course Changes<br />Program implementation and evaluation results should be interconnected<br />Share evaluation results across the project team<br />Use results from the evaluation to inform ongoing project changes, improvements<br />45<br />
  45. 45. Discussion Questions<br />Describe top 3 SHAP plan objectives <br />Define target populations<br />Discuss 2-3 components of evaluation (hopefully related to #1 and #2)<br />Identify challenges – those you have encountered or those you anticipate<br />46<br />
  46. 46. 47<br />Contact Information<br />Minnesota SHAP Project Team:<br />Lynn Blewett, Ph.D.<br />Kelli Johnson, MBA<br />Elizabeth Lukanen, MPH<br /> * Primary contact – elukanen@umn.edu, 612-626-1537<br />Website: www.shadac.org/shap<br />State Health Access Data Assistance Center <br />University of Minnesota, Minneapolis, MN<br />www.shadac.org<br />©2002-2009 Regents of the University of Minnesota. All rights reserved.The University of Minnesota is an Equal Opportunity Employer<br />

×