Evaluators who conduct studies of Federal government programs routinely encounter challenges navigating various data collection protocols and processes, such as Information Collection Requests (ICR) required by the Paperwork Reduction Act (PRA). This presentation will provide contextual information on why processes like the PRA and ICRs exist, how information collection burdens have changed over time, and highlight challenges routinely encountered in navigating such processes. The presentation will then discuss how evaluators can work with the ICR process to achieve target production timeframes, work within existing agency processes, and address and integrate public comments, while improving the overall quality of an evaluation. Data collected from original interviews related to activities of the Environmental Protection Agency (EPA) will be presented in highlighting evaluation challenges and solutions used to address limitations.
Evaluation Amidst Paperwork Reduction Act Reviews: Overcoming Challenges from a Data Collection Process Evaluators Love to Hate
1. 1
Evaluation Amidst Paperwork
Reduction Act Reviews:
Overcoming Challenges from a Data Collection
Process Evaluators Love to Hate
Nicholas Hart, PhD
@NickRHart
October 27, 2016
AEA 2016
#Eval2016
Atlanta, GA
THE GEORGE WASHINGTON UNIVERSITY
WASHINGTON, DC
2. What is the PRA?
• In 1942, the Federal Reports Act (FRA) established early
requirements on government approval from the Bureau of
the Budget (now OMB) for collecting information and
reducing reporting burdens
• In 1980, the Paperwork Reduction Act (PRA) developed a
process for reducing burden on citizens and improving
information quality; co-sponsored by Democrats and
Republicans
– passed by UC in Senate and 328-13 in House
• In 1995, PRA as we know it today was enacted to include
public comment provisions, independent reviews, and
direction that OMB develop processes for approval
2@NICKRHART Evaluation Amidst PRA Reviews
3. Why is PRA beneficial?
3@NICKRHART Evaluation Amidst PRA Reviews
26.0
32.0
28.3
-
5.0
10.0
15.0
20.0
25.0
30.0
35.0
1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010
BurdrenHours
Year
Annual per capita burden in hours from
information collections, 1997-2010
Source: Calculated from US Census Bureau population estimates and Shapiro (2013) burden estimates
• Burden is a part of the PRA, but not the whole story
4. Why is PRA beneficial?
• Improve the quality and practical utility of information
collected by the Federal government
• According to Shapiro (2013) PRA reviews can:
– Improve quality of information collection with better questions and
stronger methods; and
– Encourage public participation in data collection (15% of new
collections receive comments)
• What does PRA have to do with evaluation?
– Evaluation may include new data collection, but not always
– Are existing survey or administrative data sufficient? If not, why?
4@NICKRHART Evaluation Amidst PRA Reviews
5. How is PRA implemented?
• Federal agencies (or contractors) develop clearance
packages that are submitted to agency approvers, OMB,
and the public
– Tracked at RegInfo.gov
• Include formal “Information Collection Requests” or
ICRs that describe the information collected, provide a
reason/need, and estimate the time/cost (burden) for
responding
• Applies to any collection with 10+ respondents (a
threshold adopted from FRA)
5@NICKRHART Evaluation Amidst PRA Reviews
6. Evidence of Challenges, Successes, and Opportunities
Implementing PRA:
Examples from EPA
6@NICKRHART Evaluation Amidst PRA Reviews
• Note that above estimates of timeframes for reviews do not convey any
information about quality of submitted packages, requests from agencies to
prioritize actions, or distinctions between rulemaking priorities
7. Examples from EPA
• Hart (2016) identified PRA as both a barrier and facilitator for evaluation
at EPA, though far more often in the context as a barrier in interviews:
– “Project resources were not sufficient for obtaining an ICR”
– “no one had time to engage in formal ICR process”
– “have deliberately surveyed federal facilities to avoid PRA”
– “cost of ICRs was 30-40k in contractor funds, not including EPA time”
– “a real budget issue”
– “a major challenge”
– “PRA is a killer” for evaluation
– “ICRs are a barrier to everything”
• “While data collection itself was challenging, the most frequently
cited problem regarded the clearance process for the Paperwork
Reduction Act (PRA) from OMB. PRA issues were most frequently
cited in interviews and completed evaluations described in the
RCRA case study.” (Hart 2016, p. 388)
7@NICKRHART Evaluation Amidst PRA Reviews
8. Examples from EPA
• Some programs also identified alternative methods to
avoid PRA reviews altogether:
• “collection of new information was limited to EPA employees, since
collecting data from nonfederal employees” was limiting
• From the WasteWise evaluation: “Due to constraints under the [PRA] EPA
could not survey most program participants without undertaking an ICR
process. However, EPA can conduct surveys within the federal family.”
• We looked at federal facilities “because didn’t want to pursue ICR to get
questions…hard to get ICRs through OMB”
8@NICKRHART Evaluation Amidst PRA Reviews
9. Examples from EPA
• But there is usually more than one side to the story. As
some interviewees acknowledged:
• “OIRA had a very small staff [but] they worked harder than we did
• From an interviewed OMB staffer: “ [PRA] is absolutely an issue….our
side of the story is that if want to do a good rigorous analysis, we
never want to stand in way of getting data to do that…when we give
a hard time, usually because the approach lacks rigor and approach
may appear biased toward making [the agency] look good”
• Importantly from an OMB perspective: “we have to get agencies to
do rigorous and unbiased analysis”
9@NICKRHART Evaluation Amidst PRA Reviews
10. Examples from EPA
• And many programs did receive ICRs in an appropriate
timeline for evaluation:
• E.g., measuring autobody compliance (ICR 2344.01)
• E.g., ethanol compliance evaluation (ICR 1711.05) after minor survey
revisions
• E.g., wastewise, though included prescriptive terms of clearance
• And in other cases, industry groups have formed to fill a gap in
providing data to EPA “to ensure they had good information”
10@NICKRHART Evaluation Amidst PRA Reviews
11. Examples from EPA
• At two separate points in EPA’s history with
evaluation in the 1990s and 2000s, the agency held
a “generic clearance” for the purpose of conducting
program evaluation
– (e.g., ICR 11912-2010-001 and 199806-2020-001)
• No evidence of use
11@NICKRHART Evaluation Amidst PRA Reviews
12. What Can Be Done to Support Evaluation?
• #1: Maintain Relationships -- Engage in early outreach and
discussions with reviewing staff in agencies, and articulate
priorities to OMB and agency reviewers. OMB may also be able to
provide example packages from similar programs
• #2: Develop Quality Packages – strive to minimize burden and
maximize quality of questions in development of requests
• #3: Utilize Existing Clearances or Pursue Generic Clearances –
identify where existing program data collection can suffice, or
utilize approaches tailored for evaluation in consultation with
OMB analysts (see OMB guidance on generic clearances and M-
11-07)
• #4: Consider Whether Alternatives May Be Appropriate – some
past efforts at EPA modified samples based on alternative
approaches
12@NICKRHART Evaluation Amidst PRA Reviews
13. Nick Hart
George Washington University
Washington, D.C.
nick.r.hart@gmail.com
@NickRHart
Thank You!
13@NICKRHART Evaluation Amidst PRA Reviews
14. Resources
• Hart, N. 2016. Evaluation at EPA: Determinants of the U.S. Environmental
Protection Agency’s Capacity to Supply Program Evaluation. Dissertation.
Washington, D.C.: George Washington University.
• Shapiro, S. 2013. The Paperwork Reduction Act: Benefits, costs, and directions
for reform. Government Information Quarterly 30: 204-210.
• OMB. 2016. RegInfo.gov. Washington, D.C.: Office of Management and Budget.
14@NICKRHART Evaluation Amidst PRA Reviews
Editor's Notes
PRA holds favor and opposition, sometimes garnering strong support one way or the other. This presentation suggests there are some merits to many arguments, though for evaluations we have practical solutions for continuing to move forward