OECD workshop on measuring the link between public procurement, R&D and innovation. "Administrative Records Project: Review of U.S. Federal-wide Data Systems for R&D Statistics"
OECD workshop on measuring the link between public procurement, R&D and innovation. "Administrative Records Project: Review of U.S. Federal-wide Data Systems for R&D Statistics", presentation by John Jankowski
OECD workshop on measuring the link between public procurement, R&D and innov...STIEAS
More Related Content
Similar to OECD workshop on measuring the link between public procurement, R&D and innovation. "Administrative Records Project: Review of U.S. Federal-wide Data Systems for R&D Statistics"
Similar to OECD workshop on measuring the link between public procurement, R&D and innovation. "Administrative Records Project: Review of U.S. Federal-wide Data Systems for R&D Statistics" (20)
VIP Kolkata Call Girl Jatin Das Park 👉 8250192130 Available With Room
OECD workshop on measuring the link between public procurement, R&D and innovation. "Administrative Records Project: Review of U.S. Federal-wide Data Systems for R&D Statistics"
1. Administrative Records Project:
Review of U.S. Federal-wide Data Systems for
R&D Statistics
Christopher Pece and John Jankowski
(cpece@nsf.gov and jjankows@nsf.gov)
OECD Workshop on Measuring the Link between
Public Procurement, R&D and Innovation
December 6, 2013
National Science Foundation
National Center for Science and Engineering Statistics
www.nsf.gov/statistics/ 0
2. 2010 Committee on National Statistics
(CNSTAT) Report
• Several short term recommendations
• Three medium/long-term recommendations
– Develop R&D descriptors (tags) into administrative databases
to better enable identification of R&D components of agency
or program budgets (CNSTAT, 2010, Rec. 4-1)
– Use of administrative data to test new classification schemata
by direct access to intramural spending information from
agency databases. (CNSTAT, 2010, Rec. 4-2)
– Develop several demonstrated projects to test for the best
method to move to a system based at least partly on
administrative records (CNSTAT, 2010, Rec. 4-3) 1
3. ARP Program of Study
• Three main approaches to test feasibility & quality
– 1st Approach: Data tagging project
– 2nd Approach: Crosswalk project (cloned files)
– 3rd Approach: Supplemental sources project
• Sub-set of federal agencies from the Federal Funds
Survey (FFS) and Federal Support Survey (FSS) to
pilot different methods of using administrative records
• Analysis of results from 3 approaches to the data
collected via the traditional survey method
2
4. Approach 3: Supplemental Sources Project
• What is it?
– “NSF should develop the capacity for mining the standard and
newly enriched government-wide contracts and awards databases
to extract comprehensive information on R&D spending.”
– Can we construct survey data from publicly available Federal-wide
data sources
• What do we need to do?
– Review of multiple U.S. data systems
• Federal Assistance Awards Data System (FAADS)
• USAspending.gov
• Federal Procurement Data System (FPDS)
• Recovery.gov 3
5. The Federal Procurement Data System
(FPDS)
“The FPDS data system aims to identify who buys what, from
whom, and for how much, when, and where,…but appears to
have little relevance to NSF because its reports and data focus on
procurement actions, a category of federal spending mostly
distinct from R&D.” (pg. 82)
Examined the data on R&D within the Federal Procurement Data
System (FPDS) to identify the following necessary variables:
• Character of Work
• Field of Science and Engineering
• Place of Performance
• Type of Performer
4
6. Review of FPDS (1)
No direct connection to reporting in FPDS and Federal Funds
Survey as:
• Data to FPDS are provided by the Contracting Officer,
• not the subject matter expert or the Program Officer,
• nor the finance and accounting staffs.
Does not have information on:
• Grants,
• Federal intramural R&D
• Field of Science and Engineering
5
7. Review of FPDS (2)
Does have potential information on:
• Place of Performance
• Type of Performer – (External only)
• Character of Work
Place of Performance – Problem of location is would be
consistent with existing survey reporting practices.
Type of Performer – Provides designations for academic
organizations, for profit businesses, non-profits, FFRDC, and the
like.
6
8. Review of FPDS (3)
Product and Services Code (PSC) provides designation for:
• Basic Research
• Applied Research
• Advanced Development
• Engineering Development
• Operational Systems Development
As well as descriptors for the type of R&D activities, such as
Agriculture, Defense Systems, Energy, Medical, etc.
• Could be used as a Proxy for NSF Field of Science and
Engineering
7
9. Review of FPDS (4)
Given size of the database and activities of many agencies initial
focus was on a few manageable queries to examine the data and
the identifiers for:
• Defense Contracts Management Agency
• Defense Threat Reduction Agency
• Defense Advanced Research Projects Agency
• Defense Logistics Agency
• National Science Foundation
• National Institutes of Health
Compared agency-level FPDS summary data to data reported to
the Federal Funds Survey
8
10. Observations from FPDS Queries (1)
Initial query to identify Character of Work yielded some
concerns that give us pause into how reliable these data are
overall.
• DTRA reports showed large differences in Basic, Applied and
Development totals between FFS and FPDS
• DLA reports no Basic or Applied Research to FFS, yet FPDS
shows hundreds of millions in Applied Research and
Development.
• NSF reports no Development to FFS , but has several contract
actions for Engineering Development in FPDS.
9
11. Observations from FPDS Queries (2)
10
FPDS (2010) FFS (2010)
Defense Threat Reduction Agency
Basic $48.2 Mil $41.6 Mil
Applied $251.2 Mil $226.7 Mil
Development $119.9 Mil $258.8 Mil
Defense Logistics Agency
Basic $22.7 Mil $0
Applied $31.0 Mil $0
Development $0.8 Mil $176.2 Mil
National Institutes of Health
Basic $887.5 Mil $16,115.4 Mil
Applied $435.8 Mil $13,921.0 Mil
Development $78.0 Mil $0
12. Observations from FPDS Queries (3)
Initial query to identify Type of Performer also yielded some
concerns that give us pause into how reliable these data are
overall.
• In some instances major universities were not identified as
academic institutions.
• In other cases other for profit companies were not identified
as for profit companies.
• In other cases the same vendor had inconsistent identifiers
within the same query
11
13. Observations from FPDS Queries (4)
• Project description information needed to identify the Fields
of Science and Engineering is either, weak, inconsistent, or
not available
• Product and Services Code (PSC) descriptions about the
nature of the contracted services, they are more analogous to
project outcomes (e.g., Defense Aircraft, or Coal) rather than
general fields of science.
– Although the PSC may identify Coal as the project there no
information about the kind of research done with coal (e.g., it could be
related to research in geological sciences or atmospheric science)
12
14. Observations from FPDS Queries (5)
• Small effort to compare FPDS tabulations on contracts to
Defense contractors with company reports on our Business
R&D and Innovation Survey (BRDIS)
– In the aggregate there is little correlation between reported company-specific
totals reported in FPDS and the amounts reported by
companies in BRDIS
– Indeed there are order of magnitude difference in FPDS totals and
R&D totals reported by major defense contractors
13
FPDS (2011) BRDIS (2012)
Company /location A $3.0 Billion $1.0 Billion
Company/location B $3.3 Billion $0.6 Billion
15. Summary of Findings (1)
• Some data elements offer promise for providing insight into
Federal R&D procurement actions
• But there are a number of systematic weaknesses with the
identifiers
– Inconsistencies in classification of performer types
– Incorrect classification of basic research, applied research, and
development
– Insufficient project descriptions to best classify contracts into the Field
of Science and Engineering.
14
16. Summary of Findings (2)
• Combining FPDS with other Federal-wide data sources to fill
in the gaps is also problematic
– The same PSC designations for contracts in FPDS are not provided for
Grants within USAspending.gov
– None of these systems have data on intramural R&D
– At an inter-agency meeting to discuss the use of using administrative
records data agency representatives have questioned the quality of
these systems with regard to the level of detail NSF provides
• The use of administrative records systems alone for gathering
information on federal R&D activities in the U.S. is not
commensurate with the level of detail produced from the
Federal Funds Survey.
15
17. Summary of Findings (3)
• Before widespread use of administrative records can be used
to provide the same level of detailed statistics on R&D there
needs to be an effort to establish and enforce the application
of data standardization across federal agencies, specifically
with regard to tracking R&D
– Accounting standards and guidelines
– Financial classification taxonomy (data tags)
– Data system integration mitigation strategy to facilitate reporting
16
18. Next Steps (1)
• NSF will conduct further queries for some of the larger R&D
agencies and compare them to the results report to the Federal
Funds Survey
– U.S. Air Force
– NASA
• Pursue a Proof-of-Concept to test the feasibility of creating
data tags with agency administrative records data from
agencies with best systems
– NIH, NSF, and NIST
– DARPA and ONR
17
19. Next Steps (2)
• Developing plans and relationships to work more closely with
staff at the Office of Management and Budget to integrate
system of standardization
– Develop federal accounting audit standards specifically for
R&D with OMB and the Federal Accounting Standards
Advisory Board
– Develop comprehensive standard data tags for federal
agency classification of R&D activities with the Office of
Federal Financial Management
– Develop processes to facilitate data reporting across
different data system platforms
18