Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

OECD Blue Sky 3 Summary Presentation


Published on

OECD Blue Sky 3 Summary Presentation

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

OECD Blue Sky 3 Summary Presentation

  3. 3. WHAT BIG QUESTIONS ARE BEGGING FOR AN ANSWER? KEY MESSAGES ON  Human centered policy design  Role of participatory processes
  4. 4. We need human centered policy design – Minister Manuel Heitor • Need routinely to collect data on placement outcomes of highly trained individuals • Need to collect systematic cross-country data concerning mobility of scientists and engineers • Need new indicators on migration of refugees, especially student refugees and refugees trained in science and engineering • Improve measurement of scientific knowledge flow and its impact by learning directly from scientists and engineers through surveys and case studies • Need data on collaborative mechanisms for innovation such as de jure standards Debate 1: What big questions are begging for an answer?
  5. 5. We need participatory processes • We need to better characterize participatory processes of R&D agenda setting to help engaging scientific institutions and actors with civil society • We need collaboration with scientists, engineers and users to understand the knowledge production process and its impact • We need to increase citizen participation in science and public support for science : the role of story telling Debate 1: What big questions are begging for an answer?
  6. 6. SCOPE AND LIMITS OF INDICATOR USE BY STI POLICY KEY MESSAGES ON  The problem of research evaluation  Advantages and pitfalls of composite indicators
  7. 7. In the quest for serviceable metrics, it is important to keep reminding ourselves about the limit of our imagination and what people really value – Stephen Curry The problem of evaluation For far too long we have focused on placements in academe and used bibliometric measures to assess outcomes of education and funding - Paula Stephan Over simplification – e.g. rankings, impact factors – can create perverse effects Scope and limits of indicator use by STI policy
  8. 8. In the quest for serviceable metrics, it is important to keep reminding ourselves about the limit of our imagination and what people really value – Stephen Curry The problem of composite indicators Scope and limits of indicator use by STI policy If insensibly applied they result in indicator-driven policy, which is certainly NOT tantamount to evidence-based policy - Wolfgang Polt
  9. 9. TOWARDS MORE INCLUSIVE SCIENCE AND INNOVATION KEY MESSAGES ON  What and whom need to be included  What new metrics in this space
  10. 10. What and whom need to be included? • Include new performers of STI activities: e.g. user/consumer innovators, free/open innovation • Include geographical and cognitive peripheries: locally relevant R&D, invisible science, agriculture, social sciences and humanities, • Include new sources of info: e. g. big data, social media ….. altmetrics • Include new stakeholders in the community of practice of STI indicators: e.g. citizens (public engagement)
  11. 11.  Altmetrics have not been the panacea that we hoped for and they do not measure social impacts – Cassidy Sugimoto The promise of altmetrics THE FREE INNOVATION PARADIGM – Eric von Hippel  Change the definition of innovation in the Oslo Manual to enable it to ALSO apply to household sector innovation and other economic sectors as well.  Measure household sector innovation “Inclusive” in terms of who uses the indicators – Ismael Rafols Develop toolkits that allow exploration of choices in landscapes and allow users’ participation in decision making
  12. 12. MONDAY 19 - PARALLEL SESSIONS KEY MESSAGES  Data analytics for science and innovation  Technology diffusion and breakthroughs  Developing novel indicators from scientometrics  Capturing innovation in firms: do we get it right?  Leveraging the potential of administrative data for science and innovation policy
  13. 13. DATA ANALYTICS FOR SCIENCE AND INNOVATION • Text mining tools promise to alleviate some of the common challenges facing STI statistics, e.g. survey fatigue and unfit-for- purpose classification systems that are applied differently by human coders (e.g. patent assessors using USPC). • Theory-driven text mining offers new opportunities for generating STI indicators, e.g. through near real-time monitoring and online media monitoring for sentiment analysis. • Text mining depends on vocabularies, ontologies and other linguistic techniques. These can be defined manually or automatically, and deductively (e.g. through topic modelling) or inductively (e.g. through machine learning algorithms) – or in a combination of these approaches. MODERATOR: Katy Börner, Indiana University
  14. 14. DEVELOPING NOVEL INDICATORS FROM SCIENTOMETRICS • Traditional bibliometric indicators should be reviewed to add meaning and international comparability. Over simplification – e.g. rankings, impact factors - can have negative implications. • A quality dimension of process of validation –indicators of peer review - should be integrated. Adding more dimensions could capture real author contributions as well as novelty. • We should provide a more informed role to the users of bibliometric information. MODERATOR: Laura Cruz, Institute of Public Goods and Policies, Spain
  15. 15. CAPTURING INNOVATION IN FIRMS: DO WE GET IT RIGHT? • Overall…NO! We are capturing something but improvement needed. • Survey design, question design, content, implementation, matter for data quality and international comparability. • Also respondent characteristics have significant impacts (e.g. their expertise in innovation at the individual and firm level, micro firm, whether or not they buy in their major innovations, translation/cultural aspects, etc.) MODERATOR: Louise Earl, Statistics Canada
  16. 16. LEVERAGING THE POTENTIAL OF ADMINISTRATIVE DATA FOR SCIENCE AND INNOVATION POLICY • Metadata for research projects is inherently complicated; data access does not solve the problem; need to consistently identify and measure R&D projects • Tremendous potential in using machine-learning techniques to organize the large, unstructured data and make it amenable for analysis • Tremendous interest in networks and linkages and this raises difficult problems in disambiguation; potential interesting work going forward on this MODERATOR: Adam Jaffe , MOTU, New Zealand
  17. 17. TECHNOLOGY DIFFUSION AND BREAKTHROUGHS • Appropriate reference frames / reference data sets / benchmarks are important requisites for the assessment of technology diffusion. • A long-term (funding/analytical/strategical) focus is beneficial in the assessment of technology diffusion, in order to allow for the recognition of long-term dynamics and changes within the field. • In advanced assessments of technology diffusion it is of great value to allow for an agile / dynamic approach to data collection, as opposed to dependence upon a static data repository. MODERATOR: Mosahid Khan, WIPO
  18. 18. SCIENCE AND INNOVATION POLICY-MAKING IN AN ERA OF BIG DATA KEY MESSAGES ON  “Big”, “promising” data or “uncomfortable” data?  Potential for science and innovation policy making We want informed story telling that captures the essence of the underlying data
  19. 19. • We are here to create data and metrics to gain shared understanding and evaluate policy alternatives and identify gaps • Leverage on digitisation to deliver new metrics • Need to develop a granular capability to capture the dynamics of innovation • - • We don’t know where t the next data will come from from! BLUE SKY KEY NOTE LECTURE - Scott Stern
  20. 20. STI POLICY MAKING IN THE ERA OF BIG DATA  New data – big data, web data and open data – data combinations and interactive mapping and reporting tools: exciting opportunities or “uncomfortable data”?  Exciting Yes, BUT: What is the right amount of data? Need for complementary investment in capabilities to deal with the data; Need to think of the human in the loop, how do we present the results? Can we take big data and create a narrative?; Remove uncertainty through experiments; Look at these methods/data as “toolkits” rather than “silver bullets” answers for policy makers Challenges for the use of “big data” in companies: trust; availability of platforms; technical skills; access to complementary data (who owns the data?)
  21. 21. NEW MODELS AND TOOLS FOR MEASURING SCIENCE AND INNOVATION IMPACTS KEY MESSAGES ON  Build on a wide range of available tools  Embed measurement and evaluation into all of our work
  22. 22. NEW MODELS AND TOOLS Now available: high-quality, high coverage, interlinked data, cost- effective storage and computation, validated, scalable algorithms, visualization and animations capabilities - but are we using them? Some old ideas have not sunk in yet – need to build understanding and a culture for evaluating everything Quality of data is key – we need to understand what the data tell us Measuring the role of innovation in economic performance and productivity is like measuring the contribution of butter to the cake OECD role: microdata analysis, building and sharing understanding, developing standards, …
  24. 24. BIG DATA: OPPORTUNITIES AND CHALLENGES FOR OFFICIAL STATISTICIANS  Surveys and administrative data are complementary methods, that ensure the representativity of the population and can be used to measure new phenomena (e.g. survey data to analyse the disruptive impact of digital platform services)  Big data techniques are used by NSOs for analytical/statistical purposes (e.g. use of geo-spatial data, hydrographic and weather data to forecast agricultural yields, scanner data to replace price collection, web scraping to improve frames for surveys, crowdsourcing of information to improve the design of policies), but also for operational ones (to cut down cost of processing data).  Opportunities relate to “timeliness” and “higher granularity”, better “accuracy” and reduction of respondent burden  Institutional Challenges: access to privately owned data/privacy issues; content stability; replicability  Technical challenges: need to invest in infrastructure, software, capabilities/expertise  In the era of Internet of Things there is need for “smart statistics”, partnerships with the private sector and incentives for data sharing  What the OECD/the international community do? Common data standards, algorithmic transparency and accountability (dealing with automation), trusted third party for certification, “labelling” activities, clearing houses for data programmes, deal with international comparability with the new data sources; collect initiatives on use of big data to develop indicators, share best practices
  25. 25. TUESDAY 20 - PARALLEL SESSIONS  Innovation and IP: what data gaps limit policy discussion?  Researchers on the move  Interaction and impacts of STI policies  Capturing hidden innovators  STI actors: the potential of direct surveys
  26. 26. INNOVATION AND IP: WHAT DATA GAPS LIMIT POLICY DISCUSSION? • IPRs beyond patents: need for holistic view (for instance, exploiting data on other IP - TMs, utility models…). More information needed about trade secrets in particular. • Better understanding of the use of IP by end users in products. Here we need better data, for instance product-patent pairs (de Rassenfosse). Licensing data would be particularly useful. So far we have been limited to just a few sectors, like pharma. • Better understanding of the mechanism of knowledge flows. Again, better data is needed, for instance the diffusion from the scientific literature to practitioners (via the "enlightenment literature," Hicks). MODERATOR: Alan Marco, U.S. Patent and Trademark Office, USA
  27. 27. RESEARCHERS ON THE MOVE • Bibliometric data can provide a wealth of information on mobility. Data can provide levels of aggregation from the country to the region, institution and individual. • Combining different sources of data can provide larger opportunities on a global scale. However, linking challenges need to be resolved. • Technology now provides new tools to scrape/mine Internet (e.g. CVs) such as Natural Processing Language (NLP). MODERATOR: Emilda B. Rivers, National Science Foundation, USA
  28. 28. INTERACTION AND IMPACTS OF STI POLICIES • Program evaluation - significant progress in both techniques and availability of linked datasets since Blue Sky 2 • More to be done to assess the efficiency of programs and the joint impact of policies (but a unique identifier for each firms using gvt support programs and complete information of each support enjoyed by the firm are needed) • STI System evaluation – complex, no appropriate model currently available. Operational definition of STI system and internationally comparable proxies of policy levers are needed MODERATOR: Pierre Therrien, Innovation, Science and Econ Dev, Canada
  29. 29. CAPTURING HIDDEN INNOVATORS • Go beyond definition of formal private sector/market. • Need to extend the definition of innovation to cover households and public sector but also informal business sector especially as the geography of innovation is changing. Social innovation more problematic at this stage: definition still confusing. • Need to investigate more the methodologies to capture innovation beyond formal private sector. Need to define survey methodology that needs to be different from private sector one as the characteristics are quite different. • Even public sector is a controversial definition: is it only public administration? Does it include universities? Hospitals? Need to do more research to see if all public sector innovate the same way or if there are substantial differences. MODERATOR: Vladimir Lopez-Bassols, S&T policy consultant, USA
  30. 30. STI ACTORS: THE POTENTIAL OF DIRECT SURVEYS • Bibliographic information not sufficient to explain research and innovation processes. Surveys are useful and necessary to understand motivations driving research and research orientations. • Surveys are necessary and useful to measure perceptions and opinions of Actors regarding the development of the STI system, how institutional setting affects their behaviour, or the impact of institutional reforms • OECD should focus on global issues but still work with local researchers to increase the quality of the data • OECD should consider implementing direct surveys to address policy gaps and when data is not sufficient to answer key policy questions MODERATOR: Fernando Galindo-Rueda, OECD
  31. 31. LOOKING FORWARD: WHAT DATA INFRASTRUCTURES AND PARTNERSHIPS? KEY MESSAGES ON  Research data “infrastructures” that are reusable and sharable
  32. 32. The need for granular and interoperable data • Share the data so that it is reusable • Need to directly involve researchers to collect the data about researchers • Create standards for persistent identifiers in datasets • Communities should come together to develop the common infrastructure • Ensure policy continuity in this area
  33. 33. WEDNESDAY 21 - PARALLEL SESSIONS  Beyond indicators: the innovation and productivity nexus  Towards standards for a common research infrastructure  Trust, culture and citizen's engagement in science and innovation  Developing novel approaches to measure human capital and innovation  Surveying innovation in different contexts
  34. 34. BEYOND INDICATORS: THE INNOVATION AND PRODUCTIVITY NEXUS • Micro-level: production functions are a useful tool and provide a conceptual framework for estimating rates of return on investments. However, relevant for public policy to estimate the existence of complementarities (or substitution effects) • Macro-level: governments (and society) need to know what the rates of return are from different public investments and measure spillovers from all intangible investments (education, training and R&D), including by the public sector • Improving productivity-innovation nexus needs better macro-micro nexus: – Need for more micro-level measures to better understand aggregate dynamics and determinants – Encourage linking across different datasets (macro and micro-level datasets including firm, bilibiometrics, patent data, etc.) MODERATOR: Mariagrazia Squicciarini, OECD
  35. 35. TOWARDS STANDARDS FOR A COMMON RESEARCH INFRASTRUCTURE • Big potential in linking data on researchers (inputs, outputs of research, affiliations, geographical information etc.) for a better understanding of their behaviour and for a better informed policy making • Advances are being made towards data integration but many concepts remain black boxes. More dialogue is needed between different communities to promote mutual understanding • Models and experimentation to monitor open science are emerging. What are the metrics for open science, being aware of the fact that open science is more than open access and open data? What role for the OECD? MODERATOR: Cecilia Cabello, Spanish Foundation for S&T, Spain
  36. 36. TRUST, CULTURE AND CITIZEN'S ENGAGEMENT IN SCIENCE AND INNOVATION • Although science is global, ‘science culture’ remains local; innovation is a collective process and depends on social, spatial and historical contexts • Develop metrics to account for culture in public understanding and attitudes to science and innovation. Not country rankings, cluster analysis across a set of variables • Policy making could be helped by considering different approaches to segmenting populations in surveys. Disengaged people have different, but valid, attitudes • Scientists often don’t communicate what the public wants to know • Could OECD become curator of existing subjective databases around the world? Develop a “Frascati manual” on public attitudes to science and innovation – a “Ghent Manual” ? MODERATOR: Carthage Smith, OECD
  37. 37. DEVELOPING NOVEL APPROACHES TO MEASURE HUMAN CAPITAL AND INNOVATION • R&D sample survey in Germany shows that gender, education and nationality diversity can make a difference in research teams and is positively related to innovative capacity. More historical data are needed to determine causality (R&D sample survey) • Mobility across research fields leads to less valuable inventions (loss of specialisation) but more novel inventions (cross-fertilisation of ideas). Collaboration and access to scientific publications can help balance the shortcomings of mobility. • Oslo Manual provides clear guidelines on how to collect data but overlooks issues related to human capital, impact on outcomes and regional innovation. Linking data from different sources could give new insights without running new surveys. MODERATOR: John Gawalt, NSF, USA
  38. 38. SURVEYING INNOVATION IN DIFFERENT CONTEXTS • More comprehensive and different indicators of innovation are needed to capture innovation practices in non-traditional sectors and in developing economies. => These need to better capture incremental and non-technological innovations, the sourcing of external knowledge and sectoral specificities • There is a bias towards manufacturing in much of the analysis of innovation. Information on innovation in rural areas, in mining, utilities and agriculture needs to be collected more comprehensively. • Surveys have to aim for more objective comparable information on innovation to capture those innovating incrementally. The framing of surveys matters for responses. MODERATOR: Tomohiro Ijichi NISTEP, Japan