This document discusses research assessment exercises and metrics for measuring research productivity and impact. It summarizes findings from a study on researchers' publication and citation behaviors. Key findings include that journal articles are the dominant output across most disciplines but other outputs vary significantly by field. The document also discusses issues with different bibliometric data sources and measures, and implications for research institutions in how they strategy and support is assessed.
Beyond the Factor: Talking about Research ImpactClaire Stewart
The document discusses the increasing interest in research metrics and impact from funders, publishers, and institutions for purposes such as hiring, promotion, and evaluating proposals, but notes there are significant limitations to current metrics like journal impact factors which vary widely between disciplines and do not capture the full breadth of research outputs and impacts. It advocates for using quantitative metrics to support, not replace, expert review and evaluation of research and capturing a richer array of data on outputs like publications, presentations, and other influences on knowledge and society to more fully understand a researcher's impact.
The document describes the process for conducting a content analysis and developing taxonomies from academic articles. It involves identifying topics and keywords, searching relevant databases and journals with limits, downloading abstracts, analyzing content, identifying themes, and consolidating the analysis into a taxonomy or classification system. An example taxonomy is provided that categorizes 295 analyzed articles based on various themes, such as unit of analysis, area of study, theories used, methodology, and origin of study. The purpose is to systematically classify and organize knowledge within a research domain.
The document discusses the importance of using profiles instead of single metrics to evaluate research. It provides 4 examples of alternative visualizations that provide richer information than typical metrics like h-index or journal impact factors. These profiles look at the distribution of article citations, normalized citation percentiles for individual researchers, and compare outputs and impacts of research units. The profiles stimulate questions about the data to support better research management and decision making than relying solely on simplified metrics.
The document describes how the CDC's Science Impact Framework can be used to measure the impact of scientific work beyond just citation data. It provides three case studies that will illustrate how the framework can be applied. The framework uses a combination of quantitative and qualitative indicators to measure outcomes across five levels of influence: disseminating science, creating awareness, catalyzing action, effecting change, and shaping the future. The case studies will demonstrate how scientific work can have a complex path of impact that does not necessarily follow a linear progression through these levels of influence.
Identifying Structures in Social Conversations in NSCLC Patients through the ...IJERA Editor
The exploration of social conversations for addressing patient’s needs is an important analytical task in which
many scholarly publications are contributing to fill the knowledge gap in this area. The main difficulty remains
the inability to turn such contributions into pragmatic processes the pharmaceutical industry can leverage in
order to generate insight from social media data, which can be considered as one of the most challenging source
of information available today due to its sheer volume and noise. This study is based on the work by Scott
Spangler and Jeffrey Kreulen and applies it to identify structure in social media through the extraction of a
topical taxonomy able to capture the latent knowledge in social conversations in health-related sites. The
mechanism for automatically identifying and generating a taxonomy from social conversations is developed and
pressured tested using public data from media sites focused on the needs of cancer patients and their families.
Moreover, a novel method for generating the category’s label and the determination of an optimal number of
categories is presented which extends Scott and Jeffrey’s research in a meaningful way. We assume the reader is
familiar with taxonomies, what they are and how they are used.
This document summarizes a library workshop on conducting literature searches. It outlines the session agenda which included an overview of inquiry-based learning, defining literature searches and comparing them to comprehensive reviews. The document discusses research databases, peer-reviewed journals, and e-journals. It also provides examples of developing an effective search strategy and formatting references in APA style.
This document discusses research assessment exercises and metrics for measuring research productivity and impact. It summarizes findings from a study on researchers' publication and citation behaviors. Key findings include that journal articles are the dominant output across most disciplines but other outputs vary significantly by field. The document also discusses issues with different bibliometric data sources and measures, and implications for research institutions in how they strategy and support is assessed.
Beyond the Factor: Talking about Research ImpactClaire Stewart
The document discusses the increasing interest in research metrics and impact from funders, publishers, and institutions for purposes such as hiring, promotion, and evaluating proposals, but notes there are significant limitations to current metrics like journal impact factors which vary widely between disciplines and do not capture the full breadth of research outputs and impacts. It advocates for using quantitative metrics to support, not replace, expert review and evaluation of research and capturing a richer array of data on outputs like publications, presentations, and other influences on knowledge and society to more fully understand a researcher's impact.
The document describes the process for conducting a content analysis and developing taxonomies from academic articles. It involves identifying topics and keywords, searching relevant databases and journals with limits, downloading abstracts, analyzing content, identifying themes, and consolidating the analysis into a taxonomy or classification system. An example taxonomy is provided that categorizes 295 analyzed articles based on various themes, such as unit of analysis, area of study, theories used, methodology, and origin of study. The purpose is to systematically classify and organize knowledge within a research domain.
The document discusses the importance of using profiles instead of single metrics to evaluate research. It provides 4 examples of alternative visualizations that provide richer information than typical metrics like h-index or journal impact factors. These profiles look at the distribution of article citations, normalized citation percentiles for individual researchers, and compare outputs and impacts of research units. The profiles stimulate questions about the data to support better research management and decision making than relying solely on simplified metrics.
The document describes how the CDC's Science Impact Framework can be used to measure the impact of scientific work beyond just citation data. It provides three case studies that will illustrate how the framework can be applied. The framework uses a combination of quantitative and qualitative indicators to measure outcomes across five levels of influence: disseminating science, creating awareness, catalyzing action, effecting change, and shaping the future. The case studies will demonstrate how scientific work can have a complex path of impact that does not necessarily follow a linear progression through these levels of influence.
Identifying Structures in Social Conversations in NSCLC Patients through the ...IJERA Editor
The exploration of social conversations for addressing patient’s needs is an important analytical task in which
many scholarly publications are contributing to fill the knowledge gap in this area. The main difficulty remains
the inability to turn such contributions into pragmatic processes the pharmaceutical industry can leverage in
order to generate insight from social media data, which can be considered as one of the most challenging source
of information available today due to its sheer volume and noise. This study is based on the work by Scott
Spangler and Jeffrey Kreulen and applies it to identify structure in social media through the extraction of a
topical taxonomy able to capture the latent knowledge in social conversations in health-related sites. The
mechanism for automatically identifying and generating a taxonomy from social conversations is developed and
pressured tested using public data from media sites focused on the needs of cancer patients and their families.
Moreover, a novel method for generating the category’s label and the determination of an optimal number of
categories is presented which extends Scott and Jeffrey’s research in a meaningful way. We assume the reader is
familiar with taxonomies, what they are and how they are used.
This document summarizes a library workshop on conducting literature searches. It outlines the session agenda which included an overview of inquiry-based learning, defining literature searches and comparing them to comprehensive reviews. The document discusses research databases, peer-reviewed journals, and e-journals. It also provides examples of developing an effective search strategy and formatting references in APA style.
This is presentation on library assessment at Pitt University Library System delivered to iSchool Academic Librarianship Graduate students. December 2015.
Reputation, impact, and the role of libraries in the world of open scienceKeith Webster
An overview of the relationship between open science, research assessment, university rankings, and the role of librarians in advancing the research university
This document summarizes the Advanced Systems and Concepts Office's (ASCO) social sciences research agenda from 2002-2011. ASCO serves as the Defense Threat Reduction Agency's (DTRA) internal strategic studies office and identifies and develops social sciences research to anticipate weapons of mass destruction threats. The summary discusses ASCO's research focus areas over time, including initial modeling projects from 2002-2006, a shift in 2006 to assessing challenges in using social models and their application to WMD threats, and current focus areas on computational social modeling challenges and assessing model goodness. Lessons learned include the importance of interdisciplinary expertise, evolving research focus, and not assuming more funding or technology will lead to advances.
Day 1 - Quisumbing and Davis - Moving Beyond the Qual-Quant DivideAg4HealthNutrition
This document discusses the benefits and challenges of integrating qualitative and quantitative research methods. It argues that keeping qualitative and quantitative research separate unnecessarily limits understanding of the social world. Both methods have strengths, and using them together can overcome their individual weaknesses. The document outlines differences in qualitative and quantitative research and provides an example study that combined the methods sequentially and concurrently to better understand long-term poverty impacts in Bangladesh.
The document summarizes a webinar on altmetrics presented by altmetric.com, Science-Metrix, and Elsevier Labs. Altmetrics measure the broader impact of scholarly works through social media mentions, downloads, views, and saves. They provide new perspectives on the societal and academic impact of research beyond traditional metrics like citations. However, altmetrics are still in early stages and have limitations like disciplinary biases since some fields are more active on social media than others. More research is needed to better understand what altmetrics represent and how they can most useful complement traditional metrics.
This document outlines some common mistakes in comparative research methods. It discusses issues like causality, case selection, coding observations, subjectivity, and challenges in data analysis when making comparisons between multiple countries. Selecting too many countries can make the research lengthy and prone to errors. Coding data from different places can be difficult if variables are defined differently. Subjectivity is also a potential issue since qualitative data from case studies is involved. Accessing comparable data can pose problems if some countries have limited information sources.
EDUC 8102-6 - MD7Assgn5: Research Application Paper #2. eckchela
The document summarizes two research articles. The first article by Fetherston and Kelly used grounded theory to evaluate the effectiveness of a conflict resolution course. It found the revised course better prepared students for careers through transformative learning. The second article by Hsia and Spruijt-Metz used a correlation design to determine relationships between smoking habits and meanings for Asian-American students. It found gender-related relationships and proposed gender-specific smoking prevention programs.
Sugimoto - Social media metrics as indicators of broader impactinnovationoecd
This document discusses whether social media metrics can reveal the broader impacts of scholarly work beyond citations. It acknowledges that altmetrics were intended to measure impacts in informal communication channels but finds that altmetrics do not necessarily measure broader impacts or dissemination. Key findings include that altmetrics indicators are not more diverse than citations, do not broaden geographic dissemination, and do not necessarily benefit underrepresented groups like women or global south researchers. The document recommends avoiding goal displacement with indicators and calls for more research on measuring true social impacts.
This document describes MELODEM, an initiative to harmonize analytic approaches across longitudinal dementia studies. It discusses organizing working groups around topics like selection bias, measurement, and time scales. The goals are to conceptually and empirically compare methods, reach consensus on preferred methods, and address barriers to adoption. Working groups meet monthly by phone and annually in person. Papers have been submitted on reporting standards and the sensitivity of findings to practice effects specifications, with others in progress on survival bias and high-dimensional data methods.
Data Science: Origins, Methods, Challenges and the future?Cagatay Turkay
Slides for my talk at City Unrulyversity on 18.03.15 in London. Discuss the term Data Science, touch upon the origins and the data scientist types. A longer discussion on the Data Science process and challenges analysts face.
And here is the abstract of the talk:
Data Science ... the term is everywhere now, on the news, recruitment sites, technology boards. "Data scientist" is even named to be sexiest job title of the century. But what is it, really? Is it just a hype or a term that will be with us for some time?
This session will investigate where the term is originating from and how it relates to decades of research in established fields such as statistics, data mining, visualisation and machine learning. We will investigate how the field is evolving with the emergence of large, heterogeneous data resources. We will discuss the objectives, tools and challenges of data science as a practice, and look at examples from research and industrial applications.
Sociologists use quantitative and qualitative research methods like surveys, field research, and secondary data analysis to study groups. Quantitative research uses numerical data and statistics while qualitative research uses narrative data. Surveys are used to study large groups and require defining a population, sampling technique, and questionnaire design. Field research observes aspects of social life through case studies and participant observation. Data is analyzed using measures like mean, median, and mode to identify correlations and test hypotheses about causal relationships while avoiding biases or harm to participants.
Understanding impact through alternative metrics: developing library-based as...Kristi Holmes
This document discusses metrics and impact assessment for translational science research. It provides background on translational science, the role of Clinical and Translational Science Awards (CTSAs), and the mission of the Northwestern University Clinical and Translational Sciences (NUCATS) Institute to speed research discoveries to patients. The document outlines sample output and impact metrics that could be used for assessment and lists principles to guide an evaluation and continuous improvement program. It also discusses the role of libraries in providing metrics and impact services and outlines the services provided by the Galter Library Metrics and Impact Core at Northwestern University.
The ECSA Characteristics of Citizen ScienceMargaret Gold
An overview of the work and outcomes on the ECSA Characteristics of Citizen Science - full notes on https://zenodo.org/communities/citscicharacteristics
Overview and Exemplar Components of the Research Methodology on the Research ...YogeshIJTSRD
The research methodology outlines how research has been performed, addresses an unusual technique as well a description of whether a researcher has introduced a new method or substantially modified one that already occurs. This article takes the initiative in research education to help students develop and retain research skills in planning, preparing, and writing research methods as one of the K 12 learning skills among senior high school research courses. The research methodology comprises basic components including the design, sampling, tools, collection procedures, analysis, and ethical considerations. There are three types of study designs qualitative, quantitative, and mixed methods. One can explain the sampling procedure, size, subjects tested, and the location of the study in the sampling techniques. On the other hand, a researcher must mention the technical materials used in the study when writing the research instrument. The validity and reliability of the instruments should be tested before being used by the researcher. Since it involves gathering the information needed to resolve the research problem posed, data collection is considered to be the most important step of the research process. Data can be analyzed using a number of techniques, quantitative or qualitative. In addition, the study report should indicate whether and to what extent the studies comply with ethical standards. As a general rule, the research methods should be robust enough to reproduce the results. In this light, these overviews and exemplars help students demonstrate research writing skills and present research methodology throughout the research writing process. Almighty C. Tabuena | Yvon Mae C. Hilario | Mhelmafa P. Buenaflor "Overview and Exemplar Components of the Research Methodology on the Research Writing Process for Senior High School Students" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-3 , April 2021, URL: https://www.ijtsrd.com/papers/ijtsrd38693.pdf Paper URL: https://www.ijtsrd.com/humanities-and-the-arts/education/38693/overview-and-exemplar-components-of-the-research-methodology-on-the-research-writing-process-for-senior-high-school-students/almighty-c-tabuena
Cochrane Health Promotion Antony Morgan Explor MeetSonia Groisman
This document discusses NICE's role in providing public health guidance in the UK and some issues related to evaluating evidence on health inequalities. It describes NICE's process for developing guidance, which involves scoping topics, reviewing evidence, and making recommendations. However, it notes some limitations, such as a lack of evidence on effective interventions to reduce health inequalities and conceptual gaps in understanding the causes of inequalities. It argues NICE needs to improve its methods for evaluating evidence on inequalities, including getting the right review questions, considering different types of evidence, and better conceptual frameworks for analyzing causes of inequalities.
Societal Impact
Nicolas Robinson Garcia, INGENIO (UPV-CSIC), Universitat Politècnica de València, Spain / Daniel Torres-Salinas, Universidad de Navarra and Universidad de Granada (EC3metrics & Medialab UGR), Spain
Recently there is an increasing pressure on the development of indicators and methodologies that can offer evidences of the societal impact of researchers’ activity. This presentation will offer a comprehensive overview on the definition of societal impact, types of impact, and the attribution problem when searching for potential indicators. A special attention will be given to altmetric indicators and their potential role in tracing social engagement and its relation with societal impact. Examples of potential uses and current lines of work will be presented.
***************************
Scientometric procedures are increasingly used to analyse developments and trends in science and technology. Decisions to be taken often have severe implications. Consequently data handling, indicator construction and interpretation require competent expert knowledge, which is currently only available to a limited extent for all stakeholders in Central Europe not the least due to lacking training opportunities. Responding to the lack of a pertinent scientometrics education (especially in German speaking countries) and to the increasing demand (particularly of research quality managers), the University of Vienna (A), the German Centre for Higher Education Research and Science Studies - DZHW (D) and the Katholieke Universiteit Leuven (B) joined cooperatively to found the European Summer School for Scientometrics (esss) in 2010.
Journals’ Editorial Policies – An Analysis of the Instructions for Authors of...Rudjer Boskovic Institute
This document summarizes a study analyzing the instructions for authors of Croatian open access journals. The study examined 283 journal instructions to determine how thoroughly they addressed ethical issues, information about the journal, and instructions for manuscript preparation. It found that ethical issues were addressed least comprehensively compared to the other categories. Specifically, biomedical journals most frequently mentioned ethical topics like authorship, conflicts of interest, and research integrity, while social sciences journals provided the least information on ethics. The instructions primarily focused on formatting requirements for manuscript submissions. Overall, the study revealed that Croatian open access journals would benefit from more robustly addressing ethical guidelines and policies.
Sample of slides for Statistics for Geography and Environmental ScienceRich Harris
A sample of the slides available to support the teaching of the textbook Statistics for Geography and Environmental Science by Harris & Jarvis (2011). For further information see www.social-statistics.org
This document provides a 24-item checklist for reporting qualitative studies submitted to the Canada Communicable Disease Report (CCDR). The checklist outlines key elements that should be included in a qualitative study such as the objective, methods, findings, and conclusion. It also provides examples of what should be described such as the study setting, sampling strategies, data collection tools, data analysis process, and key findings. Following the checklist guidelines will help ensure qualitative studies submitted to the CCDR clearly describe the research context and process to aid readers' assessment and understanding of the findings.
Towards indicators for 'opening up' science and technology policyORCID, Inc
This document discusses approaches to broadening and opening up science and technology policy appraisal using indicators. It argues that conventional indicators often have perverse effects by reinforcing existing power structures and reducing diversity. The document presents conceptual frameworks for broadening appraisal inputs and making indicator outputs more open and plural rather than justifying specific decisions. Examples show how indicators can preserve multiple dimensions, represent different perspectives on concepts like excellence and interdisciplinarity, and explore directions of research portfolios. The goal is to use indicators to open up debate rather than provide unitary and prescriptive advice.
Rafols - Towards more inclusive STI indicatorsinnovationoecd
This document discusses the need for more inclusive science, technology, and innovation (STI) indicators that better capture diverse types of research and innovation.
Current STI indicators are biased towards certain types of mainstream science and may suppress or exclude valuable creative research in other fields like agriculture. This can threaten diversity in research. Indicators are also needed that make other contributions visible, like action research or co-creation.
While STI indicators can help with decisions, they do not necessarily lead to the "right" decisions if they do not reflect the full range of social and economic functions of science. Expanding indicator data and developing new indicator types may help broaden coverage of societal problems and peripheral areas of research.
This is presentation on library assessment at Pitt University Library System delivered to iSchool Academic Librarianship Graduate students. December 2015.
Reputation, impact, and the role of libraries in the world of open scienceKeith Webster
An overview of the relationship between open science, research assessment, university rankings, and the role of librarians in advancing the research university
This document summarizes the Advanced Systems and Concepts Office's (ASCO) social sciences research agenda from 2002-2011. ASCO serves as the Defense Threat Reduction Agency's (DTRA) internal strategic studies office and identifies and develops social sciences research to anticipate weapons of mass destruction threats. The summary discusses ASCO's research focus areas over time, including initial modeling projects from 2002-2006, a shift in 2006 to assessing challenges in using social models and their application to WMD threats, and current focus areas on computational social modeling challenges and assessing model goodness. Lessons learned include the importance of interdisciplinary expertise, evolving research focus, and not assuming more funding or technology will lead to advances.
Day 1 - Quisumbing and Davis - Moving Beyond the Qual-Quant DivideAg4HealthNutrition
This document discusses the benefits and challenges of integrating qualitative and quantitative research methods. It argues that keeping qualitative and quantitative research separate unnecessarily limits understanding of the social world. Both methods have strengths, and using them together can overcome their individual weaknesses. The document outlines differences in qualitative and quantitative research and provides an example study that combined the methods sequentially and concurrently to better understand long-term poverty impacts in Bangladesh.
The document summarizes a webinar on altmetrics presented by altmetric.com, Science-Metrix, and Elsevier Labs. Altmetrics measure the broader impact of scholarly works through social media mentions, downloads, views, and saves. They provide new perspectives on the societal and academic impact of research beyond traditional metrics like citations. However, altmetrics are still in early stages and have limitations like disciplinary biases since some fields are more active on social media than others. More research is needed to better understand what altmetrics represent and how they can most useful complement traditional metrics.
This document outlines some common mistakes in comparative research methods. It discusses issues like causality, case selection, coding observations, subjectivity, and challenges in data analysis when making comparisons between multiple countries. Selecting too many countries can make the research lengthy and prone to errors. Coding data from different places can be difficult if variables are defined differently. Subjectivity is also a potential issue since qualitative data from case studies is involved. Accessing comparable data can pose problems if some countries have limited information sources.
EDUC 8102-6 - MD7Assgn5: Research Application Paper #2. eckchela
The document summarizes two research articles. The first article by Fetherston and Kelly used grounded theory to evaluate the effectiveness of a conflict resolution course. It found the revised course better prepared students for careers through transformative learning. The second article by Hsia and Spruijt-Metz used a correlation design to determine relationships between smoking habits and meanings for Asian-American students. It found gender-related relationships and proposed gender-specific smoking prevention programs.
Sugimoto - Social media metrics as indicators of broader impactinnovationoecd
This document discusses whether social media metrics can reveal the broader impacts of scholarly work beyond citations. It acknowledges that altmetrics were intended to measure impacts in informal communication channels but finds that altmetrics do not necessarily measure broader impacts or dissemination. Key findings include that altmetrics indicators are not more diverse than citations, do not broaden geographic dissemination, and do not necessarily benefit underrepresented groups like women or global south researchers. The document recommends avoiding goal displacement with indicators and calls for more research on measuring true social impacts.
This document describes MELODEM, an initiative to harmonize analytic approaches across longitudinal dementia studies. It discusses organizing working groups around topics like selection bias, measurement, and time scales. The goals are to conceptually and empirically compare methods, reach consensus on preferred methods, and address barriers to adoption. Working groups meet monthly by phone and annually in person. Papers have been submitted on reporting standards and the sensitivity of findings to practice effects specifications, with others in progress on survival bias and high-dimensional data methods.
Data Science: Origins, Methods, Challenges and the future?Cagatay Turkay
Slides for my talk at City Unrulyversity on 18.03.15 in London. Discuss the term Data Science, touch upon the origins and the data scientist types. A longer discussion on the Data Science process and challenges analysts face.
And here is the abstract of the talk:
Data Science ... the term is everywhere now, on the news, recruitment sites, technology boards. "Data scientist" is even named to be sexiest job title of the century. But what is it, really? Is it just a hype or a term that will be with us for some time?
This session will investigate where the term is originating from and how it relates to decades of research in established fields such as statistics, data mining, visualisation and machine learning. We will investigate how the field is evolving with the emergence of large, heterogeneous data resources. We will discuss the objectives, tools and challenges of data science as a practice, and look at examples from research and industrial applications.
Sociologists use quantitative and qualitative research methods like surveys, field research, and secondary data analysis to study groups. Quantitative research uses numerical data and statistics while qualitative research uses narrative data. Surveys are used to study large groups and require defining a population, sampling technique, and questionnaire design. Field research observes aspects of social life through case studies and participant observation. Data is analyzed using measures like mean, median, and mode to identify correlations and test hypotheses about causal relationships while avoiding biases or harm to participants.
Understanding impact through alternative metrics: developing library-based as...Kristi Holmes
This document discusses metrics and impact assessment for translational science research. It provides background on translational science, the role of Clinical and Translational Science Awards (CTSAs), and the mission of the Northwestern University Clinical and Translational Sciences (NUCATS) Institute to speed research discoveries to patients. The document outlines sample output and impact metrics that could be used for assessment and lists principles to guide an evaluation and continuous improvement program. It also discusses the role of libraries in providing metrics and impact services and outlines the services provided by the Galter Library Metrics and Impact Core at Northwestern University.
The ECSA Characteristics of Citizen ScienceMargaret Gold
An overview of the work and outcomes on the ECSA Characteristics of Citizen Science - full notes on https://zenodo.org/communities/citscicharacteristics
Overview and Exemplar Components of the Research Methodology on the Research ...YogeshIJTSRD
The research methodology outlines how research has been performed, addresses an unusual technique as well a description of whether a researcher has introduced a new method or substantially modified one that already occurs. This article takes the initiative in research education to help students develop and retain research skills in planning, preparing, and writing research methods as one of the K 12 learning skills among senior high school research courses. The research methodology comprises basic components including the design, sampling, tools, collection procedures, analysis, and ethical considerations. There are three types of study designs qualitative, quantitative, and mixed methods. One can explain the sampling procedure, size, subjects tested, and the location of the study in the sampling techniques. On the other hand, a researcher must mention the technical materials used in the study when writing the research instrument. The validity and reliability of the instruments should be tested before being used by the researcher. Since it involves gathering the information needed to resolve the research problem posed, data collection is considered to be the most important step of the research process. Data can be analyzed using a number of techniques, quantitative or qualitative. In addition, the study report should indicate whether and to what extent the studies comply with ethical standards. As a general rule, the research methods should be robust enough to reproduce the results. In this light, these overviews and exemplars help students demonstrate research writing skills and present research methodology throughout the research writing process. Almighty C. Tabuena | Yvon Mae C. Hilario | Mhelmafa P. Buenaflor "Overview and Exemplar Components of the Research Methodology on the Research Writing Process for Senior High School Students" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-3 , April 2021, URL: https://www.ijtsrd.com/papers/ijtsrd38693.pdf Paper URL: https://www.ijtsrd.com/humanities-and-the-arts/education/38693/overview-and-exemplar-components-of-the-research-methodology-on-the-research-writing-process-for-senior-high-school-students/almighty-c-tabuena
Cochrane Health Promotion Antony Morgan Explor MeetSonia Groisman
This document discusses NICE's role in providing public health guidance in the UK and some issues related to evaluating evidence on health inequalities. It describes NICE's process for developing guidance, which involves scoping topics, reviewing evidence, and making recommendations. However, it notes some limitations, such as a lack of evidence on effective interventions to reduce health inequalities and conceptual gaps in understanding the causes of inequalities. It argues NICE needs to improve its methods for evaluating evidence on inequalities, including getting the right review questions, considering different types of evidence, and better conceptual frameworks for analyzing causes of inequalities.
Societal Impact
Nicolas Robinson Garcia, INGENIO (UPV-CSIC), Universitat Politècnica de València, Spain / Daniel Torres-Salinas, Universidad de Navarra and Universidad de Granada (EC3metrics & Medialab UGR), Spain
Recently there is an increasing pressure on the development of indicators and methodologies that can offer evidences of the societal impact of researchers’ activity. This presentation will offer a comprehensive overview on the definition of societal impact, types of impact, and the attribution problem when searching for potential indicators. A special attention will be given to altmetric indicators and their potential role in tracing social engagement and its relation with societal impact. Examples of potential uses and current lines of work will be presented.
***************************
Scientometric procedures are increasingly used to analyse developments and trends in science and technology. Decisions to be taken often have severe implications. Consequently data handling, indicator construction and interpretation require competent expert knowledge, which is currently only available to a limited extent for all stakeholders in Central Europe not the least due to lacking training opportunities. Responding to the lack of a pertinent scientometrics education (especially in German speaking countries) and to the increasing demand (particularly of research quality managers), the University of Vienna (A), the German Centre for Higher Education Research and Science Studies - DZHW (D) and the Katholieke Universiteit Leuven (B) joined cooperatively to found the European Summer School for Scientometrics (esss) in 2010.
Journals’ Editorial Policies – An Analysis of the Instructions for Authors of...Rudjer Boskovic Institute
This document summarizes a study analyzing the instructions for authors of Croatian open access journals. The study examined 283 journal instructions to determine how thoroughly they addressed ethical issues, information about the journal, and instructions for manuscript preparation. It found that ethical issues were addressed least comprehensively compared to the other categories. Specifically, biomedical journals most frequently mentioned ethical topics like authorship, conflicts of interest, and research integrity, while social sciences journals provided the least information on ethics. The instructions primarily focused on formatting requirements for manuscript submissions. Overall, the study revealed that Croatian open access journals would benefit from more robustly addressing ethical guidelines and policies.
Sample of slides for Statistics for Geography and Environmental ScienceRich Harris
A sample of the slides available to support the teaching of the textbook Statistics for Geography and Environmental Science by Harris & Jarvis (2011). For further information see www.social-statistics.org
This document provides a 24-item checklist for reporting qualitative studies submitted to the Canada Communicable Disease Report (CCDR). The checklist outlines key elements that should be included in a qualitative study such as the objective, methods, findings, and conclusion. It also provides examples of what should be described such as the study setting, sampling strategies, data collection tools, data analysis process, and key findings. Following the checklist guidelines will help ensure qualitative studies submitted to the CCDR clearly describe the research context and process to aid readers' assessment and understanding of the findings.
Towards indicators for 'opening up' science and technology policyORCID, Inc
This document discusses approaches to broadening and opening up science and technology policy appraisal using indicators. It argues that conventional indicators often have perverse effects by reinforcing existing power structures and reducing diversity. The document presents conceptual frameworks for broadening appraisal inputs and making indicator outputs more open and plural rather than justifying specific decisions. Examples show how indicators can preserve multiple dimensions, represent different perspectives on concepts like excellence and interdisciplinarity, and explore directions of research portfolios. The goal is to use indicators to open up debate rather than provide unitary and prescriptive advice.
Rafols - Towards more inclusive STI indicatorsinnovationoecd
This document discusses the need for more inclusive science, technology, and innovation (STI) indicators that better capture diverse types of research and innovation.
Current STI indicators are biased towards certain types of mainstream science and may suppress or exclude valuable creative research in other fields like agriculture. This can threaten diversity in research. Indicators are also needed that make other contributions visible, like action research or co-creation.
While STI indicators can help with decisions, they do not necessarily lead to the "right" decisions if they do not reflect the full range of social and economic functions of science. Expanding indicator data and developing new indicator types may help broaden coverage of societal problems and peripheral areas of research.
Evolving and emerging scholarly communication services in libraries: public a...Claire Stewart
This document provides an overview of a guest lecture about evolving scholarly communication services in libraries and their role in supporting public access compliance and assessing research impact. It discusses challenges libraries face in helping researchers comply with public access policies from funders. It also explores metrics and indicators used to measure research impact, noting limitations, and how libraries can help address this complex issue by leveraging their expertise in managing scholarly information and data.
Aligning scientific impact and societal relevance: The roles of academic enga...Nicolas Robinson-Garcia
This document summarizes a study examining how academic engagement and interdisciplinary research relate to achieving both scientific impact and societal relevance. Negative binomial regressions on survey and publication data from Spanish scientists found that:
1) Academic engagement and interdisciplinary variety were positively associated with both societal relevance and scientific impact;
2) Interdisciplinary disparity was positively associated with societal relevance but its interaction with variety was positively associated with scientific impact;
3) Control variables like past impact, productivity and affiliations were also significant predictors.
The study provides empirical evidence that engagement and interdisciplinary research can enhance complementarities between scientific and societal impacts.
An overview of ethical research practices by Malcolm MacLean, Chair of UoG Research Ethics Committee.
Reader in the Culture & History of Sport, Faculty of Applied Sciences
This document discusses the mainstreaming of participatory research in health care. It provides an overview of participatory action research (PAR), including its origins in the 1940s-1960s with Lewin and its growth in Latin America. PAR is defined as a process where a group jointly diagnoses an issue, works to improve it, evaluates effectiveness, and critically reflects. The key is participation, with research conducted and changed by participants. Examples of PAR projects that address health disparities are provided, along with challenges of PAR such as building trust and balancing academic and community needs and timelines. The benefits of community-based participatory research for developing culturally-appropriate measures and establishing trust for quality data collection are also summarized.
The Importance Of Quantitative Research DesignsNicole Savoie
The document discusses quantitative and qualitative research designs. It states that qualitative research aims to understand the reasons and motivations behind issues, while quantitative research focuses on measuring trends and generalizing results from samples to populations. As examples, it provides details about two studies, one using a qualitative design to understand family relationships and support for mothers, and the other using a quantitative design but does not provide details about the specific study. It also provides background information on the samples and methods used in the qualitative study.
The document discusses measuring societal impact of research. It defines societal impact as social, environmental, cultural or economic benefits from academic activities. Measuring societal impact is challenging due to attribution problems. The UK REF attempted to allocate funding based on non-academic impact using impact case studies. Altmetrics were discussed as a potential way to measure broader impacts, though they may be more related to scientific impact. A case study at the University of Granada used bibliometric and non-bibliometric indicators across nine dimensions, finding some research groups with clear societal impact orientation. Potential applications of altmetrics include analyzing social networks and identifying research communities. Both qualitative and quantitative assessments have limitations in measuring societal impact.
This document discusses the role and importance of statistics in scientific research. It begins by defining statistics as the science of learning from data and communicating uncertainty. Statistics are important for summarizing, analyzing, and drawing inferences from data in research studies. They also allow researchers to effectively present their findings and support their conclusions. The document then describes how statistics are used and are important in many fields of scientific research like biology, economics, physics, and more. It also provides examples of statistical terms commonly used in research studies and some common misuses of statistics.
ilovepdf_merged.pdf- about Media and communicationKonulAzizli
This document discusses key concepts in social science research methods. It defines research as a structured, systematic investigation aimed at increasing understanding through objective analysis of data. Research is guided by theory, which provides frameworks for interpreting findings. The document outlines various aspects of the research process, including developing research questions, collecting and analyzing data, and presenting findings. It emphasizes that social research should have practical implications and be influenced by ethical considerations and real-world contexts.
Kicking off the INCENTIVE project with an intro to the CS Principles and Char...Margaret Gold
-The Citizen Science Lab at Leiden University
- The core concept of the INCENTIVE project
- The ECSA 10 Principles of Citizen Science
- The ECSA Characteristics of Citizen Science
Science Indicators & Mapping of Science by Aman Kr KushwahaAMAN KUMAR KUSHWAHA
This document discusses science indicators and mapping of science. It defines science indicators as statistics that measure quantifiable aspects of science creation, dissemination and application. It discusses different types of indicators such as input/output, quantitative/qualitative, and functional/instrumental. It also discusses different ways of mapping science cognitively, including journal citation maps, co-citation maps, co-word maps, and co-classification maps. Descriptive mapping aims to provide an overview of knowledge production levels.
Data Management and Broader Impacts: a holistic approachMegan O'Donnell
This document summarizes a presentation on taking a holistic approach to data management and broader impacts. It discusses the National Science Foundation's broader impacts criterion, which requires research to benefit society. It argues that examining data through a broader impacts lens highlights the benefits of good data management, data management plans, and the value of data information literacy skills. Taking this holistic approach can help researchers understand why data management plans are important, justify spending more time on data practices, and encourage embracing data sharing.
The document discusses various types of research methods including qualitative research, quantitative research, mixed research, basic research, applied research, correlation research, exploratory research, historical research, descriptive research, advocacy research, evaluation research, ethnographic research, phenomenological research, and experimental research. It provides definitions and examples of each type of research method.
Exploring 'Impact': new approaches for alternative scholarly metrics in AfricaThomas King
This document discusses alternative approaches for measuring the impact of research in Africa beyond traditional citation metrics. It notes that scholarly communication has changed radically with the Internet and explores new ways of defining and tracking impact through social media mentions, blog posts, and usage data. The document advocates considering a variety of impact measures and highlighting research that makes knowledge available to broader communities in accessible formats beyond academic journals.
Research is defined as the systematic process of collecting and analyzing information to increase understanding of a topic. It involves three main stages: planning, data collection, and analysis. There are two main forms of research - basic research which aims to develop a general body of knowledge, and applied research which aims to provide knowledge to influence social policy. Social research uses scientific methods to study human social behavior and is conducted by social scientists. It involves collecting empirical data objectively and systematically to test theories about social phenomena.
Research is defined as the systematic process of collecting and analyzing information to increase understanding of a topic. It involves three main stages: planning, data collection, and analysis. There are two main forms of research - basic research which aims to develop a general body of knowledge, and applied research which aims to provide knowledge to influence policy. Social research specifically seeks to understand social processes and problems using scientific methods employed by social scientists. It involves collecting empirical data objectively and systematically using theories to explain observations.
Big data is prevalent in our daily life. Not surprisingly, big data becomes a hot topic discussedby commercial worlds, media, magazines, general publics and elsewhere. From academic point of view, isit a research area of potential worth being explored? Or it is just another hype? Are there only computer orIS related scholars suitable for big data research due to its nature? Or scholars from other research areas are alsosuitable for this subject? This study aims to answer these questions through the use of informetricsapproach and data source form the SSCI Journal database, leveraging informetric‟s robust natures ofquantitative power of analyze information in any form onto the data source of representativeness. This research shows that big data research is at its growth phase with an exponential growth patternsince 2012 and with great potential for years to come. And perhaps surprisingly, computer or IS relateddisciplinesare not on the top 5 research areas fromthis research results. In fact, the top five research disciplinesare more diversified then expected: business economics (#1), Government Law (#2), InformationScience/ Library Science (#3), Social Science (#4) and Computer Science (#5). Scholars from the USuniversities are the most productive in this subject while Asian countries, including Taiwan, are alsovisible. Besides, this study also identifies that big data publications from SSCI journal database during2005-2015 do fit Lotka‟s law. This study contributes tounderstand the current big data research trends and also show the ways toresearchers who are interested to conduct future research in big data regardless of their research backgrounds.
1. The document discusses tackling "wicked problems" in social services through using critical approaches and workplace information literacy within evidence-based practice.
2. It outlines challenges in connecting research and practice in social services due to competing factors in decision making and barriers to accessing and applying research evidence.
3. The Iriss Evidence Search and Summary Service aims to bridge this gap by producing summaries of evidence on topics relevant to the social services workforce in Scotland to inform practice and service development.
The document discusses open science and its key aspects. It notes there is widespread agreement that open science affects all stages of the research process through a global, systemic shift involving varied local implementations. It also discusses challenges and opportunities of open science, including the need for: training and skills development; addressing diversity in research cultures; resolving intellectual property issues; and overcoming biases towards well-resourced research. Overall, the document argues open science provides tools for improved research governance if supported through appropriate incentives, infrastructures and monitoring.
Similar to Seminari CRICC : Avaluació de la recerca. (20)
Transmedia literacy: un proyecto de investigacióncricc
Este documento describe un proyecto de investigación sobre alfabetización transmedia dirigido por Maria-Jose Masanet de la Universitat de Barcelona. El proyecto busca mapear las competencias transmedia de los jóvenes a través de encuestas, talleres y entrevistas en 8 países. También busca identificar las estrategias informales de aprendizaje de los jóvenes con medios y cómo pueden aprovecharse esos conocimientos en el aula. Los objetivos son desarrollar un mapa de competencias transmedia, identificar estrategias de aprendizaje
Las redes de la lectura: historia de un proyecto de investigacióncricc
El documento describe el proceso de investigación detrás de un libro sobre la lectura social y sus contextos múltiples. Comenzó en 2015 con la idea de una publicación que permitiera comprender mejor la naturaleza compleja de la lectura social y su ubicación en el contexto de las transformaciones que están invirtiendo el acto de la lectura. El libro se publicó en 2016 y 2018 en italiano y español respectivamente. El proyecto tuvo un enfoque interdisciplinario y se basó en el análisis de datos generados por usuarios en plataformas de
Ernest Abadal es va centrar en presentar les accions que es poden dur a terme per detectar les notícies falses. les accions personals i també les institucionals que s'estan duent a terme per combatre les notícies falses. Pel que fa a les actuacions personals, cal detectar la fiabilitat de la font de la notícia, comprovar la data, consultar experts i, sobretot, no difondre continguts dels quals no estem segurs. Pel que fa a les actuacions institucionals, hi ha tres vies obertes: la penalització, que estan seguint Facebook i Google amb tot un seguit d'actuacions dirigides a detectar i penalitzar les pàgines creades per la desinformació; també els portals de verificació, vinculats a associacions de periodistes i mitjans de comunicació que es dediquen a contrastar i verificar rumors i falses informacions que circulen per les xarxes socials i, finalment, l'educació i la formació, un aspecte que és essencial i que molts països ja han detectat tot incorporant competències en avaluació de la informació en els programes educatius de primària i secundària per assegurar que els joves van a saber avaluar la veracitat de la informació que trobaran a internet i les xarxes socials.
Sergio Villanueva explica cuáles son las características de las noticias falsas por medio de tres historias. La primera de las historias cuenta cómo, a finales del siglo XIX, una noticia falsa hizo explotar la guerra entre España y los Estados Unidos. Esta historia nos hace ver que las noticias falsas siempre tienen una resonancia cognitivo-ideológica con el momento histórico. La segunda historia, ya en el momento presente, nos transporta a una granja de fabricación de noticias falsas a un pequeño municipio de Macedonia del Norte desde donde se lanzaron gran parte de las fake news para la campaña electoral de Trump. Esta historia nos enseña que detrás de toda fabricación de noticias hay un intercambio económico. Por último, explicamos la noticia falsa que se generó por accidente en 2018 en nuestra facultad. A partir de este relato hacemos una aproximación al perfil de usuario que tiene más posibilidad de compartir una noticia falsa: hombre, mayor de 65, y con una posición ideológica polarizada.
Mariluz Congosto (2019). Social network analysis: una herramienta para desenm...cricc
Este documento presenta un resumen de tres oraciones sobre el análisis de redes sociales en Twitter para detectar falsedades. Explica que el análisis de redes sociales puede ser una herramienta para desenmascarar falsedades en Twitter mediante el estudio de la propagación de la información a través del tiempo y de las conexiones entre usuarios, y que el análisis de las relaciones entre usuarios puede clasificar perfiles con precisión revelando estructuras internas.
El documento describe los criterios para la evaluación de proyectos de investigación del Plan Estatal de I+D+i en España. Explica que desde la Ley de Ciencia de 1986, se estableció el Plan Nacional de Investigación como principal instrumento para fomentar la actividad científica. Desde 2013, el plan se denomina Plan Estatal de Investigación Científica y Técnica y de Innovación. Cada proyecto recibe cuatro evaluaciones de expertos y es calificado en cuatro niveles. Los criterios de evaluación incluyen la calidad c
Codeless Generative AI Pipelines
(GenAI with Milvus)
https://ml.dssconf.pl/user.html#!/lecture/DSSML24-041a/rate
Discover the potential of real-time streaming in the context of GenAI as we delve into the intricacies of Apache NiFi and its capabilities. Learn how this tool can significantly simplify the data engineering workflow for GenAI applications, allowing you to focus on the creative aspects rather than the technical complexities. I will guide you through practical examples and use cases, showing the impact of automation on prompt building. From data ingestion to transformation and delivery, witness how Apache NiFi streamlines the entire pipeline, ensuring a smooth and hassle-free experience.
Timothy Spann
https://www.youtube.com/@FLaNK-Stack
https://medium.com/@tspann
https://www.datainmotion.dev/
milvus, unstructured data, vector database, zilliz, cloud, vectors, python, deep learning, generative ai, genai, nifi, kafka, flink, streaming, iot, edge
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
1. S&T Indicators in the Wild
contextualisation and participation for
responsible metrics
Ismael Ràfols
Ingenio (CSIC-UPV), Universitat Politècnica de València
CWTS (Centre for Science and Technology Studies), University of Leiden
SPRU (Science Policy Research Unit), University of Sussex
Universitat de Barcelona, Seminaris CRICC
Centre de Recerca en Informació, Comunicació i Cultura
Research Evaluation (2019), 28(1), 7-22.
2. The argument
• High policy demand for indicators in research assessments. However
current S&T indicators are highly problematic, possibly harmful.
• For quant methods to contribute to research impact assessment (RIA), we
need to go beyond scientometrics (or big data metrics) as it is now.
• Instead, we have to develop “indicators in the wild” (‘en plein air’) in
hybrid forums for engaging with contextual and diverse expertise
• This implies three moves in ’translation’:
1. broadening out the scope of data and expertise used
2. use quantitative outputs for opening up in processes that include
deliberation,
3. engaging with disparate communities in the framing of problems and
questions
4. Why indicators? Pressing demands of research
management and evaluation
• Increasing size of research endeavour
1.5 M papers per year only in Web of Science (0.5 M 20 years ago)
Within a country: 3,000 postgraduate programmes are evaluated in 48
panels in Brasil
Globalisation. Many mid-income countries have multiplied their
publication output (China)
• Increasing competition in academia
• Governance of public sector influenced by New Public Management
(competition as key incentives)
• Increasing societal demands
Interactions with industry and social actors (NGOs)
Grand challenges (climate change, epidemics, water & food security)
Traditional qualitative techniques of management cannot cope.
Hope that use of indicators can help...
5. Uses of indicators: Pressing demands of research
management and evaluation --- Can indicators help?
Yes, indicators can help make decisions…
Reduce time and costs
Increase transparency and sense of objectivity
Reduce complexity, accessible to managers
but do they lead to the “right” decisions?
Evaluation gap (Wouters):
“discrepancy between evaluation criteria [implicit in indicators] and
the social and economic functions of science”
6. Sci Tech & Innovation can have unexpected undesirable
effects while indicators of STI can remain “positive”
Poor housing
Asbestos
Climate change
Cultural and ethnic suppression
Casino capitalism in
financial innovation
Why did we get here?
7. Misalignment between research and societal needs
Source: Daniel Sarewitz – Saving Science – New Atlantis August, 2016
Perceived mismatch between
discourses (or expectations) of
research and actual outcomes.
Energy, environment, health, the
digital economy
More research does not mean
better societal outcomes
Monitoring tools and incentives
(bibliometric indicators!) are
part of the problem.
8. How can Quantitative Studies contribute
to Research Assessment?
1. Uncertainty and value-laden views in assessment
3. Towards indicators in the wild2. Scientometrics as a
secluded science
9. Unspoken assumptions in policy use of S&T indicators
S&T indicator work in policy (enlightenment):
• Knowledge from S&T leads to well-being
• State (e.g. univ. admin) is benevolent
• Expertise (e.g. scientometrics) serves the public good
However – instances of assumptions breaking down
• no agreement on benefits of research (highly contested)
• Focus of health research in pharma therapeutics
• the state/admin can favour particular interests
• Nuclear energy?
• Pharma?
• experts’ views can be aligned with state/particular interests
• Impact indicators (e.g. pats) favour therapeutics over prevention
11. Translations in science
(Callon, Lascoumes & Barthe, 2001, Acting in an uncertain world)
Research
Collective
and manipulation
Framing and
reduction
Insights
from research
Secluded
research
12. Scientometrics (data analytics) as a ‘science’
State framing:
S&T promotion
Research
collective:
-databases &
manipulation
Generalisation
of indicators
Indicators
‘Acting’
in the world
Forest management
create ‘a legible natural
terrain that facilitates
manipulation.’
Scott (1998)
13. Data analytics use in policy: dependence and isolation
Framing based on hegemonic discourses:
Biotech / Nanotechnologies
Ignoring alternative ontologies
Compl. & Altenart. Medicine
Closed Research Collective:
Dominant databases
Poor coverage of language,
topics, regions, countries
Data in few labs. Gaps: Insights inconsistent
with world’s complexity (agro, med)
“The evaluation gap is the phenomenon (…) that the criteria in assessments do not match the character or
goals of the research under evaluation or the role that the researcher aims to play in society.” (Wouters, 2014)
Institutions
(state,
univ. mngt
industry)
as patrons
and users
15. Recap
• Science policy is being shifted towards societal goals and that this raises
demands for new indicators or alternative type of quantitative evidence
• Where there is uncertainty and lack of value agreement, advice for policy
should not separate technical knowledge from decision-making
• Scientometrics fits with the notion of ‘secluded research’, i.e. research that
is carried out without much interaction with stakeholders and the contexts in
which it will be used.
Propostion:
‘Responsible metrics, and to respond to new policy demands of addressing societal
goals, STI indicators should follow the practices of what Callon called ‘research in
the wild’.
• Contextualisation and participation.
16. Research in the wild (Callon et al. 2001)
Secluded research:
carried out under controlled
conditions, with standardised
objects, allowing comparability
and reproducibility.
Research in the wild:
conducted out of the lab,
under diverse, uncertain
conditions and local
contexts.
17. Collaboration between secluded res. and res. in the wild
Participation in
problem framing and
ontologies
Extension of the
research collective
(Broadening out)
Processes of
interpretation of insights
into contexts
(Opening up)
18. Collaboration between secluded res. and res. in the wild
Participation in
problem framing and
ontologies
1. Extension of the
research collective
(Broadening out) Processes of
interpretation of insights
into contexts
(Opening up)
Happening
via BIG DATA
19. 1. Broadening out Assessment:
Expanding the research collective
This is about pluralising inputs beyond the bibliometric database
In terms of relevant data
• Media Analysis of news, policy discourse
• Social media data Altmetrics
• Health data Global disease burden, Healthcare data
• Economic data Consumption, exports, etc. (Ciarli on rice)
In terms of expertise (not only digital traces).
• Stakeholders (e.g. consultation to experts for ‘validation’)
• Mixed-methods (Diversity Approach – SPRU/Ingenio)
• Case studies (ASIRPA)
• PIPA (Participatory methods)
Taking inputs from outside the lab – the wild
20. Collaboration between secluded research and research
in the wild:
Participation in
problem framing and
ontologies
Extension of the
research collective
(Broadening out)
2. Processes of
interpretation of
insights into contexts
(Opening up)
Major challenge
of Leiden Manifesto
Principles 1-3
21. 2. S&T indicator as a tools in policy deliberation
• ‘Conventional’ use of indicators (‘Science Arbiter’--Pielke)
Purely analytical character (i.e. free of normative assumptions)
Seeking convergence (partial converging indicators, Martin and Irvine, 1983)
Aimed at justifying ‘best-choices’ (e.g. excellence)
Unitary and prescriptive advice
• ‘Opening up’ indicators(‘Honest broker’ --Pielke)
Aimed at locating the actors in their context and dynamics
Not predictive, or explanatory, but exploratory
Construction of indicators is based on choice of perspectives
Make explicit the possible choices on what matters
Supporting debate
Making science policy more ‘socially robust’
Plural and conditional advice
Barré (2001, 2004, 2010), Stirling (2008)
22. Closing down: Unique and prescriptive
Proposing “best choices”
Rankings -- ranking list of preferences
how much?
how fast?
who’s ahead?
Quantitative evidence for opening up:
Allowing for flexibility in interpretation
23. Model 2: Plural and conditional
Exploring complementary choices
Facilitating options/choices in landscapes
Model 1: Unique and prescriptive
Proposing “best choices”
Rankings -- ranking list of preferences
which way?
what alternatives?
why?
Quantitative outputs:
Allowing for flexibility in interpretation
24. Opening up the debate with indicators
Facilitate that users:
1. dig into the underlying data and algorithms and to see
what is behind the numbers (e.g. disaggregating categories)
2. explore robustness of descriptions (e.g. showing uncertainties),
3. show contrasting dimensions and options (e.g. via science
maps),
4. help users reflect on the relation between options against
their values and interest (e.g. by highlighting in a science map the
options chosen by stakeholders with explicit interests)
25. 3. Participation in
problem framing
Extension of the
research collective
(Broadening out)
Proceesses of
interpretation of insights
into contexts
(Opening up)
3. Participation in framing of problems and questions
26. Indicator use for informing decision in a hybrid forums
Hybrid forums are collaboration between secluded research and
research in the wild.
“In hybrid forums, in which (…) [indicators] are discussed,
uncertainties predominate, and everyone contributes information and
knowledge that enrich the discussion.” Callon et al. 2001
But how should hybrid forums be organised, so as to incorporate
quantitative evidence? (Yes, it is our problem as well)
Challenge: To develop processes with ‘responsible’, ‘inclusive’, ‘opening
up’ use of quantitative evidence in S&T.
Fochler and De Rijcke (2017): evaluative inquiry
27. Summary: Towards transdisciplinary collaborations
Transdisciplinary teams needed for:
• Participation of relevant stakeholders
• Experts on the sector under study to use and interpret data
• STS – interviews, ethnography, participation
• Experts on ‘technologies of participation’ (Rip, Doug Robinson)
Scientometrics / Data analytics STS
Secluded research Research in the wild
Positivist Interpretative
Value-free Value-laden
Technocratic State Civil Society
Quantitative Qualitative
Top-down Bottom-up
Expert-based Participatory
Closing down Opening up
Complementarities between Scientometrics and STS
(gross simplification of traits)
29. The argument
• Policy demand for indicators in research assessments.
• However current S&T indicators are highly problematic, possibly harmful.
• For quant methods to contribute to research impact assessment (RIA), we
need to go beyond scientometrics (or big data metrics) as it is now.
• Instead, we have to develop “indicators in the wild” (‘en plein air’) in
hybrid forums for engaging with contextual and diverse expertise
• This implies three moves in ’translation’:
1. broadening out the scope of data and expertise used
2. use quantitative outputs for opening up in processes that include
deliberation,
3. engaging with disparate communities in the framing of problems and
questions
30.
31. Summary of the argument
1. Research (including scientometrics) can have contested effects.
• S&T is deeply involved in some of the worse problems
2. Indicators as secluded research are part of an institutional
configuration that fosters S&T with little discussion on goals
• Focusing on ‘indicator of impact’ avoids questioning what type of
contribution we want – blind support for status quo.
• We are part of the problem.
3. Quantitative studies of science can play a different role in policy
• Democratization of S&T advice needs a pluralisation of indicators
• This implies leaving the lab and doing research in the wild
There can be NO general indicators of societal impact –
Only indicators useful for supporting impact assessment
in certain contexts – in the wild. (Cf. Molas-Gallart et al., 2003)
32. An agenda for indicators in the wild
• Broadening out the inputs
Expand the research collective
– Representation of fields, languages, countries, ‘traces’ that count.
– Reaching out to other expertise (e.g. including conceptual frameworks)
• Opening up the outputs
STI indicators as tools for deliberation (Barré)
– Develop outputs that allow exploration of choices.
• Embed indicators in social appraisal processes
Develop new processes on design, creation and use of indicators
– Collaborations with experts on qualitative and participatory methods and
beyond
33. Why should WE engage with indicators in the wild?
Rationales for pluralisation and participation (Stirling, 2004)
1. Substantive: IndWild produce more socially robust knowledge
More thorough scanning of knowledge. Inclusion of plural
perspectives.
2. Normative: Under a democratic view, pluralisation is good on its own
From a tool to project ‘the perspectives’ of incumbent institutions,
towards becoming an ‘honest broker’, facilitating deliberation.
3. Instrumental/Strategic: IndWild provide credibility and legitimacy.
Indicators for research impact assessment as a window of opportunity
to reposition quantitative studies of science.
Big companies’ services) are taking over consultancy services on indicators.
In the face of a simplistic delivery of indicators of RIA… (e.g. Altmetrics)
academia can offer socially responsible research assessment.
35. Criteria for expert advice to policy
‘Degree of values consensus on a particular issue.
Sharply contested issues raise the political stakes and introduce
dynamics quite different from issues which are less controversial.
Degree of uncertainty present in a decision context.
The greater the uncertainty – both scientific and political – the more
important it is for science to focus on policy options rather than
simply scientific results.’
Roger Pielke (2007) The Honest Broker.
Under conditions of low consensus and high uncertainty…
…not possible to separate
knowledge formation & decision making.
36. Research assessment
High uncertainty
• Ex-ante – the long term value of research is unknown
• Ex-post – difficult to track influence of research to benefits
Low value consensus
• Technologies and innovations often contested
• How about negative impact (Cambridge Analitica?)
• Research can influence in innovation in different directions.
• The value of research depends on the valuation of the innovations that it
may influence
.
More is not better. Innovation is not a scalar. It’s a vector about values.
.
37. Uncertainty and values consensus in impact assessment
Roger Pielke (2007) The Honest Broker
Separation
between knowledge
formation and
decision making
Advisors will
find the way
to knowledge!
38. Uncertainty and values consensus in impact assessment
General Indicators of impact:
Patents, Tweets, Co-Pubs. Roger Pielke (2007) The Honest Broker
Separation
between knowledge
formation and
decision making
Technocratic
advice
39. Uncertainty and values consensus in impact assessment
Roger Pielke (2007) The Honest Broker
Knowledge
formation and
decision making
entangled
Advice
as ‘activism’
40. Uncertainty and values consensus in impact assessment
Roger Pielke (2007) The Honest Broker
Knowledge
formation and
decision making
entangled
Plural and
conditional
advice
41. Uncertainty and values consensus in impact assessment
Roger Pielke (2007) The Honest Broker
Impact Assess.
What type of
indicators??
Plural and
conditional
advice
42. According to Callon, the most difficult move is extending research in
the identification and framing of problems.
• Most often problem is a given by state institutions
• Problematisation / enrichment by other stakeholders
• Qualitative techniques – interviews, focus groups, etc.
• Importance in delineation and ontology building.
3. Participation in framing of problems and questions