This report provides information about an independent field research guide, along with a detailed version of the different phases and steps to undertake during a research process.
20190527_Dietmar Lampert _ New indicators for Open ScieneOpenAIRE
ย
Presented by Dietmar Lampert (ZSI Research Policy and Development)
during the OpenAIRE workshop "Research policy monitoring in the era of Open Science and Big Data" taking place in Ghent, Belgium on May 27th and 28th 2019
Day 1: Monitoring and Infrastructure for Open Science
https://www.openaire.eu/research-policy-monitoring-in-the-era-of-open-science-and-big-data-the-what-indicators-and-the-how-infrastructures
Utilization focused evaluation: an introduction (Part 1 - ROER4D) SarahG_SS
ย
Introductory slides on Utilization Focused Evaluation (UFE) that I presented to the ROER4D team (http://roer4d.org/) on 22 September 2014 as part of the project's evaluation process.
This document provides an introduction to utilization focused evaluation (UFE). UFE is defined as a decision-making framework that enhances the utility and actual use of evaluations. It is a process that involves primary intended users in selecting appropriate evaluation methods, content, and uses based on their specific situation and needs. The goal of UFE is to ensure evaluations are useful and used by conducting them for and with specific intended users and uses in mind from the beginning. The document outlines the 12 steps of the UFE process.
Developing an evaluation strategy to gain insights into the ROER4D multi-nati...SarahG_SS
ย
Presentation at the OE Global Conference held in Banff, Canada in April 2015. This presentation introduces the development of the evaluation strategy, using Utilization Focused Evaluation (UFE), for the ROER4D Project (http://roer4d.org/).
You can access the abstract here: http://conference.oeconsortium.org/2015/presentation/developing-an-evaluation-strategy-to-gain-insights-into-a-multi-national-project-roer4d/
This document summarizes the results of a mapping study about methods for capturing usability requirements. The study analyzed 150 publications and selected 29 based on inclusion criteria. Most methods elicit usability requirements after functional requirements, and guidelines are not defined in a way that end-users can understand. Few approaches deal with usability at early stages or provide precise notations. The quality of selected publications was generally good, though more work is needed on usability in model-driven development contexts. The authors plan to develop a framework and transformation rules to elicit usability requirements earlier and integrate them into analysis and design models.
UX Evaluation Design of an Online Editor for Infographics CreationNadia Serveti
ย
The document provides a summary of user experience (UX) evaluation research conducted on the design of an online infographic creation editor. It includes sections on the research background and objectives, research design, research execution and findings, and recommendations. As part of the research design, the document discusses selecting appropriate UX research methodologies, including heuristic evaluation by experts, competitive analysis of other editors, ethnographic field studies with users, and individual interviews and reaction cards. The research execution section provides details on findings from each methodology. The recommendations section suggests improvements to the editor based on the research."
This document outlines an agenda for a workshop on reflexive monitoring in action (RMA). The workshop objectives are to provide an understanding of simple, complicated, and complex situations and their related monitoring and evaluation approaches. It will also provide an introduction to the principles of RMA and teach participants how to use some RMA tools. The agenda includes sessions on introductions to RMA, explanations of learning frameworks and tools like timelines, and working exercises for participants to practice using the tools. Examples of tools demonstrated include the learning-change framework, collective system analysis, and timelines. The document discusses how RMA aims to support collective learning and change through reflection on activities and contexts over time.
20190527_Dietmar Lampert _ New indicators for Open ScieneOpenAIRE
ย
Presented by Dietmar Lampert (ZSI Research Policy and Development)
during the OpenAIRE workshop "Research policy monitoring in the era of Open Science and Big Data" taking place in Ghent, Belgium on May 27th and 28th 2019
Day 1: Monitoring and Infrastructure for Open Science
https://www.openaire.eu/research-policy-monitoring-in-the-era-of-open-science-and-big-data-the-what-indicators-and-the-how-infrastructures
Utilization focused evaluation: an introduction (Part 1 - ROER4D) SarahG_SS
ย
Introductory slides on Utilization Focused Evaluation (UFE) that I presented to the ROER4D team (http://roer4d.org/) on 22 September 2014 as part of the project's evaluation process.
This document provides an introduction to utilization focused evaluation (UFE). UFE is defined as a decision-making framework that enhances the utility and actual use of evaluations. It is a process that involves primary intended users in selecting appropriate evaluation methods, content, and uses based on their specific situation and needs. The goal of UFE is to ensure evaluations are useful and used by conducting them for and with specific intended users and uses in mind from the beginning. The document outlines the 12 steps of the UFE process.
Developing an evaluation strategy to gain insights into the ROER4D multi-nati...SarahG_SS
ย
Presentation at the OE Global Conference held in Banff, Canada in April 2015. This presentation introduces the development of the evaluation strategy, using Utilization Focused Evaluation (UFE), for the ROER4D Project (http://roer4d.org/).
You can access the abstract here: http://conference.oeconsortium.org/2015/presentation/developing-an-evaluation-strategy-to-gain-insights-into-a-multi-national-project-roer4d/
This document summarizes the results of a mapping study about methods for capturing usability requirements. The study analyzed 150 publications and selected 29 based on inclusion criteria. Most methods elicit usability requirements after functional requirements, and guidelines are not defined in a way that end-users can understand. Few approaches deal with usability at early stages or provide precise notations. The quality of selected publications was generally good, though more work is needed on usability in model-driven development contexts. The authors plan to develop a framework and transformation rules to elicit usability requirements earlier and integrate them into analysis and design models.
UX Evaluation Design of an Online Editor for Infographics CreationNadia Serveti
ย
The document provides a summary of user experience (UX) evaluation research conducted on the design of an online infographic creation editor. It includes sections on the research background and objectives, research design, research execution and findings, and recommendations. As part of the research design, the document discusses selecting appropriate UX research methodologies, including heuristic evaluation by experts, competitive analysis of other editors, ethnographic field studies with users, and individual interviews and reaction cards. The research execution section provides details on findings from each methodology. The recommendations section suggests improvements to the editor based on the research."
This document outlines an agenda for a workshop on reflexive monitoring in action (RMA). The workshop objectives are to provide an understanding of simple, complicated, and complex situations and their related monitoring and evaluation approaches. It will also provide an introduction to the principles of RMA and teach participants how to use some RMA tools. The agenda includes sessions on introductions to RMA, explanations of learning frameworks and tools like timelines, and working exercises for participants to practice using the tools. Examples of tools demonstrated include the learning-change framework, collective system analysis, and timelines. The document discusses how RMA aims to support collective learning and change through reflection on activities and contexts over time.
Evaluation: a means to gain insights into and improve the ROER4D project SarahG_SS
ย
A shortened version of a presentation given to the EDN4502W: Research & Evaluation of Emerging Technologies PGDip course on the evaluation strategy for ROER4D.
Evaluation: a means to gain insights into and improve the ROER4D projectROER4D
ย
A shortened version of a presentation ROER4D's Evaluation Advisor, Sarah Goodier, gave to the University of Cape Town's Research & Evaluation of Emerging Technologies PGDip course on the evaluation strategy for ROER4D.
This document outlines the development of an evaluation strategy for the ROER4D multi-national project. It discusses utilizing a utilization-focused evaluation approach to understand how open educational resources can address the demand for education in the Global South. Key areas of evaluation include building an evidence base on OER use and impact, developing researcher capacity, and communicating research findings. An iterative process is outlined to formulate the evaluation plan in collaboration with project partners. Initial findings from social media analytics and recommendations are also provided.
The document discusses project management and impact evaluation. It defines a project, outlines the project cycle which includes identification, preparation, appraisal, implementation, monitoring and evaluation. It discusses why projects fail and types of projects. Logical framework analysis and cost-benefit analysis are presented as tools for project planning, implementation and evaluation. Impact evaluation aims to determine if a project achieved its intended effects by estimating what would have occurred without the project.
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
ย
This document provides guidance on planning and preparing for usability testing of surveys. It discusses determining what aspects of a survey to test, who to recruit as participants, and where to conduct the testing. Key recommendations include deciding what to test at least a month before testing, recruiting 5-10 participants to represent intended users, and conducting testing in rounds with revisions between rounds rather than one large test. Locations for testing can either be at the organization conducting the test or in participants' natural environments.
This document outlines the research process for a museum studies course, including identifying the needed information, developing a scope of work, carrying out research, and summarizing and applying the findings. It provides suggestions for starting with a small pilot study using standard measures to build a data bank, agreeing on outcomes to measure success, and considering the intended reporting audience. The document also covers organizing, analyzing, and presenting the data through reports, including comparisons, relevant findings, and implications.
Establishing value: success with & business impact of learningscil CH
ย
The document discusses establishing value and impact for learning initiatives. It introduces the Swiss Center for Innovations in Learning (SCIL) and its services, which include extended training, standard trainings and courses, facilitated workplace learning, and learning networks and communities. SCIL uses methods like ROI analysis, success case studies, and monitoring informal learning to demonstrate the realized and potential value of learning programs. The document outlines these evaluation approaches and their steps to systematically measure learning impact and outcomes.
Designing and Conducting Formative Evaluationscloder6416
ย
This document provides an overview of formative evaluation and its importance in improving project design. Formative evaluation involves testing a project before or during implementation to ensure needs are being met, feedback is collected, and the design is finalized. It describes different evaluation methods like one-on-one interviews and small group testing that provide early feedback to improve the design. The document emphasizes evaluating in a real-world context and being prepared to identify and address problems to strengthen instruction.
This document outlines a 5-step "P Process" framework for guiding behavioral change programs:
1. Inquire - Define the behavioral problem through research and understanding audiences.
2. Design Strategy - Create a plan to address the problem by developing solutions and a theoretical framework.
3. Create and Test - Design interventions and test concepts with audiences, revising based on feedback.
4. Mobilize and Monitor - Implement the program, measure outcomes, and make adjustments.
5. Evaluate and Evolve - Assess the program's success, learn lessons, and consider how results can be expanded.
Participation and capacity building of partners and stakeholders is important for sustainability at each step.
An institutional perspective on analytics that focusses on a particular tool developed using an agile methodology to visualise learner behaviours in MOOCs via Sankey diagrams.
The document discusses various user research techniques for understanding users and their needs. It provides context for why user research is important, describes techniques like interviews, surveys, trips and testing, and discusses the pros and cons of each. It emphasizes that choosing the right technique depends on the goals, information needed, schedule, budget and other constraints of the specific project.
This document discusses evaluating an open API using principles of outcome mapping. It explains that outcome mapping focuses on outcomes rather than outputs and acknowledges limits of influence. The document outlines the three stages of outcome mapping: intentional design, outcome and performance monitoring, and evaluation planning. It also includes a diagram showing spheres of control, influence and interest in evaluating projects.
Here are the key differences between qualitative and quantitative research approaches:
Qualitative research is focused on understanding experiences and perspectives through open-ended questions and in-depth discussions, while quantitative research relies on standardized measures that can be statistically analyzed.
Some additional differences include:
- Qualitative research aims to gather non-numerical data like words, pictures or objects to understand meanings, experiences and views. Quantitative research gathers numerical data to test hypotheses.
- Qualitative methods are more flexible and exploratory, while quantitative methods have a fixed design to test hypotheses.
- Data analysis in qualitative research involves identifying themes and patterns, while quantitative analysis uses statistical techniques.
- Sample sizes are typically smaller in qualitative research to
The document describes a design study methodology for developing interactive visualizations to support personalization of health data for patients and users. It involves 3 phases - a pre-condition phase to understand the domain through literature review and expert input, a condition phase to design and implement visualization prototypes through an iterative user-centered process, and a post-condition phase to deploy and validate the solutions. The methodology supports achieving the objectives of understanding healthcare needs, designing interactive visualizations, and validating the solutions with experts.
This document summarizes a seminar about continuous improvement in agile practices. The seminar introduced agile values and principles like transparency, inspection and adaptation. It covered techniques for continuous improvement like retrospectives and emphasized learning from experiences to constantly improve. Attendees participated in a retrospective exercise to discuss what went well and possibilities for future seminars to become more effective.
A method is proposed to help designers understand the context for slow change interaction design (SCID). The method involves recruiting participants undergoing slow changes and having them journal about their environment, routines, knowledge, attitudes, practices and beliefs over several months. Designers would then facilitate workshops to help participants create journey maps analyzing their context and identifying realistic goals. The goal is to reveal how a participant's context and goals change over time to inform the design of technologies supporting slow changes.
What is program evaluation lecture 100207 [compatibility mode]Jennifer Morrow
ย
The document discusses what program evaluation is, including defining it as the systematic collection of information about program activities, characteristics, and outcomes to improve effectiveness and inform decision making. It also outlines the types and purposes of evaluation, how to prepare for and conduct an evaluation by developing a logic model and methodology, and important considerations around data collection, analysis, and ethics.
Evaluation: a means to gain insights into and improve the ROER4D project SarahG_SS
ย
A shortened version of a presentation given to the EDN4502W: Research & Evaluation of Emerging Technologies PGDip course on the evaluation strategy for ROER4D.
Evaluation: a means to gain insights into and improve the ROER4D projectROER4D
ย
A shortened version of a presentation ROER4D's Evaluation Advisor, Sarah Goodier, gave to the University of Cape Town's Research & Evaluation of Emerging Technologies PGDip course on the evaluation strategy for ROER4D.
This document outlines the development of an evaluation strategy for the ROER4D multi-national project. It discusses utilizing a utilization-focused evaluation approach to understand how open educational resources can address the demand for education in the Global South. Key areas of evaluation include building an evidence base on OER use and impact, developing researcher capacity, and communicating research findings. An iterative process is outlined to formulate the evaluation plan in collaboration with project partners. Initial findings from social media analytics and recommendations are also provided.
The document discusses project management and impact evaluation. It defines a project, outlines the project cycle which includes identification, preparation, appraisal, implementation, monitoring and evaluation. It discusses why projects fail and types of projects. Logical framework analysis and cost-benefit analysis are presented as tools for project planning, implementation and evaluation. Impact evaluation aims to determine if a project achieved its intended effects by estimating what would have occurred without the project.
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
ย
This document provides guidance on planning and preparing for usability testing of surveys. It discusses determining what aspects of a survey to test, who to recruit as participants, and where to conduct the testing. Key recommendations include deciding what to test at least a month before testing, recruiting 5-10 participants to represent intended users, and conducting testing in rounds with revisions between rounds rather than one large test. Locations for testing can either be at the organization conducting the test or in participants' natural environments.
This document outlines the research process for a museum studies course, including identifying the needed information, developing a scope of work, carrying out research, and summarizing and applying the findings. It provides suggestions for starting with a small pilot study using standard measures to build a data bank, agreeing on outcomes to measure success, and considering the intended reporting audience. The document also covers organizing, analyzing, and presenting the data through reports, including comparisons, relevant findings, and implications.
Establishing value: success with & business impact of learningscil CH
ย
The document discusses establishing value and impact for learning initiatives. It introduces the Swiss Center for Innovations in Learning (SCIL) and its services, which include extended training, standard trainings and courses, facilitated workplace learning, and learning networks and communities. SCIL uses methods like ROI analysis, success case studies, and monitoring informal learning to demonstrate the realized and potential value of learning programs. The document outlines these evaluation approaches and their steps to systematically measure learning impact and outcomes.
Designing and Conducting Formative Evaluationscloder6416
ย
This document provides an overview of formative evaluation and its importance in improving project design. Formative evaluation involves testing a project before or during implementation to ensure needs are being met, feedback is collected, and the design is finalized. It describes different evaluation methods like one-on-one interviews and small group testing that provide early feedback to improve the design. The document emphasizes evaluating in a real-world context and being prepared to identify and address problems to strengthen instruction.
This document outlines a 5-step "P Process" framework for guiding behavioral change programs:
1. Inquire - Define the behavioral problem through research and understanding audiences.
2. Design Strategy - Create a plan to address the problem by developing solutions and a theoretical framework.
3. Create and Test - Design interventions and test concepts with audiences, revising based on feedback.
4. Mobilize and Monitor - Implement the program, measure outcomes, and make adjustments.
5. Evaluate and Evolve - Assess the program's success, learn lessons, and consider how results can be expanded.
Participation and capacity building of partners and stakeholders is important for sustainability at each step.
An institutional perspective on analytics that focusses on a particular tool developed using an agile methodology to visualise learner behaviours in MOOCs via Sankey diagrams.
The document discusses various user research techniques for understanding users and their needs. It provides context for why user research is important, describes techniques like interviews, surveys, trips and testing, and discusses the pros and cons of each. It emphasizes that choosing the right technique depends on the goals, information needed, schedule, budget and other constraints of the specific project.
This document discusses evaluating an open API using principles of outcome mapping. It explains that outcome mapping focuses on outcomes rather than outputs and acknowledges limits of influence. The document outlines the three stages of outcome mapping: intentional design, outcome and performance monitoring, and evaluation planning. It also includes a diagram showing spheres of control, influence and interest in evaluating projects.
Here are the key differences between qualitative and quantitative research approaches:
Qualitative research is focused on understanding experiences and perspectives through open-ended questions and in-depth discussions, while quantitative research relies on standardized measures that can be statistically analyzed.
Some additional differences include:
- Qualitative research aims to gather non-numerical data like words, pictures or objects to understand meanings, experiences and views. Quantitative research gathers numerical data to test hypotheses.
- Qualitative methods are more flexible and exploratory, while quantitative methods have a fixed design to test hypotheses.
- Data analysis in qualitative research involves identifying themes and patterns, while quantitative analysis uses statistical techniques.
- Sample sizes are typically smaller in qualitative research to
The document describes a design study methodology for developing interactive visualizations to support personalization of health data for patients and users. It involves 3 phases - a pre-condition phase to understand the domain through literature review and expert input, a condition phase to design and implement visualization prototypes through an iterative user-centered process, and a post-condition phase to deploy and validate the solutions. The methodology supports achieving the objectives of understanding healthcare needs, designing interactive visualizations, and validating the solutions with experts.
This document summarizes a seminar about continuous improvement in agile practices. The seminar introduced agile values and principles like transparency, inspection and adaptation. It covered techniques for continuous improvement like retrospectives and emphasized learning from experiences to constantly improve. Attendees participated in a retrospective exercise to discuss what went well and possibilities for future seminars to become more effective.
A method is proposed to help designers understand the context for slow change interaction design (SCID). The method involves recruiting participants undergoing slow changes and having them journal about their environment, routines, knowledge, attitudes, practices and beliefs over several months. Designers would then facilitate workshops to help participants create journey maps analyzing their context and identifying realistic goals. The goal is to reveal how a participant's context and goals change over time to inform the design of technologies supporting slow changes.
What is program evaluation lecture 100207 [compatibility mode]Jennifer Morrow
ย
The document discusses what program evaluation is, including defining it as the systematic collection of information about program activities, characteristics, and outcomes to improve effectiveness and inform decision making. It also outlines the types and purposes of evaluation, how to prepare for and conduct an evaluation by developing a logic model and methodology, and important considerations around data collection, analysis, and ethics.
Similar to Research Guide | Independent Field Research: How-to Guide (20)
This presentation was provided by Racquel Jemison, Ph.D., Christina MacLaughlin, Ph.D., and Paulomi Majumder. Ph.D., all of the American Chemical Society, for the second session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session Two: 'Expanding Pathways to Publishing Careers,' was held June 13, 2024.
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...indexPub
ย
The recent surge in pro-Palestine student activism has prompted significant responses from universities, ranging from negotiations and divestment commitments to increased transparency about investments in companies supporting the war on Gaza. This activism has led to the cessation of student encampments but also highlighted the substantial sacrifices made by students, including academic disruptions and personal risks. The primary drivers of these protests are poor university administration, lack of transparency, and inadequate communication between officials and students. This study examines the profound emotional, psychological, and professional impacts on students engaged in pro-Palestine protests, focusing on Generation Z's (Gen-Z) activism dynamics. This paper explores the significant sacrifices made by these students and even the professors supporting the pro-Palestine movement, with a focus on recent global movements. Through an in-depth analysis of printed and electronic media, the study examines the impacts of these sacrifices on the academic and personal lives of those involved. The paper highlights examples from various universities, demonstrating student activism's long-term and short-term effects, including disciplinary actions, social backlash, and career implications. The researchers also explore the broader implications of student sacrifices. The findings reveal that these sacrifices are driven by a profound commitment to justice and human rights, and are influenced by the increasing availability of information, peer interactions, and personal convictions. The study also discusses the broader implications of this activism, comparing it to historical precedents and assessing its potential to influence policy and public opinion. The emotional and psychological toll on student activists is significant, but their sense of purpose and community support mitigates some of these challenges. However, the researchers call for acknowledging the broader Impact of these sacrifices on the future global movement of FreePalestine.
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...EduSkills OECD
ย
Andreas Schleicher, Director of Education and Skills at the OECD presents at the launch of PISA 2022 Volume III - Creative Minds, Creative Schools on 18 June 2024.
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
ย
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
ย
Ivรกn Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
A Visual Guide to 1 Samuel | A Tale of Two HeartsSteve Thomason
ย
These slides walk through the story of 1 Samuel. Samuel is the last judge of Israel. The people reject God and want a king. Saul is anointed as the first king, but he is not a good king. David, the shepherd boy is anointed and Saul is envious of him. David shows honor while Saul continues to self destruct.
Philippine Edukasyong Pantahanan at Pangkabuhayan (EPP) CurriculumMJDuyan
ย
(๐๐๐ ๐๐๐) (๐๐๐ฌ๐ฌ๐จ๐ง ๐)-๐๐ซ๐๐ฅ๐ข๐ฆ๐ฌ
๐๐ข๐ฌ๐๐ฎ๐ฌ๐ฌ ๐ญ๐ก๐ ๐๐๐ ๐๐ฎ๐ซ๐ซ๐ข๐๐ฎ๐ฅ๐ฎ๐ฆ ๐ข๐ง ๐ญ๐ก๐ ๐๐ก๐ข๐ฅ๐ข๐ฉ๐ฉ๐ข๐ง๐๐ฌ:
- Understand the goals and objectives of the Edukasyong Pantahanan at Pangkabuhayan (EPP) curriculum, recognizing its importance in fostering practical life skills and values among students. Students will also be able to identify the key components and subjects covered, such as agriculture, home economics, industrial arts, and information and communication technology.
๐๐ฑ๐ฉ๐ฅ๐๐ข๐ง ๐ญ๐ก๐ ๐๐๐ญ๐ฎ๐ซ๐ ๐๐ง๐ ๐๐๐จ๐ฉ๐ ๐จ๐ ๐๐ง ๐๐ง๐ญ๐ซ๐๐ฉ๐ซ๐๐ง๐๐ฎ๐ซ:
-Define entrepreneurship, distinguishing it from general business activities by emphasizing its focus on innovation, risk-taking, and value creation. Students will describe the characteristics and traits of successful entrepreneurs, including their roles and responsibilities, and discuss the broader economic and social impacts of entrepreneurial activities on both local and global scales.
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
ย
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
2. About the Guide
Who are the intended users?
Professionalsโwith no or minimal background in researchโworking for Societal Platform missions.
What is the guide for?
To understand user adoption of routine changes.
How does the guide help?
Gives you step-by-step and phase-level inputs on the activities.
7. Pre-visit
Pre-visit phase
5. Develop a
research plan
6. Secondary
research
7. Present and
revise the
research plan
8. Develop
content for
primary research