BenchmarkQA Software Quality Forum on Retrospectives, March 2011BenchmarkQA
This presentation was delivered at BenchmarkQA's March 2011 Software Quality Forum by Betty Schaar, senior consultant and training practice lead for BenchmarkQA.
Your program serves families, but does it really? Are your programs aimed at children while the parents mostly watch – or
vice versa? What if your program could truly engage family members of all ages in learning together?
This program highlights what the research tells us about the characteristics of strong families, family leisure behavior, and family learning in nonformal settings and provide strategies for strengthening your approach to whole family learning.
Learn how the 2009 revision of the Minn. Science Standards strengthen and focus learning for students, explore the connections
to new environmental and engineering concepts, and presentation offers ways they can be implemented in classrooms and informal settings. New environmental initiatives at the Dept. of Education is also discussed.
BenchmarkQA Software Quality Forum on Retrospectives, March 2011BenchmarkQA
This presentation was delivered at BenchmarkQA's March 2011 Software Quality Forum by Betty Schaar, senior consultant and training practice lead for BenchmarkQA.
Your program serves families, but does it really? Are your programs aimed at children while the parents mostly watch – or
vice versa? What if your program could truly engage family members of all ages in learning together?
This program highlights what the research tells us about the characteristics of strong families, family leisure behavior, and family learning in nonformal settings and provide strategies for strengthening your approach to whole family learning.
Learn how the 2009 revision of the Minn. Science Standards strengthen and focus learning for students, explore the connections
to new environmental and engineering concepts, and presentation offers ways they can be implemented in classrooms and informal settings. New environmental initiatives at the Dept. of Education is also discussed.
Insights on Using Developmental Evaluation for Innovating: A Case Study on th...Chi Yan Lam
Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social realities (Patton, 2011; Preskill & Beer, 2012). This presentation examines a case study of the pre-formative development of an innovative educational program. It contributes to research on DE by examining the capacity and contribution of DE for innovating. This case provides evidence supporting the utility of DE in developing innovative programs, but challenges our current understanding of DE on two fronts: 1) the necessary move from data-based to data-informed decision-making within a context of innovating, and 2) the use of DE for program co-creation as an outcome to the demands of social innovation. Analysis reveals the pervasiveness of uncertainty throughout development and how the rendering of evaluative data helps to propel development forward. DE enabled a nonlinear, co-evolutionary development process centering on six foci of development-definition, delineation, collaboration, prototyping, illumination, and evaluation-that characterize the innovation process.
JISC RSC London Workshop - Learner analyticsJames Ballard
Introduction to learning analytics and approaches to learner engagement to raise awareness and set the seen for upcoming projects and advice for supported learning providers.
Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social realities (Patton, 2011; Preskill & Beer, 2012). To that end, the developmental evaluator is expected to draw on a multitude of skills and to take on multiple roles depending on the situational demands of the DE. Beyond that little guidance is provided in the literature to guide evaluation practice amidst the decidedly complex and turbulent space of social innovation.
This presentation explores the contextual demands made of the evaluator and the emergence of the different roles taken on by the developmental evaluator to enable social innovation in a case of DE. The ‘preformative development’ of this social innovation saw the integration of microblogging into teacher education. Analysis revealed four roles assumed by the developmental evaluator when innovating: a) evaluator, b) facilitator of learning, c) project manager, and d) innovation thinker.
A seminar drawn from two projects that explored a range of assessment practices, and examined how they are implemented by establishing and comparing attitudes to assessment amongst tutors and students within three ODL environments: University of London International Programmes, King’s College London (ODL programmes) and the Open University.
Topic 5 DQ 2Nov 3-7, 2022Imagine once again that you are an au.docxjolleybendicty
Topic 5 DQ 2
Nov 3-7, 2022
Imagine once again that you are an automobile manufacturing executive tasked with increasing sales in your state. You wish to do a qualitative study to obtain the perspective of sales personnel regarding an incentive program you implemented at few dealerships that quantitatively proved to be successful. The three sources of data for your case study are individual semi-structured interviews, archival documents, and field observations. What are the most significant strengths and weaknesses of the methods for collecting data from these data sources? Why are these significant? What skills are needed to collect the data effectively? Explain. What concerns do you have about the feasibility of implementing these methods of data collection for this study? Explain.
One student already answered the below answer
Use this students answer below to make sure your answer sounds similar
STUDENT ANSWER- In the scenario for the automobile manufacturing company and the perspective of the sales personnel using the semi-structured interview, field observations, and archival documents would have positives and negatives. The semi-structured interview provides an opportunity for in-depth, open-ended questions (Steffes & Jacobs, 2021). The researcher can stray from the established questions, which allows for follow-up questions. A weakness might be the time it takes to interview all the participants, conduct the interview, and the researcher's bias. Field observations can not be recorded as they occur and rely on the researcher's memory. The researcher needs to record the evidence quickly after the event (Steffes & Jacobs, 2021). Finally, archival documents need approval before they are used. For our scenario, looking at companies implementing an incentive program would benefit the research. The data will provide a rich description of the participants. The researcher must be familiar with the research but not provide bias when conducting the research. Some considerations need to be made when collecting data. Modality, time of the interview, if they are recorded, and sharing the purpose of the interview must be disclosed to those participating in the research. Concerns about the feasibility would be money, time, and the number of participants.
Resources:
Steffes, D. & Jacobs, J. (2021). Introduction to sampling, data collection, and data analysis. In Grand Canyon University (Ed.), GCU doctoral research: Foundations and Theories.
https://lc.gcumedia.com/webbooks/gcu-doctoral-research-introduction-to-sampling-data-collection-and-data-analysis/v1.1/#/chapter/6
Student responses
Original Question-
Topic 5 DQ 1
Think again of the study on the influence of high school principals’ leadership styles and academic achievement in their schools in your state. The sources of data must be aligned with the research questions and study design, and they must be feasible for administration of the study. Identify five different qu.
References to ARCI Validity Procedures 11214Don Stone, Ph.D. .docxsodhi3
References to AR/CI Validity Procedures 1/12/14
Don Stone, Ph.D. Saint Mary’s College, 1 of 1
Don Stone’s Shortened Version of References to AR/CI Validity Procedures
Validity procedures help assure the quality of the research, especially taking into account potential bias in the way the data/research findings are generated, interpreted, and reported. Incorporating validity procedures as part of the research design and implementation helps generate quality data. They bring awareness to the interactions among the participants/collaborators, and increase the credibility of the findings/learning. Validity procedures are a required component of the Leadership Project Proposal. It may be helpful to think of validity procedures also as “quality procedures.”
From the Leadership Project Guide (April, 2011; December, 2013, p. 8)
Validity procedures take into account potential bias in the way the data/research findings are generated, interpreted, and reported. Since any observation is seen through a particular lens, good research design usually means conducting the inquiry in a way that incorporates multiple perspectives, so that no one vantage point is relied on to describe the reality of what is being investigated. Action research practitioners have developed validity procedures to guard against the bias of the inquirers, especially unconscious bias and self-deception.
There are three main areas where applying validity procedures improve the quality of the inquiry: data gathering process, interactions between participants, reporting the findings.
Examples of procedures used while gathering data include multiple cycles of action and reflection, multiple data sources (triangulation), and diversity of participants to ensure multiple perspectives. These are generally considered defining elements for action research projects. Multiple data sources might include interviews, observations, journal notes, and surveys (Kuhne & Quigley, 1997).
Examples of procedures that support interactions between participants include managing distress, authentic collaboration, balancing action with reflection, balance between chaos and order. Skills that support participants in engaging in the inquiry include being present and open, bracketing and reframing, and guarding against defensive routines and group think by challenging consensus collusion through the role of “devil’s advocate.” These validity procedures and skills raise the awareness of the inquirers during the inquiry to free them from “the distortion of uncritical subjectivity” (Heron & Reason, 2001, p. 150) to create what Heron and Reason (2001) call “critical subjectivity” (p. 149). In using the term “critical subjectivity” they posit that pure “objectivity” is not possible and the need to be openly aware and critical about the inevitably subjective and biased observations and interpretations of researchers—be they outside experts in a traditional research design or participants in a Collabo ...
Insights on Using Developmental Evaluation for Innovating: A Case Study on th...Chi Yan Lam
Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social realities (Patton, 2011; Preskill & Beer, 2012). This presentation examines a case study of the pre-formative development of an innovative educational program. It contributes to research on DE by examining the capacity and contribution of DE for innovating. This case provides evidence supporting the utility of DE in developing innovative programs, but challenges our current understanding of DE on two fronts: 1) the necessary move from data-based to data-informed decision-making within a context of innovating, and 2) the use of DE for program co-creation as an outcome to the demands of social innovation. Analysis reveals the pervasiveness of uncertainty throughout development and how the rendering of evaluative data helps to propel development forward. DE enabled a nonlinear, co-evolutionary development process centering on six foci of development-definition, delineation, collaboration, prototyping, illumination, and evaluation-that characterize the innovation process.
JISC RSC London Workshop - Learner analyticsJames Ballard
Introduction to learning analytics and approaches to learner engagement to raise awareness and set the seen for upcoming projects and advice for supported learning providers.
Presentation given at #CESToronto2013. Abstract: Developmental evaluation (DE) supports social innovation and program development by guiding program adaptation to emergent and dynamic social realities (Patton, 2011; Preskill & Beer, 2012). To that end, the developmental evaluator is expected to draw on a multitude of skills and to take on multiple roles depending on the situational demands of the DE. Beyond that little guidance is provided in the literature to guide evaluation practice amidst the decidedly complex and turbulent space of social innovation.
This presentation explores the contextual demands made of the evaluator and the emergence of the different roles taken on by the developmental evaluator to enable social innovation in a case of DE. The ‘preformative development’ of this social innovation saw the integration of microblogging into teacher education. Analysis revealed four roles assumed by the developmental evaluator when innovating: a) evaluator, b) facilitator of learning, c) project manager, and d) innovation thinker.
A seminar drawn from two projects that explored a range of assessment practices, and examined how they are implemented by establishing and comparing attitudes to assessment amongst tutors and students within three ODL environments: University of London International Programmes, King’s College London (ODL programmes) and the Open University.
Topic 5 DQ 2Nov 3-7, 2022Imagine once again that you are an au.docxjolleybendicty
Topic 5 DQ 2
Nov 3-7, 2022
Imagine once again that you are an automobile manufacturing executive tasked with increasing sales in your state. You wish to do a qualitative study to obtain the perspective of sales personnel regarding an incentive program you implemented at few dealerships that quantitatively proved to be successful. The three sources of data for your case study are individual semi-structured interviews, archival documents, and field observations. What are the most significant strengths and weaknesses of the methods for collecting data from these data sources? Why are these significant? What skills are needed to collect the data effectively? Explain. What concerns do you have about the feasibility of implementing these methods of data collection for this study? Explain.
One student already answered the below answer
Use this students answer below to make sure your answer sounds similar
STUDENT ANSWER- In the scenario for the automobile manufacturing company and the perspective of the sales personnel using the semi-structured interview, field observations, and archival documents would have positives and negatives. The semi-structured interview provides an opportunity for in-depth, open-ended questions (Steffes & Jacobs, 2021). The researcher can stray from the established questions, which allows for follow-up questions. A weakness might be the time it takes to interview all the participants, conduct the interview, and the researcher's bias. Field observations can not be recorded as they occur and rely on the researcher's memory. The researcher needs to record the evidence quickly after the event (Steffes & Jacobs, 2021). Finally, archival documents need approval before they are used. For our scenario, looking at companies implementing an incentive program would benefit the research. The data will provide a rich description of the participants. The researcher must be familiar with the research but not provide bias when conducting the research. Some considerations need to be made when collecting data. Modality, time of the interview, if they are recorded, and sharing the purpose of the interview must be disclosed to those participating in the research. Concerns about the feasibility would be money, time, and the number of participants.
Resources:
Steffes, D. & Jacobs, J. (2021). Introduction to sampling, data collection, and data analysis. In Grand Canyon University (Ed.), GCU doctoral research: Foundations and Theories.
https://lc.gcumedia.com/webbooks/gcu-doctoral-research-introduction-to-sampling-data-collection-and-data-analysis/v1.1/#/chapter/6
Student responses
Original Question-
Topic 5 DQ 1
Think again of the study on the influence of high school principals’ leadership styles and academic achievement in their schools in your state. The sources of data must be aligned with the research questions and study design, and they must be feasible for administration of the study. Identify five different qu.
References to ARCI Validity Procedures 11214Don Stone, Ph.D. .docxsodhi3
References to AR/CI Validity Procedures 1/12/14
Don Stone, Ph.D. Saint Mary’s College, 1 of 1
Don Stone’s Shortened Version of References to AR/CI Validity Procedures
Validity procedures help assure the quality of the research, especially taking into account potential bias in the way the data/research findings are generated, interpreted, and reported. Incorporating validity procedures as part of the research design and implementation helps generate quality data. They bring awareness to the interactions among the participants/collaborators, and increase the credibility of the findings/learning. Validity procedures are a required component of the Leadership Project Proposal. It may be helpful to think of validity procedures also as “quality procedures.”
From the Leadership Project Guide (April, 2011; December, 2013, p. 8)
Validity procedures take into account potential bias in the way the data/research findings are generated, interpreted, and reported. Since any observation is seen through a particular lens, good research design usually means conducting the inquiry in a way that incorporates multiple perspectives, so that no one vantage point is relied on to describe the reality of what is being investigated. Action research practitioners have developed validity procedures to guard against the bias of the inquirers, especially unconscious bias and self-deception.
There are three main areas where applying validity procedures improve the quality of the inquiry: data gathering process, interactions between participants, reporting the findings.
Examples of procedures used while gathering data include multiple cycles of action and reflection, multiple data sources (triangulation), and diversity of participants to ensure multiple perspectives. These are generally considered defining elements for action research projects. Multiple data sources might include interviews, observations, journal notes, and surveys (Kuhne & Quigley, 1997).
Examples of procedures that support interactions between participants include managing distress, authentic collaboration, balancing action with reflection, balance between chaos and order. Skills that support participants in engaging in the inquiry include being present and open, bracketing and reframing, and guarding against defensive routines and group think by challenging consensus collusion through the role of “devil’s advocate.” These validity procedures and skills raise the awareness of the inquirers during the inquiry to free them from “the distortion of uncritical subjectivity” (Heron & Reason, 2001, p. 150) to create what Heron and Reason (2001) call “critical subjectivity” (p. 149). In using the term “critical subjectivity” they posit that pure “objectivity” is not possible and the need to be openly aware and critical about the inevitably subjective and biased observations and interpretations of researchers—be they outside experts in a traditional research design or participants in a Collabo ...
Have you ever struggled with how to guide and mentor interns?
Shayna Sellars, Audubon Center of the North Woods, has entered her first year mentoring interns & Joe Walewski, Wolf Ridge, has been mentoring 16-20 graduate students for 11 years is still asking the same questions. Come hear their strategies and techniques and then stick around to join in the discussion.
Check out this Lesson Plan!
This lesson plan encourages students to explore our surroundings through our senses. What can you see, hear, smell,
and touch on a walk around your school forest or neighborhood park?
This lesson provides strategies for K-5 teachers and youth leaders who wish to take their students outside to learn about the world around them.
Within the next 50–100 years, the warming climate will have major effects on boreal and northern hardwood forests situated near the prairie–forest border of central North America.
This biome boundary shifted to the northeast
during past episodes of global warming, and is expected to do so again. The climate of the future will likely lead to higher mortality among mature trees, due to the greater frequency of droughts, fires, forest-leveling windstorms, and outbreaks of native and exotic insect pests and diseases. In addition, increasing populations of native deer and European earthworm invasions will inhibit the establishment of tree seedlings. The expected net impact of these factors will be a “savannification” of the forest, due to loss of adult trees at a rate faster than that at which
they can be replaced. This will cause a greater magnitude and more rapid northeastward shift of the prairie–forest
border, as compared with a shift solely attributable to the direct effects of temperature change.
More from MN Association for Environmental Education (6)
Lee Frelich's "Climate Change & Forests" Presentation
Empowering youth to be evaluators: Involving Young People in Evaluating Informal Education Programs Presentation
1. 3/23/10
Empowering Youth To Be Evaluators:
Involving young people in evaluating
informal education programs
Amy Grack Nelson, Evaluation & Research Associate
Science Museum of Minnesota
Overview
Overview of participatory evaluation
Participatory evaluation examples
Sampling of interactive techniques
What is participatory evaluation?
1
2. 3/23/10
It’s All About Utility
Utility - one of the four essential features of all
evaluations
(Joint Committee on Standards for Educational Evaluation, 1994)
A way to help ensure use is to increase the primary
intended users’ level of participation in the
evaluation. (Cousins & Earl, 1995; Patton, 2008)
Participatory Evaluation
“Applied social research that involves trained evaluation
personnel and practice-based decision makers working
in partnership.” (Cousins & Earl, 1995, pg. 8)
Core purpose increasing use
Characteristics of Participatory Evaluation
Balanced control of evaluation process
Involvement of primary users
Extensive participation throughout the evaluation
(Cousins & Earl, 1995; Cousins & Whitmore, 1998)
Interactive Evaluation Practice Continuum
(King & Stevahn, 2002)
2
3. 3/23/10
Benefits of Participatory Evaluation
Increases use of evaluation results by:
Enhancing relevance of the evaluation
Increasing understanding of the data
Increasing ownership of the findings
(Cousins & Whitmore, 1998; King & Stevahn, 2002; Patton, 2008)
Evaluation capacity building
Develop analytic and evaluative skills
Stakeholders develop a more “critical eye”
(Cousins & Earl, 1992, 1995)
Participatory Evaluation Examples
Science Museum of Minnesota’s
Kitty Andersen Youth Science Center
Kitty Andersen Youth Science Center’s Park Crew
Facilitate earth science and environmental education
activities in the Big Back Yard and on outreaches
3
4. 3/23/10
Example 1: Summative Evaluation
Youth will…
Learn about water related earth surface processes
Develop teaching skills
Learn about related science, technology, engineering, and
math (STEM) careers
Evaluation design
Observations and interviews of youth staff at the
beginning and end of summer
Evaluation workshop to engage youth in results
Reviewing the Work
Keep/Change
Discussion
4
5. 3/23/10
Incorporating
Evaluation Data
How often youth talked about why something is considered a pollutant (n=27)
Keep/Change Discussion
Benefits Limitations
Engages users with data to Takes considerable amount
think about successes and of time
areas of improvement
Can be used to generate
recommendations
5
6. 3/23/10
Youth Benefits
Meaningful involvement in evaluation can help youth
develop higher order thinking skills, specifically
analytic and evaluative skills. (London et al., 2003)
Youth became more reflective of their work.
Youth comments reflected increased knowledge of the
activities and confidence in sharing that knowledge with
visitors.
Youth had a stronger sense of ownership and control.
Adult Staff Benefits
Provided important feedback about the crew’s work.
Gained deeper understanding of the participants’
experience and could proactively identify and respond
to their needs.
Increased understanding of evaluation and ability to
interpret data and generate recommendations.
Developed capacity to include participatory evaluation
in future work.
6
7. 3/23/10
Outcomes of the Process
Participants experience a sense of empowerment and
pride when they have an influence on the way
programs are run and see their ideas acted upon.
(Checkoway et al., 2003; Horsch et al., 2002; London et al., 2003)
Youth used suggestions to develop their own training.
They created a visitor survey and collected data.
Youth shared their ideas with a museum operations staff
member.
Example 2:
Formative Evaluation of Outreaches
Evaluation Process
Identify daily Craft survey
Pilot surveys
objectives questions
Discuss pilot data
Enter and code Administer
and revise
data surveys
surveys
Analyze and Generate Improve outreach
discuss data recommendations activities
7
8. 3/23/10
Organizational Requirements for PE
Evaluation must be valued
Sufficient time and resources
Commitment to organizational learning as a means to
improvement
Motivated individuals
Interest and ability to learn evaluative skills
(Cousins & Earl, 1992)
Evaluator Requirements for PE
Sufficient technical and facilitator skills
Accessible for participatory activities and support
Necessary resources and time
Serve an instructional role
Motivation and commitment to participate
A tolerance for imperfection
Flexibility
(Burke, 1998; Cousins & Earl, 1992; King, 1998)
8
10. 3/23/10
Interactive Graphs
Benefits Limitations
Quick data collection People may be
Everyone can see the influenced by others
process and results People may be hesitant
Can be used as a
to place a rating where
starting point for no one else has
deeper conversations
Can see data by
various characteristics
(King, 2009)
Carousel Sheets
Benefits Limitations
Alternative to traditional Participants may influence
brainstorming each others’ responses
Lots of information in a Tend to get first responses
short timeframe and gut reaction; not
Quick way to see patterns deep and thoughtful
Promotes high involvement Responses may be too
brief
Involves users in analysis
May need to reanalyze
some of the data
(King, 2009)
10