1. Working towards a research
model
Outcomes of the DETA Summit at ELI
Web reference: http://uwm.edu/deta/summit/ and
http://uwm.edu/deta/research-model/
8. 6
5
4
1
2 2
0 0 0
0
1
2
3
4
5
6
7
The DETA Summit engaged
stakeholders from throughout the U.S.
in the field of Distance Education.
Key issues that will impact the DETA
research agenda arose and were
documented.
The DETA Summit will assist in
determining next steps for research
conducted by the Center and its
partners in distance education and
technological advancements.
Goal Attainment of Summit
Strongly Agree Agree Neither Agree nor Disagree
10. 3 33
4
1
0
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
Attendees of the DETA Summit prioritized research
questions to drive future research of the grant.
The DETA Summit helped develop research questions of
interest in the field of distance education.
Research questions
Strongly Agree Agree Neither Agree nor Disagree
11. 4
2 2
1
5
3
2
0
1
0
1
2
3
4
5
6
The DETA Summit helped identify
variables to answer the proposed
research questions.
The work from the DETA Summit will
help develop a framework of inquiry
for distance education.
Attendees of the DETA Summit
prioritized variables to be included in
the framework of inquiry to drive
future research.
Variables
Strongly Agree Agree Neither Agree nor Disagree
12. 4
0
3
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
The DETA Summit supplemental sessions were useful in exploring speciality topics and the research
surrounding them, including competency-based education, accessibility, and distance education support.
Supplemental Sessions
Strongly Agree Agree Neither Agree nor Disagree
13. Satisfaction in Logistics Min Value Max Value Mean Std. Dev.
Group member composition 6.00 10.00 9.00 1.53
Participants invited 5.00 10.00 8.86 1.86
RSVP'ing Process 6.00 10.00 8.71 1.38
DETA website 7.00 10.00 8.71 1.11
Table Facilitators 4.00 10.00 8.57 2.07
Time Allocated for Voting 7.00 10.00 8.43 1.27
Presentation 5.00 10.00 8.29 1.70
Collaborative Technology 5.00 10.00 8.14 2.19
The Summit Agenda 5.00 10.00 8.00 1.83
Location of the Summit 3.00 10.00 7.86 2.41
Room Used for Summit 5.00 10.00 7.71 2.06
The Layout of the Room 3.00 10.00 7.57 2.64
Time Allocated for Discussion 5.00 10.00 7.43 2.23
Food Provided 0.00 10.00 7.38 3.46
Summit Satisfaction Ratings
14. Satisfaction in Logistics Min Value Max Value Mean Std. Dev.
Group member composition 6.00 10.00 9.00 1.53
Participants invited 5.00 10.00 8.86 1.86
RSVP'ing Process 6.00 10.00 8.71 1.38
DETA website 7.00 10.00 8.71 1.11
Table Facilitators 4.00 10.00 8.57 2.07
Time Allocated for Voting 7.00 10.00 8.43 1.27
Presentation 5.00 10.00 8.29 1.70
Collaborative Technology 5.00 10.00 8.14 2.19
The Summit Agenda 5.00 10.00 8.00 1.83
Location of the Summit 3.00 10.00 7.86 2.41
Room Used for Summit 5.00 10.00 7.71 2.06
The Layout of the Room 3.00 10.00 7.57 2.64
Time Allocated for Discussion 5.00 10.00 7.43 2.23
Food Provided 0.00 10.00 7.38 3.46
Summit Satisfaction Ratings
Diversity in
participants and
group membership
Not enough time
to discuss
15. 15
1. Key stakeholders were engaged
2. Key issues discussed and well-
documented
3. Summit was organized
4. Summit accomplished goals
18. 18
How can we define and measure
student success beyond traditional
outcomes (e.g., academic learning?)
What are the small number of definable,
measurable characteristics that make research
rich for student success in distance education?
Who is the DE
Student?
What does competency
mean in education?
National Standard Definitions
Competency
Distance Education
19. 19
What is the difference between online
learning and distance education?
What interventions are
particularly successful?
Are there certain behaviors or patterns
in online learning?
What strategies support student
success at the individual learner, class,
program and institutional levels?
National Standard Definitions
Online Learning
Student Preparedness and
Succes
27. Research questions
What are the definitions of success from students’ perspective? 33
What are the different design components (content, interactivity,
assessments) that impact student learning?
29
What patterns of behaviors lead to increased student learning for different
populations?
26
How can we define and measure student success beyond traditional
outcomes?
25
What support structures are critical to providing quality access to online
instruction?
22
What is the currency of student learning beyond the existing credit hours? 22
What are the key components that promote a sustainable and an effective
teaching and learning ecosystem?
21
28. Research questions
Guiding research:
1. What are the different design components (content, interactivity,
assessments) that impact student learning?
2. What patterns of behaviors lead to increased student learning for different
populations?
3. What support structures are critical to providing quality access to online
instruction?
4. What are the key components that promote a sustainable and an effective
teaching and learning ecosystem?
Clarifying measures:
5. What are the definitions of success from students’ perspective?
6. How can we define and measure student success beyond traditional
outcomes?
7. What is the currency of student learning beyond the existing credit hours?
32. Input Throughput Output
Student or Learner
Characteristics
Student OutcomesStudent Behaviors
Instructor Characteristics Instructor Outcomes
Course Characteristics Course Outcomes
Program and Institutional
Characteristics
Program and Institutional
Outcomes
Program and Institutional
Behaviors
Instructor Behaviors
External
33. Input Throughput Output
Student or Learner
Characteristics
Student or Learner
Characteristics
Student OutcomesStudent Behaviors
Instructor Characteristics
Instructor Outcomes
Course Characteristics
Course Outcomes
Program and Institutional
Characteristics
Program and Institutional
Outcomes
Program and Institutional
Service Behaviors
Instructor Behaviors
External
Specific to
DE
Specific to
DE
Specific to
DE
Specific to
DE
35. Input Throughput Output
Student or Learner
Characteristics Student Outcomes
Student Behaviors
Instructor
Characteristics
Course
Characteristics
Course Outcomes
Program and
Institutional
Characteristics
Program and
Institutional
Outcomes
Program and Institutional
Service Behaviors
Learning
Effectiveness
Satisfaction
Access
Instructional
Effectiveness
36. Research toolkits
Shared measures –
Student performance is
based on numerical
representation of grade
converted to a 4.0 scale
received in the course on
assessments and as an
overall grade.
RQs - What are the
different design
components (content,
interactivity, assessments)
that impact student
learning?
37. Next steps
1. Streamline the FOI for research that will take place as part of
the grant
2. Gather feedback from Summit attendees
3. Develop measures, including clarifying measures as identified
in the summit (defining student success & currency beyond the
credit hour)
3a. Develop measures specific to our audiences
4. Develop standards for data acquistion to inform data storage
Editor's Notes
Rachel will discuss:
Development of survey
Alignment with goals of summit as outlined in proposal
Quantitative scales
Qualitative questions
Refer folks to the PDF report online
Walk through the bar charts, supporting with qualitative comments when appropriate
Rachel will discuss that a common theme was people complementing us on the participants that attended and the how the groups were composed.
Lindsey can discuss thematic analysis of qualitative questions in survey. She can confirm that the qualitative data support the quantitative data.
Lindsey will discuss her qualitative method/approach to analyzing the data.
Lindsey will discuss the thematic analysis resulting in the need for national standard definitions.
Lindsey will discuss some key themes. In particular, she will note that these were taken into account when we developed the framework of inquiry.
Tanya will discuss
Tanya will note change on retention, learning vs course/program
Desired Outcomes
Access
All learners who wish to learn online can access learning in a wide array of programs and courses,1 particularly underrepresented, those with disabilities and minorities.2 An essential component in distance education is a comprehensive infrastructure for learning that provides all individuals with the resources they need when and where they are needed. The underlying principle is that infrastructure includes people, instructional resources, processes, learning resources, policies, broadband, hardware, and software. It brings state-of-the art technology into learning to enable, motivate, and inspire all students, regardless of background, languages, or disabilities, to achieve.4
Data can be collected by examining administrative and technical infrastructure, which provides access to all prospective and enrolled learners. Access quality metrics are used for information dissemination, learning resource delivery, and tutoring services.1 Other possibilities include data gathered from student information systems, from student perception surveys, or objective accessibility ratings of online courses and programs.
Learning effectiveness
Learning effectiveness indicates a demonstration that learning outcomes were met or exceeded standards.1 This includes areas of study with research outcomes focusing on student success in achieving learning outcomes2 and other potential indicators of achievement (success, failure, achievement gains, academic achievement, improvement).3 Moreover, learning effectiveness could also include topics of retention (of content) and retention in a course (sometimes called attrition) or program (degree completion).
Typically data are gathered through direct assessment of student learning (e.g. overall grades, exam grades, or other assessments), faculty perception surveys, faculty interviews comparing learning effectiveness in delivery modes, and student focus groups or interviews measuring learning gains.1 Additionally, requests for new and better ways to measure what matters include concurrent data collection. Here, focusing on diagnosing strengths and weakness during the course of learning provides the opportunity for more immediate improved student performance. Furthermore, these technology-based assessments provide the opportunity to allow data to drive decisions on the basis of what is best for each and every student based on their unique attributes and interactivity in class.4 Other possibilities include data gathered from student information systems or from student perception surveys.
Satisfaction
Faculty are pleased with teaching online, citing appreciation and happiness. Students are pleased with their experiences in learning online, including interaction with instructors and peers, learning outcomes that match expectations, services, and orientation.1
Faculty and student surveys can indicate equal or growing satisfaction to traditional forms of learning. Other metrics can include repeat teaching of online courses by individual faculty and increase in percentage of faculty teaching online showing growing endorsement. Qualitative methods can include interviews, focus groups, testimonials with faculty, staff (including advisors and tutors), and/or students.1
Instructional effectiveness
Instructional effectiveness indicates the quality of education meets program, institutional, and national standards.1 The focus is on what and how we teach to match what people need to know, how they learn, where and when they will learn, and who needs to learn.4 The areas of study might include instructional improvement, program effectiveness, administrator effectiveness, curriculum evaluation, educational quality, outcomes of education programs, and instructional media.3 Additionally, instructional effectiveness is not limited to instruction provided inside the classroom, but extends itself to instructional support or supplemental instruction and guidance provided through institutional services or through staff and individuals outside of the classroom.
Traditionally, as in face-to-face delivered courses, student ratings of instructional effectiveness are collected. However, typically these standards in distance education and online learning are communicated in a course or program rubric (e.g., UC Chico, QM) which is administered through an objective rating of a course or program in addition to traditional methods. Recent work looks to gather this data through student perceptions of instructional effectiveness through course and program rubrics converted to student surveys. Other possibilities include objective ratings of online course and program design and instructional delivery.
References:
1. Online Learning Consortium, 5 Pillars
2. U.S. Department of Education, Application for Grants
3. What Works Clearinghouse
4. National Ed Tech Plan, U.S. Department of Education
Tanya will discuss Ashley’s and Rachel’s efforts to digitize the flip charts. Furthermore, Tanya will discuss Ashley’s efforts to analyze the RQ and Variable data to help us further develop the research model.
Tanya -- This is the research questions by table as posted in the Outcomes category on the Summit site
Tanya – These are the RQs above one standard above the mean. They can be found on the research model page. Some of these research questions are more questions that are needed to clarify definitions and measures. Other research questions can be used to actually guide empirical research in year 2.
Tanya – Specifically, we have 4 questions that can guide research at our partner institutions and through sub-grant awards. The other three questions are ones that need to be potentially clarified prior to the fall by the DETA team and it’s partners. I will address these further at the end when we discuss next steps.
Tanya
Tanya will discuss that this is on the site in outcomes
Tanya will discuss Ashley’s efforts to covert to FOI
Gathering of data from PAR, UC Chico Rubri, QM, etc.
Results of our efforts is a beautiful yet expansive list of potential variables that could be examined that encompass the complete process of DE. To simplify a bit let’s look at the overall model
Provide Justification for year 1 goal
Discuss each activity for year 1 goal
1. Host national summit
2. Determine desired outcomes
3. Establish framework of inquiry
4. Formulate measures
5. Establish research instrumentation
Transition: Now, let’s talk more specifically about our first activity as we are launching this grant.
Provide Justification for year 1 goal
Discuss each activity for year 1 goal
1. Host national summit
2. Determine desired outcomes
3. Establish framework of inquiry
4. Formulate measures
5. Establish research instrumentation
Transition: Now, let’s talk more specifically about our first activity as we are launching this grant.
Tanya – It may be useful to have a virtual summit or use other conferences to which we have been invited to flesh out some of these clarifying questions.