SlideShare a Scribd company logo
1 of 14
Download to read offline
The Geoscience Concept Inventory WebCenter provides new
             means for student assessment
                               Emily M. Geraghty Ward
   Research Associate Department of Geological Sciences and member of the Geocognition
                      Research Lab, Michigan State University, USA

                                    Julie C. Libarkin
Associate Professor Department of Geological Sciences Director - Geocognition Research Lab,
                             Michigan State University, USA

                                      Stuart Raeburn
   Instructional Technology Researcher/Systems Developer, Michigan State University, USA

                                       Gerd Kortemeyer
           Assistant Professor of Physics Education and Director, LON-CAPA Project,
                                 Michigan State University, USA


Summary

Faculty adopt information and communication technologies (ICT) with the assumption that they
enhance student learning. In the geosciences, new curricula employ tools such as Google
Earth to aid in the interpretation of three-dimensional landscapes and the processes that create
them. In many cases, the evaluation of learning that occurs with this technology use is neither
explicit nor necessarily matched with the overarching curricular goals of ICT. Arguably,
assessment should be embedded in curriculum design according to the Backward Design
model (Wiggins & McTighe, 2005) for effective instruction. We propose embedded assessment
appropriate to ICT, specifically online assessment that takes advantage of automated scoring
and feedback mechanisms through the Geoscience Concept Inventory (GCI) WebCenter.

As an instructional tool, the WebCenter contains concept inventory questions that are carefully
designed to ascertain a student’s conceptual understanding in a range of geology subtopics.
The WebCenter’s customized LON-CAPA platform facilitates the inclusion of digital images
created by ICT technologies to assess student learning. The WebCenter’s online venue
facilitates community participation in assessment development by allowing faculty to review
existing questions and submit their own. Furthermore, the WebCenter’s testing function
provides an authentic online assessment experience that aligns with ICT practice and takes
advantage of its technological capabilities to provide immediate feedback and detect fine-
grained data such as time on task.

Currently, user activity in the portal is limited to viewing and student evaluation on a small
scale, with only a small fraction participating in the development of new concept inventory
questions. Thus, it may be that on-site teacher training workshops are needed to help initiate
collaborations and use of the technology. However, the WebCenter has already made an
impact with its online, open-source nature; encouraging participation from around the globe, as
evidenced by the number of users (n=130) and range of institutions using the GCI. Statistics
collected via online testing with a variety of student populations will allow for powerful
comparative analyses of student learning across institutions.

Keywords: evaluation, learning metadata, mobile learning, research, ICT, education
technologies

eLearning Papers • www.elearningpapers.eu •                                     1
Nº 20 • July 2010 • ISSN 1887-1542
Introduction
With technological advancement has come infusion of information and communication
technologies (ICT) into the classroom. Often, faculty members in higher education adopt these
technologies with the assumption that they enhance student learning. In the geosciences in
particular, new curricula employ tools such as Google Earth, Virtual Globes, and Geographic
Information Systems (GIS) software to aid in the interpretation of three-dimensional landscapes
and the understanding of the processes that create them. In many cases, the evaluation of
learning that occurs with this technology use is neither explicit nor necessarily matched with the
overarching curricular goals of ICT. Arguably, assessment should be embedded in curriculum
design according to the Backward Design model (Wiggins & McTighe, 2005) for effective
instruction in order to promote appropriate use of ICT as a learning tool. We propose use of
embedded assessment appropriate to ICT, specifically online assessment that takes advantage
of automated scoring and feedback mechanisms. We use the Geoscience Concept Inventory
(GCI) WebCenter (http://gci.lite.msu.edu/), an online platform currently in use for the
development of concept inventory questions and online student assessment, as an example.
WebCenters for assessment provide avenues for investigating student learning that are
targeted to the goals of ICT curricula; we encourage community development of assessment
via this or similar online venues.

This paper will introduce the GCI WebCenter as both an instructional tool and a virtual
community of practice. GCI questions are carefully designed to ascertain student’s conceptual
understanding in a range of geology subtopics. As curriculum goals are established for ICT-
infused classroom activities, assessment should be designed to measure whether those goals
are met. Because the GCI WebCenter is an online assessment tool, it is uniquely qualified for
authentic assessment of ICT activities, and can readily incorporate appropriate technology,
such as digital images, to assess student learning. Furthermore, this online venue encourages
community participation in assessment development by allowing faculty to review existing GCI
questions through discussion threads, and to submit questions of their own for review,
validation and eventual inclusion.


Background
Backward Design in curriculum development is a well-regarded and often used strategy for
creating effective instruction (Wiggins & McTighe, 2005). Backward Design requires
intentionality in instructional practice; curricula follow from goals, rather than vice versa. At its
most basic, Backward Design suggests that identification of instructional goals and
determination of goal-oriented assessments be followed by curriculum development geared
specifically for these pre-set goals and assessments. Because learning occurs within individual
and social contexts, context must also be considered in curriculum development. Viewed
through the lens of this instructional design theory, appropriate assessment should reflect the
nature and context of curricular materials and approaches. In fact, “assessment” is itself a piece
of the curriculum development process, and should emerge from within the curriculum, rather
than exist as an isolated entity.

An analysis of community efforts in digital innovations in geoscience education (represented by
the 20 abstracts from the 2009 GSA poster session From Virtual Globes to Geoblogs: Digital
Innovations in Geoscience Research, Education, and Outreach) illustrates the need for explicit
consideration of assessment in online venues (Table 1). Analysis of the abstracts reveals a
disconcerting disconnect between the “Understanding by Design” model and assessment
practice. Eighty percent of the abstracts made no mention of assessment, suggesting that
assessment is not necessarily a key component of curriculum development. Only 20% (n=4) of
the abstracts made mention of assessment, although for the majority of these, the nature of the
assessment did not appear to match the nature of the classroom activity (Table 1). Even
though the abstracts introduce new and exciting ideas for use of digital technologies in the

eLearning Papers • www.elearningpapers.eu •                                          2
Nº 20 • July 2010 • ISSN 1887-1542
classroom, the reported assessment appeared not to utilize these same digital technologies.
Only two abstracts explicitly mention using online assessment as part of their proposed
curriculum. While we recognize that abstracts cannot completely portray the depth and scope
of the curriculum and instruction they represent, and while 20 abstracts only represent a
fraction of the ongoing curriculum development efforts, these data suggest that: (1)
Assessment receives minimal attention in curriculum development and (2) Authentic, ICT-
based assessment is not being adequately paired with ICT pedagogies.


               Digital Innovations in Geoscience Research, Education and Outreach
                                          posters (n=20)

                       None               Anecdotal Assessment   Formal Assessment


                        16                         2                    2


                       80%                        10%                  10%



      Table 1: Prevalence of assessment as an essential component to the “Understanding
      by Design” model (Wiggins and McTighe, 2005) in abstracts presented at a “From
      Virtual Globes to Geoblogs: Digital Innovations in Geoscience Research, Education
      and Outreach” poster session held at the 2009 Geological Society of America annual
      meeting. Abstracts were coded for the presence of anecdotal and formal assessment.
      Three of the four abstracts that make mention of assessment allude to using online
      assessment (http://gsa.confex.com/gsa/2009AM/finalprogram/session_25205.htm).

Online Assessment
The platform on which the GCI WebCenter runs is a customized version of LON-CAPA (The
LearningOnline Network with CAPA), which provides faculty with the means to share and
review concept inventory questions and administer online tests to their students. In 1992,
CAPA (a Computer-Assisted Personalized Approach) was started to provide randomized
homework for an introductory physics course at Michigan State University (LON-CAPA;
http://www.lon-capa.org/history.html; Kashy et al. 1993; 1995). The system provided a way to
offer relevant practice problems and feedback to the students in spite of limited availability of
teaching assistants. Different students were assigned different versions (for example, different
numbers, graphs, formulas, images, and options) of the same problems, so that they could
discuss problems with each other, but not simply exchange solutions. When CAPA was first
introduced, students received paper printouts of their problems, and had to enter their solutions
through a Telnet terminal, where they received immediate feedback on the correctness of their
responses. Students typically had a limited number of allowed attempts (“tries”) to arrive at the
correct solution. In later years, as the web became more widely available, a web interface for
answer input was introduced. Eventually, the system gained learning content management
functionality to put whole curricula online, including both content and assessment resources, as
well as course management functionality (participation, grading, communication, group work
and enrollment are all handled by one system). Today, LON-CAPA is used at more than 100
institutions in addition to at MSU, within settings ranging from middle school classrooms to
graduate level courses. Participating disciplines include astronomy, biology, business,
chemistry, civil engineering, computer science, family and child ecology, geology, human food
and nutrition, human medicine, mathematics, medical technology, physics, and psychology.



eLearning Papers • www.elearningpapers.eu •                                      3
Nº 20 • July 2010 • ISSN 1887-1542
The LON-CAPA feedback tools help faculty identify the source of student difficulties about a
topic. Faculty can view assessment data for individual students (Figure 1) and problems (Figure
2) as well as generate graphs of overall class performance (Figure 3). Furthermore, the system
records “time on task” data for each student, allowing faculty to see how long students spend
answering each question and to gauge question difficulty. Time on task data will be discussed
later in the paper with regard to the research potential of the GCI WebCenter.

One of the major strengths of online systems like LON-CAPA is the embedded and automated
use of simple statistics and user tracking. Faculty can use embedded statistics to review
performance of specific users (Figure 1); this functionality can be anonymized for research
projects that fall under standard rules for human subjects research. Responses for an entire
course can also be aggregated (Figure 2), giving a sense of the prevalence of specific
alternative conceptions within a course population. Since each of the GCI response options is
based on a specific alternative conception, the automated faculty feedback provides immediate
opportunities to diagnose student ideas prior to instruction. Similar post-instruction evaluation is
also available, producing a general sense of changes, or entrenchment, of specific student
ideas in response to instruction.

In addition to looking at student response data on a question-by-question basis, total scores for
all completed GCI questions can also be displayed (Figure 3). This tends to be the most
common metric used by concept inventory users in science (Libarkin, 2008); most faculty and
researchers are looking for measures of overall change in student performance. Although not
as fine-grained as question-by-question analyses, this approach can provide interesting insight
into the impact of instruction on student conceptual understanding. Total scores also allow for
calculation of effect size or gain (c.f. Black and Wiliam, 1998); the former is the metric used for
estimating the size of change within a population most commonly accepted in educational
psychology, while the latter is the metric commonly reported by disciplinary science educators
(e.g., physics; Hake, 2002)




        Figure 1: LON-CAPA statistics functions allow faculty to review individual student
        performance. For this example, this student selected the first response option
        indicating that over time, the Earth would shrink in volume. Correct answers are
        provided in the test statistics as well as the date and time when that the student
        submitted the answer.


eLearning Papers • www.elearningpapers.eu •                                         4
Nº 20 • July 2010 • ISSN 1887-1542
Figure 2: Answer distribution of a particular problem (same as in Figure 1)
           across the whole course. While most students answered correctly, an equal
           number of students assumed that Earth either shrinks or that there is simply no
           way of knowing.




           Figure 3: Score distribution from a 16-student course for all GCI questions prior
           to instruction. Out of the maximum of 29 available points, students scored a
           minimum of 7 and a maximum of 17 points. Within this interval, scores were
           fairly evenly distributed, suggesting a range of ability levels within the course.



GCI WebCenter
The Geoscience Concept Inventory (GCI) is a valid and reliable multiple-choice concept
inventory, designed, tested and validated with a national population of entry-level college
students (Figure 4; Libarkin & Anderson, 2005; 2007; Libarkin, 2008). As a general measure of
geoscience conceptual understanding, the GCI has proven useful in evaluating learning in a
number of instructional contexts (Elkins & Elkins, 2007; Petcovic & Ruhf, 2008; Kortz et al.,
2008). In addition, the GCI was developed with specific grounding in student experiences and

eLearning Papers • www.elearningpapers.eu •                                          5
Nº 20 • July 2010 • ISSN 1887-1542
ideas (Libarkin & Anderson, 2007), and was designed for flexibility within the context of
standardized assessment (Libarkin, 2008). Development of concept inventories by individual
researchers, the universal norm, can be expanded to include the community of faculty when
ICT is utilized. This provides for development of assessments that are uniquely authentic to
multiple instructional settings and diverse assessment needs.

Community development of concept inventories begins with community members identifying
alternative conceptions held by students through analysis of open-ended exam questions,
student interviews, and/or review of the literature. Concept inventory questions are generated
according to the “best practices” of assessment design (c.f. Haladyna and Downing, 1989b;
Frey et al., 2005; Libarkin, 2008), following guidelines emerging from survey design and related
fields, and requiring community participation in order to diversify question content and validate
new and existing questions. Geoscientists, science educators, educational psychologists, and
psychometricians are all invited to provide expert review of GCI questions to ensure content,
construct, communication validity, and, where appropriate, cultural validity. The reliability and
additional validity of GCI questions are further evaluated by the GCI WebCenter team once the
questions have been tested with different student populations. GCI questions may undergo
many revise - re-pilot - re-analyze cycles in order to generate the highest quality assessment
questions possible.




                         Community                                             GCI Team
                                                              Pilot testing:         Standard Factor
        Identify          Generate        External review
                                                                 Including               Analysis
      alternative           test            by scientists
                                                              “think aloud”           Item Response
     conceptions          questions        and educators
                                                                interviews                Theory




                                                 Together
                                       Revise - Re-pilot - Re-analyze
        GCI
    Development
      Process
                                                GCI

         Figure 4: The GCI development process as conceptualized for the GCI
         WebCenter. Developing questions for the Geoscience Concept Inventory requires
         community participation in order to diversify question content and validate existing
         questions. This iterative process ensures that GCI questions are both valid and
         reliable (https://www.msu.edu/~libarkin/GCI_DEVELOPMENT.html).




eLearning Papers • www.elearningpapers.eu •                                               6
Nº 20 • July 2010 • ISSN 1887-1542
To help facilitate community-driven, collaborative assessment design, the GCI WebCenter was
launched in 2009 to provide faculty with online access to GCI questions (Figure 5). This online
accessibility was conceptualized as a portal for three community-based activities. First, we
envisioned a mechanism through which the community of faculty could comment on existing
GCI questions through discussion threads. These comments provide opportunities for
correction of errors, discussion of reasoning behind question structure, and opportunities for the
community to learn about the significant research effort that needs to underlie each concept
inventory question. Second, the GCI was originally created to measure only a very narrow
range of concepts typically taught in entry-level college geo- or Earth science courses.
Recognizing the need to expand the GCI, the WebCenter is an invitation for the community to
participate as co-authors on the GCI. This extension of co-authorship allows experts in diverse
content areas to propose questions, thus expanding the usefulness of the GCI as a measure of
conceptual understanding. Submitted questions go through the same cycle of review and
revision as original GCI questions, ensuring high quality overall (Libarkin and Ward, in press).
Finally, the WebCenter serves as an authentic online assessment tool, providing ease-of-use
for students, autogenerated feedback for faculty, and, eventually, banking of anonymous
student response data. The online assessment satisfies needs for rapid feedback and authentic
assessment of ICT pedagogies, while the student data bank will enhance research potential for
the entire community.




 Figure 5: Faculty can enroll in the GCI WebCenter to access all available GCI questions.
 WebCenter functions include question review, question submission, and online testing. Also
 available to faculty is the GCI Workbook to help with question writing and review. The workbook
 provides information regarding “best practice” in writing multiple choice questions as well as the
 importance of question validity.


Given its online platform, the WebCenter is well suited to act as a virtual community of practice
for a diverse set of users. The WebCenter’s capabilities allow for inclusion of assessments that
target specific classroom activities and utilize interesting digital innovations in GIS, Google
Earth, and ICT technologies. Questions can be developed with these technologically-enhanced
materials in order to ascertain conceptual understanding in geoscience, and learning that
occurs in response to ICT. Furthermore, the WebCenter disseminates questions developed by
the community and collects performance data from a range of student populations.

Question Review and Validation

Faculty can browse GCI questions based on subtopic (e.g. volcanoes, glaciers, mountains,
etc.) and comment on individual questions in a discussion thread (Figure 6). Besides
commenting on questions, users are able to provide expert answers, thereby providing
additional control on question validity.

eLearning Papers • www.elearningpapers.eu •                                          7
Nº 20 • July 2010 • ISSN 1887-1542
Performance data from the expert answers informs question validity but also provides for an
interesting comparison between expert responses and answers provided by more novice
student populations (e.g. non-science majors).

Upon entering the GCI WebCenter, faculty can select the “Review Questions” tab to access all
GCI questions. Questions are organized in folders, allowing faculty to view GCI questions
according to a subtopic of interest. Each question is given a title based on the question content
(e.g. Location of glaciers, see figure below) in order to facilitate question browsing for faculty.
Faculty can select individual questions in order to view them and may submit answers of their
own. Faculty can use the green arrows to move to other questions contained within the
selected subtopic folder.




 Figure 6: The review questions function of the GCI WebCenter allows faculty to view the GCI
 questions grouped by subtopic. In this example, the user was able to view all GCI questions
 related to glaciers and made a comment that informs the communication validity of the question.
 The discussion thread function facilitates dialogue between WebCenter users (anonymously or
 not) and provides authors of questions with valuable feedback regarding question validity. These
 data are utilized by the WebCenter team in question revisions and to ensure validity.




eLearning Papers • www.elearningpapers.eu •                                        8
Nº 20 • July 2010 • ISSN 1887-1542
Question Submission and Diversifying Content

Currently, the GCI is limited in content. Given the diversity of concepts that can be classified as
“geoscience”, community involvement in question development is absolutely necessary for the
instrument to satisfy the needs of the geoscience community at large (Figure 7). The GCI
currently contains 85 questions available for review within the WebCenter (14 of these are in
the pilot phase and need further testing with students). Many questions involve 2D images;
although feasible within the LON-CAPA platform, the WebCenter has yet to incorporate 3D or
even 4D representations as part of the question bank. We envision 3D images and simulations
as necessary components of assessments for certain concepts; for example, understanding of
geologic time might best be measured through use of dynamic simulations. For this reason, in
addition to the need for expanded concept coverage, the WebCenter has a built in function for
question submission where faculty are able to upload potential questions for expert review and
piloting. Because the GCI WebCenter is an online assessment instrument, it is well positioned
to include questions targeting curriculum activities that involve digital technologies inherent to
ICT.




   Figure 7: WebCenter users are able to submit potential GCI questions via the WebCenter to
   help diversify question content. The template provides authors with the required components
   for the potential question to be considered as part of the GCI. The template prompts authors to
   ground their questions in student data and provide a rationale for the inclusion of the question
   in the GCI.


Online Testing and Question Reliability

The most powerful feature (and currently most commonly used component) of the GCI
WebCenter is the online testing function (Figure 8). Faculty can create GCI tests to administer
to their students online, by either manually selecting questions from the GCI question bank or
by allowing the WebCenter to generate a test for them. Performance data are compiled by the

eLearning Papers • www.elearningpapers.eu •                                          9
Nº 20 • July 2010 • ISSN 1887-1542
WebCenter during testing, and the WebCenter automatically generates a statistical report for
the test creator once the testing period has ended. Since many faculty currently use the GCI to
diagnose student conceptual understanding and evaluate learning post-instruction, the
autoreport is a rapid feedback mechanism that is ideally suited for instruction that seeks to be
responsive to student needs.




   Figure 8: Questions can be selected from the GCI to create online tests for students.
   Anonymized student performance data are collected by the WebCenter, which then compiles
   test statistics to provide to faculty (see Figures 2 and 3 for examples).


Research Potential of the GCI WebCenter
In addition to providing faculty with a powerful online tool designed for assessing students’
conceptual understanding, the WebCenter also has potential for research both within and
across courses. Student performance data collected from a wide range of institutions are open
access in anonymous form and available to all WebCenter users. These data can be used to
investigate questions about curricular effectiveness, or for comparison of different student
groups. To facilitate potential research questions, simple demographic data, such as gender
and age, are collected from all test-takers. Furthermore, the WebCenter collects time on task
data while students take the online exams. Below we examine the student data collected via
the WebCenter through online exams administered by two faculty.

eLearning Papers • www.elearningpapers.eu •                                     10
Nº 20 • July 2010 • ISSN 1887-1542
Two instructors used the GCI WebCenter to administer pre-tests in January 2010 to students
enrolled in six separate introductory courses. Most commonly, tests assembled by instructors
using the GCI WebCenter contain a minimum of 15 questions taken from the GCI v.2 bank of
71 validated questions. Tests are comprised of 4 mandatory questions, and at least one
question from each of eleven bins. The system also chooses two questions at random from a
pool of 14 pilot questions, and includes them automatically in each Concept Inventory sub-test.
Each student enrolled in a course receives the same questions, although the order of questions
and response options per question to be randomized. For the Jan. 2010 cohort, a total of 1369
submissions were recorded for the 49 different questions implemented -- 41 questions from the
GCI v.2 and 8 "pilot" questions; these pilot questions are being evaluated for possible addition
to the GCI. Tests varied in length from 27 questions (longest) to 18 questions (shortest).




   Figure 9: Time on task data for January 2010 cohort: A) Plot of time-on-task recorded by
   GCI WebCenter for student submissions B) Plot of mean “time-on-task” versus degree of
   difficulty for each of 41 GCI v2 questions included in Concept Tests administered.


The system records both when each student first displays a particular question, and when each
student submits his/her answer. Consequently, an estimate can be made of time on task, an
important cognitive measure. Time on task is defined as the number of seconds that elapses
between question display and answer submission. For the 1369 submissions made by students
enrolled in the six Jan. 2010 courses, the mean time on task was 36 seconds, and the median
was 26 seconds (Figure 9A).

In addition to time on task, a simple degree of difficulty measure was also calculated for each
question, where degree of difficulty ranges from 0 (least difficult) to 1 (most difficult), and is
defined as:

        Deg.Diff = (Total Submissions - Total correct submissions)/(Total Submissions)

For the 41 questions from the GCI v.2, the mean Deg.Diff was 0.65 (Figure 9B). The best linear
fit to a plot of mean time on task per question versus degree of difficulty indicates a positive
correlation between the two. The Pearson Product-Moment Correlation Coefficient of r = 0.303
indicates that a positive correlation is significant (critical value = 0..26 for α = 0.05 in one-tailed
test with 39 d.f.). Comparison of time on task for questions from the GCI v.2 with time on task
for pilot questions shows that on average students spent a few extra seconds answering a pilot

eLearning Papers • www.elearningpapers.eu •                                           11
Nº 20 • July 2010 • ISSN 1887-1542
question compared to existing GCI v.2 (Table 2). We hypothesize that this is related to the
wording and higher level geoscience content of the pilot questions, and anticipate a decrease in
pilot question time on task with revision; this hypothesis has yet to be tested.


                                                               Time on Task
      Source            Submissions
                                              Mean                Median           Std. Deviation
 GCI v.2 Questions           1327               35                  26                    32
   Pilot Questions            132               39                  32                    30


Table 2: Summary statistics for questions used in Concept Tests administered to January 2010
cohort: 41 GCI v2 questions and 8 pilot questions, being tested for inclusion in the inventory.


Conclusions
The GCI WebCenter provides an authentic online assessment experience that aligns with ICT
practice and takes advantage of technological capabilities for immediate feedback and capture
of fine-grained data such as time on task. Although the WebCenter currently enrolls 130 users,
user activity on the WebCenter is mostly limited to viewing and student evaluation on a small
scale. The WebCenter has consistent requests for enrollment, although only a small fraction of
these users participate in the collaborative development of new concept inventory questions
(see discussion of barriers in Gannon-Leary & Fontainha, 2007). We do see increased user
activity after specific interventions; typically after giving talks at national conferences about the
functions and potential of the WebCenter and what it can provide to faculty. We plan to develop
on-site and virtual teacher training workshops that cover the details of assessment
development and encourage community participation in question writing. Given the user activity
on the WebCenter, it may be that in order for this virtual community of practice to be
successful, it must begin with face-to-face interaction.

That said, the GCI WebCenter already has made an impact in technology-based science
education. The online, open-source nature of the GCI WebCenter allows for greater
participation from users around the globe, as evidenced by the number of users and range of
institutions using the GCI. Furthermore, the statistics collected via online testing with a variety
of student populations allows for powerful comparative analysis of student learning across
institutions. We encourage the community to participate in the expansion and diversity of the
GCI in order to bridge the gap between curriculum goals and instruction in the Backward
Design model. Assessment targeting curriculum that utilizes digital innovations provides faculty
with evidence of student learning and of the efficacy of these interventions. We also anticipate
expansion of the WebCenter approach to other domains outside of the geosciences.


Acknowledgments

We thank all students and faculty who have encouraged and participated in the original GCI
and GCI WebCenter projects. This work is funded by NSF through grant DUE-0717790. Any
opinions, findings, and conclusions or recommendations expressed in this manuscript are those
of the authors and do not necessarily reflect the views of the National Science Foundation.




eLearning Papers • www.elearningpapers.eu •                                          12
Nº 20 • July 2010 • ISSN 1887-1542
References
Black, P. and Wiliam, D. (1998). Inside the Black Box: Raising Standards Through Classroom
Assessment. The Phi Delta Kappan, v. 80, n. 2, p. 139-144, 146-148.

Elkins, J. T., & Elkins, N. M. L. (2007). Teaching Geology in the Field: Significant Geoscience
Concept Gains in Entirely Field-based Introductory Geology Courses. Journal of
Geoscience Education, v. 55, n. 2, p. 126-132.

Frey, B.B., Petersen, S., Edwards, L.M., Teramoto Pedrotti, J., Peyton, V. (2005). Teaching and Teacher
Education, v. 21, p. 357–364.

Gannon-Leary, Patricia Margaret & Fontainha, Elsa (2007). Communities of Practice and
virtual learning communities: benefits, barriers and success factors. eLearning Papers, no. 5. ISSN 1887-
1542.

Geological Society of America. (2009). From Virtual Globes to Geoblogs: Digital Innovations in
Geoscience Research, Education and Outreach, retrieved May 2, 2010 from
http://gsa.confex.com/gsa/2009AM/finalprogram/session_25205.htm.

Hake, R. (2002). Lessons from the physics education reform effort. Conservation Ecology, v. 5, n. 2,
article 28 (online) URL: http://www.consecol.org/vol5/iss2/art28/.

Haladyna, T. M., and Downing, S. M. (1989b). Validity of taxonomy of multiple-choice item-writing rules.
Applied Measurement in Education, v. 2, p. 51-78.

Kashy, E., Sherrill, B.M., Tsai, Y., Thaler, D., Weinshank, D., Engelmann, M., and Morrissey, D.J. (1993).
CAPA, an integrated computer assisted personalized assignment system. American Journal of Physics,
v. 61. p. 1124-1130.

Kashy, E., Gaff, S. J., Pawley, N., Stretch, W.L., Wolfe, S., Morrissey, D.J., and Tsai, Y. (1995).
Conceptual questions in computer-assisted assignments. American Journal of Physics, v. 63. p. 1000-
1005.

Kortz, K. M., Smay, J. J., & Murray, D. P. (2008). Increasing Learning in Introductory
Geoscience Courses Using Lecture Tutorials. Journal of Geoscience Education, v. 56, p. 280-
290.

LON-CAPA. (2010). The LearningOnline Network with CAPA History, retrieved May 2, 2010 from
http://www.lon-capa.org/history.html.

Libarkin, J. (2010). What is the Geoscience Concept Inventory (GCI)?, retrieved May 2, 2010 from
https://www.msu.edu/~libarkin/GCIinventory.html.

Libarkin, J.C. & Ward, E.M.G. (in press). The qualitative underpinnings of quantitative concept inventory
questions. In Feig, A.P. & Stokes, A. (Eds.). Qualitative research in geoscience education: Geological
Society of America Special Paper.

Libarkin, J.C., & Anderson, S.W. (2005). Assessment of Learning in Entry-Level Geoscience Courses:
Results from the Geoscience Concept Inventory. Journal of Geoscience Education, 53, 394-401.

Libarkin, J.C., & Anderson, S.W. (2007). Development of the Geoscience Concept Inventory, NSF
Conference Proceedings.

Libarkin, J.C. (2008). Concept Inventories in Higher Education Science. National Research Council.
http://www7.nationalacademies.org/bose/Libarkin_CommissionedPaper.pdf

Petcovic, H. L., & Ruhf, R. R. (2008). Geoscience Conceptual Knowledge of Preservice
Elementary Teachers: Results from the Geoscience Concept Inventory. Journal of
Geoscience Education, v. 56, p. 251-260.



eLearning Papers • www.elearningpapers.eu •                                               13
Nº 20 • July 2010 • ISSN 1887-1542
nd
Wiggins, G.P. & McTighe, J. (2005). Understanding by Design 2        edition. Association for Supervision &
Curriculum Development: Alexandria, VA, 370 p.


Authors

Emily M. Geraghty Ward
Research Associate Department of Geological Sciences and member of the Geocognition
Research Lab, Michigan State University, USA
wardem@msu.edu

Julie C. Libarkin
Associate Professor Department of Geological Sciences Director - Geocognition Research Lab,
Michigan State University, USA
libarkin@msu.edu

Stuart Raeburn
Instructional Technology Researcher/Systems Developer, Michigan State University, USA
raeburn@msu.edu

Gerd Kortemeyer
Assistant Professor of Physics Education and Director, LON-CAPA Project, Michigan State
University, USA
korte@lite.msu.edu




Copyrights

                 The texts published in this journal, unless otherwise indicated, are subject to a
                 Creative Commons Attribution-Noncommercial-NoDerivativeWorks 3.0
Unported licence. They may be copied, distributed and broadcast provided that the author and
the e-journal that publishes them, eLearning Papers, are cited. Commercial use and derivative
works are not permitted. The full licence can be consulted on
http://creativecommons.org/licenses/by-nc-nd/3.0/



Edition and production

Name of the publication: eLearning Papers
ISSN: 1887-1542
Publisher: elearningeuropa.info
Edited by: P.A.U. Education, S.L.
Postal address: C/ Muntaner 262, 3º, 08021 Barcelona, Spain
Telephone: +34 933 670 400
Email: editorial@elearningeuropa.info
Internet: www.elearningpapers.eu




eLearning Papers • www.elearningpapers.eu •                                                14
Nº 20 • July 2010 • ISSN 1887-1542

More Related Content

What's hot

The paradigm shift from traditional learning to digital learning in mathematics
The paradigm shift from traditional learning to digital learning in mathematics The paradigm shift from traditional learning to digital learning in mathematics
The paradigm shift from traditional learning to digital learning in mathematics Dr. C.V. Suresh Babu
 
2021_01_15 «Learning Analytics for Large Scale Data».
2021_01_15 «Learning Analytics for Large Scale Data».2021_01_15 «Learning Analytics for Large Scale Data».
2021_01_15 «Learning Analytics for Large Scale Data».eMadrid network
 
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...Lietuvos kompiuterininkų sąjunga
 
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...IJCI JOURNAL
 
eLSE2019-presentationTPACK
eLSE2019-presentationTPACKeLSE2019-presentationTPACK
eLSE2019-presentationTPACKDana Craciun
 
Public Consulting Group - School Data Management
Public Consulting Group - School Data ManagementPublic Consulting Group - School Data Management
Public Consulting Group - School Data ManagementPublic Consulting Group
 
ICT in science education - The conditions that affect teachers
ICT in science education - The conditions that affect teachersICT in science education - The conditions that affect teachers
ICT in science education - The conditions that affect teachersSvava Pétursdóttir
 
Building Rigidity that Withstands all types of Formative and Summative Evalua...
Building Rigidity that Withstands all types of Formative and Summative Evalua...Building Rigidity that Withstands all types of Formative and Summative Evalua...
Building Rigidity that Withstands all types of Formative and Summative Evalua...Jose Luis Sanchez
 

What's hot (12)

The paradigm shift from traditional learning to digital learning in mathematics
The paradigm shift from traditional learning to digital learning in mathematics The paradigm shift from traditional learning to digital learning in mathematics
The paradigm shift from traditional learning to digital learning in mathematics
 
2021_01_15 «Learning Analytics for Large Scale Data».
2021_01_15 «Learning Analytics for Large Scale Data».2021_01_15 «Learning Analytics for Large Scale Data».
2021_01_15 «Learning Analytics for Large Scale Data».
 
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...
Eugenijus KURILOVAS, Irina VINOGRADOVA. Mobilus mokymasis naudojant planšetin...
 
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...
USABILITY OF WEB SITES ADDRESSING TECHNOLOGY BASED CASER (CLASSROOM ASSESSMEN...
 
CSAM Poster
CSAM PosterCSAM Poster
CSAM Poster
 
Using Activity theory to study e-portfolio adoption
Using Activity theory to study e-portfolio adoptionUsing Activity theory to study e-portfolio adoption
Using Activity theory to study e-portfolio adoption
 
TTFBriefing
TTFBriefingTTFBriefing
TTFBriefing
 
eLSE2019-presentationTPACK
eLSE2019-presentationTPACKeLSE2019-presentationTPACK
eLSE2019-presentationTPACK
 
Public Consulting Group - School Data Management
Public Consulting Group - School Data ManagementPublic Consulting Group - School Data Management
Public Consulting Group - School Data Management
 
ICT in science education - The conditions that affect teachers
ICT in science education - The conditions that affect teachersICT in science education - The conditions that affect teachers
ICT in science education - The conditions that affect teachers
 
Building Rigidity that Withstands all types of Formative and Summative Evalua...
Building Rigidity that Withstands all types of Formative and Summative Evalua...Building Rigidity that Withstands all types of Formative and Summative Evalua...
Building Rigidity that Withstands all types of Formative and Summative Evalua...
 
Kelley
KelleyKelley
Kelley
 

Viewers also liked

eLearning Papers - Special edition 2008
eLearning Papers - Special edition 2008eLearning Papers - Special edition 2008
eLearning Papers - Special edition 2008eLearning Papers
 
From Open Educational Resources to Open Educational Practices
From Open Educational Resources to Open Educational PracticesFrom Open Educational Resources to Open Educational Practices
From Open Educational Resources to Open Educational PracticeseLearning Papers
 
From analog to digital: new ways of teaching and learning. A quick view of IC...
From analog to digital: new ways of teaching and learning. A quick view of IC...From analog to digital: new ways of teaching and learning. A quick view of IC...
From analog to digital: new ways of teaching and learning. A quick view of IC...eLearning Papers
 
Innovation, informational literacy and lifelong learning: creating a new culture
Innovation, informational literacy and lifelong learning: creating a new cultureInnovation, informational literacy and lifelong learning: creating a new culture
Innovation, informational literacy and lifelong learning: creating a new cultureeLearning Papers
 
Digital technologies and inclusive schooling
Digital technologies and inclusive schoolingDigital technologies and inclusive schooling
Digital technologies and inclusive schoolingeLearning Papers
 
Net4voice: new technologies for voice-converting in barrier-free learning env...
Net4voice: new technologies for voice-converting in barrier-free learning env...Net4voice: new technologies for voice-converting in barrier-free learning env...
Net4voice: new technologies for voice-converting in barrier-free learning env...eLearning Papers
 
The challenge of quality in peer-produced eLearning content
The challenge of quality in peer-produced eLearning contentThe challenge of quality in peer-produced eLearning content
The challenge of quality in peer-produced eLearning contenteLearning Papers
 
From cheating to teaching: a path for conversion of illegal gambling machines
From cheating to teaching: a path for conversion of illegal gambling machinesFrom cheating to teaching: a path for conversion of illegal gambling machines
From cheating to teaching: a path for conversion of illegal gambling machineseLearning Papers
 

Viewers also liked (8)

eLearning Papers - Special edition 2008
eLearning Papers - Special edition 2008eLearning Papers - Special edition 2008
eLearning Papers - Special edition 2008
 
From Open Educational Resources to Open Educational Practices
From Open Educational Resources to Open Educational PracticesFrom Open Educational Resources to Open Educational Practices
From Open Educational Resources to Open Educational Practices
 
From analog to digital: new ways of teaching and learning. A quick view of IC...
From analog to digital: new ways of teaching and learning. A quick view of IC...From analog to digital: new ways of teaching and learning. A quick view of IC...
From analog to digital: new ways of teaching and learning. A quick view of IC...
 
Innovation, informational literacy and lifelong learning: creating a new culture
Innovation, informational literacy and lifelong learning: creating a new cultureInnovation, informational literacy and lifelong learning: creating a new culture
Innovation, informational literacy and lifelong learning: creating a new culture
 
Digital technologies and inclusive schooling
Digital technologies and inclusive schoolingDigital technologies and inclusive schooling
Digital technologies and inclusive schooling
 
Net4voice: new technologies for voice-converting in barrier-free learning env...
Net4voice: new technologies for voice-converting in barrier-free learning env...Net4voice: new technologies for voice-converting in barrier-free learning env...
Net4voice: new technologies for voice-converting in barrier-free learning env...
 
The challenge of quality in peer-produced eLearning content
The challenge of quality in peer-produced eLearning contentThe challenge of quality in peer-produced eLearning content
The challenge of quality in peer-produced eLearning content
 
From cheating to teaching: a path for conversion of illegal gambling machines
From cheating to teaching: a path for conversion of illegal gambling machinesFrom cheating to teaching: a path for conversion of illegal gambling machines
From cheating to teaching: a path for conversion of illegal gambling machines
 

Similar to The Geoscience Concept Inventory WebCenter provides new means for student assessment

Enhancing ICT Education through Formative Assessment, Learning Analytics and ...
Enhancing ICT Education through Formative Assessment, Learning Analytics and ...Enhancing ICT Education through Formative Assessment, Learning Analytics and ...
Enhancing ICT Education through Formative Assessment, Learning Analytics and ...César Pablo Córcoles Briongos
 
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...IAEME Publication
 
The Virtual Teaching Material Driving the Meaningful Learning of Engineering...
 The Virtual Teaching Material Driving the Meaningful Learning of Engineering... The Virtual Teaching Material Driving the Meaningful Learning of Engineering...
The Virtual Teaching Material Driving the Meaningful Learning of Engineering...Research Journal of Education
 
The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...Dr. C.V. Suresh Babu
 
Summary of Two Evaluation Studies in Educational technology
 Summary of Two Evaluation Studies in Educational technology  Summary of Two Evaluation Studies in Educational technology
Summary of Two Evaluation Studies in Educational technology mazin
 
A study on the impact of web technologies in teacher education to train the f...
A study on the impact of web technologies in teacher education to train the f...A study on the impact of web technologies in teacher education to train the f...
A study on the impact of web technologies in teacher education to train the f...Dr. C.V. Suresh Babu
 
PBL-CT Banjarmasin.pptx
PBL-CT Banjarmasin.pptxPBL-CT Banjarmasin.pptx
PBL-CT Banjarmasin.pptxssuserdc4c27
 
Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...ijsrd.com
 
Technology standard 4
Technology standard 4Technology standard 4
Technology standard 4caila-bishop
 
Taking students at the university of nottingham on a digital learning journey
Taking students at the university of nottingham on a digital learning journeyTaking students at the university of nottingham on a digital learning journey
Taking students at the university of nottingham on a digital learning journeyJisc
 
Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...Dann Mallet
 
2021 Digital Media Assignments In Undergraduate Science Education An Eviden...
2021  Digital Media Assignments In Undergraduate Science Education  An Eviden...2021  Digital Media Assignments In Undergraduate Science Education  An Eviden...
2021 Digital Media Assignments In Undergraduate Science Education An Eviden...Scott Donald
 
Bridging the digital divide: academic skills and digital literacies to suppor...
Bridging the digital divide: academic skills and digital literacies to suppor...Bridging the digital divide: academic skills and digital literacies to suppor...
Bridging the digital divide: academic skills and digital literacies to suppor...RichardM_Walker
 
9172020 Originality Reporthttpsucumberlands.blackboar.docx
9172020 Originality Reporthttpsucumberlands.blackboar.docx9172020 Originality Reporthttpsucumberlands.blackboar.docx
9172020 Originality Reporthttpsucumberlands.blackboar.docxJospehStull43
 
Learning analytics
Learning analyticsLearning analytics
Learning analyticscharityc04
 
Impact of social media in learning mathematics
Impact of social media in learning mathematicsImpact of social media in learning mathematics
Impact of social media in learning mathematicsDr. C.V. Suresh Babu
 
The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...Dr. C.V. Suresh Babu
 

Similar to The Geoscience Concept Inventory WebCenter provides new means for student assessment (20)

Enhancing ICT Education through Formative Assessment, Learning Analytics and ...
Enhancing ICT Education through Formative Assessment, Learning Analytics and ...Enhancing ICT Education through Formative Assessment, Learning Analytics and ...
Enhancing ICT Education through Formative Assessment, Learning Analytics and ...
 
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
THE USE OF COMPUTER-BASED LEARNING ASSESSMENT FOR PROFESSIONAL COURSES: A STR...
 
JURNAL PENDUKUNG
JURNAL PENDUKUNGJURNAL PENDUKUNG
JURNAL PENDUKUNG
 
JURNAL PENDUKUNG
JURNAL PENDUKUNGJURNAL PENDUKUNG
JURNAL PENDUKUNG
 
The Virtual Teaching Material Driving the Meaningful Learning of Engineering...
 The Virtual Teaching Material Driving the Meaningful Learning of Engineering... The Virtual Teaching Material Driving the Meaningful Learning of Engineering...
The Virtual Teaching Material Driving the Meaningful Learning of Engineering...
 
The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...
 
Summary of Two Evaluation Studies in Educational technology
 Summary of Two Evaluation Studies in Educational technology  Summary of Two Evaluation Studies in Educational technology
Summary of Two Evaluation Studies in Educational technology
 
A study on the impact of web technologies in teacher education to train the f...
A study on the impact of web technologies in teacher education to train the f...A study on the impact of web technologies in teacher education to train the f...
A study on the impact of web technologies in teacher education to train the f...
 
PBL-CT Banjarmasin.pptx
PBL-CT Banjarmasin.pptxPBL-CT Banjarmasin.pptx
PBL-CT Banjarmasin.pptx
 
Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...Interactive Technologies for Improving Quality of Education to Build Collabor...
Interactive Technologies for Improving Quality of Education to Build Collabor...
 
Technology standard 4
Technology standard 4Technology standard 4
Technology standard 4
 
Taking students at the university of nottingham on a digital learning journey
Taking students at the university of nottingham on a digital learning journeyTaking students at the university of nottingham on a digital learning journey
Taking students at the university of nottingham on a digital learning journey
 
Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...
 
2021 Digital Media Assignments In Undergraduate Science Education An Eviden...
2021  Digital Media Assignments In Undergraduate Science Education  An Eviden...2021  Digital Media Assignments In Undergraduate Science Education  An Eviden...
2021 Digital Media Assignments In Undergraduate Science Education An Eviden...
 
Bridging the digital divide: academic skills and digital literacies to suppor...
Bridging the digital divide: academic skills and digital literacies to suppor...Bridging the digital divide: academic skills and digital literacies to suppor...
Bridging the digital divide: academic skills and digital literacies to suppor...
 
9172020 Originality Reporthttpsucumberlands.blackboar.docx
9172020 Originality Reporthttpsucumberlands.blackboar.docx9172020 Originality Reporthttpsucumberlands.blackboar.docx
9172020 Originality Reporthttpsucumberlands.blackboar.docx
 
Learning analytics
Learning analyticsLearning analytics
Learning analytics
 
Impact of social media in learning mathematics
Impact of social media in learning mathematicsImpact of social media in learning mathematics
Impact of social media in learning mathematics
 
HB633OW
HB633OWHB633OW
HB633OW
 
The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...The paradigm shift of ict in learning and teaching with respect to mathematic...
The paradigm shift of ict in learning and teaching with respect to mathematic...
 

More from eLearning Papers

OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...
OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...
OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...eLearning Papers
 
Designing and Developing Mobile Learning Applications in International Studen...
Designing and Developing Mobile Learning Applications in International Studen...Designing and Developing Mobile Learning Applications in International Studen...
Designing and Developing Mobile Learning Applications in International Studen...eLearning Papers
 
From E-learning to M-learning
From E-learning to M-learningFrom E-learning to M-learning
From E-learning to M-learningeLearning Papers
 
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...eLearning Papers
 
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...eLearning Papers
 
GGULIVRR: Touching Mobile and Contextual Learning
GGULIVRR: Touching Mobile and Contextual LearningGGULIVRR: Touching Mobile and Contextual Learning
GGULIVRR: Touching Mobile and Contextual LearningeLearning Papers
 
Reaching Out with OER: The New Role of Public-Facing Open Scholar
Reaching Out with OER: The New Role of Public-Facing Open ScholarReaching Out with OER: The New Role of Public-Facing Open Scholar
Reaching Out with OER: The New Role of Public-Facing Open ScholareLearning Papers
 
Managing Training Concepts in Multicultural Business Environments
Managing Training Concepts in Multicultural Business EnvironmentsManaging Training Concepts in Multicultural Business Environments
Managing Training Concepts in Multicultural Business EnvironmentseLearning Papers
 
Reflective Learning at Work – MIRROR Model, Apps and Serious Games
Reflective Learning at Work – MIRROR Model, Apps and Serious GamesReflective Learning at Work – MIRROR Model, Apps and Serious Games
Reflective Learning at Work – MIRROR Model, Apps and Serious GameseLearning Papers
 
SKILL2E: Online Reflection for Intercultural Competence Gain
SKILL2E: Online Reflection for Intercultural Competence GainSKILL2E: Online Reflection for Intercultural Competence Gain
SKILL2E: Online Reflection for Intercultural Competence GaineLearning Papers
 
Experience Networking in the TVET System to Improve Occupational Competencies
Experience Networking in the TVET System to Improve Occupational CompetenciesExperience Networking in the TVET System to Improve Occupational Competencies
Experience Networking in the TVET System to Improve Occupational CompetencieseLearning Papers
 
Leveraging Trust to Support Online Learning Creativity – A Case Study
Leveraging Trust to Support Online Learning Creativity – A Case StudyLeveraging Trust to Support Online Learning Creativity – A Case Study
Leveraging Trust to Support Online Learning Creativity – A Case StudyeLearning Papers
 
Innovating Teaching and Learning Practices: Key Elements for Developing Crea...
Innovating Teaching and Learning Practices:  Key Elements for Developing Crea...Innovating Teaching and Learning Practices:  Key Elements for Developing Crea...
Innovating Teaching and Learning Practices: Key Elements for Developing Crea...eLearning Papers
 
Website – A Partnership between Parents, Students and Schools
Website – A Partnership between Parents, Students and SchoolsWebsite – A Partnership between Parents, Students and Schools
Website – A Partnership between Parents, Students and SchoolseLearning Papers
 
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...Academic Staff Development in the Area of Technology Enhanced Learning in UK ...
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...eLearning Papers
 
The Ageing Brain: Neuroplasticity and Lifelong Learning
The Ageing Brain: Neuroplasticity and Lifelong LearningThe Ageing Brain: Neuroplasticity and Lifelong Learning
The Ageing Brain: Neuroplasticity and Lifelong LearningeLearning Papers
 
Checklist for a Didactically Sound Design of eLearning Content
Checklist for a Didactically Sound Design of eLearning ContentChecklist for a Didactically Sound Design of eLearning Content
Checklist for a Didactically Sound Design of eLearning ContenteLearning Papers
 
The International Student and the Challenges of Lifelong Learning
The International Student and the Challenges of Lifelong LearningThe International Student and the Challenges of Lifelong Learning
The International Student and the Challenges of Lifelong LearningeLearning Papers
 
Fostering Older People’s Digital Inclusion to Promote Active Ageing
Fostering Older People’s Digital Inclusion to Promote Active AgeingFostering Older People’s Digital Inclusion to Promote Active Ageing
Fostering Older People’s Digital Inclusion to Promote Active AgeingeLearning Papers
 
eLearning and Social Networking in Mentoring Processes to Support Active Ageing
eLearning and Social Networking in Mentoring Processes to Support Active AgeingeLearning and Social Networking in Mentoring Processes to Support Active Ageing
eLearning and Social Networking in Mentoring Processes to Support Active AgeingeLearning Papers
 

More from eLearning Papers (20)

OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...
OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...
OER in the Mobile Era: Content Repositories’ Features for Mobile Devices and ...
 
Designing and Developing Mobile Learning Applications in International Studen...
Designing and Developing Mobile Learning Applications in International Studen...Designing and Developing Mobile Learning Applications in International Studen...
Designing and Developing Mobile Learning Applications in International Studen...
 
From E-learning to M-learning
From E-learning to M-learningFrom E-learning to M-learning
From E-learning to M-learning
 
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...
Standing at the Crossroads: Mobile Learning and Cloud Computing at Estonian S...
 
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...
M-portfolios: Using Mobile Technology to Document Learning in Student Teacher...
 
GGULIVRR: Touching Mobile and Contextual Learning
GGULIVRR: Touching Mobile and Contextual LearningGGULIVRR: Touching Mobile and Contextual Learning
GGULIVRR: Touching Mobile and Contextual Learning
 
Reaching Out with OER: The New Role of Public-Facing Open Scholar
Reaching Out with OER: The New Role of Public-Facing Open ScholarReaching Out with OER: The New Role of Public-Facing Open Scholar
Reaching Out with OER: The New Role of Public-Facing Open Scholar
 
Managing Training Concepts in Multicultural Business Environments
Managing Training Concepts in Multicultural Business EnvironmentsManaging Training Concepts in Multicultural Business Environments
Managing Training Concepts in Multicultural Business Environments
 
Reflective Learning at Work – MIRROR Model, Apps and Serious Games
Reflective Learning at Work – MIRROR Model, Apps and Serious GamesReflective Learning at Work – MIRROR Model, Apps and Serious Games
Reflective Learning at Work – MIRROR Model, Apps and Serious Games
 
SKILL2E: Online Reflection for Intercultural Competence Gain
SKILL2E: Online Reflection for Intercultural Competence GainSKILL2E: Online Reflection for Intercultural Competence Gain
SKILL2E: Online Reflection for Intercultural Competence Gain
 
Experience Networking in the TVET System to Improve Occupational Competencies
Experience Networking in the TVET System to Improve Occupational CompetenciesExperience Networking in the TVET System to Improve Occupational Competencies
Experience Networking in the TVET System to Improve Occupational Competencies
 
Leveraging Trust to Support Online Learning Creativity – A Case Study
Leveraging Trust to Support Online Learning Creativity – A Case StudyLeveraging Trust to Support Online Learning Creativity – A Case Study
Leveraging Trust to Support Online Learning Creativity – A Case Study
 
Innovating Teaching and Learning Practices: Key Elements for Developing Crea...
Innovating Teaching and Learning Practices:  Key Elements for Developing Crea...Innovating Teaching and Learning Practices:  Key Elements for Developing Crea...
Innovating Teaching and Learning Practices: Key Elements for Developing Crea...
 
Website – A Partnership between Parents, Students and Schools
Website – A Partnership between Parents, Students and SchoolsWebsite – A Partnership between Parents, Students and Schools
Website – A Partnership between Parents, Students and Schools
 
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...Academic Staff Development in the Area of Technology Enhanced Learning in UK ...
Academic Staff Development in the Area of Technology Enhanced Learning in UK ...
 
The Ageing Brain: Neuroplasticity and Lifelong Learning
The Ageing Brain: Neuroplasticity and Lifelong LearningThe Ageing Brain: Neuroplasticity and Lifelong Learning
The Ageing Brain: Neuroplasticity and Lifelong Learning
 
Checklist for a Didactically Sound Design of eLearning Content
Checklist for a Didactically Sound Design of eLearning ContentChecklist for a Didactically Sound Design of eLearning Content
Checklist for a Didactically Sound Design of eLearning Content
 
The International Student and the Challenges of Lifelong Learning
The International Student and the Challenges of Lifelong LearningThe International Student and the Challenges of Lifelong Learning
The International Student and the Challenges of Lifelong Learning
 
Fostering Older People’s Digital Inclusion to Promote Active Ageing
Fostering Older People’s Digital Inclusion to Promote Active AgeingFostering Older People’s Digital Inclusion to Promote Active Ageing
Fostering Older People’s Digital Inclusion to Promote Active Ageing
 
eLearning and Social Networking in Mentoring Processes to Support Active Ageing
eLearning and Social Networking in Mentoring Processes to Support Active AgeingeLearning and Social Networking in Mentoring Processes to Support Active Ageing
eLearning and Social Networking in Mentoring Processes to Support Active Ageing
 

Recently uploaded

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersChitralekhaTherkar
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 

Recently uploaded (20)

Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of Powders
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 

The Geoscience Concept Inventory WebCenter provides new means for student assessment

  • 1. The Geoscience Concept Inventory WebCenter provides new means for student assessment Emily M. Geraghty Ward Research Associate Department of Geological Sciences and member of the Geocognition Research Lab, Michigan State University, USA Julie C. Libarkin Associate Professor Department of Geological Sciences Director - Geocognition Research Lab, Michigan State University, USA Stuart Raeburn Instructional Technology Researcher/Systems Developer, Michigan State University, USA Gerd Kortemeyer Assistant Professor of Physics Education and Director, LON-CAPA Project, Michigan State University, USA Summary Faculty adopt information and communication technologies (ICT) with the assumption that they enhance student learning. In the geosciences, new curricula employ tools such as Google Earth to aid in the interpretation of three-dimensional landscapes and the processes that create them. In many cases, the evaluation of learning that occurs with this technology use is neither explicit nor necessarily matched with the overarching curricular goals of ICT. Arguably, assessment should be embedded in curriculum design according to the Backward Design model (Wiggins & McTighe, 2005) for effective instruction. We propose embedded assessment appropriate to ICT, specifically online assessment that takes advantage of automated scoring and feedback mechanisms through the Geoscience Concept Inventory (GCI) WebCenter. As an instructional tool, the WebCenter contains concept inventory questions that are carefully designed to ascertain a student’s conceptual understanding in a range of geology subtopics. The WebCenter’s customized LON-CAPA platform facilitates the inclusion of digital images created by ICT technologies to assess student learning. The WebCenter’s online venue facilitates community participation in assessment development by allowing faculty to review existing questions and submit their own. Furthermore, the WebCenter’s testing function provides an authentic online assessment experience that aligns with ICT practice and takes advantage of its technological capabilities to provide immediate feedback and detect fine- grained data such as time on task. Currently, user activity in the portal is limited to viewing and student evaluation on a small scale, with only a small fraction participating in the development of new concept inventory questions. Thus, it may be that on-site teacher training workshops are needed to help initiate collaborations and use of the technology. However, the WebCenter has already made an impact with its online, open-source nature; encouraging participation from around the globe, as evidenced by the number of users (n=130) and range of institutions using the GCI. Statistics collected via online testing with a variety of student populations will allow for powerful comparative analyses of student learning across institutions. Keywords: evaluation, learning metadata, mobile learning, research, ICT, education technologies eLearning Papers • www.elearningpapers.eu • 1 Nº 20 • July 2010 • ISSN 1887-1542
  • 2. Introduction With technological advancement has come infusion of information and communication technologies (ICT) into the classroom. Often, faculty members in higher education adopt these technologies with the assumption that they enhance student learning. In the geosciences in particular, new curricula employ tools such as Google Earth, Virtual Globes, and Geographic Information Systems (GIS) software to aid in the interpretation of three-dimensional landscapes and the understanding of the processes that create them. In many cases, the evaluation of learning that occurs with this technology use is neither explicit nor necessarily matched with the overarching curricular goals of ICT. Arguably, assessment should be embedded in curriculum design according to the Backward Design model (Wiggins & McTighe, 2005) for effective instruction in order to promote appropriate use of ICT as a learning tool. We propose use of embedded assessment appropriate to ICT, specifically online assessment that takes advantage of automated scoring and feedback mechanisms. We use the Geoscience Concept Inventory (GCI) WebCenter (http://gci.lite.msu.edu/), an online platform currently in use for the development of concept inventory questions and online student assessment, as an example. WebCenters for assessment provide avenues for investigating student learning that are targeted to the goals of ICT curricula; we encourage community development of assessment via this or similar online venues. This paper will introduce the GCI WebCenter as both an instructional tool and a virtual community of practice. GCI questions are carefully designed to ascertain student’s conceptual understanding in a range of geology subtopics. As curriculum goals are established for ICT- infused classroom activities, assessment should be designed to measure whether those goals are met. Because the GCI WebCenter is an online assessment tool, it is uniquely qualified for authentic assessment of ICT activities, and can readily incorporate appropriate technology, such as digital images, to assess student learning. Furthermore, this online venue encourages community participation in assessment development by allowing faculty to review existing GCI questions through discussion threads, and to submit questions of their own for review, validation and eventual inclusion. Background Backward Design in curriculum development is a well-regarded and often used strategy for creating effective instruction (Wiggins & McTighe, 2005). Backward Design requires intentionality in instructional practice; curricula follow from goals, rather than vice versa. At its most basic, Backward Design suggests that identification of instructional goals and determination of goal-oriented assessments be followed by curriculum development geared specifically for these pre-set goals and assessments. Because learning occurs within individual and social contexts, context must also be considered in curriculum development. Viewed through the lens of this instructional design theory, appropriate assessment should reflect the nature and context of curricular materials and approaches. In fact, “assessment” is itself a piece of the curriculum development process, and should emerge from within the curriculum, rather than exist as an isolated entity. An analysis of community efforts in digital innovations in geoscience education (represented by the 20 abstracts from the 2009 GSA poster session From Virtual Globes to Geoblogs: Digital Innovations in Geoscience Research, Education, and Outreach) illustrates the need for explicit consideration of assessment in online venues (Table 1). Analysis of the abstracts reveals a disconcerting disconnect between the “Understanding by Design” model and assessment practice. Eighty percent of the abstracts made no mention of assessment, suggesting that assessment is not necessarily a key component of curriculum development. Only 20% (n=4) of the abstracts made mention of assessment, although for the majority of these, the nature of the assessment did not appear to match the nature of the classroom activity (Table 1). Even though the abstracts introduce new and exciting ideas for use of digital technologies in the eLearning Papers • www.elearningpapers.eu • 2 Nº 20 • July 2010 • ISSN 1887-1542
  • 3. classroom, the reported assessment appeared not to utilize these same digital technologies. Only two abstracts explicitly mention using online assessment as part of their proposed curriculum. While we recognize that abstracts cannot completely portray the depth and scope of the curriculum and instruction they represent, and while 20 abstracts only represent a fraction of the ongoing curriculum development efforts, these data suggest that: (1) Assessment receives minimal attention in curriculum development and (2) Authentic, ICT- based assessment is not being adequately paired with ICT pedagogies. Digital Innovations in Geoscience Research, Education and Outreach posters (n=20) None Anecdotal Assessment Formal Assessment 16 2 2 80% 10% 10% Table 1: Prevalence of assessment as an essential component to the “Understanding by Design” model (Wiggins and McTighe, 2005) in abstracts presented at a “From Virtual Globes to Geoblogs: Digital Innovations in Geoscience Research, Education and Outreach” poster session held at the 2009 Geological Society of America annual meeting. Abstracts were coded for the presence of anecdotal and formal assessment. Three of the four abstracts that make mention of assessment allude to using online assessment (http://gsa.confex.com/gsa/2009AM/finalprogram/session_25205.htm). Online Assessment The platform on which the GCI WebCenter runs is a customized version of LON-CAPA (The LearningOnline Network with CAPA), which provides faculty with the means to share and review concept inventory questions and administer online tests to their students. In 1992, CAPA (a Computer-Assisted Personalized Approach) was started to provide randomized homework for an introductory physics course at Michigan State University (LON-CAPA; http://www.lon-capa.org/history.html; Kashy et al. 1993; 1995). The system provided a way to offer relevant practice problems and feedback to the students in spite of limited availability of teaching assistants. Different students were assigned different versions (for example, different numbers, graphs, formulas, images, and options) of the same problems, so that they could discuss problems with each other, but not simply exchange solutions. When CAPA was first introduced, students received paper printouts of their problems, and had to enter their solutions through a Telnet terminal, where they received immediate feedback on the correctness of their responses. Students typically had a limited number of allowed attempts (“tries”) to arrive at the correct solution. In later years, as the web became more widely available, a web interface for answer input was introduced. Eventually, the system gained learning content management functionality to put whole curricula online, including both content and assessment resources, as well as course management functionality (participation, grading, communication, group work and enrollment are all handled by one system). Today, LON-CAPA is used at more than 100 institutions in addition to at MSU, within settings ranging from middle school classrooms to graduate level courses. Participating disciplines include astronomy, biology, business, chemistry, civil engineering, computer science, family and child ecology, geology, human food and nutrition, human medicine, mathematics, medical technology, physics, and psychology. eLearning Papers • www.elearningpapers.eu • 3 Nº 20 • July 2010 • ISSN 1887-1542
  • 4. The LON-CAPA feedback tools help faculty identify the source of student difficulties about a topic. Faculty can view assessment data for individual students (Figure 1) and problems (Figure 2) as well as generate graphs of overall class performance (Figure 3). Furthermore, the system records “time on task” data for each student, allowing faculty to see how long students spend answering each question and to gauge question difficulty. Time on task data will be discussed later in the paper with regard to the research potential of the GCI WebCenter. One of the major strengths of online systems like LON-CAPA is the embedded and automated use of simple statistics and user tracking. Faculty can use embedded statistics to review performance of specific users (Figure 1); this functionality can be anonymized for research projects that fall under standard rules for human subjects research. Responses for an entire course can also be aggregated (Figure 2), giving a sense of the prevalence of specific alternative conceptions within a course population. Since each of the GCI response options is based on a specific alternative conception, the automated faculty feedback provides immediate opportunities to diagnose student ideas prior to instruction. Similar post-instruction evaluation is also available, producing a general sense of changes, or entrenchment, of specific student ideas in response to instruction. In addition to looking at student response data on a question-by-question basis, total scores for all completed GCI questions can also be displayed (Figure 3). This tends to be the most common metric used by concept inventory users in science (Libarkin, 2008); most faculty and researchers are looking for measures of overall change in student performance. Although not as fine-grained as question-by-question analyses, this approach can provide interesting insight into the impact of instruction on student conceptual understanding. Total scores also allow for calculation of effect size or gain (c.f. Black and Wiliam, 1998); the former is the metric used for estimating the size of change within a population most commonly accepted in educational psychology, while the latter is the metric commonly reported by disciplinary science educators (e.g., physics; Hake, 2002) Figure 1: LON-CAPA statistics functions allow faculty to review individual student performance. For this example, this student selected the first response option indicating that over time, the Earth would shrink in volume. Correct answers are provided in the test statistics as well as the date and time when that the student submitted the answer. eLearning Papers • www.elearningpapers.eu • 4 Nº 20 • July 2010 • ISSN 1887-1542
  • 5. Figure 2: Answer distribution of a particular problem (same as in Figure 1) across the whole course. While most students answered correctly, an equal number of students assumed that Earth either shrinks or that there is simply no way of knowing. Figure 3: Score distribution from a 16-student course for all GCI questions prior to instruction. Out of the maximum of 29 available points, students scored a minimum of 7 and a maximum of 17 points. Within this interval, scores were fairly evenly distributed, suggesting a range of ability levels within the course. GCI WebCenter The Geoscience Concept Inventory (GCI) is a valid and reliable multiple-choice concept inventory, designed, tested and validated with a national population of entry-level college students (Figure 4; Libarkin & Anderson, 2005; 2007; Libarkin, 2008). As a general measure of geoscience conceptual understanding, the GCI has proven useful in evaluating learning in a number of instructional contexts (Elkins & Elkins, 2007; Petcovic & Ruhf, 2008; Kortz et al., 2008). In addition, the GCI was developed with specific grounding in student experiences and eLearning Papers • www.elearningpapers.eu • 5 Nº 20 • July 2010 • ISSN 1887-1542
  • 6. ideas (Libarkin & Anderson, 2007), and was designed for flexibility within the context of standardized assessment (Libarkin, 2008). Development of concept inventories by individual researchers, the universal norm, can be expanded to include the community of faculty when ICT is utilized. This provides for development of assessments that are uniquely authentic to multiple instructional settings and diverse assessment needs. Community development of concept inventories begins with community members identifying alternative conceptions held by students through analysis of open-ended exam questions, student interviews, and/or review of the literature. Concept inventory questions are generated according to the “best practices” of assessment design (c.f. Haladyna and Downing, 1989b; Frey et al., 2005; Libarkin, 2008), following guidelines emerging from survey design and related fields, and requiring community participation in order to diversify question content and validate new and existing questions. Geoscientists, science educators, educational psychologists, and psychometricians are all invited to provide expert review of GCI questions to ensure content, construct, communication validity, and, where appropriate, cultural validity. The reliability and additional validity of GCI questions are further evaluated by the GCI WebCenter team once the questions have been tested with different student populations. GCI questions may undergo many revise - re-pilot - re-analyze cycles in order to generate the highest quality assessment questions possible. Community GCI Team Pilot testing: Standard Factor Identify Generate External review Including Analysis alternative test by scientists “think aloud” Item Response conceptions questions and educators interviews Theory Together Revise - Re-pilot - Re-analyze GCI Development Process GCI Figure 4: The GCI development process as conceptualized for the GCI WebCenter. Developing questions for the Geoscience Concept Inventory requires community participation in order to diversify question content and validate existing questions. This iterative process ensures that GCI questions are both valid and reliable (https://www.msu.edu/~libarkin/GCI_DEVELOPMENT.html). eLearning Papers • www.elearningpapers.eu • 6 Nº 20 • July 2010 • ISSN 1887-1542
  • 7. To help facilitate community-driven, collaborative assessment design, the GCI WebCenter was launched in 2009 to provide faculty with online access to GCI questions (Figure 5). This online accessibility was conceptualized as a portal for three community-based activities. First, we envisioned a mechanism through which the community of faculty could comment on existing GCI questions through discussion threads. These comments provide opportunities for correction of errors, discussion of reasoning behind question structure, and opportunities for the community to learn about the significant research effort that needs to underlie each concept inventory question. Second, the GCI was originally created to measure only a very narrow range of concepts typically taught in entry-level college geo- or Earth science courses. Recognizing the need to expand the GCI, the WebCenter is an invitation for the community to participate as co-authors on the GCI. This extension of co-authorship allows experts in diverse content areas to propose questions, thus expanding the usefulness of the GCI as a measure of conceptual understanding. Submitted questions go through the same cycle of review and revision as original GCI questions, ensuring high quality overall (Libarkin and Ward, in press). Finally, the WebCenter serves as an authentic online assessment tool, providing ease-of-use for students, autogenerated feedback for faculty, and, eventually, banking of anonymous student response data. The online assessment satisfies needs for rapid feedback and authentic assessment of ICT pedagogies, while the student data bank will enhance research potential for the entire community. Figure 5: Faculty can enroll in the GCI WebCenter to access all available GCI questions. WebCenter functions include question review, question submission, and online testing. Also available to faculty is the GCI Workbook to help with question writing and review. The workbook provides information regarding “best practice” in writing multiple choice questions as well as the importance of question validity. Given its online platform, the WebCenter is well suited to act as a virtual community of practice for a diverse set of users. The WebCenter’s capabilities allow for inclusion of assessments that target specific classroom activities and utilize interesting digital innovations in GIS, Google Earth, and ICT technologies. Questions can be developed with these technologically-enhanced materials in order to ascertain conceptual understanding in geoscience, and learning that occurs in response to ICT. Furthermore, the WebCenter disseminates questions developed by the community and collects performance data from a range of student populations. Question Review and Validation Faculty can browse GCI questions based on subtopic (e.g. volcanoes, glaciers, mountains, etc.) and comment on individual questions in a discussion thread (Figure 6). Besides commenting on questions, users are able to provide expert answers, thereby providing additional control on question validity. eLearning Papers • www.elearningpapers.eu • 7 Nº 20 • July 2010 • ISSN 1887-1542
  • 8. Performance data from the expert answers informs question validity but also provides for an interesting comparison between expert responses and answers provided by more novice student populations (e.g. non-science majors). Upon entering the GCI WebCenter, faculty can select the “Review Questions” tab to access all GCI questions. Questions are organized in folders, allowing faculty to view GCI questions according to a subtopic of interest. Each question is given a title based on the question content (e.g. Location of glaciers, see figure below) in order to facilitate question browsing for faculty. Faculty can select individual questions in order to view them and may submit answers of their own. Faculty can use the green arrows to move to other questions contained within the selected subtopic folder. Figure 6: The review questions function of the GCI WebCenter allows faculty to view the GCI questions grouped by subtopic. In this example, the user was able to view all GCI questions related to glaciers and made a comment that informs the communication validity of the question. The discussion thread function facilitates dialogue between WebCenter users (anonymously or not) and provides authors of questions with valuable feedback regarding question validity. These data are utilized by the WebCenter team in question revisions and to ensure validity. eLearning Papers • www.elearningpapers.eu • 8 Nº 20 • July 2010 • ISSN 1887-1542
  • 9. Question Submission and Diversifying Content Currently, the GCI is limited in content. Given the diversity of concepts that can be classified as “geoscience”, community involvement in question development is absolutely necessary for the instrument to satisfy the needs of the geoscience community at large (Figure 7). The GCI currently contains 85 questions available for review within the WebCenter (14 of these are in the pilot phase and need further testing with students). Many questions involve 2D images; although feasible within the LON-CAPA platform, the WebCenter has yet to incorporate 3D or even 4D representations as part of the question bank. We envision 3D images and simulations as necessary components of assessments for certain concepts; for example, understanding of geologic time might best be measured through use of dynamic simulations. For this reason, in addition to the need for expanded concept coverage, the WebCenter has a built in function for question submission where faculty are able to upload potential questions for expert review and piloting. Because the GCI WebCenter is an online assessment instrument, it is well positioned to include questions targeting curriculum activities that involve digital technologies inherent to ICT. Figure 7: WebCenter users are able to submit potential GCI questions via the WebCenter to help diversify question content. The template provides authors with the required components for the potential question to be considered as part of the GCI. The template prompts authors to ground their questions in student data and provide a rationale for the inclusion of the question in the GCI. Online Testing and Question Reliability The most powerful feature (and currently most commonly used component) of the GCI WebCenter is the online testing function (Figure 8). Faculty can create GCI tests to administer to their students online, by either manually selecting questions from the GCI question bank or by allowing the WebCenter to generate a test for them. Performance data are compiled by the eLearning Papers • www.elearningpapers.eu • 9 Nº 20 • July 2010 • ISSN 1887-1542
  • 10. WebCenter during testing, and the WebCenter automatically generates a statistical report for the test creator once the testing period has ended. Since many faculty currently use the GCI to diagnose student conceptual understanding and evaluate learning post-instruction, the autoreport is a rapid feedback mechanism that is ideally suited for instruction that seeks to be responsive to student needs. Figure 8: Questions can be selected from the GCI to create online tests for students. Anonymized student performance data are collected by the WebCenter, which then compiles test statistics to provide to faculty (see Figures 2 and 3 for examples). Research Potential of the GCI WebCenter In addition to providing faculty with a powerful online tool designed for assessing students’ conceptual understanding, the WebCenter also has potential for research both within and across courses. Student performance data collected from a wide range of institutions are open access in anonymous form and available to all WebCenter users. These data can be used to investigate questions about curricular effectiveness, or for comparison of different student groups. To facilitate potential research questions, simple demographic data, such as gender and age, are collected from all test-takers. Furthermore, the WebCenter collects time on task data while students take the online exams. Below we examine the student data collected via the WebCenter through online exams administered by two faculty. eLearning Papers • www.elearningpapers.eu • 10 Nº 20 • July 2010 • ISSN 1887-1542
  • 11. Two instructors used the GCI WebCenter to administer pre-tests in January 2010 to students enrolled in six separate introductory courses. Most commonly, tests assembled by instructors using the GCI WebCenter contain a minimum of 15 questions taken from the GCI v.2 bank of 71 validated questions. Tests are comprised of 4 mandatory questions, and at least one question from each of eleven bins. The system also chooses two questions at random from a pool of 14 pilot questions, and includes them automatically in each Concept Inventory sub-test. Each student enrolled in a course receives the same questions, although the order of questions and response options per question to be randomized. For the Jan. 2010 cohort, a total of 1369 submissions were recorded for the 49 different questions implemented -- 41 questions from the GCI v.2 and 8 "pilot" questions; these pilot questions are being evaluated for possible addition to the GCI. Tests varied in length from 27 questions (longest) to 18 questions (shortest). Figure 9: Time on task data for January 2010 cohort: A) Plot of time-on-task recorded by GCI WebCenter for student submissions B) Plot of mean “time-on-task” versus degree of difficulty for each of 41 GCI v2 questions included in Concept Tests administered. The system records both when each student first displays a particular question, and when each student submits his/her answer. Consequently, an estimate can be made of time on task, an important cognitive measure. Time on task is defined as the number of seconds that elapses between question display and answer submission. For the 1369 submissions made by students enrolled in the six Jan. 2010 courses, the mean time on task was 36 seconds, and the median was 26 seconds (Figure 9A). In addition to time on task, a simple degree of difficulty measure was also calculated for each question, where degree of difficulty ranges from 0 (least difficult) to 1 (most difficult), and is defined as: Deg.Diff = (Total Submissions - Total correct submissions)/(Total Submissions) For the 41 questions from the GCI v.2, the mean Deg.Diff was 0.65 (Figure 9B). The best linear fit to a plot of mean time on task per question versus degree of difficulty indicates a positive correlation between the two. The Pearson Product-Moment Correlation Coefficient of r = 0.303 indicates that a positive correlation is significant (critical value = 0..26 for α = 0.05 in one-tailed test with 39 d.f.). Comparison of time on task for questions from the GCI v.2 with time on task for pilot questions shows that on average students spent a few extra seconds answering a pilot eLearning Papers • www.elearningpapers.eu • 11 Nº 20 • July 2010 • ISSN 1887-1542
  • 12. question compared to existing GCI v.2 (Table 2). We hypothesize that this is related to the wording and higher level geoscience content of the pilot questions, and anticipate a decrease in pilot question time on task with revision; this hypothesis has yet to be tested. Time on Task Source Submissions Mean Median Std. Deviation GCI v.2 Questions 1327 35 26 32 Pilot Questions 132 39 32 30 Table 2: Summary statistics for questions used in Concept Tests administered to January 2010 cohort: 41 GCI v2 questions and 8 pilot questions, being tested for inclusion in the inventory. Conclusions The GCI WebCenter provides an authentic online assessment experience that aligns with ICT practice and takes advantage of technological capabilities for immediate feedback and capture of fine-grained data such as time on task. Although the WebCenter currently enrolls 130 users, user activity on the WebCenter is mostly limited to viewing and student evaluation on a small scale. The WebCenter has consistent requests for enrollment, although only a small fraction of these users participate in the collaborative development of new concept inventory questions (see discussion of barriers in Gannon-Leary & Fontainha, 2007). We do see increased user activity after specific interventions; typically after giving talks at national conferences about the functions and potential of the WebCenter and what it can provide to faculty. We plan to develop on-site and virtual teacher training workshops that cover the details of assessment development and encourage community participation in question writing. Given the user activity on the WebCenter, it may be that in order for this virtual community of practice to be successful, it must begin with face-to-face interaction. That said, the GCI WebCenter already has made an impact in technology-based science education. The online, open-source nature of the GCI WebCenter allows for greater participation from users around the globe, as evidenced by the number of users and range of institutions using the GCI. Furthermore, the statistics collected via online testing with a variety of student populations allows for powerful comparative analysis of student learning across institutions. We encourage the community to participate in the expansion and diversity of the GCI in order to bridge the gap between curriculum goals and instruction in the Backward Design model. Assessment targeting curriculum that utilizes digital innovations provides faculty with evidence of student learning and of the efficacy of these interventions. We also anticipate expansion of the WebCenter approach to other domains outside of the geosciences. Acknowledgments We thank all students and faculty who have encouraged and participated in the original GCI and GCI WebCenter projects. This work is funded by NSF through grant DUE-0717790. Any opinions, findings, and conclusions or recommendations expressed in this manuscript are those of the authors and do not necessarily reflect the views of the National Science Foundation. eLearning Papers • www.elearningpapers.eu • 12 Nº 20 • July 2010 • ISSN 1887-1542
  • 13. References Black, P. and Wiliam, D. (1998). Inside the Black Box: Raising Standards Through Classroom Assessment. The Phi Delta Kappan, v. 80, n. 2, p. 139-144, 146-148. Elkins, J. T., & Elkins, N. M. L. (2007). Teaching Geology in the Field: Significant Geoscience Concept Gains in Entirely Field-based Introductory Geology Courses. Journal of Geoscience Education, v. 55, n. 2, p. 126-132. Frey, B.B., Petersen, S., Edwards, L.M., Teramoto Pedrotti, J., Peyton, V. (2005). Teaching and Teacher Education, v. 21, p. 357–364. Gannon-Leary, Patricia Margaret & Fontainha, Elsa (2007). Communities of Practice and virtual learning communities: benefits, barriers and success factors. eLearning Papers, no. 5. ISSN 1887- 1542. Geological Society of America. (2009). From Virtual Globes to Geoblogs: Digital Innovations in Geoscience Research, Education and Outreach, retrieved May 2, 2010 from http://gsa.confex.com/gsa/2009AM/finalprogram/session_25205.htm. Hake, R. (2002). Lessons from the physics education reform effort. Conservation Ecology, v. 5, n. 2, article 28 (online) URL: http://www.consecol.org/vol5/iss2/art28/. Haladyna, T. M., and Downing, S. M. (1989b). Validity of taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, v. 2, p. 51-78. Kashy, E., Sherrill, B.M., Tsai, Y., Thaler, D., Weinshank, D., Engelmann, M., and Morrissey, D.J. (1993). CAPA, an integrated computer assisted personalized assignment system. American Journal of Physics, v. 61. p. 1124-1130. Kashy, E., Gaff, S. J., Pawley, N., Stretch, W.L., Wolfe, S., Morrissey, D.J., and Tsai, Y. (1995). Conceptual questions in computer-assisted assignments. American Journal of Physics, v. 63. p. 1000- 1005. Kortz, K. M., Smay, J. J., & Murray, D. P. (2008). Increasing Learning in Introductory Geoscience Courses Using Lecture Tutorials. Journal of Geoscience Education, v. 56, p. 280- 290. LON-CAPA. (2010). The LearningOnline Network with CAPA History, retrieved May 2, 2010 from http://www.lon-capa.org/history.html. Libarkin, J. (2010). What is the Geoscience Concept Inventory (GCI)?, retrieved May 2, 2010 from https://www.msu.edu/~libarkin/GCIinventory.html. Libarkin, J.C. & Ward, E.M.G. (in press). The qualitative underpinnings of quantitative concept inventory questions. In Feig, A.P. & Stokes, A. (Eds.). Qualitative research in geoscience education: Geological Society of America Special Paper. Libarkin, J.C., & Anderson, S.W. (2005). Assessment of Learning in Entry-Level Geoscience Courses: Results from the Geoscience Concept Inventory. Journal of Geoscience Education, 53, 394-401. Libarkin, J.C., & Anderson, S.W. (2007). Development of the Geoscience Concept Inventory, NSF Conference Proceedings. Libarkin, J.C. (2008). Concept Inventories in Higher Education Science. National Research Council. http://www7.nationalacademies.org/bose/Libarkin_CommissionedPaper.pdf Petcovic, H. L., & Ruhf, R. R. (2008). Geoscience Conceptual Knowledge of Preservice Elementary Teachers: Results from the Geoscience Concept Inventory. Journal of Geoscience Education, v. 56, p. 251-260. eLearning Papers • www.elearningpapers.eu • 13 Nº 20 • July 2010 • ISSN 1887-1542
  • 14. nd Wiggins, G.P. & McTighe, J. (2005). Understanding by Design 2 edition. Association for Supervision & Curriculum Development: Alexandria, VA, 370 p. Authors Emily M. Geraghty Ward Research Associate Department of Geological Sciences and member of the Geocognition Research Lab, Michigan State University, USA wardem@msu.edu Julie C. Libarkin Associate Professor Department of Geological Sciences Director - Geocognition Research Lab, Michigan State University, USA libarkin@msu.edu Stuart Raeburn Instructional Technology Researcher/Systems Developer, Michigan State University, USA raeburn@msu.edu Gerd Kortemeyer Assistant Professor of Physics Education and Director, LON-CAPA Project, Michigan State University, USA korte@lite.msu.edu Copyrights The texts published in this journal, unless otherwise indicated, are subject to a Creative Commons Attribution-Noncommercial-NoDerivativeWorks 3.0 Unported licence. They may be copied, distributed and broadcast provided that the author and the e-journal that publishes them, eLearning Papers, are cited. Commercial use and derivative works are not permitted. The full licence can be consulted on http://creativecommons.org/licenses/by-nc-nd/3.0/ Edition and production Name of the publication: eLearning Papers ISSN: 1887-1542 Publisher: elearningeuropa.info Edited by: P.A.U. Education, S.L. Postal address: C/ Muntaner 262, 3º, 08021 Barcelona, Spain Telephone: +34 933 670 400 Email: editorial@elearningeuropa.info Internet: www.elearningpapers.eu eLearning Papers • www.elearningpapers.eu • 14 Nº 20 • July 2010 • ISSN 1887-1542