The AP-CAT project aims to develop an adaptive computerized test for AP Statistics that assesses student mastery of specific statistical concepts and provides formative feedback. Researchers are investigating whether use of the adaptive test promotes student engagement and improves learning outcomes. A survey found positive correlations between use of the AP-CAT system and different types of student engagement. The project is in its third year of a five-year study, and future work will explore causality and expand validation and diversity of the student sample.
An Adaptive Evaluation System to Test Student Caliber using Item Response TheoryEditor IJMTER
Computational creativity research has produced many computational systems that are
described as creative [1]. A comprehensive literature survey reveals that although such systems are
labelled as creative, there is a distinct lack of evaluation of the Creativity of creative systems [1].
Nowadays, a number of online testing websites exist but the drawback of these tests is that every
student who gives a particular test will always be given the same set of questions irrespective of their
caliber. Thus, a student with a very high Intelligence Quotient (IQ) may be forced to answer basic
level questions and in the same way weaker students may be asked very challenging questions which
they cannot response. This method of testing results into a wastage of time for the high IQ students
and can be quite frustrating for the weaker students. This would never benefit a teacher to understand
a particular student’s caliber for the subject under Consideration. Each learner has different learning
status and therefore different test items should be used in their evaluation. This paper proposes an
Adaptive Evaluation System developed based on an Item Response Theory and would be created for
mobile end user keeping in mind the flexibility of students to attempt the test from anywhere. This
application would not only dynamically customize questions for students based on the previous
question he/she has answered but also by adjusting the degree of difficulty for test questions
depending on student ability, a teacher can acquire a valid & reliable measurement of student’s
competency.
An Adaptive Evaluation System to Test Student Caliber using Item Response TheoryEditor IJMTER
Computational creativity research has produced many computational systems that are
described as creative [1]. A comprehensive literature survey reveals that although such systems are
labelled as creative, there is a distinct lack of evaluation of the Creativity of creative systems [1].
Nowadays, a number of online testing websites exist but the drawback of these tests is that every
student who gives a particular test will always be given the same set of questions irrespective of their
caliber. Thus, a student with a very high Intelligence Quotient (IQ) may be forced to answer basic
level questions and in the same way weaker students may be asked very challenging questions which
they cannot response. This method of testing results into a wastage of time for the high IQ students
and can be quite frustrating for the weaker students. This would never benefit a teacher to understand
a particular student’s caliber for the subject under Consideration. Each learner has different learning
status and therefore different test items should be used in their evaluation. This paper proposes an
Adaptive Evaluation System developed based on an Item Response Theory and would be created for
mobile end user keeping in mind the flexibility of students to attempt the test from anywhere. This
application would not only dynamically customize questions for students based on the previous
question he/she has answered but also by adjusting the degree of difficulty for test questions
depending on student ability, a teacher can acquire a valid & reliable measurement of student’s
competency.
Data Mining Techniques for School Failure and Dropout SystemKumar Goud
Abstract: Data mining techniques are applied to predict college failure and bum of the student. This is method uses real data on middle-school students for prediction of failure and drop out. It implements white-box classification strategies, like induction rules and decision trees or call trees. Call tree could be a call support tool that uses tree-like graph or a model of call and their possible consequences. A call tree is a flowchart-like structure in which internal node represents a "test" on an attribute. Attribute is the real information of students that is collected from college in middle or pedagogy, each branch represents the outcome of the test and each leaf node represents a class label. The paths from root to leaf represent classification rules and it consists of three kinds of nodes which incorporates call node, likelihood node and finish node. It is specifically used in call analysis. Using this technique to boost their correctness for predicting which students might fail or dropout (idler) by first, using all the accessible attributes next, choosing the most effective attributes. Attribute choice is done by using WEKA tool.
Keywords: dataset, classification, clustering.
This lecture recaps the previous lecture on exploratory factor analysis, and introduces psychometrics and (fuzzy) concepts and their measurement, including (operationalisation), reliability (particularly internal consistency of multi-item measures), validity and the creation of composite scores. See also https://en.wikiversity.org/wiki/Survey_research_and_design_in_psychology/Lectures/Psychometric_instrument_development
Correlations of Students’ Academic Achievement in Mathematics with Their Erro...paperpublications3
Abstract: While there have been many researches done with regards to error analysis in mathematics, little is known about analysis of error detection abilities. It is necessary that mathematics educators understand the error detection abilities of the students so that appropriate instructions and materials can be given to the learners. In particular, this study investigates the error detection abilities of private school students in mathematics and the correlations with their written test results. A total of 41 private candidates who are reading mathematics sat for three written tests. The first test is an arithmetic test, which assesses the students’ understanding in number operations. The second test is on algebra, and the third test is an error detection test. The results from these three tests are consolidated and analyzed using Excel spreadsheets and a graphic calculator TI-84Plus. Statistical tools such as the regression line, the Pearson product-moment correlation coefficient and the reliability coefficient are used to analyze the data and evaluate the error detection instrument. At the end of the study, it is found that the academic performance of the students in mathematics is highly correlated to their error detection abilities. At the same time, it is also found that the designed error detection test is a reliable instrument which is suitable to predict the future success of a student in mathematics with some limitations. This study is part of a growing body of research on error analysis in mathematics. In using the largely untapped source of students studying in a local private education institution, this project will definitely contribute to future research on similar topics.
A Structural Equation Modelling of Entrepreneurial Education and Entrepreneu...inventionjournals
International Journal of Business and Management Invention (IJBMI) is an international journal intended for professionals and researchers in all fields of Business and Management. IJBMI publishes research articles and reviews within the whole field Business and Management, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Association rule discovery for student performance prediction using metaheuri...csandit
According to the increase of using data mining tech
niques in improving educational systems
operations, Educational Data Mining has been introd
uced as a new and fast growing research
area. Educational Data Mining aims to analyze data
in educational environments in order to
solve educational research problems. In this paper
a new associative classification technique
has been proposed to predict students final perform
ance. Despite of several machine learning
approaches such as ANNs, SVMs, etc. associative cla
ssifiers maintain interpretability along
with high accuracy. In this research work, we have
employed Honeybee Colony Optimization
and Particle Swarm Optimization to extract associat
ion rule for student performance prediction
as a multi-objective classification problem. Result
s indicate that the proposed swarm based
algorithm outperforms well-known classification tec
hniques on student performance prediction
classification problem.
Correlation based feature selection (cfs) technique to predict student perfro...IJCNCJournal
Education data mining is an emerging stream which h
elps in mining academic data for solving various
types of problems. One of the problems is the selec
tion of a proper academic track. The admission of a
student in engineering college depends on many fact
ors. In this paper we have tried to implement a
classification technique to assist students in pred
icting their success in admission in an engineering
stream.We have analyzed the data set containing inf
ormation about student’s academic as well as socio-
demographic variables, with attributes such as fami
ly pressure, interest, gender, XII marks and CET ra
nk
in entrance examinations and historical data of pre
vious batch of students. Feature selection is a pro
cess
for removing irrelevant and redundant features whic
h will help improve the predictive accuracy of
classifiers. In this paper first we have used featu
re selection attribute algorithms Chi-square.InfoGa
in, and
GainRatio to predict the relevant features. Then we
have applied fast correlation base filter on given
features. Later classification is done using NBTree
, MultilayerPerceptron, NaiveBayes and Instance bas
ed
–K- nearest neighbor. Results showed reduction in c
omputational cost and time and increase in predicti
ve
accuracy for the student model
Research Inventy : International Journal of Engineering and Scienceinventy
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
A Study on Learning Factor Analysis – An Educational Data Mining Technique fo...iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Data Mining Techniques for School Failure and Dropout SystemKumar Goud
Abstract: Data mining techniques are applied to predict college failure and bum of the student. This is method uses real data on middle-school students for prediction of failure and drop out. It implements white-box classification strategies, like induction rules and decision trees or call trees. Call tree could be a call support tool that uses tree-like graph or a model of call and their possible consequences. A call tree is a flowchart-like structure in which internal node represents a "test" on an attribute. Attribute is the real information of students that is collected from college in middle or pedagogy, each branch represents the outcome of the test and each leaf node represents a class label. The paths from root to leaf represent classification rules and it consists of three kinds of nodes which incorporates call node, likelihood node and finish node. It is specifically used in call analysis. Using this technique to boost their correctness for predicting which students might fail or dropout (idler) by first, using all the accessible attributes next, choosing the most effective attributes. Attribute choice is done by using WEKA tool.
Keywords: dataset, classification, clustering.
This lecture recaps the previous lecture on exploratory factor analysis, and introduces psychometrics and (fuzzy) concepts and their measurement, including (operationalisation), reliability (particularly internal consistency of multi-item measures), validity and the creation of composite scores. See also https://en.wikiversity.org/wiki/Survey_research_and_design_in_psychology/Lectures/Psychometric_instrument_development
Correlations of Students’ Academic Achievement in Mathematics with Their Erro...paperpublications3
Abstract: While there have been many researches done with regards to error analysis in mathematics, little is known about analysis of error detection abilities. It is necessary that mathematics educators understand the error detection abilities of the students so that appropriate instructions and materials can be given to the learners. In particular, this study investigates the error detection abilities of private school students in mathematics and the correlations with their written test results. A total of 41 private candidates who are reading mathematics sat for three written tests. The first test is an arithmetic test, which assesses the students’ understanding in number operations. The second test is on algebra, and the third test is an error detection test. The results from these three tests are consolidated and analyzed using Excel spreadsheets and a graphic calculator TI-84Plus. Statistical tools such as the regression line, the Pearson product-moment correlation coefficient and the reliability coefficient are used to analyze the data and evaluate the error detection instrument. At the end of the study, it is found that the academic performance of the students in mathematics is highly correlated to their error detection abilities. At the same time, it is also found that the designed error detection test is a reliable instrument which is suitable to predict the future success of a student in mathematics with some limitations. This study is part of a growing body of research on error analysis in mathematics. In using the largely untapped source of students studying in a local private education institution, this project will definitely contribute to future research on similar topics.
A Structural Equation Modelling of Entrepreneurial Education and Entrepreneu...inventionjournals
International Journal of Business and Management Invention (IJBMI) is an international journal intended for professionals and researchers in all fields of Business and Management. IJBMI publishes research articles and reviews within the whole field Business and Management, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Association rule discovery for student performance prediction using metaheuri...csandit
According to the increase of using data mining tech
niques in improving educational systems
operations, Educational Data Mining has been introd
uced as a new and fast growing research
area. Educational Data Mining aims to analyze data
in educational environments in order to
solve educational research problems. In this paper
a new associative classification technique
has been proposed to predict students final perform
ance. Despite of several machine learning
approaches such as ANNs, SVMs, etc. associative cla
ssifiers maintain interpretability along
with high accuracy. In this research work, we have
employed Honeybee Colony Optimization
and Particle Swarm Optimization to extract associat
ion rule for student performance prediction
as a multi-objective classification problem. Result
s indicate that the proposed swarm based
algorithm outperforms well-known classification tec
hniques on student performance prediction
classification problem.
Correlation based feature selection (cfs) technique to predict student perfro...IJCNCJournal
Education data mining is an emerging stream which h
elps in mining academic data for solving various
types of problems. One of the problems is the selec
tion of a proper academic track. The admission of a
student in engineering college depends on many fact
ors. In this paper we have tried to implement a
classification technique to assist students in pred
icting their success in admission in an engineering
stream.We have analyzed the data set containing inf
ormation about student’s academic as well as socio-
demographic variables, with attributes such as fami
ly pressure, interest, gender, XII marks and CET ra
nk
in entrance examinations and historical data of pre
vious batch of students. Feature selection is a pro
cess
for removing irrelevant and redundant features whic
h will help improve the predictive accuracy of
classifiers. In this paper first we have used featu
re selection attribute algorithms Chi-square.InfoGa
in, and
GainRatio to predict the relevant features. Then we
have applied fast correlation base filter on given
features. Later classification is done using NBTree
, MultilayerPerceptron, NaiveBayes and Instance bas
ed
–K- nearest neighbor. Results showed reduction in c
omputational cost and time and increase in predicti
ve
accuracy for the student model
Research Inventy : International Journal of Engineering and Scienceinventy
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
A Study on Learning Factor Analysis – An Educational Data Mining Technique fo...iosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
Educational Data Mining is used to find interesting patterns from the data taken from
educational settings to improve teaching and learning. Assessing student’s ability and performance with
EDM methods in e-learning environment for math education in school level in India has not been
identified in our literature review. Our method is a novel approach in providing quality math education
with assessments indicating the knowledge level of a student in each lesson. This paper illustrates how
Learning Curve – an EDM visualization method is used to compare rural and urban students’ progress
in learning mathematics in an e-learning environment. The experiment is conducted in two different
schools in Tamil Nadu, India. After practicing the problems the students attended the test and their
interaction data are collected and analyzed their performance in different aspects: Knowledge
component level, time taken to solve a problem, error rate. This work studies the student actions for
identifying learning progress. The results show that the learning curve method is much helpful to the
teachers to visualize the students’ performance in granular level which is not possible manually. Also it
helps the students in knowing about their skill level when they complete each unit.
A Survey on Research work in Educational Data Miningiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
3 D Project Based Learning Basics for the New Generation Science Standardsrekharajaseran
This presentation is a part of the workshop presented at Griffin RESA Drive-In STEM Conference on September 28, 2016. It provides an introduction to the basics of three dimensional project based learning for STEM Education and New Generation Science Standards.
Clustering Students of Computer in Terms of Level of ProgrammingEditor IJCATR
Educational data mining (EDM) is one of the applications of data mining. In educational data mining, there are two key domains, i.e. student domain and faculty domain. Different type of research work has been done in both domains.
In existing system the faculty performance has calculated on the basis of two parameters i.e. Student feedback and the result of student in that subject. In existing system we define two approaches one is multiple classifier approach and the other is a single classifier approach and comparing them, for relative evaluation of faculty performance using data mining
Techniques. In multiple classifier approach K-nearest neighbor (KNN) is used in first step and Rule based classification is used in the second step of classification while in single classifier approach only KNN is used in both steps of classification.
But in proposed system, I will analyse the faculty performance using 4 parameters i.e., student complaint about faculty, Student review feedback for faculty, students feedback, and students result etc.
For this proposed system I will be going to use opinion mining technique for analyzing performance of faculty and calculating score of each faculty.
In this study, the effect of combining variables from the different data sources for student academic performance prediction was examined using three state-of-the–art classifiers: Decision Tree (DT), Artificial Neural Network (ANN) and Support Vector Machine (SVM). The study examined the use of heterogeneous multi-model ensemble techniques to predict student academic performance based on the combination of these classifiers and three different data sources. A quantitative approach was used to develop the various base classifier models while the ensemble models were developed using stacked generalisation ensemble method in order to overcome the individual weaknesses of the different models. Variables were extracted from the institution’s Student Record System and Learning Management System (Moodle) and from a structured student questionnaire. At present, negligible work has been done using this integrated approach and ensemble techniques especially with aggregated learner data in performance prediction in HE. The empirical results obtained show that the ensemble models.........................
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
1. ABSTRACT
The Cognitive Diagnostic Computerized Adaptive Testing
(CD-CAT) for AP Statistics, also known as the AP-CAT
project, is a five-year project funded by the NSF. This project
utilizes computerized adaptive testing, IRT modeling, and
cognitive diagnostic modeling to help assess, educate, and
engage students in AP statistics courses. Supported by the
NSF REU program at University of Notre Dame, I had the
opportunity to work with Dr. Cheng and her team on the AP-
CAT project, particularly on student engagement.
What are the goals of the AP-CAT?
1. To develop a CD-CAT system for AP statistics
teachers and students.
2. Determine if the CD-CAT system is effective in
improving student engagement:
1. Does the use of the system promote
engagement?
2. Does engagement improve learning?
3. Encouraging STEM careers in underrepresented
populations.
THE AP-CAT PROJECT
Most literature on student engagement suggest a
similar three-factor model: affective/emotional,
behavioral, and cognitive engagement (Fredericks
& McColskey, 2012):
• Affective focuses on a student’s attitude and
motivation (e.g. “I am very interested in
learning statistics”).
• Behavioral encompasses a student’s
participation with course material (e.g. “I make
sure to study on a regular basis”).
• Cognitive envelopes a student’s cognitive
investment and higher-order thinking (e.g. “I
combine ideas from different courses to help me
complete assignments”).
Student
Engagement
Affective
Engagement Behavioral
Engagement
Cognitive
Engagement
STUDENT ENGAGEMENT
• The system provides a large item bank and allows
teachers to assemble assignments and/or exams in
the form of either standard or adaptive assessments.
• Adaptive item selection:
• Tailor each test to students ability level.
• Does not give easy/boring questions to
higher ability students, or
hard/discouraging questions to lower
ability students;
• Shortens assessment length and saves time.
• Upon the completion of an assessment the student
and teacher will receive reports on performance and
mastery on topics covered by the assessment. There
are two types of outputs that a CAT system can give:
• Summative score (derived from item
response theory or IRT modeling);
• Formative feedback (profile score derived
from cognitive diagnostic modeling).
• We believe adaptive testing and cognitive diagnostic
feedback will promote student engagement.
FEATURES OF AP-CAT AND ENGAGEMENT
ITEM RESPONSE THEORY (IRT): RASCH MODEL
ADAPTIVE TESTING BASED ON AN IRT MODEL
• In adaptive testing the
next question is selected
to match one’s ability
level, which is estimated
based on his or her
responses to previous
items.
• Testing continues until a
specified number of items
are given, or a specified
level of precision in
ability estimation is
achieved.
• It models the probability of answering an item
correctly given a person’s latent ability θ and an
item’s difficulty δ. We can estimate θ given item
responses (correct or incorrect).
Estimate Ability
Figure 1: Flow chart of an
adaptive system.
NO
Start
Select Item
Termination
Criteria
Met?
Estimate
Final Ability
End
• CDM helps provide detailed feedback about the
skills or attributes a person has acquired in a
particular domain given his or her item responses.
• For example, 𝒂 = 1 0 0 1 1 , represents a latent
profile for someone who has mastered the first and
the last two attributes.
• A Q-matrix summarizes attributes assessed by a
test. E.g., illustrated below is a 3-item test assessing
5 attributes. Item 1 assess attribute 1, Item 2
attributes 1 and 3, and Item 3 all 5 skills.
• CDM models the probability of answering item j
correctly given person i’s latent profile of attribute
mastery (𝒂&) and item j’s vector in the Q-matrix,
which includes a total of K attributes (k = 1, 2, …
K). Below shows the DINA model, which includes
two item parameters: guessing (s) and slipping (g).
𝜂&( = ∏ 𝛼&+
,(+-
+./
P 𝑋&( = 1 𝒂&, 𝑠(, 𝑔( = 𝑔(
/5678
(1 − 𝑠()678
• The DINA model assumes that attributes are non-
compensatory, meaning that the deficiency on one
attribute cannot be compensated by the mastery on
another. Some other CDM models make different
assumptions, such as the DINO model.
COGNITIVE DIAGNOSTIC MODELING (CDM)
COGNITIVE DIAGNOSTIC FEEDBACK
Through CDM, we can obtain an estimate of 𝒂, which is
the estimated profile of a person’s mastery status, and the
probability of mastering each attribute.
𝑸 =
1 0 0
1 0 1
1 1 1
0 0
0 0
1 1
Summarizing distributions of univariate data
Calculating mean
Understanding/Interpreting values within a boxplot
Calculating interquartile range (IQR)
Interpreting percentiles
MEASURING ENGAGEMENT
• The Survey of Student Engagement for Statistics
(Whitney & Cheng, in preparation) is a three factor
measure of student engagement at the classroom level
specifically for statistics classes. A total of 178 high
school students (Female = 54.50%, White = 55.06%)
completed the survey.
• Positive correlations were found between self-
reported frequency of use of the AP-CAT system with
all three engagement factors (See Table 1).
• A linear regression model with three aspects of
engagement predicting statistics learning (measured
by a standard assessment), was significant, F(3,121) =
4.67, p = .004, and R2 = .104, and cognitive
engagement is the significant predictor (see Table 1).
REFERENCES AND ACKNOWLEDGEMENTS
I would like to thank Dr. Ying Cheng, Brendan M. Whitney, Alex Broderson, and Dr.
Cheng Liu for their guidance and mentorship during my summer research experience.
Thank you to Kristie LeBeau, my lab partner, for being with me every step of the way. I
would also like to extend my most sincere gratitude to the Center for Research
Computing at Notre Dame, Dr. Paul Brenner, and Kallie O’Connell for their dedication
and selfless contribution to the summer REU programs. A final thank you goes to the
National Science Foundation for funding my research contributions this summer and
investing in my future as a computational social science scholar.
Cheng, Y. (n.d.). AP-CAT Home. Retrieved July 13, 2016, from https://ap- cat.crc.nd.edu/
Cheng, Y. (2009). When cognitive diagnosis meets computerized adaptive testing: CD-CAT.
Psychometrika, 74(4), 619-632. doi:10.1007/s11336-009-9123-2
Fredericks, J. A., & McColskey, W. (2012). The measurements of student engagement: A
comparative analysis of variaous methods and student self-report instruments. In S. L.
Christenson, A. L. Reschly, & C. Wylie (Eds), Handbook of Research on Student
Engagement (763-782). New York, Springer Science + Business Media.
Huebner, A., Wang, B., & Lee, S. (2009). Practical issues concerning the application of the
DINA model to CAT data. In D. J. Weiss (Ed.), Proceedings of the 2009 GMAC
Conference on Computerized Adaptive Testing.
Rupp, A. A., & Templin, J. L. (2007). Unique characteristics of cognitive diagnostic models.
Paper presented at the Annual Meeting of the National Council on Measurement in
Education, Chicago, IL.
Whitney, B. M., & Cheng, Y. (2016). The Survey of Student Engagement for Statistics: Initial
development and validation. Manuscript in preparation.
Table 1: Analyses of Student Engagement Data
Correlation Outcomes (r Values)
Predictor Affective Behavior Cognitive
I use the AP-CAT system
multiple times a week. .14 .36** .21*
Linear Regression Outcomes (Standardized β)
Statistics learning .060 -.056 .30*
* p < .05
** p < .001
FUTURE DIRECTIONS
• It is the beginning of year three of a five year project.
In the next two years we will:
• Investigate causality and possible mediation effects
with future randomized control trials.
• Collect and analyze data on behavioral indicators
of engagement, e.g., system log of actual use of the
AP-CAT system.
• Further validation of the engagement survey and
expand sample in diversity.
Department of Psychology, University of Notre Dame
Katlynn G. Kennedy, Kristie LeBeau, Brendan M. Whitney, M.A., Alex Brodersen, Cheng Liu, Ph.D., Ying Cheng, Ph.D.
The AP-CAT Project: Integrating Cognitive Diagnostic Modeling with Computerized
Adaptive Testing for Statistics Education