Qualitative IT Assignment on Data, Information & Knowledge
Teacher Tech Knowledge Factors
1. Handbook of Research on
Teacher Education in the
Digital Age
Margaret L. Niess
Oregon State University, USA
Henry Gillow-Wiles
Oregon State University, USA
A volume in the Advances in Higher Education
and Professional Development (AHEPD) Book
Series
4. 549
Exploring Technological Knowledge of Office Data Processing Teachers
INTRODUCTION
Helping beginner and veteran teachers develop
technological knowledge and skills in how to use
new technologies to teach is a critical component
ofteacherpreparationinthisdigitalage(National
Council for Accreditation of Teacher Education
[NCATE], 2010). Existing research indicates that
a critical factor influencing beginner teachers’
adoption of technology is the quantity and qual-
ity of technological knowledge and experiences
includedintheirteachereducationprogram(Agyei
Voogt, 2011; Tondeur, Van Braak, Sang, Fisser
Ottenbreit-Leftwich, 2012). Today’s teachers
shoulddeveloplessonsthatteachlearnerscontent
knowledgeandassistthemtodeveloptwenty-first
century skills so that they can think effectively,
actively solve problems, and be digitally literate.
Thepreparationofteachersintheeducationaluses
of technology in the current digital age appears
to be a key component in almost every improve-
ment plan for education and educational reform
program (Davis Falba, 2002). According to
Gess-Newsome Lederman (2003), while some
issuesineducationtakeontheflavorofsocialand
historical context, others, such as how to prepare
beginner and experienced teachers to integrate
technology for effective teaching and learning in
the current digital age, remain almost ill-defined.
Mostimportantly,researchevidenceshowsthatin
spite of many efforts that researchers and educa-
tional institutions have invested over the years in
preparingbothbeginnerandexperiencedteachers
in the educational uses of technology, pre-service
(beginner in this study) and in-service (veteran in
thisstudy)teachersstilllackappropriateskillsand
knowledge needed to be able to successfully use
technology to teach (Uwameiye Adegbenro,
2007). According to Meskill, Mossop, DiAngelo
Pasquate(2002),thisisnotnecessarilythecase,
finding that new teachers appear to be affected by
the existing culture of the teaching profession.
While beginner teachers may be more conversant
with technology in their daily lives than veteran
teachers, they are not exposed to ideas about how
to integrate technology in classroom settings.
Althoughsomeresearchreportedthatteachers’
experience in teaching did not influence their use
of information communication technology (ICT)
in teaching (Niederhauser Stoddart, 2001),
more research showed that teaching experience
influenced the successful use of ICT in class-
rooms (Williams, 2003; Gorder, 2008; Cubuk-
cuoglu, 2013; Ndibalema, 2014). In particular,
Gorder (2008) reported that teacher experience
is significantly correlated with the actual use
of technology. Lau and Sim (2008) conducted a
study on the extent of ICT adoption among 250
secondaryschoolteachersinMalaysia.Theirfind-
ingsrevealedthatexperiencedteachersfrequently
use computer technology in the classrooms more
than the beginner teachers. This result implies
that teachers’ ICT knowledge and skills in rela-
tion to the successful implementation of ICT as
pedagogical tool (Pierson, 2001) is complex and
not a clear predictor of ICT integration in teach-
ing and learning. In addition, Kumar Kumar
(2003) argue that lack of adequate training and
experienceisoneofthemainfactorswhyteachers
do not use technology in their teaching.
The Further Education and Training (FET)
colleges in South Africa currently have greater
access to educational technologies than has
been the case in the past. In addition, much
investment has been made in the acquisition
of ICT infrastructure in these colleges where
vocational and technology-based subjects are
being offered for the purpose of artisan skills
development.However,verylittleisknownabout
whatformsoftechnologicalknowledgeandskills
are needed by Office Data Processing (ODP)
teachers for effective teaching and learning. It
is in this sense that in this study we explored
the technological knowledge of ODP teachers
at FET colleges in South Africa. We explored
the technological knowledge of ODP teachers
5. 550
Exploring Technological Knowledge of Office Data Processing Teachers
in the specific domains such as Microsoft Word
program, spreadsheet application, audio typing,
PowerPoint presentation, Interactive Teaching
Box (ITB), Web technology, and data projector
application. The teaching of these applications
in FET colleges is in line with the new South
Africa’s National Certificate Vocational (NCV)
curriculum standard and the ICT White Paper
Policy of the Department of Basic Education
(2007). Moreover, ODP requires a particular
procedure and a set of tasks that should be ex-
ecuted within a given period of time.
This study promises to contribute towards a
better understanding of the framework that can
be used to unpack the technological knowledge
of teachers. A framework for understanding
and unpacking the knowledge of teachers who
teach with the help of technology needs to have
a technology dimension. The overarching re-
search questions that we pursued in this study
are enunciated as follows:
(a) What is the nature of technological knowl-
edge of beginner and veteran ODP teachers
in the use of ICT as a pedagogical tool?
(b) Is there any significant differences in the
technological knowledge between beginner
and veteran ODP teachers with respect to
their teaching experience?
We proceed by giving the background of the
study in the next section to examine the techno-
logicalpedagogicalcontentknowledge(TPACK)
framework that undergirds this study, explain
technologicalknowledge,andexaminetheProce-
duralFunctionalPedagogicalContentKnowledge
(PrFPACK) framework. These literature review
sections are followed by the discussion of the
chosen methodology, and the presentation and
discussion of findings.
BACKGROUND
The knowledge related to the effective use of
educational technologies has become widely
recognizedasanimportantaspectofaknowledge
baseofeducatorsinthedigitalage(ISTE[Institute
for Science and Technology Education], 2008;
NCATE,2010;Partnershipfor21st
CenturySkills,
2003). The concept of knowledge as applied to
Science,EngineeringandTechnology(SET)cov-
ersdeclarativeknowledge(knowthat),functional
knowledge(know-how)andproceduralknowledge
(skills) (Ferris, 2009; Nissen, 2006). Most of the
literature discussing different types of knowledge
of teaching and learning dwells extensively on
cognitive and affective domains of knowledge
Shulman(1986)andBiggs(1999).Thesetypesof
knowledgeareoftenassociatedwithsurfacelearn-
ing concerning the representation of facts rather
than assimilation of the significance of the facts
into a construct that guides an appropriate action
(procedural knowledge). The essential types of
knowledgethatcanenhanceeffectiveteachingand
learninginanICT-enhancedclassroomhavebeen
identified in the literature to include pedagogical,
declarative,functionalandproceduralknowledge
as applied to SET (Nissen, 2006).
Pedagogicalknowledgereferstotheknowledge
of methods and strategies employed by teachers
in the process of teaching and learning. This
knowledge includes the fundamental knowledge
of classroom management, sequential lesson
preparation, student motivation, assessment and
evaluation. Declarative knowledge is defined by
Ryle (1958) as “know that”, which is a form of
knowledge that is associated with representa-
tions of facts rather than assimilation of facts
into constructs that guide effective actions. Some
researchers also refer to this concept as content
knowledge (Shulman, 1986;Shavelson, Ruiz-
Primo, Li Ayala, 2003).
Functional knowledge is defined by Ryle
(1958) as “know-how”, or having the ability to
6. 551
Exploring Technological Knowledge of Office Data Processing Teachers
describe the steps and rules to perform a function,
but not to articulate the description of what is
known and put it into practice effectively. Biggs
(2003) describes this kind of knowledge as func-
tional,whichisalsoreferredtobyotherresearchers
as technological knowledge (Koehler Mishra,
2009;MishraKoehler,2006)).InanICT-based
classroomenvironment,technologicalknowledge
ismuchmorethanjustknowingabouttechnology
or having the orientation to use technology.
Procedural knowledge, according to Biggs
(1999), is the ability or skills of the knower to
choose and perform some actions in an appropri-
ate and effective manner. Nissen (2006) argues
that knowing how to ride a bicycle can only be
demonstrated by mounting and actually riding
the bicycle. He made a clear distinction between
having the ability to perform a function, which is
functional knowledge and to actually perform the
action by effectively applying skills in practical
terms,whichisproceduralknowledge.Hiebertand
Lefevre (1986) defined procedural knowledge as
rules or procedures for solving problems in that
proceduresaresequentiallyordereddeterministic
instructions for how to perform a given task ef-
fectively. Hiebert and Lefevre (1986) argue that
procedural knowledge denotes dynamic and suc-
cessfulutilizationofparticularrulesorprocedures
that require not only the knowledge of the object
beingutilized,butalsotheknowledgeoftheformat
andthesyntaxforarepresentationalsystem.Many
researchers find that procedural knowledge con-
firms the mastery of content knowledge and that
functionalknowledgeenablesthedevelopmentof
content (Biggs, 1999; Hiebert Lefevre, 1986).
For example, in the training of office ad-
ministrator, there is a specific number of words
per minute that an office administrator must be
able to type accurately. The administrator is also
expected to effectively demonstrate the requisite
skills in the effective management of modern
office machines. This implies that the program
has specific anticipated outcomes that must be
measured before students can obtain the National
Certificate (NC) in Office Data Processing and
Office Management Technology. Moreover, an
ODPinstructorneedstohavetheabilitytochoose
theappropriateICTinfrastructurewiththeunder-
standing of the steps and rules that guide the use
of ICT tools effectively. This ability will help to
enhance the students’ comprehension of the tasks
to perform and the skills that they are expected to
be acquired at the end of the lesson, which have
to do with functional knowledge. Additionaly,
the teacher should be able to solve basic trouble
shooting problems as the need arises during the
course of instruction without distrupting the les-
son. Thus, he or she needs procedural knowledge
in this case. In the next section we discuss the
framework that undergirds the study.
EXAMINING TPACK
THEORETICAL FRAMEWORK
Mishra’s Koehler’s (2006) seven-construct
TPACK framework has been used as a theoreti-
cal basis for developing the survey to understand
teachers’ TPACK knowledge. This framework is
comprised of three basic knowledge sources and
four others derived from the interaction among
these three basic sources. These constructs can
be briefly defined as: technology knowledge
(TK), the knowledge of the technology tools;
pedagogical knowledge (PK), the knowledge
of teaching methods; content knowledge (CK),
the knowledge of subject matter; technological
pedagogical knowledge (TPK), the knowledge
about using technology to implement teaching
methods;technologicalcontentknowledge(TCK),
theknowledgeofthesubjectmatterrepresentation
via technology; pedagogical content knowledge
(PCK), the knowledge of the teaching method
with respect to the subject matter content; and,
technological pedagogical content knowledge
(TPACK), the knowledge of using technology to
implement teaching method for different types
of subject matter.
7. 552
Exploring Technological Knowledge of Office Data Processing Teachers
The strength of the technological pedagogical
contentknowledgeisthatitprovidesaframework
to examine what knowledge teachers need in
order to integrate technology into teaching and
learning. The TPACK framework can be used to
measure the impact of technology on teaching
and learning (Friesen Lock, 2010). If TPACK
represents the ICT-related knowledge required of
teachers in the digital age, then it is natural to ask
how TPACK might be measured and assessed to
ensure the effectiveness of teacher development.
Although several published studies describe in-
struments for measuring TPACK, there is as yet
no widely accepted instrument that addresses the
technological knowledge in an ICT-enhanced
classroom explicitly.
The literature to date reveals that many re-
searchers have attempted to extend or modify the
TPACK framework (Albion, Jamieson-Proctor
Finger, 2010; Angeli Valanides, 2009; Ar-
chambault Crippen, 2009; Cox, 2009; Lee
Tsai, 2010; Schmidt, Baran, Thompson, Koehler,
Mishra Shin, 2009) and, but there are apparent
limitationsastoitsapplicationinanICT-enhanced
classroom context. Cox (2009), in the contextual
analysis of TPACK, identified the various com-
plexities and why TPACK has proven difficult to
measure.Firstly,Cox(2009)arguesthatthenature
of all the knowledge involved in the framework
makes a very dense and a multifaceted domain.
New skills and understanding emerge when these
facets of knowledge are combined. Secondly,
TPACK has proven difficult to measure because
the technological knowledge must be exhibited
in some context. The need for context makes it
difficult to measure because TPACK is in real-
ity a “way of thinking” (Niess et al., 2009) and
“intuitive understanding” (Schmidt et al., 2009).
Based on the contextual analysis of Cox (2009),
it is imperative to make the technological knowl-
edgeeasiertounderstandandlesscomplicatedby
explaining in clear terms what is being measured.
It is in this sense that technological knowledge is
explained in the next section.
EXPLANATION OF
TECHNOLOGICAL KNOWLEDGE
A broad definition of the concept of technology
is important for understanding technological
knowledge. A precise clarity in defining technol-
ogy in the education domain will give a clearer
meaning to the TPACK theoretical framework.
This understanding will further enhance its quan-
titative and qualitative measurements as well as
make it more widely applicable. Moreover, it is
difficult to measure something that is not fully
understood. As “technology has deeply impacted
on education in many ways” (Herschbach, 1995,
p. 32), the importance of technological knowl-
edge has been identified as an area of concern.
The epistemology of technological knowledge is
by no means yet a fully developed area. On the
other hand, “it makes little sense to talk about
curricular strategies until the epistemological
dimensions of technological knowledge are first
determined” (Herschbach, 1995, p. 32). Having
said all this, we note that technology is a very
broad and ill-defined terminology. It can mean
any application of human knowledge to solving
practical societal problems.
AccordingtoAggarwal(1999),thewordtech-
nology is derived from the Greek word, “techno”,
meaning art or skill and “logia”, meaning science
orstudy.ThisideaalsoconferscredencetoGumbo
(2003), who defines technology by adopting an
etymologicalanalysis.Thetermtechnology,from
anetymologicalexploration,canbedefinedasthe
science of study of an art or skill. Price (1987)
describes technology as the application of a sci-
entific knowledge to a practical problem. Petzer
andSteenkamp(2004)writethattechnology,being
man-made,isaboutdevelopingpracticalsolutions
to problems and that these solutions are ideas
generated to meet the three basic human needs,
food, clothing and shelter. For example, in order
tosolvethehungerproblem,thevendingmachine
wasinvented.Thetechnologyhereistheinvention
and effective use of vending machines is to solve
8. 553
Exploring Technological Knowledge of Office Data Processing Teachers
the hunger problem. Thus, no technology exists
without the knowledge and skills to solve some
specific problems. Romizowski and Alexander
(1980) posit that technology describes a process
of something that people do in order to solve
problems or to achieve goals such as products,
services, instruments, and tools that can be used
to better solve human problems and to meet the
needs of the community.
If technology is well understood in this per-
spective, then there will be a clear understanding
of how to identify its knowledge and to assess it.
Anderson (2011) argues that the technological
knowledge involves a holistic development of
threeknowledgedomains:cognitive,affectiveand
psychomotor. The cognitive domain varies from
the simple to the more complex with the main
focus on processing information by the brain at
differentlevelsofthinking,understanding,remem-
bering, and analysing the information received,
which is content knowledge. An affective domain
is described as the feelings and attitudes that one
is expected to develop as a result of instruction
(Pudi, 2011). This description was also indicated
as the functional knowledge, which Biggs (2003)
identifies as what the learner is able to know
and feel. Biggs (2003), along with Louw and Du
Toit (2010), regard the psychomotor domain of
knowledge as learning that depends on a set of
manual skills and the ability to perform a specific
task in a correct manner. Biggs (1999) refers to
this knowledge as procedural knowledge.
However, it was argued that the psychomotor
domain cannot stand in isolation from the cogni-
tive domain as the three knowledge domains are
interrelated (Louw Du Toit, 2010). The psy-
chomotor domain (procedural knowledge) is the
performance of an act correctly after describing
the specific steps and the guiding rules to choose
and use certain tools or equipment, which is the
functional aspect of knowledge in the ICT-based
instruction. The application of how to use the
tool or technique to perform the action or task
correctly will develop the needed skill to be ac-
quired (Pudi, 2011). According to Zhao (2003),
thetechnologicalknowledgeofteachersissituated
inthecontextwheretechnologyisfrequentlyused
to solve a problem for which the usefulness lies
onlyinitseffectiveuse,i.e.proceduralknowledge.
The technological knowledge is blurred because
it covers procedural knowledge that relates to the
activity and conceptual knowledge in the context
ofthebodyofthesubjectmatter(DrugerYung,
1995; Williams, 1992).
EXAMINING PrFPACK
THEORETICAL FRAMEWORK
The procedural, functional, pedagogical content
knowledge forms a theoretical framework pro-
posed to holistically explore the technological
knowledgeofODPteachers(Adegbenro,Olugbara
Mwakapenda,2012;Adegbenro,Mwakapenda
Olugbara, 2013). This framework extended
the classical TPACK by replacing technological
knowledgewithproceduralfunctionalknowledge
to give the framework a precise clarity. That is,
the “T” (technological) in TPACK is replaced
with “PrF” (procedural functional) to obtain an
extended theoretical framework, given the acro-
nym PrFPACK. Authors such as Yilmaz-Ozden,
Mouza, Karchmer-Klein and Glutting (2013)
have confirmed the need to provide more clarity
about the TPACK framework and to revisit the
measurement inventories built directly around
the framework.
There are extended TPACK inventories in the
literature to date that are quite relevant to this
study (Archambault Crippen, 2009; Graham,
Nicolette,Larry,PamelaHarris,2009;Schmidt
et al., 2009). These inventories are generally rel-
evant in the development of a new inventory, but
they have significant limitations. Schmidt et al.
(2009)providemoremeasuresformostofthesub-
scales and include many well-worded and useful
measures. However, the utility of the inventory
is limited because it is for elementary preservice
9. 554
Exploring Technological Knowledge of Office Data Processing Teachers
teachers and looks at technology knowledge from
a broad perspective in relation to many other
subjects, which makes it very difficult to measure
technologicalknowledge.Theinventoryprovided
byArchambaultandCrippen(2009)isalsolimited
in that it has three to four items per scale only.
Moreover,itwasdesignedforonlineteachingonly
and it thus has limited utility for exploring the use
of other types of educational technology. Graham
et al. (2009) provide the relevant measures that
contributetotheassessmentofaspecificconstruct.
Their measures have from five to ten items per
sub-scale with two open-ended measures, which
make it possible to adequately assess each sub-
scale.Here,themeasuresaredeeplyfocusedonthe
scientific and technological knowledge domains,
but they do not specify the form of knowledge
and skills that are to be explicitly measured in the
technology-enhancedclassroom.Itisimportantto
notethathowaresearcherunderstandstechnology
determines the instrument he or she is going to
use, and the items and data that come out from
the instrument.
The difficulty in providing a precise defini-
tion of technological knowledge makes it hard
to coherently address the challenges of teacher
preparation and professional development in the
current digital age. This also makes it difficult to
createrobustinventoriesorrubricsthatcanbeused
toconvenientlymeasuretechnologicalknowledge
inavarietyofcontexts(Albion,Jamieson-Proctor
Finger, 2010). It is necessary, therefore, to have
aclearunderstandingoftechnologicalknowledge,
particularlyinthecontextoftechnology-enhanced
environment, in order to be able to assess the
teachers’ technological knowledge and e-skills
in the ICT-based classroom. This study aims
to further examine this gap by exploring the
construct validity of the PrFPACK inventory,
which builds on the work of Foster, Dawson and
Reid (2005), Graham et al. (2009) and Schmidt
et al. (2009). The PrFPACK inventory consists
of a set of 65 comprehensive measures that were
organised into 13 sub-domains of knowledge
(Adegbenro, Mwakapenda Olugbara, 2013).
In this framework, the measure was defined to be
comprehensiveifitisrelativelyunambiguousand
directly measures what it intends to measure in
clearterms.Theknowledgesub-domainsrelateto
specific theoretical PrFPACK constructs such as
PK, CK, FK (functional knowledge), PrK (proce-
dural knowledge), PCK, PrPK (procedural peda-
gogicalknowledge),PrFK(proceduralfunctional
knowledge),FCK(functionalcontentknowledge),
PrFCK (procedural functional content knowl-
edge),PrFPK(proceduralfunctionalpedagogical
knowledge),PrPcK(proceduralpedagogicalcon-
tent knowledge), FPCK (functional pedagogical
content knowledge), and PrFPCK (procedural
functional pedagogical knowledge). Figure 1
shows the PrFPACK theoretical framework used
in this study to explore the factors that reflect
the nature of the technological knowledge of the
ODP teachers in the ICT-enhanced classrooms.
In addition, table 1 shows the PrFPACK
knowledge domains and succint descriptions of
these knowledge domains.
Inthenextsectionwediscussthemethodsthat
we used in the study.
METHODOLOGY
The factor analytic method was the principal
methodology of this study, which utilized a quan-
titativesurveytoexaminemeasuresoftechnologi-
cal knowledge of teachers. However, qualitative
classroom observation and interview were also
employed to enrich the findings of the study. The
choice of merging two or more methods and/or
techniques in explanatory research designs was
based on the premise that an approach based on
hybridization of methods and/or techniques has
anaddedvalueofbetterdatatriangulation,expan-
sion of findings, and reasonable interpretation
(Cresswell, 2009). The use of the hybrid method,
in which quantitative data collection and data
analysis were followed by a qualitative phase,
10. 555
Exploring Technological Knowledge of Office Data Processing Teachers
provides a deep understanding of technological
knowledge of ODP teachers in the use of the ICT
tools in the digital age.
Data collection and analysis were performed
in two phases. In the first phase of this study, the
quantitative survey was administered to 130 ODP
teacherswith107respondingfrom11FETgovern-
ment colleges in the Gauteng Province of South
Africa.The107ODPteacherswhorespondedtoa
setofquestionnairesprovidedtheteachingexperi-
ence that allowed us to decide on the participants
who were involved in the classroom observation
andinterviewprocess.Inaddition,theobservation
checklist provided the evidence of the availability
of the ICT tools in the two FET colleges that have
well equipped ICT-enhanced classrooms with all
thenecessaryICTinfrastructureandinternetcon-
nectivity. The questionnaires were distributed to
ODP teachers with the help of College principals
and heads of the Business Studies Department. In
the second phase of this study, a purposful sam-
pling technique was used to select 3 beginner and
3 veteran ODP teachers from the 2 FET colleges
based on their teaching experience, to participate
in the classroom observation and the interview.
Study Participants
A total of 107 ODP teachers constituting 64 be-
ginner and 43 veteran ODP teachers participated
in the study. Most of the participants are females
(n=71; 66.36%) as compared to their male coun-
terparts (n=36; 33.64%). This uneven gender
sample of participants reflects the dominance of
the female population over the male population
in the South African teachers in this specialized
vocational subject area. There were 64 (59.81%)
participants with an experience of at most 5 years
(beginners)with40.10%(n=43)ofthepaticipants
having at least 10 years ODP teaching experience
intheICT-enhancedclassroom.Byinvolvingboth
beginner and veteran teachers in the study, we
Figure 1. The PrFPACK framework
11. 556
Exploring Technological Knowledge of Office Data Processing Teachers
aimed to relate the technological knowledge of
the ODP teachers with their teaching experience.
Measurement Inventory
The need for relational analysis of the dataset led
to the choice of using a measurement inventory
of a set of 65 comprehensive measures to collect
responses from the veteran and beginner ODP
teachers in the FET colleges of South Africa. The
abilityoftheODPteacherstoaccuratelyrespondto
themeasurementitemsattheestablishedlevelsof
competence was measured by having them to use
a 5-point Likert Scale to respond to the measures.
The high response that corresponded to a score of
5 indicated “highly competent”, the low response
Table 1. Description of PrFPACK knowledge domains
No Knowledge
Domain
Description of the Knowledge Domain
1. Procedural
(PrK)
A response that shows an effective demonstration of the ability to choose and use appropriate ICT
tools to represent the subject matter with dexterity in solving basic technical trouble shooting problems
without disrupting the lesson (e-skill, application).
2. Functional (FK) A response that shows the ability to describe the steps and rules that govern the use of ICT tools and
the understanding of how to apply rules and concepts.
3. Pedagogical
(PK)
A response that shows the use of different methods and strategies to teach the subject matter.
4. Content (CK) A response that shows the understanding of the concept of the subject matter.
5. Pedagogical
content (PCK)
A response that shows the use of different methods to teach for the understanding of the subject matter.
6. Pedagogical
Functional
(PFK)
A response that shows the know-how of using different methods and strategies to teach the subject
matter and describe the rules and steps that govern the use of ICT tools.
7. Procedural
Content (PrCK)
A response that shows an effective demonstration of the skill and understanding of how to use ICT
tools to teach the subject matter with the ability to solve basic trouble shooting problems without
disrupting the lesson.
8. Procedural
Functional
(PrFK)
A response that shows an effective demonstration of know-how to choose and use ICT tools, describe
the right steps and rules to follow and how to solve troubleshooting problems without disrupting the
lesson.
9. Procedural
Functional
Content
(PrFCK)
A response that shows an effective demonstration of the understanding and know-how to choose and
use ICT tools and solve troubleshoot problems in the subject matter.
10. Procedural
Functional
Pedagogical
(PFPK)
A response that shows an effective demonstration of the know-how of using different methods and
follows the right steps to choose and use ICT tools to teach the subject matter and solve basic technical
problems.
11. Procedural
Pedagogical
Content
(PrPCK)
A response that shows an effective demonstration of the use of different methods to use ICT tools to
teach the subject matter and troubleshoot basic technical problems.
12. Functional
Pedagogical
Content
(FPCK)
A response that shows the know-how of using different methods to use ICT tools to teach the subject
matter and describing the rules and steps to follow correctly.
13. Procedural
Functional
Pedagogical
Content
(PFPCK)
A response that shows an effective know-how to choose and use ICT tools with best methods and
follow the right steps and rules to teach the subject matter and solve basic technical problems as the
need arises.
12. 557
Exploring Technological Knowledge of Office Data Processing Teachers
thatcorrespondedtoascoreof1indicates“highly
incompetent.”Intermediateresponsescorrespond-
ing to a score of 2 indicated “incompetent, ”, a
score of 3 indicated “fairly competent,” and a
score of 4 indicated “competent.”
The initial draft of the measurement inven-
tory was designed to explore the technological
knowledge of the ODP teachers and was refined
through a series of interactive revisions based
on the measurement principles of some existing
inventories as earlier mentioned. The revision
process involved an expert review committee
in the ICT Faculty of the university where this
studywasconducted.Themeasurementinventory
was pilot tested for content validity to improve
internal consistency of measures. All 65 items,
addressing 13 knowledge constructs, proved to
be reliable evidenced by the Chronbach Alpha
of greater than 0.7.
Data Analysis
The 107 completed responses were collected and
tabulated using the Microsoft Excel spreadsheet
software. The factor analysis method based on
principal component analysis and exploratory
factor analysis techniques was used to analyse
the response dataset. Specifically, the univariate
descriptive statistics with the aid of the Microsoft
Excel and the multivariate exploratory factor
analysis with the help of the Matlab tools were
appliedinthisstudyfordataanalysis.Descriptive
statistics using mathematical quantities such as
mean and standard deviation were employed to
summarise and interpret some properties of the
sample dataset.
Factor analysis is a multivariate data analysis
method suitable for examining the relationships
between the measured variables, which in this
context are the measures of technological knowl-
edge.Factoranalysismethodcanbeapplied,either
as exploratory analysis tool to identify a set of
factors underlying the dataset of the measured
variables, or as confirmatory analysis tool to test
whether the dataset fits a hypothesized model. In
this study, we applied exploratory factor analysis
as a tool to identify factors measuring the techno-
logical knowledge of ODP teachers because we
do not have a prior knowledge of the hypotheses
about the measured knowledge factors (Shinas,
Yilmaz-Ozden, Mouza, Karchmer-Klein Glut-
ting, 2013).
In applying the factor analysis method, we
first standardised the raw dataset to increase
the effects of a variable whose variance is small
and to reduce the effects of a variable with large
variance. The usual practice of using z-score
standardisation procedure was adhered to in this
study. The factor analysis method takes an input,
which is the dataset contained in a correlation
or a covariance matrix and organises it to better
explain the structure of the underlying system
that produced the dataset. This implies that the
correlation or the covariance matrix measures
the wellness of the variance of each element that
can be explained by the relationship between the
variables of the model.
Secondly,theeigenvaluesandthecorrespond-
ing eigenvectors were calculated for correlation
matrix to select the number of factors to use in
the next stage of the analysis. In this study, the
principal component analysis was used for the
purpose of selecting factors to carry over to the
next stage of analysis. Although the principal
componentanalysismaynotbethebestmethodfor
factor selection when compared to other methods
liketheparallelanalysismethod(Horn,1965)and
the minimum average parcels (Valicer, 1976), it
is still one of the most widely used methods for
dimension reduction and for selecting number of
factors in an exploratory factor analysis (Alkali,
Ishaku, Yusuf Aisha, 2013; Liu, Lin Hou,
2003; Lu, Liu, Jang, 2012). This study retained
eleven factors whose eigenvalues surpassed one
according to the criterion proposed by Kaiser
(1960). Factor eigenvalues are conceptually the
amount of variance accounted for by each factor
13. 558
Exploring Technological Knowledge of Office Data Processing Teachers
and are equal to the sum of the squared loadings
for a given factor.
Thirdly, varimax rotated factor analysis was
performedbasedonthenumberoffactorsretained
to produce factor loadings and factor scores. The
varimax rotation is one of the most popular or-
thogonalfactorrotationmethods,oftenappliedto
achieve a more pragmatically meaningful factor
solution. Moreover, the rotation of the factor axis
was executed to yield factors that were clearly
marked by high loadings for some variables and
low loadings for other variables. This approach
facilitates a better interpretation of results in
terms of the original variables (Liu et al., 2003).
Factor loadings are the correlation coefficients
between the model variables and the factors.
Theyrepresentthemostimportantinformationon
which the interpretation of factors is based (Liu
et al., 2003). Factor scores are the projections of
data onto the corresponding eigenvectors and are
regarded as the actual values of each observation
on the underlying factors. In this study, factor
scores, standardized using the z-score statistics,
were calculated from all the 107 observations
using the Bartlett weighted least-square estimate
(Bartlett, 1954).
RESULTS
The first important consideration in presenting
the results of this study is to discuss the reliability
result of the response dataset. The reliability of
a measurement is the extent that a measurement
consistently gives the same result on different
occasions. The significance of a measurement
reliability is to obtain a standard index to evaluate
the validity of the measurement (Olugbara et al.,
2014). Data validity can be understood from the
literature to mean an integrated judgment of the
degree to which an empirical evidence and theo-
retical rationales support the appropriateness of
inferences,propositionsorconclusions(Gomez
Elliot, 2013). In harmony with the suggestion by
authors like DiStefano, Zhu Mindrila (2009),
the adequacy of the response data for applying
factoranalysishasbeencheckedbeforehandusing
the following principles:
(a) Thecorrelationmatrixwascalculatedtoyield
the mean correlation of 0.517 varying from
0.092 to 0.865 with a range of 0.773. This
result implies the measures are sufficiently
correlated to allow for the direct application
of factor analysis.
(b) The result from the Sphericity test proposed
by Bartlett (1954) indicated that the cor-
relation matrix was not by chance with the
computed test values of loglikelihood =
37.72, Chi statistics = 2872.70, degree of
freedom=1420andpvalue=0.0.Thisresult
indicates the significance of the response
data for the direct application of the factor
analysis.
(c) The reliability of the response data was
checked by calculating the Cronbach alpha
to be 0.9853. This reliability was found to
be very high, which gave us the foundation
to proceed with further analysis.
The result in Table 2 shows eigenvalues, the
percentage of variance, cumulative eigenvalue
and cumulative percentages of variance associ-
ated with the model factors as computed by the
principalcomponentanalysis.Thisresultrevealed
thefirst11factorstoapproximatelyexplain82.6%
of total variance. This identification then led to
the selection of the 11 factors whose eigenvalues
surpassed 1 for the execution of an exploratory
factor analysis routine on the original response
dataset.
The large values of the computed communali-
ties gave an indication of the success of the factor
analysis for exploring technological knowledge
of teachers. The communality is the proportion
of the variance of a comprehensive measure that
is accounted for by the common factors. The
minimum communality value is 0.587 for the
15. 560
Exploring Technological Knowledge of Office Data Processing Teachers
measure“Iknowvariousconceptsandapplications
of advanced information presentation systems,
including Microsoft PowerPoint presentation”.
The maximum communality value is 0.989 for
the measure “I can effectively describe the right
steps and rules to use a data projector and Inter-
active Teaching Box (ITB) to teach and motivate
studentwithvariousstrategiesbestsuitedtoteach
Spreadsheet programs and solve minor technical
troubleshooting problems without disrupting the
lessons”.
Thefactoranalysiswasconductedonthedata-
set of responses collected with 65 comprehensive
measures because the responses of ODP teachers
were included in the subjects of the analysis. The
principal component analysis enabled the extrac-
Factor Eigenvalue Percent of
Variance
Cumulative
Eigenvalue
Cumulative Percent
of Variance
38 0.125 0.192 63.837 98.211
39 0.110 0.170 63.948 98.381
40 0.101 0.155 64.049 98.536
41 0.096 0.148 64.145 98.684
42 0.086 0.132 64.231 98.817
43 0.078 0.120 64.309 98.937
44 0.073 0.112 64.382 99.049
45 0.066 0.101 64.448 99.150
46 0.062 0.096 64.510 99.246
47 0.059 0.090 64.569 99.336
48 0.051 0.078 64.619 99.414
49 0.044 0.067 64.663 99.482
50 0.042 0.065 64.705 99.547
51 0.039 0.060 64.744 99.607
52 0.037 0.058 64.782 99.664
53 0.032 0.049 64.814 99.713
54 0.031 0.047 64.844 99.760
55 0.028 0.043 64.873 99.804
56 0.026 0.039 64.898 99.843
57 0.019 0.029 64.917 99.872
58 0.018 0.027 64.935 99.899
59 0.015 0.024 64.950 99.923
60 0.014 0.021 64.964 99.944
61 0.011 0.016 64.975 99.961
62 0.009 0.014 64.983 99.974
63 0.008 0.012 64.991 99.987
64 0.006 0.009 64.997 99.995
65 0.003 0.005 65.000 100.000
Table 2. Continued
16. 561
Exploring Technological Knowledge of Office Data Processing Teachers
tion of 11 latent factors from the response data.
However,afterthevarimaxrotationwasperformed
in an exploratory factor analysis, 9 factors were
initially upheld because they showed distinctive
factor loadings that fulfilled the 0.4 requirements
(Kawashima Shiomi, 2007). The last two fac-
torsreflectedmeasureswithweakfactorloadings,
and for this reason they were dropped in further
analysis. In addition, among the 9 promising
factors identified for further analysis, only 4 fac-
tors that enabled us to explore the technological
knowledge of ODP teachers were retained. The
retention policy follows the line of thought by
Anderson (2001) who argues that technological
knowledge involves a holistic development of
three knowledge domains of cognitive (content
knowledge), affective (functional knowledge),
and psychomotor (procedural knowledge). This
understandingprovidedasoundbasisuponwhich
content knowledge, functional knowledge, and
procedural knowledge were retained to explore
the technological knowledge of ODP teachers.
Moreover,theintersection(proceduralfunctional
content knowledge) of the three knowledge con-
structs was included in our analysis. The reason
for this decision is that it was argued that proce-
dural knowledge cannot stand in isolation from
the cognitive domain as these three knowledge
domainsareinterrelated(LouwDuToit,2010).
Table 3. Factor 1 (Procedural functional content knowledge)
Comprehensive Measures Factor
Loadings
Mean
Score
Standard
Deviation
Determine strategies best suited to teach and motivate students about various concepts of
advanced database management systems.
0.624* 3.654 1.001
Know the concepts and applications of advanced database management systems. 0.601* 3.832 1.041
Describe the right steps and rules to use a data projector and ITBox for teaching various concepts
of advanced database management system.
0.665* 4.000 1.046
Effectively use a data projector to teach Microsoft PowerPoint presentation and address basic
technical trouble shooting problems without disrupting the lessons.
0.744 3.654 1.056
Determine strategies best suited to teach and motivate students to understand various concepts of
advance database management systems and solve basic technical problems.
0.711* 3.551 1.021
Effectively demonstrate the right steps to use a data projector and ITBox to teach and motivate
students about the concepts of file management systems and solve basic technical problems
without disrupting the lesson.
0.636* 3.598 1.063
Know how to use ITBox with a data projector for teaching various concepts of advanced
database management systems and solve file management problems.
0.755* 3.710 1.046
Effectively use ITBox and a data projector for teaching various concepts of advanced database
management systems and solve virus attack problems without disrupting the lessons.
0.757** 3.738 0.994
Effectively describe the steps and rules to using a data projector and ITBox to teach and motivate
students with strategies best suited to understand Web technology and solve basic connectivity
problems when the need arises without disrupting the lessons.
0.487 3.822 0.856
Effectively describe the right steps and rules to use a data projector and ITBox to teach and
motivate students with strategies best suited to understand Spreadsheet programs and solve basic
troubleshooting problems without disrupting the lessons.
0.789 3.729 1.033
Effectively use a data projector and determine strategies best suited to teach and motivate
students to understand various concepts of advance database management systems and solve
basic tool bars trouble shooting problems without disrupting the lesson.
0.715* 3.720 0.989
Know how to describe the right steps and rules to use a data projector to teach and motivate
students to understand the concepts of Excel programs.
0.515 4.019 0.890
Eigenvalue = 11.005, percentage of variance = 16.005
17. 562
Exploring Technological Knowledge of Office Data Processing Teachers
The surrogate method was used to name the 4
latent factors for technological knowledge explo-
ration as discovered from the exploratory factor
analysis. According to this surrogate method, the
name of a latent factor corresponds to the name
of a single comprehensive measure that loaded
highly amongst all semantically related compre-
hensive measures of the factor. The loadings of
semantically related comprehensive measures
are marked with asterisk (*) and the factor that
loaded highly amongst the semantically related
comprehensive measures is marked with double
asterisk (**). The terms “weak”, “moderate” and
“strong” respectively, refer to the factor loading
values in the interval [0.3, 0.5], [0.5, 0.75] and
[0.75, 1.0] (Liu et al., 2003).
The most important factor, accounting for
16.00% of variance, is factor 1 presented in table
3. There were 12 measures in this factor with
their loadings ranging from 0.487 to 0.757. The
mean scores of the measures of this factor ranged
from 3.551 to 4.019, meaning the ODP teachers
generally responded well above the middle point
of 3. The majority of the measures semantically
referredtotheuseoftechnologytoteachdatabase
management systems. In particular, the measure
with the highest loading (0.757) that belongs to
the knowledge category of procedural functional
Table 4. Factor 2 (Functional knowledge)
Comprehensive Measures Factor
Loadings
Mean
Score
Standard
Deviation
Determine strategies best suited to motivate students to understand various concepts of
spreadsheet programs.
0.724* 4.112 0.805
Know various concepts and application of spreadsheet programs, including Microsoft Excel. 0.636* 4.224 0.793
Know how to describe the right steps and rules to use a data projector to teach the concepts of
formatting worksheets.
0.752** 3.972 0.956
Effectively use a data projector and Interactive Teaching Box (ITBox)to teach Worksheets
Printing and solve all basic printing problems when the need arises.
0.511* 3.850 0.909
Determine strategies best suited to teach students to understand various concepts of file
management programs.
0.552* 3.841 0.943
Effectively demonstrate the right steps to use ITBox and a data projector to motivate students to
understand the concepts of Mail merge and solve basic troubleshooting problems when the need
arises.
0.629* 3.963 0.951
Know how to effectively describe the right steps and procedures to use a data projector to teach
various concepts of Keyboard customization and solve basic technical problems when the need
arises.
0.629* 3.841 1.029
Know how to describe the right steps to use ITBox with a data projector to teach various
concepts of Keyboard customization.
0.619* 3.841 1.083
Know how to effectively use ITBox with a data projector to teach and demonstrate various
concepts of Worksheet formatting and solve basic troubleshooting problems when the need arises
without disrupting the lessons.
0.743* 3.841 0.992
Effectively use a data projector and determine strategies best suited to motivate students
to understand various concepts of Keyboard customization and Toolbar and solve basic
troubleshooting problems when the need arises without disrupting the lessons.
0.634 4.009 0.906
Know how to describe the right steps and rules to use a data projector to teach and motivate
students with strategies to understand Microsoft word processing concepts and solve tool bar
access troubleshooting problems without disrupting the lesson.
0.675 3.925 0.929
Effectively describe the right steps and strategies best suited to use a data projector and ITBox
to teach and motivate students to understand various concepts of spreadsheet programs and solve
trouble shooting problems without disrupting the lesson.
0.709* 3.981 0.879
Eigenvalue = 8.433, percentage of variance = 12.974
18. 563
Exploring Technological Knowledge of Office Data Processing Teachers
content knowledge refers to knowing how to ef-
fectively use ITBox and a data projector to teach
database management systems. This factor was
named procedural functional content knowledge,
according to the surrogate naming convention
earlier discussed.
The second most important factor, accounting
for 12.97% of variance, is the factor 2 (Table 4)
thatwasnamedfunctionalknowledge.Thisfactor
has12measures,withtheirloadingsrangingfrom
0.511 to 0.752. The mean scores of the measures
of this factor ranged from 3.841 to 4.224, mean-
ing that the ODP teachers generally responded
well above the middle point of 3. The majority
of the measures semantically referred to the use
of technology to teach spreadsheet programs. In
particular, the measure with the highest loading
(0.752) belongs to the knowledge category of
functional knowledge.
The third important factor, accounting for
12.94% of variance, is the factor 3 (Table 5) and
was named procedural knowledge. This factor
also has 12 measures with their loadings rang-
ing from 0.484 to 0.734. The mean scores of the
measuresofthisfactorrangedfrom3.907to4.159,
also meaningthat the ODP teachers generally
Table 5. Factor 3 (Procedural knowledge)
Comprehensive Measures Factor
Loadings
Mean
Score
Standard
Deviation
Effectively determine strategies best suited to teach students about various concepts of the
Microsoft word program.
0.658* 4.065 0.924
Know various concepts and applications of advanced information presentation systems, including
ITBox and PowerPoint presentation.
0.648 4.159 0.859
Effectively describe the right steps and rules to use a data projector to teach the Microsoft word
program and printing devices.
0.712* 4.000 1.046
Effectively use a data projector to teach Window media and Advance Trascription System with
stratetgies to solve basic troubleshooting problems as the need arises without disrupting the
lesson.
0.734** 3.981 0.981
Effectively determine strategies best suited to teach and motivate students on the concepts of
Web technology.e.g Internet
0.721 3.907 0.976
Effectively demonstrate and follow the right steps to use a data projector to teach students about
the concepts of Microsoft Word Program, Audio typing and solve basic troubleshooting problems
when the need arises.
0.609* 3.991 1.005
Effectively describe the right steps and procedures to use a data projector and ITBox to teach
various concepts of the Advance Presentation program and solve basic troubles shooting
problems without disrupting the lessons.
0.656* 3.953 1.004
Know how to use ITBox with a data projector to teach various concepts of Toolbar and Keyboard
Customization and solve basic trouble shooting problems without disrupting the lesson.
0.708* 3.991 1.032
Effectively use ITBox with a data projector to teach various concepts of Audio Typing and solve
basic troubleshooting problems without disrupting the lessons.
0.526* 3.963 0.868
Effectively describe the right steps and rules to use a data projector and ITBox to teach and
motivate students with strategies best suited to teach Advanced Data Transcription system and
solve tool bars trouble shooting problems without disrupting the lessons.
0.484* 4.131 0.848
Effectively use a data projector to determine strategies best suited to motivate students to
understand various concepts of Spreadsheet programs and solve trouble shooting problems
without disrupting the class.
0.547 4.065 0.861
Effectively describe the right steps and strategies best suited to use a data projector and ITBox to
teach and motivate students to understand various concepts of the Microsoft word program and
solve tool bar trouble shooting problems without disrupting the lesson.
0.551* 3.944 0.930
Eigenvalue = 8.408, percentage of variance = 12.935
19. 564
Exploring Technological Knowledge of Office Data Processing Teachers
responded well above the middle point of 3. The
majority of the measures semantically referred
to the use of technology to teach the Microsoft
word program. In particular, the measure with the
highest loading (0.734) belongs to the knowledge
category of procedural knowledge.
The fourth factor accounting, for 7.72% of
variance, is the factor 4 (Table 6) that was named
content knowledge. This factor has 8 measures
with their loadings ranging from 0.437 to 0.674.
The mean scores of the measures of this factor
ranged from 3.776 to 4.028, also meaning that the
ODPteachersgenerallyrespondedwellabovethe
middle point of 3. The majority of the measures
semantically referred to the use of technology to
teach web technology. In particular, the measure
with the highest loading (0.674) belongs to the
knowledge category of content knowledge.
To successfully complete the analysis of the
exploratoryfactoranalysisresultsofthestudy,the
scores of ODP teachers were calculated about the
latent factors identified. The scores were created
to represent the placement of an individual ODP
teacher on the latent factors identified using the
factor analytic methodology. The factor scores
are the actual values of each ODP teacher of the
underlying latent factors and were calculated for
all the 107 ODP teachers. The distribution and
the residual variance of the factor scores were
expressed in the standard deviation unit and cal-
culated using the Bartlett (1954) approach. The
Bartlett approach was used over other methods
becauseitusesthemaximumlikelihoodestimation
scheme to produce unbiased estimates of the true
factorscoresDiStefano,ZhuandMindrila(2009).
Thedemographicfactorofexperiencewascho-
sentocompareandcontrastbetweentechnological
knowledgeofODPteachersbecausestudyofICT
integration for teaching based on experience dif-
ferentialisofgreatinterest(Gorder,2008;Meskill
et al., 2002; Wetzel et al., 2007; Williams, 2003).
Table 7 shows the mean and standard deviation
results of the factor score calculation based on
teacher experience. The direct inspection of these
results revealed that veteran ODP teachers scored
higher in each of the technological knowledge
factors than the beginner teachers. The veteran
teachers scored 0.199 on average with the stan-
Table 6. Factor 4 (Content knowledge)
Comprehensive Measures Factor
Loadings
Mean
Score
Standard
Deviation
Determine strategies best suited to teach and motivate students about various concepts of web
technology and window media player for audio typing.
0.654* 3.907 0.885
Understand various concepts and applications of web technology, including, sending emails and
web surfing.
0.674** 4.028 0.852
Describe the right steps and rules to use a data projector to teach web technology such as,
sending e-mail, surfing web and establishing a network connection.
0.643* 3.944 0.920
Effectively use ITBox and a data projector to teach web technology and solve connectivity
troubleshooting problems without disrupting the lesson.
0.544* 3.776 0.945
Determine strategies best suited to teach and motivate students to understand the concepts of the
Microsoft PowerPoint presentation Program.
0.528 3.832 0.807
Effectively describe the right steps and procedures to use a data projector to teach the concepts of
Web technology.E.g Internet, Window Media.
0.617* 3.860 0.874
Know how to use ITBox with a data projector to teach the concepts of Internet and Web
technology.
0.565* 3.916 0.933
Know how to describe the right steps and rules to use a data projector to teach and motivate
students with strategies to understand concepts of Web Technology.
0.437* 3.888 0.894
Eigenvalue = 5.020, percentage of variance = 7.723
20. 565
Exploring Technological Knowledge of Office Data Processing Teachers
dard deviation of 0.834 for procedural functional
contentknowledge,whilebeginnerteachersscored
-0.133 with standard deviation of 1.127. The
mean score of 0.025 with the standard deviation
of 0.906 was obtained for the veteran teachers in
functional knowledge, while beginner teachers
scored -0.017 with standard deviation of 1.121.
Moreover, the veteran teachers scored higher in
the procedural knowledge with the mean score of
0.234,standarddeviationof0.529,whilebeginner
teachers scored -0.157 with standard deviation of
1.239. The veteran teachers obtained mean score
of 0.106 with the standard deviation of 1.049 in
the content knowledge, while beginner teachers
scored -0.071 with standard deviation of 1.058.
The positivity of a factor score implies that on
average, teachers ratings on a latent factor are
above the mean score. Similarly, the negativity
of a factor score implies that on average, teachers
ratings on a latent factor are below the mean score
because all factor loadings are positive (tables 3
to 6). These results imply that veteran teachers
have the empirical measure of their technologi-
cal knowledge slightly above the mean score on
every latent factor, while the empirical measure
of technological knowledge of beginner teachers
was slightly below the mean score on every latent
factor. However, it is important to further test
whethertheobserveddifferenceswerestatistically
significant or not.
These results in table 7 were further probed by
performing a one-way analysis of variance on the
factor scores to determine whether the observed
differences in technological knowledge of ODP
teachers on the basis of experience are statisti-
cally significant or not, with table 8 showing the
results. In order to establish whether to reject the
null hypothesis of no significant difference be-
tween means of two population samples, we can
inspect the satisfiability of the constraint F-value
F-critical or the satisfiability of the constraint
P-value =0.05 in table 8. The application of
any of these constraints enabled us to arrive at
the following findings:
(a) There is no significant difference between
theproceduralfunctionalcontentknowledge
of beginner and veteran ODP teachers.
Table 7. Factor scores identified by technological knowledge factors for ODP teachers related to experi-
ence demography
Technological Knowledge Factor Beginner (N=64) Veteran (N=43)
Procedural functional content knowledge. -0.133(1.127)* 0.199(0.834)**
Functional knowledge. -0.017(1.121)* 0.025(0.906)**
Procedural knowledge. -0.157(1.239)* 0.234(0.529)**
Content knowledge. -0.071(1.058)* 0.106(1.049)**
The minimum score = *, the maximum score = **
Table 8. Significant test for factor scores on ODP teachers based on the experience demography
Technological Knowledge Factor F-value P-value F-critical
Procedural functional content knowledge. 2.728 0.102 3.932
Functional knowledge. 0.040 0.842 3.932
Procedural knowledge. 3.826 0.053 3.932
Content knowledge. 0.720 0.398 3.932
Significant difference = *, P=0.05
21. 566
Exploring Technological Knowledge of Office Data Processing Teachers
(b) There is no significant difference between
the functional knowledge of beginner and
veteran ODP teachers.
(c) There is no significant difference between
the procedural knowledge of beginner and
veteran ODP teachers.
(d) Thereisnosignificantdifferencebetweenthe
content knowledge of beginner and veteran
ODP teachers.
DISCUSSION
This study applied factor analysis for exploring
the nature of beginner and veteran ODP teach-
ers’ technological knowledge in the technology-
enhanced classrooms at TVET colleges in South
Africa. This discussion in this section centres on
the research questions that were pursued in the
study, while at the same time the discussion maps
the results against the theoretical section.
Research Question One
What is the nature of technological knowledge of
beginner and veteran ODP teachers in the use of
ICT as a pedagogical tool? This study applied
the definition of Aggarwal (1999) that technol-
ogy is both an art (functional knowledge) and a
science (procedural knowledge) as well as the
works of Anderson (2011) and Louw and Du Toit
(2010) to explore the technological knowledge of
ODP teachers. In practice, there is an apparent
differential between procedural knowledge and
functional knowledge for all participants. For
example, there are many good football players
throughout the world who know the rules and
steps to play football fantastically (functional
knowledge). However, not many of these football
players can bring innovative excitement into the
game of football. For instance, the scoring acu-
men of King Pele of Brazil, the dribbling genius
of Maradona of Argentina and the Backheeling
inventionofRonaldoofPortugal.Theseinstances
provethepracticaldistinctionbetweenfunctional
knowledgeandproceduralknowledge.Ifteachers
could have a clear understanding of technology
and its purpose as a pedagogical tool in an ICT-
enhanced classroom, then learning using ICT ef-
fectivelyasdigitalageteachersisaneasypractice
that produces great excitement and stimulates
learning amongst students in all subjects.
The factor analytic methodology was used to ex-
plore the latent factors underlying the technologi-
calknowledgeofODPteachersfollowingtheAg-
garwal (1999) to guide the exploration. The most
important latent factor, accounting for 16.00% of
variance in the observed factors in terms of the
underlying latent facor, is procedural functional
content knowledge. This factor was measured
by 12 comprehensive measures and it includes
responses that show an effective demonstration
of the understanding and know-how to choose
and use ICT tools such as the ITBox and the data
projector to teach database management systems.
The next important latent factor, accounting for
12.97% of variance,is functional knowledge that
relates to responses that show the ability of ODP
teacherstodescribethestepsandrulesthatgovern
the use of ICT tools and the understanding of how
to apply rules and concepts to teach spreadsheet
programs. Another latent factor, accounting for
the 12.94% of variance, is procedural knowledge,
which indicates the responses that show an effec-
tive demonstration of the ability of ODP teachers
to choose and use appropriate ICT tools to teach
programs such as the Microsoft word. The last
latent factor, accounting for 7.72% of variance,
is content knowledge, which indicates responses
that show the understanding of the concept of
using technology to teach web technology. This
study therefore, found the nature of the techno-
logical knowledge of beginner and veteran ODP
teachers in the use of ICT as pedagogical tool
to be procedural functional content knowledge,
functional knowledge, content knowledge, and
proceduralknowledge.Theproceduralfunctional
22. 567
Exploring Technological Knowledge of Office Data Processing Teachers
contentknowledgegavethemostimportantfactor
explaining the technological knowledge of the
ODP teachers sampled.
Research Question Two
Is there any significant differences in the techno-
logical knowledge of beginner and veteran ODP
teachers with respect to their teaching experi-
ence? Comparing and contrasting of the experi-
ence of beginner and veteran teachers provides
a framework for understanding the dimensions
of the technological knowledge of teachers in
the digital age (Wetzel, Zambo Ryan, 2007).
The examination of technological knowledge of
beginner and veteran ODP teachers using factor
analysis indicated slight differences in their tech-
nological knowledge. The average factor score of
the veteran ODP teachers was slightly above the
average score, while the average factor score of
beginner teachers was slightly below the aver-
age. However, the results of one-way analysis of
variance on the factor scores confirms that the
observed differences in technological knowledge
of ODP teachers are not statistically different on
the basis of experience factor.
The observed slight differences, coupled with the
obtainedlowaveragefactorscoresofODPteachers
on technological knowledge (procedural, func-
tional, content), suggested that we probe further
using classroom observation and interview, fol-
lowingGrahametal.(2009)andDenzin(1978),to
enrichtheresultsofthisstudy.Datawerecollected
through classroom observation and interviews
from3veteranand3beginnerODPteachers.These
teachers were purposively selected by the head of
BusinessStudiesdepartmentineachFETCollege
based on their years of teaching experience and
theirwillingnesstoarticulatetheircraft.Thethree
beginner teachers had 1, 2 and 4 years of ODP
teachingexperience,whiletheveteranteachershad
14, 17 and 25 years of ODP teaching experience.
Each teacher was observed 2 times at intervals
of 2 weeks based on the departmental timetable.
Each lesson lasted for about 45 minutes. All the
study participants completed a consent form and
agreed to voluntarily participate in the classroom
observation and interview session. In addition,
the teachers consented to digital voice and video
recording. Therefore all classroom observations
and interviews data were audio and video-taped,
and transcribed using a data reduction technique
(Miles Huberman, 1994) and the data analysis
spiral technique proposed by Denzin (1978). The
recorder was sensitive enough to capture low
voices and to eliminate background noise that
may possibly impede the quality of voice of study
participants. Field notes were used to maintain a
record of events as a back-up to both digital voice
recorderduringtheinterviewsandvideorecording
of classroom observations.
Data collected through direct observation and
interviews were focused on the 4 latent factors
discovered from the exploratory factor analysis
phase. For example, we observed how veteran
and beginner ODP teachers:
(a) Effectively use Interactive Teaching Box
and a data projector to teach the concepts
and skills in database management towards
enhancing skill acquisition by students.
(b) Apply strategies to solve file management
and basic technical trouble shooting prob-
lems as the need arises without disrupting
the lesson.
(c) Demonstrate steps and procedures followed
to use a data projector with ITB to teach
various programs in ODP. For example,
MicrosoftExcelprogram,Wordprocessing,
Database Management, Keyboard custom-
ization,FormattingworksheetsandPrinting.
The semi-structured interview questions were
constructed to address possible misconceptions
that came up during the observations. Examples
of questions in this regard include:
23. 568
Exploring Technological Knowledge of Office Data Processing Teachers
(a) I observed that you were able to quickly
fix the monitor that suddenly turned blank
without disrupting the lesson or calling the
technologists. Were you trained on how to
solve all these technical problems?
(b) Iobservedyouwereunabletohelpthestudent
that was struggling with his printing device.
Do you have reasons for this?
(c) Whydidyouuseverbalinstructiontoexplain
“mail merging” skill instead of demonstrat-
ing on a data projector available in your
classroom?
Apictureoftechnologicalknowledgeinbegin-
ner and veteran teachers use of ICT infrastructure
as pedagogical tool begins to emerge from this
study. We found that both veteran and beginner
teachers used ITB with a data projector to teach
ODP in the ICT-enhanced classrooms with a
Dictaphoneandinternetconnectivity.Theaverage
factor score of veteran teachers in the factor that
represents technological knowledge (table 7) is
slightly higher than the factor score of beginner
teachers. It might be that veteran teachers were
able to solve all basic technical trouble shooting
problems without calling on the available tech-
nologistsforhelp.Forexample,themonitorofthe
system and the smart white board was observed
to suddenly go blank as one veteran teacher was
demonstrating to the students on a computer. The
teacherquicklycheckedthepowercableandfixed
it firmly in a jiffy. The monitor and smart white
boardscreenwererestoredwithoutdistruptingthe
lesson.Inexplainingthereasonwhileshewasable
to fix the problem, the veteran teacher remarked:
Researcher – I observed you were able to solve
various basic trouble shooting problems on the
Computer without disrupting your lessons. Were
you taught in your professional teacher training
on how to solve all these problems?
VT3–Oh!Weweretaughtwithobsoletetypewrit-
ersduringourprofessionalteacherstrainingthen,
unlikenow,thestudentsarejusttoofortunatewith
new technologies compared to our time. I learnt
the skills to solve basic technical problems when
the need arises through observation and techni-
cal mentoring by the technologists on-the-job,
by asking them questions, and practicing what
l have learnt from them instantly, then l became
proficient over the years.
By constrast, it was observed that beginner
teachers were not able to solve some basic trouble
shooting problems most often during the lessons.
For example, it was observed that a beginner
teacherwith2yearsteachingexperiencesuddenly
switchedfromdemonstrationtoverbalinstruction
inexplainingthestepsandprocedurestofollowin
“Mail merging”. The beginner teacher remarked:
Researcher – I observed that you suddenly
abandoned the ITB and used verbal instruction to
explain “mail merging” and some of the students
were confused at a point. Do you have reasons
for this?
BT2 – I used verbal instruction because the key-
board was hanging and the ITB was not respond-
ing. I quickly abandoned the equipment and used
verbal instruction to avoid embarrassment and
disrupting of the lesson because l need to call the
technologists to fix the problem.
Researcher–Iobservedthatyoualwaysstruggle
to complete your lesson as some students encoun-
teredtechnicaltroubleshootingproblemsontheir
computerswhichyoucannothelpthemtoaddress.
Howdoyouthenachievethecurriculumobjectives
and enhance students skills acquisition?
BT3 – Because we were not trained on how to use
ITB with a data projector as pedagogical tools, it
posedalotofchallengesbecuasetherearenumer-
24. 569
Exploring Technological Knowledge of Office Data Processing Teachers
ous problems we encounter while teaching which
always set us back most often as the technologists
arenotresponsive,ittakesalotoftimebeforethey
can respond to technical problems. Some of the
computers are not working properly as you can
see. These problems are a major hindrance to the
completion of the curriculum most of the time.
Theresultsofthisstudyareinconformitywith
the findings of Meskill et al. (2002) that learning
to use ICT in a preservice teacher education class
is not as compelling as learning to use ICT as
you teach in the classroom. Veteran and beginner
teachers were able to use ICT infrastructure to a
certain degree in their ODP classrooms and they
demonstrated the understanding of some basic
concepts, steps, and principles that guide the use
ofICTfortheteachingofODP.Ingeneral,veteran
ODPteachersweremoreabletoeffectivelyuseall
available ICT tools in their ICT-enhanced class-
roomsandtheyexhibitedtechnologicalknowledge
with dexterity in solving basic technical trouble
shooting problems.
This result also concurs with the findings of
Ndibalema (2014), Cubukcuoglu (2013) and Ku-
marandKumar(2003),thatteachershavepositive
perceptions towards the use of ICT in teaching,
but that they do not use it pedagogically because
oftheirlackoftechnicalskillsandsufficienttrain-
ing.However,thefindingsintheself-reportfactor
analysis showed slightly higher factor scores in
the technological knowledge of veteran teachers
above the technological knowledge of beginner
teachers. The cause of the slight differential in
factor scores might not be ascertained in the
self-reportquestionnaires,butwecanconfidently
infer that classroom observation and interview
have enriched the findings of this study. The
classroom observation and interview afterwards
made a clear indication of the little differences
in the technological knowledge of veteran and
beginner teachers.
IMPLICATIONS OF THE STUDY
Implications of this study pinpoint the impor-
tance of technological knowledge of teachers
in the digital age. The study has established the
rationales to differentiate between functional
and procedural knowledge when considering the
technological knowledge of teachers. As much as
improving the technological knowledge of teach-
ers is important in the digital age, teachers should
consider improving teaching and learning quality
throughprogrammesthatemphasizethedevelop-
ment of teachers technological knowledge. This
study clearly points to the need for professional
development of beginning and veteran teachers
withregardtotheirtechnologicalknowledge.This
way, teachers will be able to astutely use the ICT
infrastructurewithstrategiestosolveintrinsicICT
trouble shooting problems when the need arises
to avoid disruption of lessons. Teachers are more
likely to integrate ICT tools in their teaching as
pedagogical tools if they have a clear understand-
ing and knowledge of technology, and if it helps
them to achieve their instructional objectives.
The results of this study suggest that teach-
ers should be aware that functional, content,
and procedural knowledge are important for the
implementation and integration of ICT systems
in schools and colleges. Moreover, while teachers
should work on improving their content, func-
tional, and procedural knowledge of technology,
management in their institutions should provide
support to develop these knowledge areas. Man-
agement could provide this support successfully
by encouraging teachers to continuously develop
these knowledge areas. In addition, management
shouldensurethatteachersreceiveconstanttrain-
ing in these knowledge areas to contend with the
changing nature of technology. This study adds
to the definition of technology proposed by Ag-
garwal (1999) and describes technology as the
application of knowledge and skills in the astute
use of practical instruments to solve problems,
meet needs and enhance effective performance.
25. 570
Exploring Technological Knowledge of Office Data Processing Teachers
Thisstudyhasalsoestablishedthenecessityto
haveacomprehensiveinventorythatissuitablefor
measuring the technological knowledge of teach-
ers. At present, many of the existing inventories
are highly subjective and too generic. This case
implies that researchers should continue to find
bettermeasurementmodelsthatcouldobjectively
measure the technological knowledge of teachers
andtofindnovelwaystoimprovethedevelopment
oftechnologicalknowledgeofteachers.Thisstudy
adds to the work of Mishra and Koehler (2006),
Cox(2008)andGrahamet.al.(2009)indeveloping
acomprehensivemeasuringinstrumenttomeasure
technological knowledge of teachers, and most
importantly, a clear understanding of technology
for the effective integration of ICT as pedagogi-
cal tools. Furthermore, while the TK in TPACK
representsthetechnologicalknowledge,thisstudy
has divided TK into PrK (procedural knowledge)
andFK(functionalknowledge)forclearifyingthe
diversity involved in the set called TK.
The methodology of this study demonstrates
that a combination of self-reported tests (ques-
tionnaire, interview) and observation (classroom
observation and field note) is a useful tool that
presents logical conclusive evidence for explor-
ing the technological knowledge of teachers. The
findings of this study have shown an interrelated-
ness between data collected through a PrFPACK
survey and those of classroom observation and
interviews. In the classroom observation results,
the technological knowledge of ODP teachers
revealed that the components of PrFPACK were
clearly observed and ascertained. Similarly, the
use of field notes was found expedient in noting
downsomeimportantfeaturesofthetechnological
knowledgeofteachers.Forexample,howaveteran
teacher was able to solve some basic technical
problems without professional training could not
be reported comprehensively in the self-reported
survey. Field notes also provided a summary of
the difficulties teachers were experiencing dur-
ing the lesson preparation and the actual ICT-
enhanced classroom instructional practices, thus
providing the basis for the improvement of the
teachers’ professional training in the use of ICT.
The methodology of this study should therefore
beconsideredinordertohaveamorecomprehen-
siveinformationwhenstudyingthetechnological
knowledge of teachers.
LIMITATIONS AND FUTURE WORK
The findings of this study were predominantly
based on the ODP teachers from all the 11 FET
government colleges in the Gauteng Province of
South Africa. The sampling procedure and the
nature of the sample were restricted to the ODP
teachers. Although the ODP teachers were the
focus of this study, the framework is sufficiently
universalandcanreadilybeappliedtoothereduca-
tion domains and other subject areas in a technol-
ogy-enhanced classroom because PrFPACK as a
theoretical framework has clarified the diversity
involvedintechnologicalknowledge.Inaddition,
the self-reporting on perceptions of the sampled
ODP teachers may not adequately represent the
understanding of their technological knowledge
in general. Finally, this study has only carried
out a cross-sectional survey, which implies that
any change in the technological knowledge of the
ODP teachers after a given period was neglected.
Future work should consider longitudinal
surveys to account for continuing technological
knowledgedevelopmentsamongsttheODPteach-
ers. In addition, it is possible to determine the
leveloftechnologicalknowledgeoftheindividual
teachers in other subject areas at different institu-
tional levels in ICT-enhanced classrooms and to
identify potential areas needing an intervention.
We hope to further investigate this point using a
partial credit probabilistic model formed on the
item response theory.
26. 571
Exploring Technological Knowledge of Office Data Processing Teachers
CONCLUSION AND
RECOMMENDATIONS
The factor analytic methodology was explored in
thisstudytodefinethelatentfactorsthatcharacter-
ize the technological knowledge of the teachers
whoparticipatedinthisstudy.Afterexecutingthe
factor analysis routines of principal component
analysis and exploratory factor analysis on the
response data, interpreting and discussing the
outputsoftheseroutines,thefindingsofthisstudy
were found to be as follows:
(a) The PrFPACK inventory of comprehensive
measures provides a useful instrument to
explore the technological knowledge of
ODP teachers in ICT-enhanced classroom
environment.
(b) The application of the factor analytic
methodology supports the identification
that technological knowledge of the ODP
teachersisproceduralknowledge,functional
knowledge, content knowledge and proce-
dural functional content knowledge.
(c) This study finds no significant differences
in the technological knowledge of the ODP
teachers with respect to their teaching
experience.
The following recommendations are made
based on the findings in this study:
(a) Beginnerandveteranteachersshouldhavea
basic understanding of the functions of the
variouscomponentsofthecomputerandthe
purposeofICTinfrastructureaspedagogical
tools, particularly teachers teaching in the
ICT-enhanced classrooms.
(b) Beginner and veteran teachers should ac-
quire the ability and skills to solve basic
troubleshootingproblemsduringclassroom
instructions to avoid classroom disruptions
and their frustration in achieving the lesson
objectives.
(c) Educational stakeholders and teacher
training institutions should include in their
curriculum, novel methods of using new
technologies as pedagogical tools and the
understanding of technological knowledge.
REFERENCES
Adegbenro,B.J.,Mwakapenda,W.W.,Olugba-
ra,O.O.(2013).Conceptualrelationshipbetween
e-skillsproceduralandpedagogicalknowledgeof
vocationalbusinessstudiesteachersinICT-based
classroom. Proceedings of the Global Business
and Technology Association Fifteenth Annual
International Conference. Finland: GBATA.
Adegbenro, B. J., Olugbara, O. O., Mwakap-
enda,W.W.(2012).Developmentandvalidationof
an assessment instrument for office data process-
ingteachers’technologyknowledgeinICT-based
classroom context in South Africa FET Colleges.
Proceedings of the Society for Information Tech-
nology and Teacher Education. Austin: AACE.
Aggarwal, J. C. (1999). Essentials of educational
technology: Teaching learning innovations in
education. Kerala: VIKAS.
Agyei, D. D., Voogt, J. M. (2011). Exploring
thepotentialofthewill,skll,toolmodelinGhana;
Predicting prospective practicing teachers’ use
of technology. Computers Education, 56(1),
91–100. doi:10.1016/j.compedu.2010.08.017
Albion, R. P., Jamieson-Proctor, R., Finger, G.
(2010).AuditingtheTPACKconfidenceofAustra-
lianpre-serviceteachers:TheTPACKconfidence
survey (TCS). Chesapeake: SITE.
27. 572
Exploring Technological Knowledge of Office Data Processing Teachers
Alkali, S. C., Ishaku, J. M., Yusuf, S. N., Aisha,
S.(2013).Applicationofenvironmetrictechniques
onsurfacewaterandgroundwatersystemsinMai-
dugurimetropolitanarea,northeastNigeria.Peak
Journal of Physical and Environmental Science
Research, 1(6), 87–94.
Anderson,J.R.(2011).Cognitivepsychologyand
its implications (5th ed.). New York: Addison
Wesley Longman.
Angeli, C., Valanides, N. (2009). Episte-
mological and methodological issues for the
conceptualization, development and assessment
of ICT-TPCK: Advances in TPCK. Computers
Education, 52(1), 154–168. doi:10.1016/j.
compedu.2008.07.006
Archambault, L., Crippen, K. (2009). Exam-
ining TPACK distance educators in the United
States. Contemporary Issues In Technology and
Teacher Education, 9(1). Retrieved Novemebr 8,
2013,fromhttp://www.citejournal.org/vol9/iss1/
general/article2.cfm
Bartlett, M. S. (1954). A further note on the mul-
tiplying factors for various c2 approximations in
factor analysis. Journal of the Royal Statistical
Society. Series A (General), 16, 296–298.
Biggs, J. (1999). Teaching for quality at Univer-
sity: What the student does (2nd ed.). London:
McGraw-Hill Education.
Biggs, J. (2003). What the student does: Teach-
ing for enhanced learning. Higher Educa-
tion Research Development, 18(1), 57–75.
doi:10.1080/0729436990180105
Cox, S. (2009). A conceptual analysis of technol-
ogypedagogicalcontentknowledge(Unpublished
doctoraldissertation).BrighamYoungUniversity:
Provo, UT.
Cresswell,J.W.(2009).Researchdesign;Qualita-
tive, quantitative and mixed method approaches
(3rd ed.). Los Angeles: Sage.
Cubukcuoglu,B.(2013).Factorsenablingtheuse
of technology in subject teaching. International
Journal of Education and Development using
Information and Communication Technology,
9(3), 50-60.
Davis,K.S.,Falba,C.J.(2002).Integratingtech-
nologyinelementarypreserviceteachereducation:
Orchestrating scientific inquiry in meaningful
ways. Journal of Science Teacher Education,
13(4), 303–329. doi:10.1023/A:1022535516780
Denzin, N. K. (1978). The Research Act. London:
McGraw Hill.
DepartmentofBasicEducation.(2007).National
Examination Assessments and Measurement.
Pretoria: Government Printers.
DiStefano, C., Zhu, M., Mindrila, D. (2009).
Understanding and using factor scores: Consid-
eration for the applied researchers. Practical
Assessment,ResearchEvaluation,14(20),1–8.
Dugger, W., Yung, J. E. (1995). Technology
education today. Bloomington: Phi Delta Kappa
Educational Foundation.
Ferris, T. L. J. (2009). On the method of research
for systems engineering. 7th Annual Conference
onSystemsEngineeringResearch.Loughborough:
Research School of Systems Engineering.
Foster, P. A., Dawson, V. M., Reid, D. (2005).
Measuring preparedness to teach with ICT. Aus-
tralasian Journal of Educational Technology,
21(1), 1–18.
Friensen, S., Lock, J. V. (2010). High perfor-
mance districts in the application of 21st
learn-
ing technologies. Galileo Educational Network.
University of Calgary.
Gess-Newsome, J., Lederman, N. G. (Eds.).
(2002). Examining pedagogical content knowl-
edge: The construct and its implications for sci-
enceeducation.Dordrecht,Netherlands:Springer.
doi:10.1007/0-306-47217-1
28. 573
Exploring Technological Knowledge of Office Data Processing Teachers
Gomez, E. A., Elliot, N. (2013). Measuring
mobile ICT literacy: Short-message performance
assessmentinemergencyresponsesettings.IEEE
Transactions on Professional Communication,
56(1), 16–32. doi:10.1109/TPC.2012.2208394
Gorder, L. M. (2008). A study of teacher percep-
tions of instructional technology integration in
the Classroom. Delta Pi Epsilon Journal, 50(2),
63–76.
Graham, C. R., Nicolette, B., Pamela, C, Leigh,
S., Larry, C., St., Harris, R. (2009). Measur-
ing the TPACK confidence of in-service science
teacher. TechTrends, 57(3), 1953–1960.
Gumbo, M. T. (2003). Indigenous technologies:
Implications for a technology education curricu-
lum. (Unpublished PhD thesis). Vista University,
Pretoria, South Africa.
Herschbach, D. R. (1995). Technology as knowl-
edge: Implications for instruction. Journal of
Technology Education, 7(1), 31–42.
Hiebert, J., Lefevre, P. (1986). Conceptual
and procedural knowledge in mathematics: An
introductory analysis. In J. Hiebert (Ed.), Con-
ceptual and Procedural knowledge: The case of
mathematics (pp. 1–27). Hillsdale, NJ: Erlbaum.
Horn,J.(1965).Arationalandtestforthenumber
of factors in factor analysis. Psychometrika, 30,
179 185.
ISTE. (2008). National educational technol-
ogy standards for teachers (NETS-T) 2008.
Retrieved October 28, 2013 from http://
www.Iste.org/Content/navigationMenu/NETS/
ForTeachers/2008Standard/ NETS_for_Teach-
ers_208.htm
Kaiser,H.F.(1960).Theapplicationofelectronic
computers to factor analysis. Educational and
Psychological Measurement, 20(1), 141–151.
doi:10.1177/001316446002000116
Kawashima, N., Shiomi, K. (2007). Factors of
the thinking disposition of Japanese high school
students. Social Behavior and Personality, 35(2),
187–194. doi:10.2224/sbp.2007.35.2.187
Koehler, M., Mishra, P. (2009). What is Tech-
nological Pedagogical Content Knowledge?
Contemporary Issues in Technology Teacher
Education, 9(1), 60–70.
Kumar, P., Kumar, A. (2003). Effect of a
web-based project on preservice and in-service
teacher’s Attitudes towards computers and tech-
nology skills. Journal of Computing in Teacher
Education, 19(3), 87–92.
Lau, C., Sim, J. (2008). Exploring the extent of
ICT adoption among Secondary school teachers
in Malaysia. International Journal of Computing
and ICT Research, 2(2), 19–36.
Lee, M. H., Tsai, C. C. (2010). Exploring
teachers’ perceived self efficacy and technologi-
cal pedagogical content knowledge with respect
to educational use of the World Wide Web. In-
structional Science, 38(1), 1–21. doi:10.1007/
s11251-008-9075-4
Liu, C., Lin, K., Hou, Y. (2003). Application of
factor analysis in the assessment of groundwater
quality in a blackfoot disease area in Taiwan.
The Science of the Total Environment, 313(1-3),
77–89. doi:10.1016/S0048-9697(02)00683-6
PMID:12922062
Louw, L. P., du Toit, E. R. (2010). Help l’m a
student teacher! Skills development for teaching
practice. Pretoria: Van Schaik.
Lu, K. L., Liu, C. W., Jang, C. S. (2012). Uis-
ing multivariate statistical methods to assess the
groundwater quality in arsenic-contaminated
area of Southwestern Taiwan. Environmen-
tal Monitoring and Assessment, 184(10),
6071–6085. doi:10.1007/s10661-011-2406-y
PMID:22048921
29. 574
Exploring Technological Knowledge of Office Data Processing Teachers
Meskill,C.,Mossop,J.,DiAngelo,S.,Pasquate,
R. (2002). Expert and novice teachers talking:
Precepts,conceptsandmisconceptions.Learning
Language and Technology, 6(3), 46–57.
Miles, M. B., Huberman, A. M. (1994). An
expanded source book, qualitative data analysis
(end ed.) London: SAGE publications.
Mishra, P., Koehler, M. J. (2006). Technologi-
calpedagogicalcontentknowledge:Aframework
for integrating Technology in teacher knowledge.
Teachers College Record, 108(6), 1017–1054.
doi:10.1111/j.1467-9620.2006.00684.x
National Council for Accreditation of Teacher
Education. (2010). What makes a teacher effec-
tive?RetrievedOctober16,2013fromhttp://www.
ncate.org/LikClick.aspx
Ndibalema,P.(2014).Teachers’attitudestowards
the use of information communication technol-
ogy (ICT) as a pedagogical tool in secondary
schoolsinTanzania:ThecaseofKondoaDistrict.
InternationalJournalofEducationandResearch,
2(2), 1–16.
Niederhauser, D. S., Stoddart, T. (2001).
Teachers’ instructional perspectives and use of
educational software. Teaching and Teacher
Education, 17(1), 15–31. doi:10.1016/S0742-
051X(00)00036-6
Niess,M.L.,Ronau,R.N.,Shafer,K.G.,Driskell,
S. O., Harper, S. R., Johnston, C., Kersaint,
G. et al. (2009). Mathematics teacher TPACK
standards and development model. Contempo-
rary Issues in Technology Teacher Education,
9(1), 4–24.
Nissen, M. E. (2006). Harnessing knowledge
dynamics: Principled organisational knowing
and learning. Hershey, Pennsylvania: IRM Press.
doi:10.4018/978-1-59140-773-7
Olugbara, O. O., Millham, R., Heukelman, D.,
Thakur, S., Wesso, H. W., Sharif, M. (2014).
Determining eskills interventions to improve
the effectiveness of service delivery by com-
munity development workers. Proceedingsof
the e-Skills for Knowledge Production and In-
novation Conference 2014, Cape Town, South
Africa, 305-334. Retrieved December 14, 2014
from http://proceedings.e-skillsconference.org
/2014/e-skills305-334Olugbara879.pdf
Partnershipfor21st
CenturySkills.(2004).21st
Cen-
turylearningenvironmentswhitepaper.Retrieved
July 1, 2009 from http://www.21stcenturyskills.
org/documents/le_white_paper-1.pdf
Petzer, L., Steenkamp, K. (2004). Discovering
technologyGrade5learners’book.Piettermaritz-
burg: Shutters Shooters.
Pierson, M. E. (2001). Technology integration
practice as a function of pedagogical expertise.
Journal of Research on Computing in Education,
33(4), 413–430.
Price, K. (1987). The use of technology: Vary-
ing the medium in language teaching. In W. M.
Rivers (Ed.), Interactive language teaching (pp.
155–169). Cambridge: Cambridge University
Press.
Pudi, T. I. (2011). Understanding Technology
Education from a South African Perspective.
Pretoria, SA: Van Schaik Publishers.
Romizowski, J., Alexander, T. (1980). Transfer
ofeducationaltechnology:Internationalyearbook
ofeducationalandinstructionaltechnology.New
York: Cambridge University Press.
Ryle, G. (1958). Knowing how and knowing that.
The Philosophical Review, 46, 19–28.
30. 575
Exploring Technological Knowledge of Office Data Processing Teachers
Schmidt, D., Baran, E., Thompson, A., Koehler,
M. J., Mishra, P., Shin, T. (2009). TPACK:
The development and validation of an assessment
instrument for pre-service teachers. Proceedings
of the 2009 Annual Meeting of the American Edu-
cational Research Association. San Diego, AER.
Shavelson, R., Ruiz-Primo, A., Li, M., Ayala,
C. (2003). Evaluating new approaches to assess-
ing learning (CSE Report 604). Los Angeles,
C.A: University of California National Centre for
Research on Evaluation.
Shinas, V. H., Yilmaz-Ozden, S., Mouza, C.,
Karchmer-Klein, R., Glutting, J. J. (2013).
Examiningdomainsoftechnologicalpedagogical
content knowledge using factor analysis. Journal
of Research on Technology in Education, 45(4),
339–360.doi:10.1080/15391523.2013.10782609
Shulman, L. S. (1986). Knowledge and teach-
ing: Foundation of the new reform. Harvard
Educational Review, 57(1), 1–22. doi:10.17763/
haer.57.1.j463w79r56455411
Tondeur, J., van Braak, J., Sang, G., Voogt, J.,
Fisser, P., Ottenbreit-Leftwich, A. (2012).
Preparingpre-serviceteacherstointegratetechnol-
ogy in education: A Synthesis of qualitative evi-
dence. Computers Education, 59(1), 134–144.
doi:10.1016/j.compedu.2011.10.009
Uwameiye, R., Adegbenro, B. J. (2007).
Trainers’ and trainees’ assessment of the imple-
mentation of ICT program for secretaries at the
staff training centers in Southwestern Nigeria.
International Journal of Information and Com-
munication Technology Education, 3(1), 1–9.
doi:10.4018/jicte.2007010101
Valicer, W. F. (1976). Determining the number of
componentsfromthematrixofpartialcorrelations.
Psychometrika, 41(3), 321–327. doi:10.1007/
BF02293557
Wetzel, K., Zambo, R., Ryan, J. (2007). Con-
trasts in Classroom technology use between Be-
ginning and Experienced Techers. International
Journal of Technology in Teaching and Learning,
3(1), 15–27.
Williams, M. D. (2003). Technology integration
in Education. In S. C. Tan F. L. Wong (Eds.),
Teaching and learning with technology: An
Asia-pacific perspective (pp. 17–31). Singapore:
Prentice Hall.
Williams,T.I.(1992).Ashorthistoryoftwentieth-
century technology. Oxford: Clarendon Press.
Yilmaz-Ozden, S., Mouza, C., Karchmer-Klein,
R., Glutting, J. J. (2013). Examining domains
of technological pedagogical content knowledge
using factor analysis. Journal of Research on
Technology in Education, 45(4), 339–360. doi:1
0.1080/15391523.2013.10782609
Zhao,Y.(2003).Whatteachersshouldknowabout
technology: Perspectives and practices. Informa-
tion Age Publishing.
ADDITIONAL READING
Adegbenro,B.J.(2013).Assessmentoftechnologi-
cal, pedagogical and content knowledge frame-
work in the use of information communication
technologyforofficedataprocesssinginstruction
(Unpublished doctoral dissertation). Tshwane
University of Technology: Pretoria.
Buabeng-Andoh,C.,Totimeh,F.(2012).Teach-
ers’ innovative use of computer technologies in
classroom:AcaseofselectedGhananianschools.
International Journal of Education Develop-
ment using ICT, 18 (3). 22-34.
Cox, S. (2008). A conceptual analysis of technol-
ogypedagogicalcontentknowledge(Unpublished
doctoraldissertation).BrighamYoungUniversity:
Provo, UT.
31. 576
Exploring Technological Knowledge of Office Data Processing Teachers
Dirckinck-Holmfeld, L., Hodgson, V., Jones,
C., de Laat, M., CcConnel, D., Ryberg, T.
(2010). Teacher use of ICT: Challenges and op-
portunities. Proceedings of the 7th International
Conference on Networked Learning. Lancaster:
Lancaster University.
Hismanoglu,M.(2012).ProspectiveEFLTeachers
perceptionsofICTIntegration.Astudyofdistance
Higher Education in Turkey. Journal of Educa-
tional Technology Society, 15(1), 185–196.
Kafyulilo, A. C. (2010). Practical use of ICT in
science and mathematics teachers’ training at
Dar es Salaam University College of Education:
An analysis of prospective teachers’ technologi-
cal pedagogical content knowlegde. MSc thesis
submitted to the University of Twente.
Kimbell, R., Stables, K., Green, R. (1996).
Understandingpracticeindesignandtechnology.
Buckingham: Open University Press.
Pudi, T. I. (2011). Understanding Technology
Education from a South African perspective.
Pretoria: Van Schaik.
Rodrigues, S., Marks, A., Steel, P. (2003).
Developing science and ICT pedagogy and con-
tent knowledge: A model of continuing profes-
sional development. Innovations in Education
and Teaching International, 40(4), 386–394.
doi:10.1080/1470329032000128413
Unwin, T. (2005). Towards a framework
for the use of ICT in Teacher Training in
Africa. Open Learning, 20(2), 113–129.
doi:10.1080/02680510500094124
KEY TERMS AND DEFINITIONS
Functional Knowledge: Having the under-
standing of how to describe the rules and steps to
follow in the use of digital tools to teach.
ICT Tools: Information Communication
Technology tools are digital infrastructures such
as; computers, laptops, desktops, data projector,
softwareprograms,printersscannersandInterac-
tive teaching box.
Office Data Processing: Professional office
skills and operation of electronic systems in data
processing and accounting in an organisation and
industry.
Procedural Knowledge: Effective applica-
tion of digital knowledge in the classroom with
ability to solve recurrent problems while using
digital tools.