Your SlideShare is downloading. ×
Utilizing Internet Technologies To Support Learning
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Utilizing Internet Technologies To Support Learning

822
views

Published on

Published in: Education, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
822
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. International Journal of Information Management 22 (2002) 27–46 Utilizing Internet technologies to support learning: an empirical analysis$ Mihir Parikha,*, Sameer Vermab a Institute for Technology and Enterprise, Polytechnic University, Five Metrotech Center, LC401, Brooklyn, NY 11201, USA b College of Business, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132, USA Abstract The Internet has evolved into a universal platform to communicate and share information. It has profoundly impacted the way in which we organize, work, and learn. This paper proposes and evaluates a unifying framework that utilizes two Internet technologies, Web-based pull technology and push technology, in supporting classroom-based learning. We developed two fully operational education support systems based on the framework for two different types of courses. One system supports courses requiring extensive interactions, both communication and transfer of data files, among the course instructor, individual students, and student teams working on group projects. The other system supports courses requiring moderate interactions. One hundred and eighty-one students in eight classes across three semesters used and evaluated these systems. The study found that the systems supported learning by providing critical course information, study material, and assignments conveniently, timely, and in usable formats. The systems were user friendly and increased student productivity. The students were satisfied with the systems and found the systems useful. In addition, the study found that the system support for highly interactive courses was better than that for less interactive courses on all dimensions of system support. r 2002 Elsevier Science Ltd. All rights reserved. Keywords: Internet technologies; Education support systems; Learning support; Pull technology; Push technology 1. Introduction and background As the Internet has grown to become a major channel for business communications, entertainment, and information exchange, academic institutions are expanding its use beyond $ An early version of this paper with preliminary results was published in the Proceedings of the Americas Conference on Information Systems (AMCIS) held in Milwaukee in August 1999. *Corresponding author. Tel.: +1-718-260-3867; fax: +1-718-260-3874. E-mail addresses: mparikh@poly.edu (M. Parikh), sverma@sfsu.edu (S. Verma). 0268-4012/02/$ - see front matter r 2002 Elsevier Science Ltd. All rights reserved. PII: S 0 2 6 8 - 4 0 1 2 ( 0 1 ) 0 0 0 3 8 - X
  • 2. 28 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 research to support teaching and learning. Emerging information technologies, like Internet technologies, can play varied but significant roles in all three major learning environments: Computer microworld, classroom-based learning environment, and virtual learning environment (Wilson, 1996). Computer microworld is a self-contained computer-based learning environment, such as computer-based training and intelligent tutoring systems, where students learn at their own pace using a computerized learning system. Here, Internet technologies can be used to distribute, maintain, and update learning systems. Classroom-based learning environment is the most widely used learning environment in which students periodically meet face-to-face with their instructors. Here, Internet technologies can be used to distribute lecture notes and assignments through Web sites and to support instructor–student communication through e-mail. Virtual learning environment is a telecommunications-based learning environment, like distance learning, where the students, dispersed over a large geographic area, learn through a communication medium. Here, the Internet can replace the traditional telephone networks as the medium of communication for video conferencing and provide an enhanced, multimedia network. New Internet-based virtual universities, such as Western Governors University, Jones International University, California Virtual University, and Concord University School of Law, have emerged to take advantage of these Internet-enabled educational opportunities. Traditional universities have also started offering courses and degree programs through the Internet to reach geographically distant markets. The Internet is considered a preferred technology to improve instruction, increase access, and raise productivity in higher education (MacArthur & Lewis, 1996). Analyzing the impact of the Internet on two technology-based learning environments, computer microworld and virtual learning environment, and studying new Internet-enabled educational models, such as virtual universities, are fascinating research areas and the focus of many researchers. However, the role of the Internet in supporting and improving the current classroom-based education is somewhat ignored. This study focuses on this under-studied but important research area. Several researchers have pointed out that the traditional classroom-based education has benefited very little from the information technology revolution (Alavi, 1994; Alavi, Wheeler, & Valacich, 1995; Soloway, 1993). Unfortunately, this argument also extends to the Internet and related technologies. Many instructors place lecture notes, course information, class schedules, and assignments on the Web. However, a strong opinion has recently emerged that the current practice of using the Internet in teaching has not lived up to the high initial expectations and it even leads to sub-standard education (Huang, 2001; Mendels, 1999; Neumann, 1998; Nobel, 1998a–c; Young, 1998). Most course Web sites are passive. They lack the interactivity crucial for many learning * activities and methods, such as group discussion, case study analysis, continuing unfinished class discussion, asking questions and immediately receiving answers, and clarifying what will be studied in the next class. When the new course material is uploaded or the old course material is revised on a course Web * site, most students do not know about it unless they regularly check the Web site. Often critical and time-sensitive information is not reviewed by students.
  • 3. 29 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 There is no feedback loop unless a technically complex and cumbersome login process is added * to the Web site. Therefore, the instructor cannot monitor the activities of each student on the course Web site and is never sure whether students regularly review the course material or not. Not all course material is available on the course Web site. Developing Internet-compatible * course material involves substantial cost, specifically in terms of faculty time and efforts, but brings little monetary or professional rewards (Baer, 1998; Choren et al., 1999). In addition, many instructors are reluctant to put their intellectual work on the Web for security reasons and the possible violation of their intellectual property rights. In some cases, even if they are willing to put their material on the Web, they simply do not have the advanced skills or software needed for Web site design and development (Sumner & Hostetler, 1999). Most institutions and instructors have put course Web sites on ad hoc basis. They do not utilize * all major Internet technologies or have utilized them at a sub-optimum level. Very few course Web sites use discussion bulletin boards and even fewer use real-time, interactive question and answer sessions using Internet messaging technology. These problems are due to the inherent limitations of the Web and the insufficient utilization of other Internet technologies. It is easy to put documents on the course Web sites, but leveraging the full potential of the Internet requires integrating visual, aural, and textual material and providing nonlinear access to the material (Baer, 1998). Developing the right combination of Internet technologies with appropriate teaching methods and instructional material to support learning activities is a key to improve learning in the classroom-based learning environment (Huang, 2001; Mahoney, 1998; Spooner, Spooner, Algozzine, & Jordan, 1998; Sumner & Hostetler, 1999). If all major Internet technologies are used under a unifying framework, the Internet can make the educational process more efficient and effective and vastly improve classroom-based education. They can revolutionize the way we learn and the way learning is supported in classrooms. In the following sections, we propose and empirically evaluate such a framework. 2. Learning support framework Learning activities in classroom-based education are of three types: pre-class, in-class, and post- class. Pre-class activities include obtaining the outline of an upcoming class, reading the assigned cases and articles, completing assignments, and identifying difficult and ambiguous concepts. In- class activities include interacting with the instructor and the classmates, learning new concepts, clarifying ambiguous issues, applying theories to case situations, and closing the gaps in their knowledge on the subject. Post-class activities include reviewing lecture notes, reinforcing their knowledge about the subject, communicating with the instructor for any remaining clarifications, preparing for the upcoming exams and projects, and reviewing the instructor’s feedback. To support in-class activities, many academic institutions use collaborative systems, like groupware. Several studies have shown that collaborative systems successfully support in-class activities (Alavi, 1994; Alavi et al., 1995; Tyran & Shepherd, 1999). However, their use in pre-class and post-class activities is restricted due to the following limitations. Most collaborative systems are useful only in the ‘‘same time, same place’’ environment. * They require proprietary and expensive hardware or software. *
  • 4. 30 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 They are mostly feasible only on local area networks due to high bandwidth requirements. * They are effective only with a small group. * Internet technologies can provide an open and ubiquitous platform for communications that provides major benefits of collaborative systems while minimizing their limitations. The Web is the most popular Internet technology among instructors. It is a pull technology in which the content is made available to the intended recipient when the recipient specifically requests the content. While pull technology has several advantages, it is also plagued by several limitations as discussed in the previous section. It is useful only in those course activities that do not require interactions or delivery of time-sensitive information. For example, when new course material is added on the course Web site, pull technology requires an effort on the part of students to access the material. Some students invariably forget to check the material creating communication and learning problems, which can increase dissatisfaction among the students and instructor. Push technology, another major Internet technology, minimizes the problems of pull technology. It is defined as the software that enables Internet, Intranet, and Extranet users to customize automatic delivery of information directly to their computers from a variety of sources (Levitt, 1997). While pull technology is based on request and reply model, push technology is based on publish and subscribe model (Raden, 1997). Push technology offers several advantages over pull technology. It allows secure delivery of dynamic content directly to the desktops of a pre-determined and * intended group of recipients in real time without even requiring them to open a Web browser. This feature is useful in delivering time-sensitive information, such as news, stock quote, current inventory level, changes in prices, and new product offering (Maddox, 1997; Moeller, 1997; Rivlin, 1997). In education, it can deliver changes in course schedules or updated notes for the upcoming classes in time to all students without any explicit efforts by the students. It allows customization. Individual recipients can receive messages modified by the system to * meet his or her needs. In education, instructors can utilize this feature to send customized messages, such as quiz grades along with correct answers for the quiz questions, by matching student ID. As this does not require any efforts on the part of the instructor, it can provide a very valuable support to the instructors and may increase student satisfaction in large classes. It enables the delivery of information only to the intended recipients. Therefore, it increases * security and reduces the risk of exposing sensitive information to others. This feature protects intellectual property rights by not making the copyrighted material available to everybody. In the content-intensive education systems, protecting the intellectual property developed by instructors is important. In addition, delivering the content only to the registered students (or subscribing receivers) is important in order to increase its value for the paying customers. It keeps a log that informs the sender that the recipient has received the information and viewed * it. This helps the sender immediately know the errors in message delivery and pre-empt resulting problems. In education, this can help instructors avoid many course management problems. For example, instructors can reduce anxiety and dissatisfaction among students by postponing a quiz, if the reading material did not reach the students in time. It can automatically distribute new applications and data files. This increases scalability and * automates software applications distribution leading to the reduction in the total cost of ownership (Aragon, 1997; Rivlin, 1997). In education, this feature is useful in many pre-class
  • 5. 31 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 and post-class activities such as automating term paper submission (drafts and final copies) and getting corrections/feedback from the instructor. This feature also automates distribution of project data and worksheets as they become available during the semester. It automates client system maintenance. The automation enables centralized management of * client software. When an upgrade is available, the system automatically sends it to all users through the network and installs on their computers without the users even knowing it. This self-maintaining system ensures the use of most recent version by all users, especially by technologically novice users. In education, this feature is very useful because technical support provided by educational institutions to the students is extremely limited due to low budgets. Instructors generally end up providing all technical support related to the use of technologies in their courses. Automating client system maintenance can reduce the need for extensive technical support and the drag on the instructors’ time. A recent study shows that release time (to develop and support use of technology in education) and the availability of technical support are the two most significant incentives an educational institution can provide to its faculty (Sumner & Hostetler, 1999). Despite all these advantages, push technology also has limitations such as network clogging and information overload. These limitations have led to its downfall and kept it from being a dominant force on the Internet. (Guenther, 1999; Hayes, 1997; Mosley-Matchett, 1997; Pflug, 1997). However, these limitations are more evident in the general use of push technology (such as news and stock information services) and may not affect specialized uses (such as updating business manuals on corporate Intranets, sales and marketing support, and education support). 2.1. Hybrid push–pull model Considering the benefits and limitations of both push and pull technologies, several researchers suggest that better results can be achieved by blending the two technologies (Aragon, 1997; Hibbard, 1997; Malhotra, Gosain, & Lee, 1997; Rivlin, 1997). We also take this intermediate approach and utilize a hybrid model for learning support (Fig. 1). This model maps different types of systems on three dimensions from the perspective of information recipient: Perceived control, conformance to needs, and information processing requirements. It is adapted from a generic model proposed by Malhotra et al. (1997) for information delivery and systems design. We have extended their model to the context of classroom-based learning environment and the needs of students in pre-class and post-class activities. Perceived control refers to user perceptions about who controls the information retrieval processFthe users or the system. It is high when users control information retrieval process and low when the system controls the process. It is an important variable in system usage as users are more likely to use the system if perceived control is high. In an ideal system for general users, perceived control should be high (Malhotra et al., 1997). However, it can be low in an ideal system for learning support, because the instructor rather than students should control information retrieval process through the system for two reasons. First, in a well-designed course, what minimum course information the students should receive and when can be determined better by the instructor. For example, in a statistics course, students must get information on how to form a statistical hypothesis before how to test it. Second, when the new information becomes available,
  • 6. 32 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Fig. 1. The hybrid push–pull model. it must immediately reach all students. Perceived control, as discussed here, should not be confused with the control over the ‘‘drill-down’’ ability and the pace of learning. Students should have control over how much deeper they may go in a specific concept than required, how much time they may spend on each concept, and when to engage or disengage themselves in the learning activity, as this type of control enhances their learning ability (Merrill, 1983; Ahmad & Piccoli, 1998). Our framework encourages students having this type of control by using Internet technologies with the hypertext format that enables them to ‘‘drill-down’’ and learn at own pace (Barron & Orwig, 1997). If lower perceived control leads to dissatisfaction and reduces system usage, instructors should encourage or even require students to use the system. Additionally, using some creativity, the system can be designed in a way that creates an illusion of self-control. Conformance to needs refers to the extent to which the delivered information matches the needs of the information recipient. In an ideal learning support system, the conformance should be high. The instructor should determine the content as per the objectives of the course and the students’ knowledge levels. Allowing students within the course guidelines to determine their own user- profiles, which can be used by the system in filtering and sending appropriate content, can increase the conformance. Artificial intelligence technologies can also be used in this process. Information processing requirements refer to the amount of information that users need to process to access the right information. This amount depends on the complexity and equivocality of the information (Daft & Huber, 1987). In a pure pull system, such as the Web, the processing requirements are high because the user has to search through many Web pages and links to find the right information. Several studies show that hypermedia environment disorients users and learning suffers as the disorientation increases (Beasley & Waugh, 1995; Beekman, 1994; Randolph & Griffin, 1999; Tripp & Roby, 1990). In a pure push system, the processing requirements and resultant disorientation are low because the right information is automatically delivered to the user by the system. In an ideal learning support system, the processing
  • 7. 33 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 requirements should be low. All course-related information should be delivered automatically without the student asking for it. Building an education support system (ESS) additionally requires paying special attention to the following critical areas needing system support. A collaborative environment that develops group-skills and builds communities. Education is * not just about knowing academic facts, it is also about knowing people, developing social strategies for dealing with the world, and fostering reliable friendships and complex social structure (Brown & Duguid, 1996). ESS should foster an environment that enables and encourages communications among the students. Enhancing written and oral communication skills. Effectively communicating without the help * of body language requires special communication skills. In the increasingly virtual, networked, and global organizations of the new economy, these skills are becoming very critical. Feedback strongly influences learning performance (Kulik & Kulik, 1988; Gagne, 1977). * Feedback can be used for both positive and negative reinforcements. Feedback can be ‘‘explicit’’ where learners are provided with the correct answer to match against their own answer or ‘‘implicit’’ where learners are given the correct reasoning to match against their own reasoning in finding the right answer. ESS should provide a mechanism to deliver feedback of other students, the instructor, or the built-in system modules. Increase instructor visibility. Instructor visibility refers to the student feelings that the * instructor is available beyond a specific time frame (Hislop, 1998). In traditional classroom environments, instructors are available to their students only during the class time and office hours. ESS using Internet technologies enables instructors to communicate and coordinate with students beyond the constraints of time and space. Make efficient use of time. ESS should provide a virtual platform to discuss course-related * issues between students and the instructor. Such a platform reduces unnecessary student– instructor face-to-face meetings saving time and increasing convenience for both sides. In addition, these discussions (only the common issues, not the confidential ones) should be catalogued using discussion group archives so that other students facing the same issues can simply refer to the archives and need not contact the instructor. Utilize a proper ‘‘learning design’’. Most instructors do not have the interest or expertise in * developing highly sophisticated ESS. Therefore, minimize system development by the instructors and avoid having instructors learn new technologies; rather enable them to use the technologies with which they are most familiar. For example, most instructors use MS Powerpoint software for lecture slides while some use Macromedia Director to make the slides interactive. The system should be flexible enough to incorporate any of these formats. 2.2. Integrated support system architecture A simple architecture (Fig. 2) can be utilized to develop an ESS that integrates multiple Internet technologies as per the model discussed above. The architecture1 has three primary components: 1 The architecture is discussed in detail in our other paper: Verma and Parikh (2001) ‘‘ActiveBook: Optimizing Internet Technologies in Education’’, Campus-Wide Information Systems, 18(1), 28–42.
  • 8. 34 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Local Student Instructor Local Replicated Com. Com. Replicated Data Protocols Protocols Data Interface Interface Student Client Instructor Client The Internet To update.. To access.. Lecture Notes Lecture Notes Assignments Assignments Schedule Schedule Newsletters Newsletters Profile Profile Servers Databases FAQ FAQ Sample tests Web Student Profile Sample tests Real time chat Real time chat FTP Course Info. System Update System Update Push Lecture Notes System Admin. Chat Discussion Security Shared Documents Fig. 2. An integrated support system architecture. Servers and Databases, Instructor Client and Student Client. Servers and Databases include multiple software-based servers to employ different protocols for the Internet-enabled communications and databases of course material including student information, lecture notes, course schedules, and assignments. Instructor Client assists instructors in uploading the course material and administering the course by setting preferences, such as the files available for transfer, timestamps, class schedules, message board information, and software/operational information, for a particular class. Student Client assists students in accessing lecture notes, assignments, schedule, and other course-related information. It allows students to download information via a single click of a button and access the information later even when the student is off-line. The architecture is generic enough to develop different ESS incorporating support features unique to different courses. It remotely helps the instructor and students organize course material, assists communications between the instructor and students, and uses intelligent interfaces to provide dynamic delivery of course content. Kendall and Kendall (1999) identified four categories of pull technologies and four categories of push technologies based on what the users feel they want (alpha), they think they want (beta), they really want (gamma), and they really need (delta). All of these types of technologies can be used in this architecture. Alpha technologies can be used to broadcast a change in the syllabus or announce the availability of a new assignment. Beta technologies can be used to filter and categorise discussions as per concepts or threads. A student can subscribe or unsubscribe to a concept to get included or excluded from the discussion. Gamma technologies can be used to filter information that the instructor thinks a specific student must get based on the instructor’s evaluation of the student. Delta technologies can be used to automate monitoring of student activities and dynamically provide information based on student needs.
  • 9. 35 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 3. Empirical evaluation We utilized an evaluation model (Fig. 3) consisting of four dimensionsFcontent, technology, interface, and functionalityFto analyze the learning support provided by the ESS developed based on the framework. In the model, each dimension has two components. The content dimension refers to the characteristics of the information provided by the system to its users. How precise the information is, does it meet the needs of the user or not, is it sufficient, is it relevant, etc. are the aspects related to the conformance of content. How useful the information is, does it increase the productivity of the user, etc. are the aspects related to the usefulness of content. The technology dimension refers to utilizing appropriate technologies to deliver the right information at the right time to the right recipient. Is the information available when needed, is the information up-to-date, how fast the system responds to the needs, etc. are the aspects related to timeliness. Is the information provided by the system accurate, the user has confidence in the information or not, is the system dependable, etc. are the aspects related to accuracy. The interface dimension deals with how easy it is to learn and use the system. Is the system easy to learn, is it flexible, is it user friendly, etc. are the issues related to the ease of use. Is the layout of the screen appropriate, are the instructions clear, is the information shown in a useful format, etc. are the issues related to the format of the user interface. The functionality dimension represents the practicality, as opposed to the attractiveness, of the system. Is the system convenient to use, is it convenient to get the required course related information, is it convenient to communicate with other students and the instructor, etc. are the issues related to the convenience provided by the system. Is the system overall satisfactory is an overarching issue related to the satisfaction with the system. An effective ESS should provide enhanced support on all four dimensions. Therefore, we postulate: An ESS developed based on the framework provides above average support for learning on the content dimension (H1A), on the technology dimension (H1B), on the interface dimension (H1C), and on the functionality dimension (H1D). Conformance Content Usefulness Timeliness H1A-H1D Technology Support Learning Accuracy System Support Ease of Use H2A-H2D Interface Highly Format Regular vs. interactive course course Convenience Functionality Satisfaction Underlying Measurement Dimensions Variables Fig. 3. The evaluation model.
  • 10. 36 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Internet technologies can be used not only to deliver course material but also to support interactions and coordination among students and their instructor. In some courses (let us say, Course-A), like introductory computer information systems, management science models, and business statistics courses, students spend considerable amount of time in pre-class and post-class activities. But they require fewer interactions and coordination among themselves. In other types of courses (let us say, Course-B), like simulation gaming and business strategy courses, students not only spend the equal amount of time in pre-class and post-class activities, but also extensively interact and coordinate with other students and the instructor. For example, in a simulation gaming course, the interactions can include discussing the relative position of ones own team compared to other teams in the simulated business environment, forecasting product demand, predicting cash-flow, developing new business models, and transferring decision and simulation result files. The courses (Course-A) that involve low interactions are more suited for distance education than the courses (Course-B) that require extensive interactions (Van Slyke, Kittner, & Belanger, 1998). However, to effectively carry out Course-B in the classroom-based environment, a suitable interactive platform is needed to support interactions and coordination. An effective ESS provides such a platform to extend support for synchronous and asynchronous interactions and coordination without time and space constraints. Additionally, it supports regular broadcast from the instructor, frequent information flow among the students, continual contributions by individuals, and coordination by partially automating the workflow. Thus, the benefits of the ESS would be more apparent in Course-B than Course-A. We postulate: System support for learning is perceived higher in Course-B than in Course-A on the content dimension (H2A), on the technology dimension (H2B), on the interface dimension (H2C), and on the functionality dimension (H2D). 3.1. Measurement of variables 3.1.1. Systems To test the hypotheses, we developed two fully operational ESS, Systems A and B, based on the proposed framework. System-A supports Course-A (like introductory computer information systems, management science, and statistics) and System-B supports Course-B (like simulation gaming). Systems A and B are similar to some extent whereby both have the same basic features. However, System-B additionally provides some advanced, ‘‘add-on’’ features required to support Course-B. Both systems provide basic content types such as MS Powerpoint-based lecture notes and MS Word-based assignments. System-B, in addition, provides the software content of a simulation game for download and installation and module to upload decision and result files. All downloads, uploads, and installations are managed by the system. The interfaces of both systems are identical. Although the systems provide different levels of support, they work in the similar fashion. Both allow students to synchronize all course information via a single click of button or automatically when they open the system. The synchronization occurs based on the student profile created on the server. This profile contains information about specific objects to be downloaded to or uploaded from the student computer. This approach enables the systems to incrementally
  • 11. 37 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 download content or upload information. No files are transferred twice unless specified. The synchronized information can then be accessed later by the student even when his computer is off- line. The student accesses the information by going to various sections of the system. Each section presents the most recently updated information. These sections pertain to different features such as lectures, assignments, e-mail, chat, etc. 3.1.2. Dependent measures To evaluate the learning support provided by the Systems A and B, a questionnaire (see Appendix A) was developed and administered. The questionnaire measured the user perceptions of the learning support provided by the systems on four dimensions comprising eight variables as discussed in the evaluation model (Fig. 3). The questionnaire contained twenty-five questions. These questions were developed from a comprehensive review of previous studies related to the assessment of a system on multiple dimensions (Aldag & Power, 1986; Doll & Torkzadeh, 1988; Kettinger & Lee, 1994; King, Premkumar, & Ramamurthy, 1990; Raymond, 1987; Subramanian, 1994). Each question was a statement on which the participants were asked to rate the system on a seven-point Likert-type scale with 1 being strongly disagree and 7 being strongly agree with the statement. We asked the participants to answer the questions by comparing the experiences they had in similar courses without a computerized support system. We did not use any performance measures (scores in examinations or assignments) to evaluate the effects of the systems on the student performance for three reasons. First, the system usage may have an effect on the student performance, but the purpose of this study is not to measure such effects. Second, the assignments and exams were based on the content of the course and not on the delivery methodology. Third, because the questionnaire was administered anonymously, we were unable to connect student responses with their performance. 3.1.3. The participants One hundred and eighty-one undergraduate students in eight classes across three semesters participated in the study. One hundred and twenty-nine students in five classes used System-A. Table 1 Demographics System-A System-B Overall Sample size 129 52 181 Average age (yr) 25.02 28.66 25.77 Gendera Female 57 19 76 Male 69 23 92 Average GPA group (scale 4.0) 3.00–3.24 3.00–3.24 3.00–3.24 Work experience (yr) 6.69 7.16 6.78 Computer experienceb 5.23 5.47 5.28 Confidence in computer abilitiesb 5.38 5.66 5.44 Internet/web experienceb 5.59 5.38 5.55 a Values missing in thirteen (13) observations. b On the scale of 1 to 7 where 1 being low and 7 being high.
  • 12. 38 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Fifty-two students in three classes used System-B. Demographic data (Table 1) suggest that the subjects were mature and had on average over 6 years of work experience. They have had sufficient experience in using computers and the Internet. Average age, work experience, computer experience, and confidence in computer abilities were slightly higher in the System-B users because System-A was used mostly by the junior level students while System-B was used mostly by the senior level students. Overall, the sample was a good representative of the population of junior and senior undergraduate students at urban business schools in large US cities. 4. Data analysis and discussion 4.1. Validity and reliability Although the questionnaire was developed from past studies, it was pilot tested before its use in the main study to ensure content validity of the measurement instrument. In addition, a correlation analysis and a factor analysis were performed to ensure construct validity of the questionnaire. The correlation analysis shows that, except one, all correlations among the questions within each variable were significant (po0:001). The statistically insignificant correlation (which had p ¼ 0:001) was on two questions related to the ease of use variable, on which total four questions were asked. In the factor analysis, a principal component analysis was performed followed by Varimax rotation (Table 2). The solution was limited to four factors (64.9 percent variation explained). The factor analysis shows a reasonable alignment of variables along the four dimensions of the evaluation model. Reliability of the questionnaire was evaluated using Cronbach Alpha (Cronbach, 1951). The alpha value of 0.7 or above is preferred for a preliminary, exploratory research like this (Nunnally, 1978). Table 3 shows that the alpha values are above 0.7 on all variables and dimensions except one, the accuracy component for System-B. However, the alpha values for the technology dimension for System-B and for both systems combined are considerably higher than 0.7. In addition, we took several special precautions during the study to reduce the effects of extraneous variables on the results and increase external validity. 1. Multiple independent instructors participated in the study (in addition to one researcher of this study). This should have reduced experimenter bias. 2. The study was conducted over three semesters. This should have removed any time related extraneous effects. 3. Three different courses were used in Course-A category. This should have reduced the biases related to one specific course and to the tasks involved in that course. 4.2. Hypotheses testing A preliminary analysis of response percentiles (Table 4) shows that the participants overwhelmingly agree with the statements given in the questionnaire. Only a small number of participants disagreed on the content dimension. This provides a preliminary proof that the systems supported learning.
  • 13. 39 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Table 2 Factor analysis Content factor 1 Interface factor 2 Functionality factor 3 Technology factor 4 Conformance M21 0.7897 M22 0.7731 M23 0.7774 0.4301 M24 0.7567 Usefulness U5 0.8569 U6 0.8167 Timeliness T7 0.4992 0.4286 T8 0.4534 0.4059 T9 0.4290 0.5277 Accuracy A18 0.6185 A19 0.7191 A20 0.7789 Ease of use E1 0.6608 E2 0.5192 0.4810 E3 0.6965 E4 0.7421 Format F15 0.5984 F16 0.5769 F17 0.5479 Convenience C10 0.4771 0.5962 C11 0.6829 C12 0.5058 0.4415 C13 0.5913 C14 0.4640 Satisfaction S25 0.5060 0.4423 Table 5 shows the mean values and standard deviations of student responses on each dimension and each component for the overall data (both systems combined). Hypotheses 1A–D postulate that the mean value of responses on each dimension should be higher than the average valueFfourFon the scale of 1–7. To test these hypotheses, four one-sample, one-sided, upper- tailed t-tests were conducted. On each t-test, the observed value of t was higher than the critical t value (2.35 at 99% confidence interval and 180 degrees of freedom) implying the rejection of null hypothesis. In addition, p-values (the probability of a mean value being equal to or less than the average value, four) were o0.005. In other words, the mean values were significantly higher than average (four) on all dimensions. Thus, all of hypotheses 1A–D were supported concluding that the systems provided above average support for learning. In addition, we performed eight, similar t-tests on individual components. In all these tests, observed t-values were higher than the critical t-value. Thus, even at the level of each individual component the systems supported learning. Table 6 shows the mean values and the standard deviations of the student responses on each dimension and each component for Systems A and B. Hypotheses 2A–D postulate that the mean values of responses for System-B should be higher than those for System-A on each of the four
  • 14. 40 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Table 3 Cronbach’s alpha System-A System-B Overall Conformance 0.8153 0.8722 0.8417 Usefulness 0.9252 0.7669 0.9086 Content 0.8776 0.8976 0.8915 Timeliness 0.7561 0.8383 0.7860 Accuracy 0.7437 0.6586 0.7364 Technology 0.8131 0.8309 0.8274 Ease of use 0.7462 0.7494 0.7486 Format 0.7265 0.7816 0.7472 Interface 0.8232 0.8694 0.8387 Convenience 0.7553 0.7298 0.7583 Satisfactiona NA NA NA Functionality 0.7927 0.7802 0.7980 a Only one question was asked that measured overall satisfaction with the system. Table 4 Response percentiles Underlying dimension Individual Disagree Indifferent Agree variable (p2.33) (%) (2:33oXo5:67) (%) (X5.67) (%) Conformance 1.7 28.1 70.2 Usefulness 1.7 34.8 63.5 Content 1.7 32.6 65.7 Timeliness 0.0 27.1 72.9 Accuracy 0.0 20.4 79.6 Technology 0.0 21.5 78.5 Ease of use 0.0 14.4 85.6 Format 0.0 14.4 85.6 Interface 0.0 15.5 84.5 Convenience 0.0 24.3 75.7 Satisfaction 0.0 19.3 80.7 Functionality 0.0 27.6 72.4 dimensions. To test these hypotheses, four independent-samples, one-sided, upper-tailed t-tests were conducted. Except for the interface dimension (hypothesis 2C), on each t-test, the observed value of t was higher than the critical t value (2.35 at 99% confidence interval and 180 degrees of freedom) implying the rejection of null hypothesis. In addition, except for the interface dimension, the p-values (the probability of the means being equal or the mean for System-B being less than the mean for System-A) were o0.005. In other words, the mean response values for System-B were significantly higher than the mean response values for System-A on three dimensionsFcon- tent, technology, and functionality. Thus, hypotheses 2A, 2B, and 2D were supported concluding that the systems supported learning better in Course-B than Course-A on three out of four dimensions of learning support.
  • 15. 41 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Table 5 Above average support (hypotheses 1A–D) Underlying dimension Individual variable Overall Significance test Means SD t po Conformance 5.92 0.877 29.38 0.000 Usefulness 5.76 1.194 19.79 0.000 Content 5.84 0.958 25.80 0.000 Timeliness 6.08 0.740 37.80 0.000 Accuracy 6.20 0.747 39.64 0.000 Technology 6.14 0.663 43.46 0.000 Ease of use 6.27 0.603 50.55 0.000 Format 6.22 0.666 44.87 0.000 Interface 6.24 0.580 52.04 0.000 Convenience 6.15 0.645 44.80 0.000 Satisfaction 6.18 0.827 35.52 0.000 Functionality 6.17 0.666 43.78 0.000 Table 6 Differences in two courses (hypotheses 2A–D) Underlying dimension Individual variable System-A System-B Significance test Means SD Means SD t po Conformance 5.75 0.867 6.33 0.762 4.24 0.000 Usefulness 5.57 1.269 6.23 0.819 4.17 0.000 Content 5.66 0.980 6.28 0.736 4.67 0.000 Timeliness 5.95 0.741 6.39 0.642 3.74 0.000 Accuracy 6.11 0.778 6.43 0.617 2.91 0.002 Technology 6.03 0.669 6.41 0.567 3.59 0.000 Ease of use 6.23 0.615 6.37 0.565 1.40 0.081 Format 6.15 0.657 6.41 0.659 2.45 0.008 Interface 6.19 0.572 6.39 0.580 2.13 0.017 Convenience 6.07 0.639 6.35 0.619 2.76 0.003 Satisfaction 6.09 0.852 6.40 0.721 2.32 0.011 Functionality 6.08 0.673 6.38 0.602 2.79 0.003 We also performed eight similar t-tests on individual components. The observed t values were higher than the critical t-value on all components except twoFease of use and satisfaction. For the ease of use component, we believe that because both systems had identical user interface with similar way of interacting with the system, the subjects did not find one system easier to use than the other. This may be the reason that we could not reject the null hypotheses for the interface dimension. For the satisfaction variable, we cannot provide any explanation, except that the observed t-value (2.32) is extremely close to the critical t-value (2.35) indicating higher system support at somewhat lower (97.5%) confidence level.
  • 16. 42 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 5. Conclusions This study shows that when multiple Internet technologies are used under the proposed framework they can effectively support learning. The framework enables interactivity among students and their instructor and extends learning beyond classroom environment. It enables instructors with all skill levels to utilize emerging Internet technologies while they can still create and manipulate content using the software applications with which they are familiar. While it protects the intellectual property rights of instructors and enables timely delivery of content using push technology, it also enables students to get additional information when they want using pull technology. Information technologies, specifically the Internet and the Web, are rapidly transforming the education and training industry with new educational models, such as the virtual university and just-in-time learning. However, their role in traditional classroom-based education has been limited to automating information delivery (Baer, 1998; Ives & Jarvenpaa, 1996; Leidner & Jarvenpaa, 1995; Magal & Raghunathan, 1998). The framework proposed in this study goes beyond delivering content to students as is done by most course Web sites to reengineering classroom-based education. By providing a common platform to students and instructors to communicate and coordinate, it eliminates the barriers of time and space. It assists students and instructors in automating chores of course management and creating a virtual community that may last beyond a course. It attempts to combine different learning models, specifically traditional objectivism with constructivism, collaborativism, and socioculturism (Leidner & Jarvenpaa, 1995). This study was conducted in a field setting. Real students used real education support systems in real courses. As in all field studies, the control for some unknown, extraneous effects may have been unintentionally compromised. However, the benefits of testing the systems in actual usage and ability to apply the results across other settings outweigh the unascertained loss due to the lack of control. Another contribution of this study is the development and validation of the learning support measurement instrument. The questionnaire used in this study to measure system support in learning is a most comprehensive questionnaire. Empirical testing of its validity through correlation and factor analyses and its reliability through Cronbach’s alpha analysis strengthens its eligibility as a valid and reliable instrument. Other researchers in this area will be able to use it in future studies. This was an exploratory study. Although the results indicate that the systems did support learning, one cannot claim that the systems also improved learning. For such a conclusion, there is a need to conduct a controlled experiment with a control group (using a generic course Web site) and an experimental group (using the framework-based system). The experiment should measure the effects of the two different treatments on the variables related to the improvement in learning. One way to measure the improvement in learning is matching student performance on each learning objectiveFrecall and recognize, comprehend, apply, analyze, synthesize, and evaluate- Fas defined by Bloom (1956, 1981). Additionally, evaluate such effects over a period of time to avoid transitory improvements due to novelty effects (Alavi et al., 1995). In the study, we assumed a ‘‘closed learning model’’, where the right course material is determined and provided by the course instructor. As opposed to this is an ‘‘open learning
  • 17. 43 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 model’’, which assumes autonomous learners who find most of the relevant course material on their own. Closed learning models are widely used in most degree courses at most universities. However, some advanced degree courses and executive courses utilize on open learning model. The benefits found in this study may not be found in those courses. A future study should look into the finer aspects of these two different learning models. Finally, this experiment has a strong US orientation. Educational use of Internet technologies and the availability of the technologies may differ from one country to other and may lead to different results. In addition, student competence in information handling and in using information technologies in different cultural settings may also lead to different results. We encourage other researchers to carry out similar studies in drastically different settings in future. Appendix A. Measurement Questionnaire Please circle your response. Strongly Strongly Disagree Agree Ease of Use 1. Learning to operate ActiveBook was easy for me. 1.....2.....3.....4.....5.....6.....7 2. I found ActiveBook to be flexible to interact with. 1.....2.....3.....4.....5.....6.....7 3. ActiveBook is user friendly. 1.....2.....3.....4.....5.....6.....7 4. I found ActiveBook overall easy to use. 1.....2.....3.....4.....5.....6.....7 Usefulness 5. ActiveBook increased my productivity in the course 1.....2.....3.....4.....5.....6.....7 related activities. 6. I found ActiveBook overall useful. 1.....2.....3.....4.....5.....6.....7 Timeliness 7. ActiveBook gave me the information I needed in time in 1.....2.....3.....4.....5.....6.....7 the course related activities. 8. ActiveBook provided up-to-date information in the 1.....2.....3.....4.....5.....6.....7 course related activities. 9. Response/turnaround time in ActiveBook is ideal. 1.....2.....3.....4.....5.....6.....7 Convenience 10. ActiveBook is convenient to use in the course related 1.....2.....3.....4.....5.....6.....7 activities. 11. ActiveBook provided convenience in getting lecture notes. 1.....2.....3.....4.....5.....6.....7 12. ActiveBook provided convenience in getting general 1.....2.....3.....4.....5.....6.....7 information including the schedule of class. 13. ActiveBook provided convenience in getting time 1.....2.....3.....4.....5.....6.....7 sensitive information such as news/instructions from
  • 18. 44 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 the instructor and changes in the schedule. 14. ActiveBook provided convenience in communicating 1.....2.....3.....4.....5.....6.....7 with the instructor. Format 15. I am happy with the layout of the ActiveBook screen. 1.....2.....3.....4.....5.....6.....7 16. The instructions provided by ActiveBook were clear. 1.....2.....3.....4.....5.....6.....7 17. The information provided by ActiveBook was in a useful 1.....2.....3.....4.....5.....6.....7 format. Accuracy 18. The information provided by ActiveBook is accurate. 1.....2.....3.....4.....5.....6.....7 19. I have complete confidence in the information provided 1.....2.....3.....4.....5.....6.....7 by ActiveBook. 20. ActiveBook is dependable. 1.....2.....3.....4.....5.....6.....7 Conformance 21. ActiveBook provides the precise information I need. 1.....2.....3.....4.....5.....6.....7 22. Information content in ActiveBook meets my needs. 1.....2.....3.....4.....5.....6.....7 23. ActiveBook provides sufficient information. 1.....2.....3.....4.....5.....6.....7 24. ActiveBook provides relevant information. 1.....2.....3.....4.....5.....6.....7 Satisfaction 25. I am overall satisfied with ActiveBook 1.....2.....3.....4.....5.....6.....7 Note: For convenience, we named both systems ‘‘ActiveBook’’. References Ahmad, R., & Piccoli, G. (1998). Virtual learning environments: An information technology basic skills course on the Web. In Proceedings of the fourth Americas conference on information systems (pp. 1022–1024). Alavi, M. (1994). Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18(2), 159–174. Alavi, M., Wheeler, B. C., & Valacich, J. S. (1995). Using IT to reengineer business education: An exploratory investigation to collaborative telelearning. MIS Quarterly, 19(3), 293–312. Aldag, R. J., & Power, D. J. (1986). An empirical assessment of computer-assisted decision analysis. Decision Sciences, 17, 572–588. Aragon, L. (1997). When shove comes to push. PC Week, A1, A4. Baer, W. S. (1998). Will the Internet transform higher education? The emerging internet: Annual review of the institute for information studies. Aspen, CO: The Aspen Institute. http://www.aspeninst.org/dir/polpro/CSP/IIS/98/98.html Barron, A. E., & Orwig, G. W. (1997). New technologies for education: A beginner’s guide. Englewood, CO: Libraries Unlimited. Beasley, R. E., & Waugh, M. L. (1995). Cognitive mapping architecture and hypermedia disorientation: An empirical study. Journal of Educational Multimedia and Hypermedia, 4(2/3), 239–255. Beekman, G. (1994). Computer currents. Redwood City, CA: Benjamin/Cummings.
  • 19. 45 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Bloom, B. S. (Ed.). (1956). Taxonomy of educational objectives: Cognitive domain. New York: Longmans. Bloom, B. S., Madaus, G. F., & Hastings, J. T. (1981). Evaluation to improve learning. New York: McGraw- Hill. Brown, J. S., Duguid, P. (1996). Universities in the digital age. Change, 11–19. Choren, R., Laufer, C., Blois, M., Torres, V., Ferraz, F., Robichez, G., Daflon, L., de Lucena, C. J. P., Fuks, H. (1999). Orchestrating technology for Web-based education. In Proceedings of the fifth Americas conference on information systems (pp. 130–132). Cronbach, L. J. (1951). Coefficient alpha and internal structure of tests. Psychometrica, 16, 297–334. Daft, R. L., & Huber, G. P. (1987). How organizations learn: A communication framework. Research in the Sociology of Organizations, 5, 1–36. Doll, W. J., & Torkzadeh, G. (1988). The measurement of end-user computing satisfaction. MIS Quarterly, 12(2), 259–274. Gagne, R. M. (1977). The conditions of learning (3rd ed.). New York: Holt, Rinehart and Winston. Guenther, K. (1999). The Web gets pushy. CIO Enterprise Section, 12(7), 66. Hayes, F. (1997). Just say no to pushers. Computerworld, 31(17), 113. Hibbard, J. (1997). Intranet advocates: Don’t push users, let them browse. Computerworld, 31(17), 8. Hislop, G. W. (1998). Instructor visibility in online. In Proceedings of the fourth Americas conference on information systems (pp. 1054–1056). Huang, A. H. (2001). Problems associated with using information technology in teaching: A research proposal. In Proceedings of the seventh Americas conference on information systems (pp. 39–40). Ives, B., & Jarvenpaa, S. L. (1996). Will the Internet revolutionize business education and research? Sloan Management Review, 37(Spring), 33–41. Kendall, K. E., & Kendall, J. E. (1999). Information delivery systems: An exploration of Web pull and push technologies. Communications of the AIS, 1, 14. Kettinger, W. J., & Lee, C. C. (1994). Perceived service quality and user satisfaction with the information services function. Decision Sciences, 25(5/6), 737–766. King, W. R., Premkumar, G., & Ramamurthy, K. (1990). An evaluation of the role and performance of a decision support system in business education. Decision Sciences, 21(3), 642–659. Kulik, J. A., & Kulik, C. C. (1988). Timing of feedback and verbal learning. Review of Educational Research, 58(1), 79–97. Leidner, D. E., & Jarvenpaa, S. L. (1995). The use of information technology to enhance management school education: A theoretical view. MIS Quarterly, 19(3), 265–291. Levitt, J. (1997). Rating the push products. Information Week, 628, 53–59. MacArthur, D., & Lewis, M. (1996). Untangling the Web: Applications of the Internet and other information technologies to higher education (DRU-1401-IET). Santa Monica, CA: RAND. Maddox, K. (1997 February 24). Online data push. Information Week, 619, 61–68. Magal, S., & Raghunathan, M. (1998). Have the transformational information technologies really transformed the academia? In Proceedings of the fourth Americas conference on information systems. Baltimore, MD. Mahoney, J. (1998). Higher education in a dangerous time: Will technology really improve the university? Journal of College Admission, 24(3), 161. Malhotra, A., Gosain, S., & Lee, Z. (1997). Push–pull: The information tug of war, a framework for information delivery and acquisition systems design. In Proceedings of the third Americas conference of association for information systems. Indianapolis, IN. Mendels, P. (1999). Online education bets a credibility boost. New York Times, March 13. Merrill, M. D. (1983). Component display theory. In C. M. Reigeluth (Ed.), Instructional design theories and models. Hillsdale, NJ: Lawrence Erlbaum Associates. Moeller, M. (1997). Delivering push. PC Week, 14(19), 1, 16. Mosley-Matchett, J. D. (1997). Webcasting: It’s important to learn pros and cons of push and pull. Marketing News, 31(8), 31. Neumann, P. G. (1998). Risks of e-education. Communications of ACM, 40(10), 136.
  • 20. 46 M. Parikh, S. Verma / International Journal of Information Management 22 (2002) 27–46 Nobel, D. F. (1998a). Digital diploma mills: The automation of higher education. First Monday, 3(1). http:// firstmonday.dk/issues/issue3 1/noble/ Nobel, D. F. (1998b). Digital diploma mills, Part II: The coming battle over online instruction. Working Paper. http:// communication.ucsd.edu/dl/ddm2.html Nobel, D. F. (1998c). Digital diploma mills, Part III: The bloom is off the rose. Working Paper. http://communication. ucsd.edu/dl/ddm3.html Nunnally, J. C. (1978). Psychometric theory (2nd ed.). New York, NY: McGraw-Hill. Pflug, O. (1997). ‘Push’ technology: Dead on arrival. Computerworld, 31(4), 37. Raden, N. (1997). Push back in push technology: Is push technology the next big thing for data warehousing, or is it just a distraction? DBMS and Internet Systems, http://www.dbmsmag.com/9711i15.html Randolph, G. B., & Griffin, J. A. (1999). Disorientation and hypermedia structure. In Proceedings of the fifth Americas conference on information systems (pp. 939–941). Raymond, L. (1987). Validating and applying user satisfaction as a measure of MIS success in small organizations. Information & Management, 12, 173–179. Rivlin, G. (1997). When push comes to shove. Upside, 9(5), 98–110. Soloway, E. (1993). Technology in education. Communications of the ACM, 36(5), 28–29. Spooner, F., Spooner, M., Algozzine, B., & Jordan, L. (1998). Distance education and special education: Promises, practices, and potential pitfalls. Teacher Education and Special Education, 21(2), 121–131. Subramanian, G. H. (1994). A replication of perceived usefulness and perceived ease of use measurement. Decision Sciences, 25(5/6), 863–874. Sumner, M., & Hostetler, D. (1999). Factors influencing the adoption of technology in teaching. In Proceedings of the fifth Americas conference on information systems (pp. 951–953). Tripp, S. D., & Roby, W. (1990). Orientation and disorientation in a hypertext lexicon. Journal of Computer-Based Instruction, 17, 120–124. Tyran, C. K., & Shepherd, M. (1999). Collaborative technology in the classroom: A research framework. In Proceedings of the fifth Americas conference on information systems. Van Slyke, C., Kittner, M., & Belanger, F. (1998). Distance education: A telecommuting perspective. In Proceedings of the fourth Americas conference on information systems (pp. 666–668). Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design. Englewood Cliffs, NJ: Educational Technology Publication. Young, J. R. (1998). A year of Web pages for every course: UCLA debates their value. The Chronicle of Higher Education, 44(36). http://chronicle.com/free/v44/i36/36a02901.htm Mihir Parikh is an assistant professor at the Institute for Technology and Enterprise of Polytechnic University in New York City where he leads several research initiatives and teaches executive degree courses in the management of information technology and systems, information and knowledge management, and digital strategy. As an active researcher, he has published over twenty papers. He has presented papers and lectured at various national and international forums in the USA, Europe, Taiwan, India, and Israel. He also consults in the information technology services, healthcare, financial services, and retail industries in the areas of information and knowledge management, strategic IT planning, digital strategy, and decision support. For his recent work, visit http://www.ite.poly.edu/people/ mparikh. Sameer Verma is assistant professor of information systems at San Francisco State University, San Francisco, CA. His areas of interest and work include online learning systems, content management and delivery, event-driven architectures, and e-commerce. His focus is on the implementation of these systems via state-of-the-art broadband technologies and services. Details about his work can be found at http://verma.sfsu.edu/

×