• Like
CALL Roy Journal 2012
Upcoming SlideShare
Loading in...5

CALL Roy Journal 2012

Uploaded on


  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. This article was downloaded by: [Debopriyo Roy]On: 19 December 2012, At: 16:12Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Computer Assisted Language Learning Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/ncal20 Website analysis as a tool for task- based language learning and higher order thinking in an EFL context a Debopriyo Roy a Center for Language Research, School of Computer Science and Engineering, University of Aizu, Aizuwakamatsu, 9650001, Japan Version of record first published: 19 Dec 2012.To cite this article: Debopriyo Roy (2012): Website analysis as a tool for task-based languagelearning and higher order thinking in an EFL context, Computer Assisted Language Learning,DOI:10.1080/09588221.2012.751549To link to this article: http://dx.doi.org/10.1080/09588221.2012.751549PLEASE SCROLL DOWN FOR ARTICLEFull terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditionsThis article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden.The publisher does not give any warranty express or implied or make any representationthat the contents will be complete or accurate or up to date. The accuracy of anyinstructions, formulae, and drug doses should be independently verified with primarysources. The publisher shall not be liable for any loss, actions, claims, proceedings,demand, or costs or damages whatsoever or howsoever caused arising directly orindirectly in connection with or arising out of the use of this material.
  • 2. Computer Assisted Language Learning 2012, 1–27, iFirst article Website analysis as a tool for task-based language learning and higher order thinking in an EFL context Debopriyo Roy* Center for Language Research, School of Computer Science and Engineering, University of Aizu, Aizuwakamatsu 9650001, Japan Besides focusing on grammar, writing skills, and web-based language learning,Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 researchers in CALL and second language acquisition have also argued for the importance of promoting higher-order thinking skills in ESL (English as Second Language) and EFL (English as Foreign Language) classrooms. There is solid evidence supporting the effectiveness of teaching analytical skills and critical thinking skills in a language classroom. This article argues that website analysis exercises and related design education might be a possible way to get students involved in constructive writing practices, and for promoting critical thinking. Research evidence supports website design process as a potentially valuable and energizing experience, and a rhetorical exercise in the technical communication classroom. Twenty-eight students participated in this in-class experiment as reported in this article. The six- week experiment involved analyzing websites with open-ended questions, indirectly based on established design guidelines of the web user experience model as developed by Garrett (2011). Accuracy scores suggest that readers did reasonably well with questions on audience analysis and product goals, and showed promise analyzing questions on navigation/information/interface design, with enough indication that with more feedback and structured assessment mechanism, analytical ability of these non-native readers will improve, resulting in superior English text production and improved analytical ability. However, variability in accuracy scores across weeks also indicate that more practice, feedback, and contextual exposure are required for performance improvement. Keywords: website evaluation; user experience; EFL; higher-order thinking skills; analysis 1. Introduction Imagine some of the problems that non-native speakers with low English language proficiency, might confront when exposed to a language learning context in Moodle (learning management system), where they have to experience an English website, and then analyze the website by answering several open-ended design queries in English. In a typical scenario, as observed by the author(s) in class, such a task involved reading and comprehending the website content in English, performing selected translation of the content in their native language, or switching back and forth between English and the native version if available, comprehend the design queries and its scope in their native language, frame the responses for the design *Email: droy@u-aizu.ac.jp ISSN 0958-8221 print/ISSN 1744-3210 online Ó 2012 Taylor & Francis http://dx.doi.org/10.1080/09588221.2012.751549 http://www.tandfonline.com
  • 3. 2 D. Roy questions mentally in their native language, and then some will try to respond directly in English, while others will take the help of translation software (Google translator in most cases), and online and/or portable dictionaries (e.g. Weblio). It is probably worth exploring whether such use of multiple interfaces and applications might create a sense of a holistic learning environment, and has the potential to enrich the process and quality of language learning? Using website analysis as a tool for task-based learning for second language acquisition and higher-order thinking is a possible pedagogical approach and idea that has not yet been researched in the CALL (computer-assisted language learning) literature. Task-based language learning (TBLL) focuses on the use of authentic language and on asking students to do meaningful non-linguistic tasks using the target language. Assessment is primarily based on task outcome (in other words the appropriate completion of tasks) rather than on accuracy of language forms (http:// en.wikipedia.org/wiki/Task-based_language_learning).Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 This study considered English website analysis for task-based language learning. Design analysis basically is a non-linguistic task, but here it necessitates authoring meaningful text (reporting design analysis) in English. This study is important because we still have no understanding of the extent and efficiency with which novice L2 learners (with limited English language proficiency) indulging in introductory website analysis might be able to explain introductory design rationales in the target language (with limited background knowledge and feedback). Recent literature in CALL suggests a very strong focus on the use of applications like wikis (e.g. theoretical or procedural information in Wikipedia or Wikihow could act as a strong aid for task-based language learning, and help explain a phenomenon) and blogs (Wheeler & Wheeler, 2009; Williams & Jacobs, 2004), social networking sites (Harrison & Thomas, 2009), mobile-assisted language learning applications (Kukulska-Hulme & Shield, 2008), Second Life (Stevens, 2006), gaming (Ersoz, 2000), etc. for second or foreign language acquisition. The literature in CALL related to the use of these interfaces and programs focus on the following aspects: (1) How students could possibly use these programs and interfaces for target language acquisition? (2) Different cognitive issues in learning, related to learner variability, cognitive styles (Crosby & Iding, 2007; Heift, 2008; Shang, 2007), etc? (3) How to develop learner-centered design rationales for virtual environments and for hypermedia design (Peterson, 1998a)? (4) How these interfaces should be designed so that students are able to use them for more efficient language acquisition (Lonfils & Vanparys, 2001; Peterson, 1998a,b)? The concept of interface design and usability has been approached from variety of standpoints in CALL literature, related to building websites, learning environ- ments, courseware packages, exercises and authoring systems, etc. (Levy, 2002; Muir, Shield, & Kukulska-Hulme, 2003; Plass, 1998). Literature on usability in mainstream HCI facilitates a holistic understanding of the field. The above-mentioned areas of research focuses almost entirely on the design and usability guidelines for building elearning websites; courseware packages, etc., with some emphasis on how students might learn from these superior models of design, and could evaluate the worthiness of these ‘‘authentic’’ target language websites.
  • 4. Computer Assisted Language Learning 3 However, there has been no real attempt, or hardly any literature in CALL, which encourages students to understand how principles of website design and usability, actually works. In this study, the focus is on teaching web design in a second language classroom and including students in the core of the design process. The central focus of this project is on the efficiency with which EFL readers are able to write in English, but expectedly as an outcome of design thinking, subsequent brainstorming, formal analysis and explanation. The purpose of this research is NOT to measure higher order thinking, but explore how such thinking processes could possibly promote readers’ ability to write logically in English. The focus is on writers’ analytical writing ability in English. The strength of the design education (in an analyzing context) lies in its ability to promote higher-order thinking skills (such as Bloom’s revised taxonomy), original content production (Anderson & Kraithwohl, 2001) and potential for intensive writing in the discipline. For example, L2 learners use wikis for collaborative text production inDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 the target language. They often use technical websites in the target language. But, do they understand and could they explain what is it in the interface that they like to use, and what makes working with the interface or application fun/boring, engaging/disengaging and interactive/stand-alone, and informative/non-informative? Students understanding of the design process and related decision-making will not only lead to better use of the interface, but will also facilitate and support research dealing exclusively with usability testing of elearning websites in a L2 context (Shield & Kukulska-Hulme, 2003). In a language context, website or any other interface analysis with specific design- based queries might be one way to promote analytical thinking through its focus on creating, evaluating and analyzing (Atherton, 2002) and promotes active participa- tion, argumentation, problem solving, conducting investigations and tackling subject matter that is complex (Tytler, 2004). This proposed exploratory approach with website analysis will equip students to better understand the interface they use for elearning applications. The analysis of the results from the experiment reported in this article mainly discussed students’ preliminary understanding of the website content, design, navigation and usability, rather than the aspects of how their use of English (grammatically) for responding to design queries during website analysis was correct or deficient. The major point in the literature review concerning the need to think and analyze in the target language is more relevant since the entire analysis was conducted in English for non-native speakers. 2. Research focus and significance There is literature reflecting on how critical thinking could be developed through problem-based learning as a pedagogical approach in learning and teaching context (Kek & Huijser, 2011). But there is no relevant literature in CALL or second language acquisition that discusses how well students are able to accept design education in their mainstream language learning culture. 2.1. Research question The major research question for this study relates to readers’ (students with computer science majors in this EFL context) ability to successfully explain design decisions by studying specific websites of interest, in an English language course.
  • 5. 4 D. Roy Specifically, the experimental study as reported here, attempts to understand if readers are more comfortable with specific types of design queries as opposed to others, and fare consistently better or worse over time. Is there any significant difference between user responses on specific design questions over time? Is there any pattern or practice effect in design scores to suggest improved analysis and writing (probably resulting from systematic design thinking over time?)? 2.2. Significance The experiment as reported in this article is contingent on what introductory EFL readers can successfully process, and provides indication as to whether with more formative assessments along the way in terms of design guidance, sustained exposure to any specific website, task-oriented web-based activities and feedback, an EFL reader could improve on his/her ability to explain design decisions more diligently,Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 and selectively draw from the field of web design studies. The website analysis assignment in an EFL classroom is significant for various reasons: (1) It allows for a comprehensive understanding of the extent to which readers could understand information organization, design, and layout. (2) This assignment helped explain whether readers, with their current levels of English language processing ability, and use of metacognitive reading strategies (e.g. skimming, scanning, reading headings only, bulleted points, etc.) in a foreign language context, are comfortable answering some questions as opposed to others, i.e. whether there is a statistically significant difference between responses to different questions that represent different categories in design analysis. Mean responses within and between categories will help explain if there is a practice effect between weeks of analysis. Different web- based queries (both design and inference) pose different kinds of intellectual challenge in terms of design comprehension and text authoring as responses. (3) Overall, the major purpose was to understand the degree to which the assignment help EFL speakers produce original, analytical and inference- based textual explanation of what they understand about the website design. The exploration of writing itself is not central to the design of the study. Rather, the focus is on readers’ ability to comprehend design queries, and respond to it with some language competence and substantial content, which demonstrates higher- order thinking skills. Also, the focus is not on measuring higher order thinking, unlike IQ measurement. Rather, ability for purposeful website analysis indicates higher order thinking. 3. A review of the literature Website analysis is an integral part of professional/technical communication courses (Anderson, 2006). Website evaluations or usability studies have been successful over the past several years (Zhang & Dran, 2000). They include conceptual discussions on what should be evaluated and how to do it (Instone, 1997; Neilson, 2000). Literature in the area of technical communication suggests that web analysis might be a beneficial exercise (Spyridakis, 2000) and more so in an EFL context (Shield &
  • 6. Computer Assisted Language Learning 5 Kukulska-Hulme, 2003). Lee (2000), while considering the motivational aspects of tasks that involve the Internet, describes creating and publishing web pages as a task that is ‘‘one of the most potentially valuable and energizing’’. Furthermore, there is solid research supporting the use of internet and web-based writing in various forms, including motivational and psychological aspects of networked and collaborative writing processes in ESL/EFL classrooms (Belisle, 1996; Krajka, 2000; Tan, Gallo, Jacobs, & Kim-Eng Lee, 1999; Trokeloshvili & Jost, 1997). Although writing in the target language is central to this literature, there is marginal or secondary focus on systematic thinking, schematization, and ability for information comprehension, design analysis and inference-based responses. 3.1. Importance of analytical approach and higher-order thinking skills There is research suggesting the integration of writing, critical thinking and activeDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 learning in the classroom (Bean, 2011). Educators have argued for the importance of promoting higher-order thinking skills in ESL and EFL classrooms (Chamot, 1995; Tarvin & Al-Arishi, 1991) and empirical evidence supports the effectiveness of teaching critical thinking skills along with English as a second or foreign language (Chapple & Curtis, 2000; Davidson, 1994, 1995). Research findings related to cognitive development have shown that many aspects of reading and writing are pertinent to important thinking skills (Moffett & Wagner, 1983; Pearson & Tierney, 1984; Staton, 1984). Spada and Lightbown (1993) hold that thinking skills operate effectively when students voice their analysis and take part in the learning process occurring in the classroom. Although there is little argument among theorists and educators about the interrelatedness of the development of languages and thinking skills, in typical school settings, language learning and thinking skills are often treated as independent processes (Suhor, 1984). In the tradition and transition of English language teaching methodology, the integration of language and thinking has been peripheral (Pica, 2000). Language as a way of thinking and learning has been more of a pedagogical catchphrase than instructional practice. Kabilan (2000) argued that for learners to be proficient in a language, they need to be able to think creatively and critically when using the target language. Content-based teaching is an approach considered by many as an effective way to teach language skills while supporting the development of critical thinking. Through content-based instruction, learners develop language skills by thinking and learning in the target language (Brinton, Snow, & Wesche, 1989). Krug (2000) pointed out that design need to be intuitive. This is where the importance of a website analysis assignment as a critical thinking exercise makes a strong argument as a technique worth exploring. Van Hoosier-Carey (1997) described the website assignment as a rhetorical exercise in the technical communication classroom. It is important for readers to understand the conceptual process of designing a website, include the rationale of the project, target audience, purpose of the website etc (December & Ginsberg, 1995). In a more advanced EFL classroom, the experimental and goal-orientated nature of web design projects involves tasks such as deciding with a partner where to place a picture on a page being constructed, or browsing, which requires active choices of where to search next. These are claimed to help promote higher-order thinking skills (Mike, 1996),which include reviewing, scanning, selecting and negotiating, and particularly important for EFL students doing further studies in other disciplines, research and rhetorical skills that may be developed. Furthermore, Warschauer
  • 7. 6 D. Roy (1997) pointed out that web design skills (and collaboratively accessing and interpreting information) incorporate ‘‘situated learning’’: that which allows students: ‘‘to carry out meaningful tasks, and solve meaningful problems, in an environment that reflects their own personal interests as well as the multiple purposes to which their knowledge will be put in the future’’ (Collins, Brown, & Newman, 1989). With the goal of designing and publishing web pages, students can actively make use of new technologies, skills, and knowledge. Although higher order thinking could not be directly measured as part of this study, but the primary focus of this in-class experiment was to promote analytical thinking through intricate design analysis, and the quality of responses helped us indirectly assess the extent to which analytical design thinking was involved. 3.2. Reading, understanding and using English text in a Japanese contextDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 In order to understand performance with website analysis, it is important to comprehend how typical Japanese readers process English text in general. Atkinson (1997) and Fox (1994) depict Japanese learners as group-oriented, harmony-seeking, hierarchical, and non-critical thinkers. This study is thus an attempt to promote critical thinking and reflective writing. Research on web literacy for novice internet users suggests that web page reading might often result in hasty and random choices with little thought and evaluation, resulting from individual reader preference, ability to process information, language skills, and need for comprehension or task completion (Eagleton, 2001). Also, some web pages will be heavy on text, while others might not be. Thus, readers automatically and naturally confront linguistically variable, simple and complex text. Research suggests that exposure to these text modifications will result in different levels of comprehension with Japanese students (Yano, Long, & Ross, 1994). Also, students have other strategies to make reading and comprehension relatively more effective. Students in Japanese EFL context have extensively adopted metacognitive reading strategies like skimming, scanning, reading headlines, using dictionaries, translation, etc. in analyzing technical texts in an EFL context, and results have shown superficial and often surface level ability to comprehend, analyze, and write about a technical text (Roy, 2010) and similar results were observed in other EFL situations (Maghsudi & Talebi, 2009). English website analysis assignment in an EFL context is not only a reading exercise, but also incorporate readers’ ability to successfully analyze information organization, design and layout, grouping, navigation, audience analysis, etc. Neilson (1997) has demonstrated that the website analysis task is different from any other reading task, probably because it requires an analytical mindset, analysis, evaluation and resultant English text production in a specific design context. Also, ability to explain a design and layout might not always incorporate reading and comprehending the entire text in the web page. At other times, they might get away, understanding the headlines, the menu items, the introductory sentence of a paragraph, etc. (Neilson’s Alertbox, 1997). 3.3. Web design principles Creation of a successful web page should logically involve a proper understanding of web design principles. To ensure the best possible outcome, designers should
  • 8. Computer Assisted Language Learning 7 consider a full range of user interface issues (e.g. audience analysis; user requirements; user expectations related to navigation, content and organization; colors and graphics, etc.), for the best possible human performance (Adkisson, 2002; Baca & Cassidy, 1999; Koyani, Bailey, & Nall, 2004; Lynch & Horton, 2002; Spyridakis, 2000; Zimmerman & Akerelrea, 2002). Based on these relatively broader guidelines, and the different aspects of user experience design as is evident in the model as proposed by Garrett (2011) and shown in Figure 1, specific questions were designed on audience and task analysis, product goals, information and interaction design, information architecture, etc. Garrett’s (2011) model defined the following design classifications and these definitions, with examples were explained in class, before students engaged themselves with the weekly assignments as reported in the following sections. (1) Information Design – The presentation of information in a way thatDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 facilitates understanding. (2) Interface Design – Arranging interface elements to enable users to interact with the functionality of the system. (3) Navigation Design – The set of screen elements that allow the user to move through the information architecture. (4) Interaction Design – How the system behaves in response to the user. (5) Information Architecture – The arrangement of content elements to facilitate human understanding. (6) Functional Specifications – A detailed description of the ‘‘feature set’’ of the product. (7) Content Requirements – A description of various content elements that will be required. (8) Audience Analysis – Profile of users who would be using the website. (9) User Needs – What users would like to see in the website? (10) Site Objectives – Business or other kinds of goals. Garrett’s (2011) model for user-centered web interface design was adopted for this study due to its clarity related to the structure of the design, a design promoting sequential and structured thinking about website design, explicit purpose behind Figure 1. Total score for the websites analyzed.
  • 9. 8 D. Roy each step in the design process, and the range of ideas behind how to think about design. (1) Exploratory research: This paper makes an argument that using website analysis as a tool for task-based learning engages readers in the critical thinking required to enhance language learning. The measure of critical thinking lies indirectly in the use of keywords, sentences that make up the responses for the design and inference-based queries. (2) Participants: Participants were (N ¼ 28) junior-level students with an advanced standing took part in the study. The sample adopted for this study represents the general EFL population in countries like Japan. They were computer science majors in a Japanese technical university, in the process of completing 15 credits of English language as part of the university requirement, and also taken or taking some other courses where English isDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 the medium of instruction. The general EFL population in this university has demonstrated pre-intermediate English language processing ability as is evidenced from regular class observations and similar writing assignments over the semesters and across different student groups. The pre-test questionnaire suggests that participants for the study (male ¼ 21; female ¼ 7) and other students in this university are mostly without any background in website analysis and design. This is their first experience with information and user-centered design, and they represent an EFL population whose English processing skills are pre-intermediate at best. This sample represents any population where individuals do not have any specific background in information design, web analysis, but have experience reading English text and processing it with strategies (metacognitive approaches) mentioned previously. The above characteristics would fit in any similar EFL context where students might have some technical skills but no analytical skills in the nature that has been used for this study, and this is why the sample chosen for this study could be aptly generalized for similar EFL learning situations. 4. Methods This entire experiment was conducted in English and as part of an elective course in writing and design. At the beginning of the course (hosted in Moodle) and before the experiment formally started, the participants were given a trial website evaluation exercise with detailed explanation of the questions to be asked during the actual class assignment that followed over the subsequent weeks. Further, a detailed in-class lecture and discussion followed, with examples and demonstrations of how website evaluation is an important topic with discussions on content analysis, navigation, page layout, etc. The practice forum assignment ensured that every student can see what others have done and adopt/adjust their writing strategy accordingly. Students had over two weeks to adjust with the questions, the lectures and get more ideas from what other students have written about. The graded website evaluation assignment did not start until the third week into the course and continued until the eighth week (for 6 weeks in total).
  • 10. Computer Assisted Language Learning 9 As part of the weekly website evaluation exercise, students had eight questions to answer on a given website every week, and submit their assignment in Moodle, on topics related to content, navigation, usability, audience analysis, marketability of the website, technical efficiency, etc. A different website was used every week during the course of the experiment. Table 1 shows the design category and the related questions. The evaluation questions were indirectly based on the web design guidelines and models mentioned above (Garrett, 2011). However, since this is an EFL web evaluation context, with limited English language efficiency and reading and comprehension skills, the questions were modified, trimmed, simplified, and summarized to suite a more efficient level of understanding and task completion. More emphasis was placed on whether students understand the overall scope of the questions and have an overall idea of how to read/scan through the English website to get an answer, and then explain the answers with adequate English languageDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 writing proficiency. Standard corporate websites were chosen that might interest computer science students. Every chosen website belonged to brand name companies. Interest in the content, technology used, scope for research, laboratories, product diversity, marketing strategies etc were preconditions on the basis of which the websites were selected for evaluation. The final decisions on the choice of websites were based on informal in-class discussion. For the assigned 8 questions every week, students had a week to consult their group partner and reach a consensus on what might be the best way to answer each question. However, each participant wrote their own response in Moodle. Following the end of the third weekly web evaluation exercise, students were asked to present orally on what they completed for the first three weeks (this exercise was not scored). The short presentation focused on a summary discussion of their website evaluation strategies and how they did it. Every person’s presentation had a short discussion session that followed. This was considered as a limited formative assessment strategy within the context of the course. At the end of the sixth week of the scored (graded) website evaluation exercise, students were handed out a post-test questionnaire which was designed to test their reflective self-report of how they approached the website and the evaluation process. 4.1. Instruments Moodle was used for assignment posting and submission. Each of the eight questions every week was open-ended and had a minimum word limit of around 60 words. Open-ended questions were designed so that participants are able to voice out their opinion and approach openly and without being influenced and constrained in any way with pre-defined options presented to them, and without being affected by the randomness of the response inherent in multiple-choice type answers. Also, open-ended questions captured the sense of how participants interpreted each question. It is possible that for a given question, the participants ended up interpreting the question in a way not intended by the instructor. Creativity in thinking might be rewarded in this context. The assessment criteria and approach ensured that the subjectivity of the score with each response are minimized, and reflect the true quality of the answer.
  • 11. 10 D. Roy Table 1. Website analysis questions and focus (Based on Garrett’s Model). Question (based on simpler Question category modifications) Content criteria for assessment Information Design Q1. Explain whether the 1. Possible interaction with the organization of website information in the site is 2. Clicks on menu items user-friendly or not? 3. Over impression of visual Navigation Design, Q2. Explain whether the layout Interface Design presentation of content is 4. Text graphic balance, appealing or not? headings, paragraphs, white space etc. Interaction Design, Q3. Explain whether the 1. Whether technologies Interface Design effective use of technology available in the website for is demonstrated? user access accessed? 2. Technologies mentioned inDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 the website to be purchased/ used by customers? Audience Analysis, Q4. Who is the target audience? 1. Who is most likely to visit the User Experience Is the website appropriate website? for the project audience? 2. Who might be interested in the products/services/ research? Test-Graphic Q5. Explain the quality of the 1. Do you like the graphics? Content content. 2. Have you read the text? 3. Did you read enough to check if the graphic supports the text content? Navigation Design Q6. Is the information 1. Did you click on different accessible? items in the drop-down / side/ bottom menus and visited liked pages successfully? 2. If you were looking for any specific information, could you access it successfully? Is the most important information easily available? How Practical Q7. Explain whether the 1. Were you able to locate the is the content? resources use real-world technologies /services/ scenarios. products available for sale? 2. What are the web technologies mentioned? 3. What do you think is the importance of the product/ services on sale? Do you think the product/service/ website successfully meet your needs? Product Goals Q8. Here are some common 1. Provide an argument for each reasons for building this (or most) points mentioned in website. Rank them in the list OR order of importance. Do 2. Provide a summarized you have a reason that is argument to support the not listed? reasons for building the website. 3. Some preference demonstrating the need for building the website observed.
  • 12. Computer Assisted Language Learning 11 4.1.1. Overall assessment mechanism and scoring (evaluating higher-order thinking skills) The Facione Holistic Scoring Rubric (Facione & Facione, 1994) developed to assess the critical thinking skills were referenced during grading. Reference for scoring. The Facione scoring rubric was used as an overall reference, along with the overall list of broad criteria (as mentioned below) and the content criteria as mentioned in Table 1, to measure higher-order thinking skills. 4.1.2. Actual scoring For accurate responses, each open-ended question could be allotted a maximum of six points. The scoring method for each question is described below, with each criteria carrying one point. Examples must be provided in support of each response,Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 and there must be clear evidence to support each response. (1) Did the reader understand the question completely? (2) Did the reader make an attempt to answer the question as was asked? (3) Could the reader explain what he/she saw during interaction with the website? (4) Is there enough evidence to suggest that the reader made an attempt to understand the given website? (5) Is the answer grammatically reasonable and of acceptable quality? Is there evidence to suggest that the reader consulted his group partner? Did they collaborate on the chosen answer? 4.1.3. Qualifying for negative points (1) If it appeared that significant portion of the text has been copied from an earlier week, without adequate analysis of the website assigned for the week, students otherwise, deducted one point from their overall score, out of 6, for each question. (2) Another important criterion for assessment was whether students make reference to the website under study, to explicitly demonstrate that they have taken a look at the website. Otherwise, one point has been deducted from the total score out of 6. Next, readers were expected to provide some level of details of what they see in the webpage and what they think it means. The reason why a general explanation without any reference to specific headlines, web page features, etc. count for a lesser grade is because there is often no way to ensure whether the explanation is about the webpage under discussion, or if it has been simply recycled from an earlier week and from a different topic. However, often a general explanation without explicitly mentioned details, headlines, features, etc. might fit in similar but different contexts, and there is no way to measure actual reader effort. A tendency to recycle materials has often been observed during classroom teaching practices and when grading assignments in Moodle. The responses were evaluated on the basis of whether readers have an overall understanding of what the website suggests, and/or if the responses touch upon what is intended in the designed question. It is NOT expected in this EFL context, that
  • 13. 12 D. Roy readers can make finer differentiation between the responses, or go very deep into the analysis. A relatively accurate and yet a surface level analysis, was considered acceptable. It was considered natural that readers will only pick one or two specific points to answer questions and the context under study. Table 1 demonstrates an example of the open-ended questions as posted in Moodle. This assignment was designed to develop insight into web page analysis with performance that might improve over time with repeated practice and more guide- lines based feedback. That is precisely why along with web evaluation, each week students worked on developing a web page on a given topic in computer science towards an all-round development. Moreover, students also designed a preliminary concept map each week, explaining the design of a website they analyzed as part of this study.Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 4.2. Webpage design and focus The weekly websites were chosen with an effort to maintain relative uniformity and consistency in terms of content complexity, topic and presentation. Table 2 demonstrates the characteristics of the websites chosen for testing. 5. Findings In this section, the focus is on how readers performed over the six weeks and how the web evaluation accuracy score could be compared across the six weeks. SPSS software was used for statistical data analysis. The findings suggested if there is a significant difference between performance across weeks, if there is a significant practice effect between performance measurements, and also how the performance compared between the first and subsequent weeks. The analysis also demonstrated any significant difference between the accuracy scores for each of the eight questions asked every week, and over 6 weeks. A reliability analysis was performed for the eight questions each week, to test if the responses for each question and the accuracy score that it received could be reliably considered towards the total score each week. The Cronbach’s alpha test produced a coefficient value of .86 between the accuracy score for eight questions during the first week, suggesting a strong correlation between responses to individual questions. A test–retest reliability analysis produced significant correlation between the total score between weeks on the majority of cases (correlation was significant at .01 level). A split-half reliability value of .74 was also obtained. Table 3 shows the mean accuracy scores for different categories of questions across 6 weeks. Before the actual experiment, an inter-coder reliability analysis was performed with a multiple choice answer (four options), asking readers to tick the option that best represents the design query asked. Each coder had to choose the correct option for each of the eight design queries asked. We had 90% agreement reported on the choices. This helped us understand if the questions were comprehended as expected. Table 3 shows that readers were relatively more successful with questions related to audience analysis/user experience and explaining the product goals. Question 4 started with a high mean value (M ¼ 4.75), and maintained around M ¼ 3.5 on an average for the following weeks. Similarly, the question on product goals started with M ¼ 5.04 and maintained an average of M ¼ 3.5 or more over subsequent
  • 14. Downloaded by [Debopriyo Roy] at 16:12 19 December 2012Table 2. Website purpose, student interest and interface features. Potential interest for students Interface features in home page (based on preliminary Maximum time spent (based on self-English websites Website focus class discussions) report)IBM Promote and research in computer Students interested in research topics Heavy on textResearch Tokyo science at the IMB research in computer science, industry Nice short arrangement of a clickable(Week# 1) laboratories research and working menu list on the left with IBM Top menu takes the visitor to the general IBM webpage The top center of the home page provides a clickable list of other research facilities globally. The center of page gives an overview of the Tokyo facility, provides press releases, and provides a clickable list of the core research competencies.Oracle Provides a business model to enable Provides a model for single global Moderately heavy on textArgus Japan global pharmaceutical companies database with unified workflow for Left menu provides a list of major(Week #2) to integrate Japan into their world management of worldwide drug items that relates to its operation. wide business operation safety information Right menu presents a detailed list of items making news. The top menu provides a general Oracle items like products, downloads, training, software etc. Any click from the home page lakes the reader to a globally pertinent Oracle page. Computer Assisted Language Learning (continued) 13
  • 15. Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 14Table 2. (Continued). D. Roy Potential interest for students Interface features in home page (based on preliminary Maximum time spent (based on self-English websites Website focus class discussions) report)Microsoft Focus on Microsoft products and Shows the product specifications, the General Microsoft page provided butJapan services; available in the market. recent models and training readers often switched between the(Week #3) Provides information on products available for customers and gives English and Japanese page. available in the Japanese market. an overall general idea of how a Short chunks of text, with a heavy computer science major might balance between text and graphics. contribute. Standard top menu with products, services, downloads etc. No textual description in the home page, but focused on the recent products, popular downloads, product support etc.Dell USA Focus on Dell products and services Shows the product specifications, the Very similar interface arrangement(Week #4) available in the market. Provides recent models and training features as Microsoft. information on products available available for customers and gives Very similar text-graphics in the USA market. an overall general idea of how a coordination quantitatively. computer science major might The top menu is defined on the basis contribute. of the type of customer for whom the products are designed. A roll- over the menu items, allows for a click on the product from a drop-down menu. (continued)
  • 16. Downloaded by [Debopriyo Roy] at 16:12 19 December 2012Table 2. (Continued). Potential interest for students Interface features in home page (based on preliminary Maximum time spent (based on self-English websites Website focus class discussions) report)Mitsubishi A Japanese TV electronics company, These are products not directly linked Completely graphical interface withTV but Mitsubishi TV focuses on the to computers and/or computer little text.(Week #5) range of the range of high- applications. Computer sciences Provides a nice visual effect. definition televisions available in major get an idea about the user Provides a menu with three major the Japanese market. interface and specific hardware types of TV, that the company Demonstrates products designed installations, and opens up the specializes in. for home and office entertainment floor for the students to explore Each menu click takes the reader to and work. how computer science majors an audio-visual page where flash might contribute. video in English plays inside a TV interface. Short text-based: product description provided in linked pages.Japan Tourism Mainly promotes the country Japan No direct relation to computer The front page first asks visitors to(Week #6) with descriptions of places, people, science products and services. choose the language. language, food, tourism Students were told to explore the The home page is moderately heavy destinations, hotels, things to do website as a user-experience on text, with a nice overall balance etc. Tourism as a service and specialist and discuss what draws of text and graphics. product is demonstrated. their attention. The top menu provides a range of options from photo galleries, movie channels etc. The center of the page provides a clickable list of guest testimonials, recent topics, news and events. The left menu provides a clickable list of amusement, shopping, gastronomy etc. Computer Assisted Language Learning 15
  • 17. 16 D. Roy weeks. As for the design queries, most of the design scores started around 4, but the overall average suggested a lower score for later weeks. Question 3 (interaction design/interface design) started with M ¼ 3.39, but performance suffered over the later weeks with an average value around M ¼ 3.00. The overall success with Q1 and Q2 questions is also interesting. Results show that overall, the accuracy score for question 8 stands the best, and also maintain a relatively higher score for all the 6 weeks. Table 4 shows the Pearson correlation values between the total scores each week. This is designed to demonstrate that a strong correlation between the total scores (each week) suggests similar complexity of the web pages used and/or similar performance with different websites. Only significant values are shown. Table 4 suggested significant correlation between weekly totals (sig. vales around .001 * .005) suggesting overall similarity between weekly performances. Although the participants in this analysis were part of the same course, lack ofDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 specific information about their background with web analysis, other courses taken on design or analysis, and variable levels of English language proficiency, did not Table 3. Mean accuracy scores for different design considerations. Week Week Week Week Week Week Question Categories #1 #2 #3 #4 #5 #6 1 Information Design 4.36 4.43 3.79 3.46 3.57 3.14 2 Navigation Design/ 4.14 4.00 3.50 3.14 3.18 3.32 Interface Design 3 Interaction Design/ 3.39 3.39 2.89 3.00 2.93 2.93 Interface Design 4 Audience Analysis/ 4.75 3.96 3.79 3.29 3.07 3.50 User Experience 5 Text-graphic content 4.04 3.32 3.00 2.50 2.36 3.00 6 Navigation design 3.21 2.89 2.79 2.54 2.11 2.68 7 How practical 2.86 2.71 2.43 1.68 1.93 2.39 is the content? S Product goals 5.04 4.39 4.21 3.96 3.50 3.18 Table 4. Significant Pearson correlation values between the totals for each week. Pearson correlation (Total score for the week) Value Significance (2-tailed) W1-W2 .559 .002 W1-W3 .722 .000 W1-W6 .575 .001 W3-W2 .536 .003 W3-W4 .518 .005 W3-W5 .543 .003 W3-W6 .790 .000 W4-W5 .603 .001 W4-W6 .457 .012 W2-W5 .379 .047 Note: *Wi ¼ Total score for the week based on the 8 questions asked. Each question could get a maximum accuracy score of 6 points and so the total (or a week could be maximum (6*8 ¼ 48 points).
  • 18. Computer Assisted Language Learning 17 allow for the assumption that the sample was normally distributed. So, a non-parametric Friedman test was performed to test if there is an overall statistically significant difference between the total mean ranks each week (for a total of 6 weeks); and also tested for differences between the total mean score each week (for a total of 6 weeks) for each of the eight questions. Table 5 demonstrates the total score, mean rank and the standard deviation for the total score each week. The total score values from Week # 1–Week # 6 suggest a drop in performance from 31.82 to 24.14, with the lowest total weekly score recorded for week 5 (total score ¼ 22.64) with slight improvement in Week # 6 (total score ¼ 24.14). The mean rank also shows a dramatic drop from 5.00 to 2.95. Mostly, mean score and mean rank show a consistent drop in performance between Week # 1–Week # 6. The best performance is recorded for IBM Research Tokyo website (total score ¼ 31.82) and the worst for Mitsubishi TV (total score ¼ 22.64).Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 Figure 1 shows the total score in terms of the website analyzed across weeks. Table 6 demonstrates results of the Friedman test for each of the eight questions (across 6 weeks). This test demonstrated whether there is any significant difference between the scores on any given question for the 6 weeks. The null hypothesis in this test is that the distributions of the ranks of each type of score across weeks are the same. The above table provides the test statistic value (Chi-square). Results suggest a statistically significant difference between the mean ranks of the weekly total scores (w2 ¼ 36.324; significance ¼ .000). Results also suggest statistically significant difference between responses for most other individual questions between 6 weeks of trial (with significant value ranging from .000 to .005. However, we see non-significant values for Q3 (.334) and Q2 (.011). The Friedman test only reported whether there are overall differences but did not pinpoint which combination of weekly scores (for any given question) in particular, differ from each other. In order to find exactly where the difference lies, post-hoc tests were performed. Wilcoxin signed rank test was performed separately for the combination of scores for the same question over the six weeks. For each question there are five combinations over 6 weeks. 5.1. Post-hoc analysis for the weekly scores on individual questions Table 7 shows the values for the Wilcoxin signed rank test for the same question between weeks. Table 7 shows whether there is any statistically significant difference between responses for each of the eight questions when compared across each of 6 weeks. A Table 5. Total scores, mean ranks and standard deviation for weekly total scores. Weekly total Total score Mean rank Standard deviation W1 31.82 5.00 9.37 W2 29.11 4.00 10.87 W3 26.39 3.82 8.68 W4 23.57 2.68 8.48 W5 22.64 2.55 9.11 W6 24.14 2.95 8.10
  • 19. Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 18 D. RoyTable 6. Friedman test values for each of 8 questions and the total score from Week 1–6. Q1 Week: Q2 Week: Q3 Week: Q4 Week: Q5 Week: Q6 Week: Q7 Week: Q8 Week: Total score (1–6) (1–6) (1–6) (1–6) (1–6) (1–6) (1–6) (1–6) week: (1–6)Chi-square 16.838 14.807 5.724 25.551 34.406 17.573 26.450 31.787 36.324Asymptotic .005 .011 .334 .000 .000 .004 .000 .000 .000 significanceNote: df ¼ 5.
  • 20. Computer Assisted Language Learning 19 Table 7. Wilcoxin signed rank test between weekly scores for a given question. Questions Weeks Z score Asymptotic significance (2-tailed) Question #1 Week#2-Week #1 7.038 .969 Week#3-Week #1 71.640 .101 Week#4-Week #1 72.103 .035 Week#5-Week #1 71.954 .051 Week#6-Week #1 73.202 .001 Question # 2 Week#2-Week #1 7.594 .552 Week#3-Week #1 71.729 .084 Week#4-Week #1 72.370 .018 Week#5-Week #1 72.369 .018 Week#6-Week #1 72.519 .012 Question # 3 Week#2-Week #1 7.077 .939 Week#3-Week #1 71.953 .051 Week#4-Week #1 7.996 .319Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 Week#5-Week #1 71.167 .243 Week#6-Week #1 71.968 .049 Question #4 Week#2-Week #1 72.524 .012 Week#3-Week #1 72.845 .004 Week#4-Week #1 73.100 .002 Week#5-Week #1 73.530 .000 Week#6-Week #1 72.861 .004 Question #5 Week#2-Week #1 71.735 .083 Week#3-Week #1 72.509 .012 Week#4-Week #1 73.463 .001 Week#5-Week #1 73.685 .000 Week#6-Week #1 73.385 .001 Question #6 Week#2-Week #1 71.203 .229 Week#3-Week #1 71.360 .174 Week#4-Week #1 71.869 .062 Week#5-Week #1 73.115 .002 Week#6-Week #1 71.088 .277 Question #7 Week#2-Week #1 7.225 .822 Week#3-Week #1 72.294 .022 Week#4-Week #1 73.476 .001 Week#5-Week #1 73.729 .000 Week#6-Week #1 71.665 .096 Question #8 Week#2-Week #1 72.029 .042 Week#3-Week #1 72.404 .016 Week#4-Week #1 72.750 .006 Week#5-Week #1 73.153 .002 Week#6-Week #1 73.531 .000 Bonferroni adjustment was used on the results received from the Wilcoxin tests for making multiple comparisons. With an acceptable significance level of .05, the Bonferroni adjustment is .05/5 ¼ .01, where 5 ¼ number of tests run for each question. If p  .01, then we do not have a statistically significant difference between any two weekly scores for a given question. Table 8 mentions the test statistics. For question # 1, we see a significant difference between week 6 and 1 scores. For question # 4, statistically significant differences are observed between weekly totals for four out of five cases. For question # 5, statistically significant differences are observed between weekly totals for three out of five cases. For question # 6, we see a significant difference between week 5 and 1 scores. For question # 7, statistically
  • 21. 20 D. Roy significant differences are observed between weekly totals for two out of five cases, and for question # 8, statistically significant differences are observed between weekly totals for three out of five cases. The Wilcoxin signed rank test was performed with relation to Week # 1 only, so that one can observe if there is any significant difference in score for the subsequent weeks when compared with the performance during the first week. 5.2. Post-hoc analysis for the weekly total scores Wilcoxin signed rank tests were performed on the different combinations of total weekly scores. This was performed to pinpoint if there is a statistically significant difference between weekly total scores for the 28 participants. This test is used to compare two sets of scores that come from the same participants. A Bonferroni adjustment was used on the results received from the Wilcoxin tests for makingDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 multiple comparisons. With an acceptable significance level of .05, the Bonferroni adjustment is .05/15 ¼ .003, where 15 ¼ number of tests run. If p 5 .003, then there is no observed statistically significant difference between any two weekly total scores for a given question. Table 8 demonstrates that in most cases, there is no statistically significant difference between weekly total scores. However, a statistically significant difference could be seen between Weeks # 3 and 1, 4 and 1, 5 and 1, and 6 and 1. 5.3. Summary of responses to the post-test questionnaire The post-test questionnaire involved asking readers about their metacognitive assessment strategies like scanning through the web pages, reading headings only, reading the first line of a paragraphed text, and so on. Basically, metacognition is a complex construct and decontextualized from the skills learnt on design principles. Metacognitive skills are not directly observable, and might involve both working memory and verbal ability (Kuhn & Dean, 2004). Self-reports clearly suggested that Table 8. Wilcoxin signed-rank test between weekly total scores. Comparison between weekly totals Z score Asymptotic significance [2-tailed] W2–W1 71.578 .115 W3–W1 73.324 .001 W4–W1 73.360 .001 W5–W1 73.728 .000 W6–W1 73.579 .000 W3–W2 71.255 .210 W4–W2 72.212 .027 W5–W2 72.706 .007 W6–W2 72.201 .028 W4–W3 71.551 .121 W5–W3 72.029 .042 W6–W3 71.988 .047 W5–W4 7.798 .425 W6–W4 7.331 .741 W6–W5 .794 .427
  • 22. Computer Assisted Language Learning 21 readers resorted to metacognitive strategies when reading the web pages every week. For example, there is reasonable agreement with the statement that when reading the text from the websites, they only focused on the title, topic sentence of the first paragraph; further, there is some agreement that readers simply tried to read only the main ideas in a web page. They did not want to read through the entire text. They tried to guess meanings, used electronic dictionary to check on the meanings, but they mostly disagreed with the statement that translator software was used for authoring responses to the queries (this self-report is in sharp contrast to the observed in-class behavior). 6. Discussion The main purpose behind this experiment was to expose readers to a wide variety of web pages with similar levels of interest and motivation, but variable degreesDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 of content complexity. No specific task was assigned (related to the web pages) as is most often seen with usability testing. This helped us understand whether design education in this specific EFL context should be framed within the scope of a specific website and task-orientation; or should design analysis be conducted with open access and navigation with variable content and individual choice of exposure? Total scores between weeks do not suggest any reasonable practice effect, and no marked improvement is seen as such. On the contrary, the performance actually dropped and the final total score on the sixth week is reasonably lower, when compared with the performance during the first week. In some cases, the performance dropped and then improved later on, but only to a limited extent. Also, with most individual questions, a drop in performance was observed when compared to the first week, although for handful of questions, some recovery in performance around the third week or later was recorded. However, the lower score was often due to language deficiency rather than ability to analyze the design (as was evidenced from poor sentence construction in the responses). Data suggest that readers have done reasonably well with the design analysis and especially the subjective responses in English to the questions on audience analysis and product goals were reasonably well thought-out, and reflect higher-order thinking skills. However, overall accuracy scores, and subjective responses suggested that readers were not particularly comfortable with Q6 (navigation design) and Q7 (how practical is the content?). The possible reason why readers did not do well with navigation design was probably because they were not exposed to a use-case scenario (A use-case scenario here represents a single navigational path that should be followed to complete actions that are required to enable or abandon a website goal), where they could actually navigate between pages to get a specific task done. Unless, a task-oriented situation is presented, it might be very difficult for novice readers to differentiate between interface design, navigation design, and information design as distinctly separate categories. Readers specifically found it difficult to comment on the practicality of the content. Some of them mentioned the web technology used towards design of the website as practical, whereas others commented on the product sold by the company as practical. So, basically, it was difficult for them to understand the broad scope of the question. However, they did receive credit for improvisation in the thought process and the fact that they attempted to answer it within the scope of what they thought to be making sense. Importantly, they
  • 23. 22 D. Roy demonstrated reasonable higher-order thinking skill when attempting to answer most of the questions. They also received points for producing text to analyze, evaluate, and create an explanation of how the web page was created and what it meant. More often they lost points because the explanation was not sufficient or did not confirm to the language-level guidelines adequately. One possible reason for the overall drop in performance (total score for each question and the overall total score each week) could be that students have not used the design guidelines for analysis, and have not applied it for the web design analysis. As observed during class sessions, they often ignored the use of design guidelines as reference. This behavior is similar to not using user manuals for understanding a system. Rather, ‘‘learn as you use’’ approach often appeared as more practical to the user of a product (Schriver, 1997). Another issue that needs attention for the drop in performance is that the context of web analysis as was practiced, focused more on the extent to which readers couldDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 produce language as an analytical exercise, but was removed from the professional landscape needed to analyze design successfully. Unless readers spend adequate time with the content of the website to be analyzed, and develop a deeper understanding of the context, with a more structured focus on web analysis (e.g. talking to the users of the website, product goals, how the website was originally intended to be used, etc.), practice effect with web analysis over weeks might not show any visible improvement. Probably, it will quickly turn into a regular class assignment with very little challenge and production benefits. An extensive formative assessment (Crooks, 2001) through explanation of the context, and a weekly structural analysis following Garrett’s (2011) model might add challenge and direction to what is expected from the web analysis exercise. Another reasonable idea might be to have weekly topics dedicated to design development stages (based on Garrett’s model), and work on each draft as iterations. Each stage might be assigned 2–3 weeks, and students could develop guidelines to analyze stages like strategy, scope, structure, skeleton and surface. Students might be assigned two websites in a team, and they design questions and responses to answer each stage in the web development process for the assigned web sites. This will enable them to understand how each stage in the process work. However, there might be a different way to situate the problem of this almost non-existent practice effect. Changing content every week (the website to be analyzed) might not help readers gain any interest in the topic, understand the design and the possible applications intended. Readers might need more than a week to be familiar with the content, and be able to analyze with some success (assuming they have or developed interest in the website content). Once they get familiar with the content and the design principles, they can apply it to changing design situations with more success. But that will come through practice, and opportunity for more in-depth analysis. Students in EFL context are not specialists and cannot be expected to function like independent and professional consultants. They are still grappling with grammatical issues on the way to communications analysis in the given website context. Reasonably, they cannot embrace any website, analyze it professionally, and then write a report about it. They just might not have the design skills and language proficiency to do that in most cases. Thus, continuous exposure to a topic, formative evaluations and feedback, sustained familiarity and application opportunity with design principles, might actually show significant practice effect over time.
  • 24. Computer Assisted Language Learning 23 7. Implications for teaching and recommendations Teachers in a technical communication course (in an EFL context) might need to adopt a systematic approach in teaching web design analysis. Website analysis in a technical communication class should clearly let students explore the various design stages (e.g., navigation, interface features, content organization, etc.) in a systematic order. Readers in an EFL context might be overwhelmed with the different levels of analysis required for each question and each stage. If they are exposed to all the different design principles and stages all at once, they might actually fail to differentiate between the required responses for each of them. Although it is possible to have some similarity in the response, readers might just ignore the complexities associated with each design principle, and fail to differentiate it from other responses. A detailed reading of the responses in this study, suggests that readers have confused between interface design and navigation design principles. They have confused between the layout of information, forms, buttons, etc. on the screen, and how theDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 design facilitates moving between pages and locate the required information. Probably, it remained underemphasized in the class that a proper interface design facilitates a smoother navigation experience on the web. This causality needs to be taught over time, and with proper examples. However, in an EFL context, the teacher might struggle to prioritize the language learning goals, because language production might get more emphasis than content and context of design. In the process, and as a result, a hardcore explanation and analysis of connections between different design principles remains a low priority. Importantly, teachers should cover all their bases when it comes to explaining design principles and should prepare adequate website-based examples to demonstrate what it means, and how design queries should be analyzed and answered. A rubric-based approach where they can read about and observe web analysis decisions might actually help. These rubrics could be prepared for each stage of the web design analysis process following Garrett’s model. Following this approach, more advanced analysis within each stage in subsequent weeks of the course, might actually help readers develop a mental schema and design orientation, and help them with language production in a more complete analytical scenario. 8. Conclusion In response to the research question about L2 learners’ ability to understand design queries, it is evident from data, as reported in this article, that L2 learners (within the scope of this learning context) have broadly started understanding the design questions over time. However, in some cases the constructed arguments might have deviated from what was intended by the authors. In fact, that was embraced as a sign of creativity. Also, some responses related to audience analysis, product goals, overall presentation of the website, etc. showed better understanding, when compared to more specific questions related to use of technology, navigation, etc. Strategically, the quality of open-ended responses and in-class observation broadly suggested the use of metacognitive reading strategies (e.g. scanning, reading headlines, etc.) in the context of the website under study. However, no significant practice effect and improvement in performance over time was visible. Also, within the scope of the curriculum only limited design and
  • 25. 24 D. Roy evaluation framework was made available to the students, and that was reflected in the surface level of analysis. But importantly, the point was to explore whether in this EFL context, readers are able to construct an acceptable argument in English, and overall, readers succeeded in doing that. A more advanced course in design (with more emphasis on design rather than language) might yield better results in similar EFL contexts. Pedagogically, the existing structure of continuous assessment in design and language (as reported in this article within the scope of an introductory EFL language course) will not help them develop substantially, unless they are fed with reasonable feedback and examples of design evaluations, at regular intervals. Practice with same questions every week, without a developing schema of added complexity has probably made things less interesting in this study. The quality of responses might actually vary, depending on the relative complexity of the website and familiarity of the content. But, in general, a reasonably more systematicDownloaded by [Debopriyo Roy] at 16:12 19 December 2012 approach to design pedagogy is probably feasible in an EFL context where emphasis on language production and proficiency will always take the center stage. Notes on contributor Debopriyo Roy is a Senior Associate Professor at the Center for Language Research – University of Aizu, Japan. His specialization includes information design, technical writing and usability for computer science majors in an EFL context. He focuses on the cognitive and behavioral aspect of writing design for print and online medium for non-native speakers. He obtained his PhD in Technical Communication from Rensselaer Polytechnic Institute, New York and MA degrees in Communication and Economics. He is an active board member of the IEEE and ACM chapters in Japan, directs his own laboratory in technical communication, supervises research projects, and is an active researcher with several publications in leading journals and conference proceedings. References Adkisson, H. (2002). Identifying de-facto standards for e-commerce web sites (Unpublished master thesis). University of Washington, Washington, DC. Anderson, L.W., & Krathwohl, D. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of bloom’s taxonomy of educational objectives. New York: Longman. Anderson, P.V. (2006). Technical communication: A reader-centered approach (6th ed.). Canada: Wadsworth Learning. Atherton, J.S. (2002). Learning and teaching: Deep and surface learning. Retrieved March 6, 2003, from http://www.dmu.ac.uk/*jamesa/learning/deepsurf.htm. Atkinson, D. (1997). A critical approach to critical thinking in TESOL. TESOL Quarterly, 31, 71– 94. Baca, B., & Cassidy, A. (1999). Intranet development and design that works. Human Factors and Ergonomics Society Annual Meeting Proceedings, 43, no. 13, 777–781. Bean, J.C. (2011). Engaging ideas: The professor’s guide to integrating writing, critical thinking, and active learning in the classroom. San Francisco, CA: Jossey–Bass. Belisle, R. (1996). E-mail activities in the ESL writing class. The Internet TESL Journal, II. Retrieved December 1996, from http://iteslj.org/Articles/Belisle-Email.htm. Brinton, D.M., Snow, M.A., & Wesche, M.B. (1989). Content-based second language instruction. Boston, MA: Heinle & Heinle. Chamot, A. (1995). Creating a community of thinkers in the ESL/EFL classroom. TESOL Matters, 5, 1–16. Chapple, L., & Curtis, A. (2000). Content-based instruction in Hong Kong: Student responses to film. System, 28, 419–433. Collins, A., Brown, J., & Newman, S. (1989). Cognitive apprenticeship: teaching the crafts of reading, writing, and mathematics. In L. Resnick, Knowing, learning, and instruction (pp. 453–494). Hillsdale, HJ: Lawrence Erlbaum.
  • 26. Computer Assisted Language Learning 25 Crooks, T. (2001). The validity of formative assessments. In British Educational Research Association Annual Conference, University of Leeds. Retrieved September 13–15, 2001, from http://www.leeds.ac.uk/educol/documents/00001862.htm. Crosby, M.E., & Iding, M.K. (1997). The influence of cognitive styles on the effectiveness of a multimedia tutor. Computer Assisted Language Learning, 10, 375–386. Davidson, B. (1994). Critical thinking: A perspective and prescriptions for language teachers. The Language Teacher, 18, 20–26. Davidson, B. (1995). Critical thinking education faces the challenge of Japan. Inquiry: Critical Thinking Across the Disciplines, 14, 41–53. December, J., & Ginsburg, M. (1995). HTML and CGI unleashed. Indianapolis, IN: Sams .net. Eagleton, M. (2001). Factors that influence Internet inquiry strategies: Case studies of middle school students with and without learning disabilities. Paper presented at the annual meeting of the National Reading Conference, San Antonio, TX. Ersoz, A. (2000, June). Six games for EFL/ESL classroom. The Internet TESL Journal, 6. Retrieved February 11, 2005, from http://iteslj.org/Lessons/Ersoz-Games.htm. Facione, P.A., & Facione, N.C. (1994). Holistic critical thinking scoring rubric. Millbrae, CA: California Academic Press. Retrieved October 12, 2012, from http://npiis.hodges.edu/IE/Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 documents/forms/Holistic_Critical_Thinking_Scoring_Rubric.pdf Fox, H. (1994). Listening to the world: Cultural issues in academic writing. Urbana, IL: National Council of Teachers of English. Garrett, J.J. (2011). The elements of user experience: User-centered design for the web. San Francisco, CA: Peachpit Press. Harrison, R., & Thomas, M. (2009). Identity in online communities: Social networking sites and language learning. International Journal of Emerging Technologies and Society, 7, 109– 124. Heift, T. (2008). Modeling learner variability in CALL. Computer Assisted Language Learning, 21, 305–321. Instone, K. (1997). Usability heuristics for the web. Retrieved April 13, 2000, from http:// webreview.com/97/10/10/usability/sidebar.html Kabilan, M.K. (2000). Creative and critical thinking in language classrooms. The Internet TESL Journal,6. Retrieved November 21, 2005, from http://itselj.org/Techniques/ Kabilian-CriticalThinking.html. Kek, M.A., & Huijser, H. (2011). The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. Higher Education Research and Development, 30, 329–341. Koyani, S.J., Bailey, R.W., & Nall, J.R. (2004). Research-based web design & usability guidelines. Computer Psychology. Krajka, J. (2000). Using the Internet in ESL Writing Instruction. The Internet TESL Journal, VI. Retrieved December 5, 2012, from http://iteslj.org/Techniques/Krajka-WritingUsing Net.html Krug, S. (2000). Don’t make me think! A common sense approach to web usability. Indianapolis, IN: Que. Kuhn, D., & Dean, D. (2004). Metacognition: A bridge between cognitive psychology and educational practice. Theory into Practice, 43, 268–273. Kukulska-Hulme, A., & Shield, L. (2008). An overview of mobile-assisted language learning: From content delivery to supported collaboration and interaction. ReCALL, 20, 271–289. Lee, K. (2000). Energizing the ESL/EFL Classroom through Internet activities. The Internet TESL Journal, VI. Retrieved April 2000, from URL http://www.aitech.ac.jp/*iteslj. Levy, M. (2002). CALL by design: Discourse, products and processes. ReCALL, 14(1), 58–84. Lonfils, C., & Vanparys, J. (2001). How to design user-friendly CALL interfaces. Computer Assisted Language Learning, 14, 405–417. Maghsudi, M., & Talebi, S.H. (2009). The impact of lingualuity on the cognitive and metacognitive reading strategies awareness of reading comprehension ability. Journal of Social Sciences, 18, 119–126. Mike, D. (1996). Internet in the schools: A literacy perspective. Journal of Adolescent and Adult Literacy, 40(1), 1–13. Moffett, J., & Wagner, B.J. (1983). Student-centered language arts and reading: A handbook for teachers. (5th ed.). Boston, MA: Houghton Mifflin.
  • 27. 26 D. Roy Muir, A., Shield, L., & Kukulska-Hulme, A. (2003). The pyramid of usability: A framework for quality course websites. In Proceedings of EDEN 12th Annual Conference of the European Distance Education Network, The Quality Dialogue: Integrating Quality Cultures in Flexible, Distance and eLearning, 15–18 June 2003, Rhodes, Greece (pp. 188–194). Milton Keynes: TLRG. Neilson, J. (1997, October). How users read on the web. Jacob Neilson’s Alertbox (ISSN 1548–5552). Retrieved November 15, 2012, from http://www.useit.com/alertbox/9710a. html Nielsen, J. (2000). Designing web usability. Indianapolis, IN: New Riders. Lynch, P., & Horton, S. (2002). The web style guide (2nd ed.). Retrieved December 1, 2012, from http://www.webstyleguide.com Pearson, P.D., & Tierney, R. (1984). On becoming a thoughtful reader: Learning to read like a writer. In A. Purves, & O. Niles (Eds.), Becoming readers in a complex society (pp. 144– 173). Chicago: University of Chicago Press. Peterson, M. (1998a). The virtual learning environment: The design of a website for language learning. Computer Assisted Language Learning, 11, 349–361. Peterson, M. (1998b). Creating hypermedia learning environments: Guidelines for designers.Downloaded by [Debopriyo Roy] at 16:12 19 December 2012 Computer Assisted Language Learning, 11, 115–124. Pica, T. (2000). Tradition and transition in English language teaching methodology. System, 29, 1–18. Plass, J. (1998). Design and evaluation of the user interface of foreign language multimedia software: A cognitive approach, Language Learning and Technology, 2(1), 35–45. Retrieved December 2, 2003, from http://llt.msu.edu/vol2num1/article2/. Roy, D. (2010, July). Reading strategies for procedural information in EFL business writing environment: An exploratory analysis. In Proceedings of the IEEE Professional Communication Society Conference (pp. 143–151), Enschede, the Netherlands. Schriver, K.A. (1997). Dynamics in document design. New York: John Wiley. Shang, H. (2007). An exploratory study of e-mail application on FL writing performance. Computer Assisted Language Learning, 20(1), 79–96. Shield, L., & Kukulska-Hulme, A. (2003). Language learning websites: designing for usability. Paper presented at EUROCALL’03: New Literacies in Language Learning and Teaching, University of Limerick, Ireland. Spyridakis, J.H. (2000). Guidelines for Authoring Comprehensible Web Pages and Evaluating Their Success. Technical Communication, 47, 301–310. Staton, J. (1984). Thinking together: Language interaction in children’s reasoning. In C. Thaiss, & C. Suhor (Eds.), Speaking and writing, K-12 (pp. 144–187). Champaign, IL: National Council of Teachers of English. (EDRS No. ED 233 379). Stevens, V. (2006). Second life in education and language learning. TESL-EJ, vol. 10. Retrieved February 15, 2012, from http://tesl-ej.org/ej39/int.html. Suhor, C. (1984). Thinking skills in English—And across the curriculum. (ERIC Document Reproduction Service No. ED 250693. Tan, G., Gallo, P.B., Jacobs, G.M, & Kim-Eng Lee , C. (1999). Using cooperative learning to integrate thinking and information technology in a content-based writing lesson. The Internet TESL Journal, 5, Retrieved December 5, 2012, from http://iteslj.org/Techniques/ Tan-Cooperative.html Tarvin, W., & Al-Arishi, A. (1991). Rethinking communicative language teaching: Reflection and the EFL classroom. TESOL Quarterly, 25(1), 9–27. Trokeloshvili, D.A., & Jost, N.H. (1997). The Internet and foreign language instruction: Practice and discussion. The Internet TESL Journal, 3. Retrieved December 6, 2012, from http://aitech.ac.jp/*teslj Tytler, R. (2004). Higher order thinking: Support reading for EME244/502 (pp. 1–7). Australia: Deakin University. Van Hoosier-Carey, G. (1997). Rhetoric by design: Using web development projects in the technical communication classroom. Computers and Composition, 14, 395–407. Warschauer, M. (1997). Computer-mediated collaborative learning: Theory and practice. Modern Language Journal, 81, 470–481. Wheeler, S., & Wheeler, D. (2009). Using Wikis to promote quality learning in teacher training. Learning, Media and Technology, 34(1), 1–10.
  • 28. Computer Assisted Language Learning 27 Williams, J.B., & Jacobs, J.S. (2004). Exploring the use of blogs as learning spaces in the higher education sector. Australasian Journal of Educational Technology, 20, 232–247. Yano, Y., Long, M.H., & Ross, S. (1994). The effects of simplified and elaborated texts on foreign language reading comprehension. Language Learning, 44, 189–219. Zhang, P., & Dran, G. (2000). Satisfiers and dissatisfiers: A two-factor model for website design and evaluation. Journal of the American Society for Information Science, 51, 1253– 1268. Zimmerman, D.E., & Akerelrea, C.A. (2002). A group card sorting methodology for developing informational Web site. In Proceedings of the International Professional Communication Conference, 2003 (pp. 437–445). Portland, OR: IEEE Professional Communication Society.Downloaded by [Debopriyo Roy] at 16:12 19 December 2012