Successfully reported this slideshow.
You’ve unlocked unlimited downloads on SlideShare!
1University of Aizu, Graduation Thesis. March, 2012 s1170047AbstractLiterature on computer assisted language learningis mostly silent on how web-based design analysiscould be effectively used as a tool and framework fordeveloping critical thinking skills and languageproficiency in an EFL classroom. This articlereported on how EFL learners perform with Englishwebsite analysis tasks in a language reception andproduct context. English website analysis ischallenging for an EFL learner with low-moderateEnglish language proficiency. The website analysisexperiment with the Belize tourism website asreported in this article was performed with a group of16 students in an EFL classroom. The results mainlydiscussed students preliminary understanding of thewebsite content, design, navigation and usability,rather than the aspects of how their use of English(grammatically) for responding to design queriesduring website analysis was correct ordeficient. Further, this study also analyzed how thethree coders with non-native English languageproficiency used for analyzing the responses to theopen-ended design questions asked of the participants,interpreted responses based on the criteria gradingrubric used for the purpose. Results show relativelyhigher levels of proficiency when answeringquestions related to overall website organization,design, layout and audience analysis. However,performance scores dropped for more inference-based queries related to overall use of technology,validity of content etc. Some relative variations inscoring could be observed between coders. Relativelylarge variation in the scores could be seen in case ofCoder A, when compared to B and C.1 IntroductionThere is substantial research in language studiesand cognition that establishes cognition and languagedevelopment to be closely related (Liaw et al., 2007).Theorists and educators have for long argued aboutthe close relationships between language andthinking skills (Piaget, 1971; Vygotsky, 1962). It isbelieved that developing students’ ability to reflect ontheir own learning process can help them progress inlearning. Literature in foreign language studies haveclearly established a relation showing how higher-order thinking skills promote higher order learningskills which in turn enable students to reach higherlevels of language proficiency (Renner, 1996).Educators have identified multiple features andelements of reading and writing to have alwaysinfluenced thinking skills to a large extent (Moffett &Wagner, 1983; Pearson & Tierney, 1984; Stanford &Roark, 1974; Staton, 1984). There has always been astrong appeal to promote higher order thinking inESL and EFL classrooms, and research has clearlyfocused on the need to foster critical thinking in aforeign language classroom (Chamot, 1995; Tarvin &Al-Arishi, 1991; Chapple & Curtis, 2000; Davidson,1994, 1995). However, unfortunately languagelearning and thinking skills were almost alwaystreated as independent processes (Miraman &Tishman, 1988; Suhor, 1984; Pica, 2000).This study on website analysis as a tool for criticalthinking in an EFL classroom is partly influenced bythe Kasper (2000) study which focused on extensiveand sustained content analysis using informationtechnology resources. This study establishes that suchattempt helps with both linguistic and cognitiveinformation processing ability. This study is alsoinfluenced by the fact that we have shifted from Web1.0 towards Web 2.0 where there is an increasedemergence of computer-mediated communication,social networking and active interaction between theuser and the web environment. Website analysis inthis environment help EFL readers not only to scanthe website under consideration, but also encouragesthem to access online discussion forums, testimonialson the web, additional resources from other sources,talk live with a service agent, chat etc. It brings to lifean environment which was long considered to bepassive and only generating information which wascompletely up to the user to receive. So, informationreception was a one-way traffic.However, in this web-based communicativeenvironment, English website analysis is challengingfor an EFL learner with low-moderate Englishlanguage proficiency. English website analysis in atypical EFL environment involve reading andcomprehending the website content in English,performing selected translation of the content in theirnative language, or switching back and forth betweenEnglish and the native version if available,comprehend the design queries and its scope in theirnative language, frame the responses for the designPerformance with Website Analysis in an EFLClassroom: Exploring Consistency in CodingTakahide Ishii s1170047 Supervised by Prof. Debopriyo Roy
2University of Aizu, Graduation Thesis. March, 2012 s1170047questions mentally in their native language, and thensome will try to respond directly in English, whileothers will take the help of translation software(Google translator in most cases), and online and/orportable dictionaries (e.g., Weblio). Besides, becauseof the way a typical Web 2.0 environment is used, atypical user might also get involved in searching andreading comments in English or their native languageabout a specific place, hotel, location etc. Thewebsite analyzer might also want to ask questions toa service agent. On a completely different topic, if theEFL website analyzer is not sure about how toapproach a website design query being asked, he/shecould also start reading about website design etc.The point here is that we can never be exactly sureabout the combination of tools being used for criticalthinking and language processing during websiteanalysis in an EFL context. The individual differencein responses to website analysis could arise not onlyfrom ability to think through the problem, but alsodue to the difference in the online resources solicitedduring the processing of a response.Whatever may be the case, in a language context,website or any other interface analysis with specificdesign-based queries might be one way to promoteanalytical thinking through its focus on creating,evaluating and analyzing (Atherton, 2002) andpromotes active participation, argumentation,problem solving, conducting investigations andtackling subject matter that is complex (Tytler, 2004).The experiment with website analysis performedwith a group of 16 students in an EFL classroom willequip students to better understand the interface theyuse for elearning applications. The analysis of theresults from the experiment reported in this articlemainly discussed students preliminary understandingof the website content, design, navigation andusability, rather than the aspects of how their use ofEnglish (grammatically) for responding to designqueries during website analysis was correct ordeficient. Further, this study is also designed toanalyze how coders with non-native Englishlanguage proficiency used for analyzing the open-ended design questions, interpret responses based onthe criteria grading rubric as is used for the reportedexperiment.The major point in the literature review concerningthe need to think and analyze in the target language ismore relevant since the entire analysis was conductedin English for non-native speakers. This study isimportant for various reasons:● The study focuses on the extent to whichreaders could process a critical response to awebsite design query.● The study tries to identify whether readers areable to understand suitably and differentiatebetween design queries asked during websiteanalysis.The study tries to identify the extent to whichnovice website designers in an EFL context are ableto assess a design response based on a specificcriteria list.2 Research QuestionsThe following research questions provide thebackbone of the experiment reported in this study.● How did the EFL readers perform with variousdesign questions as asked during websiteanalysis?● Is there any significant difference betweencoders who graded responses to the designquestions suggesting significant differencebetween responses to a design question and/orsuggesting that one or more coder(s) have notunderstood the questions and responsescorrectly and resultantly could not use theassessment rubric correctly for grading thedesign responses?This is an exploratory analysis because theliterature on computer assisted language learning isnot rich on how website analysis could be used as atool for promoting critical thinking and languageproficiency in a language classroom.3 Review of the LiteratureUsing the Internet for ESL/EFL (English assecond/foreign language) writing instructions is acommon practice now (Krajka, 2000). The issue ofusing web pages for teaching writing is raised in Tanet.al. (1999). Trokeloshvili and Jost (1997) concludedthat public displaying of student text on a studenthome page highly motivates students to conductwriting and publishing, and helps to remove mentalblocks associated with publishing ordinary writing.There is research indicating that web analysis has thepotential to be a beneficial exercise (Bunz, 2001;Spyridakis, 2000) and more so in an EFL context.The information processing strategies reportedearlier will influence readers ability to analyzeinformation organization, design and layout,grouping, navigation, audience analysis etc. Neilson(1997) has demonstrated that the website analysistask is different from any other reading task, becauseit requires an analytical mindset, analysis andresultant English text production in a specific designcontext. Also, ability to explain a design and layoutmight not always incorporate reading andcomprehending the entire text in the web page.Readers might get away with merely understanding
3University of Aizu, Graduation Thesis. March, 2012 s1170047the headlines, the menu items, the introductorysentence of a paragraph etc. (Neilson’s Alert box,1997). Research (Lynch et al., 2001) suggests thatextending critical thinking skills to the web isimportant in a first language context and there isnothing in the literature to suggest why the sameargument should not be valid in a foreign languagecontext.Van Hoosier-Care (1997) describes the websiteassignment as a rhetorical exercise in the technicalcommunication classroom. It is important for thereader to understand the conceptual process ofdesigning a website, include the rationale of theproject, target audience, purpose of the website etc(December and Ginsberg, 1995). The experimentaland goal-orientated nature of web design projectsinvolve tasks such as deciding with a partner whereto place a picture on a page being constructed, orbrowsing, which requires active choices of where tosearch next. These are claimed to help promotehigher order thinking skills (Mike, 1996), whichinclude reviewing, scanning, selecting andnegotiating, and particularly important for EFLstudents doing further studies in other disciplines,research and rhetorical skills that may be developed.Furthermore, Warschauer (1997) points out that webdesign skills incorporate situated learning: thatwhich allows students: "to carry out meaningful tasks,and solve meaningful problems, in an environmentthat reflects their own personal interests as well as themultiple purposes to which their knowledge will beput in the future" (Collins, Brown, & Newman, 1989).With the goal of designing and publishing web pages,students can actively make use of new technologies,skills, and knowledge. Warschauer (1997) alsoacknowledges this, and supports the view that manyskills, in particular, those that are involved incollaboratively accessing and interpreting worldwideinformation, and with peoples from different cultures,will be critical for success in the 21st century.The design and other questions asked during thiswebsite analysis were based on the model proposedby Garrett (2011). Specific questions were designedbased on audience and task analysis, product goals,information design, interaction design, informationarchitecture, etc. Figure 1 shows an explanation ofGarrett’s (2011) user experience model that has beensimplified and referred to as part of this study.http://www.netmagazine.com/features/content-first-content-left-right-and-centreFigure 1. Garrett’s (2011) User Experience Model:An Explanation4 Sample and ContextParticipants (N=17) are junior level students (agegroup: 18-20 years) in their third year undergraduateprogram specializing in computer science in aJapanese technical university. With this specificelective course named Writing and Design for WorldWide Web, students mostly focused on the process ofonline writing, designing and analyzing websitesbased on design principles, besides designing conceptmaps on websites they analyzed. So, for most weeksduring the course, there was an all-round effort tosharpen student skills on writing and thinking.Students were given brief lectures on basic designprinciples for website design, followed by at leastfour weeks of regular practice on website design,analysis and brainstorming activities on websitecontent which included designing concept maps.5 Methods5.1 Preparing for the experimentDuring the first couple of weeks into the course, aproper in-class lecture was delivered on the basics ofwebsite design. The lecture focused on how issueslike organization, layout, formatting, and typography,content chunks, simple wording, headings, titles, useof white space etc are important designconsiderations.5.2 In-class Website AnalysisAssignment
4University of Aizu, Graduation Thesis. March, 2012 s1170047As part of the website analysis assignment (namedas Assignment A), students were asked to study aspecific website in a chosen domain (e.g., education,entertainment, government, tourism, sports etc) andthen provide open-ended responses to 8 standardquestions asked of them. The questions were relatedto content, presentation, navigation, technology used,real-world application of content, website and contentusability, audience analysis and product goals.Readers had one complete week to complete thisassignment. Readers confronted the same set ofdesign questions every week, but the website to beanalyzed changed every week.5.3 Actual ExperimentThe experiment was conducted in a controlledenvironment as an in-class activity, and over twoweeks. The first part of the website analysis activitywas conducted entirely in Moodle. The actualexperiment ran over two weeks and started during the7thweek into the course, at the time when studentsalready had two weeks of design lecture and fourweeks of experience with website design, planningand analysis. During the 1st week of the actualexperiment, students analyzed the Belize tourismwebsite based on the 8 open-ended questions askedduring the same assignment that happened over theprevious weeks (but with a different website eachweek), and they had one-week to complete theanalysis, besides the 90 minutes of class time wherethey could consult their friends. Students enteredtheir responses in Moodle (learning managementsystem) as in-line text in open-ended format. Toencourage writing and proper explanation, theminimum word limit for the assignment was set at500 words. They had to write the responses in theirown words.5.4 InstrumentsThe Belize tourism website was chosen with thefollowing reasonable conditions in mind. The content in the website is not text heavyand clear navigation is possible. Information could be searched directly fromthe home page. Attractive pictures are available to keep thereader engaged in the task of findinginformation. Japanese version of the webpage is NOTavailable so that readers are forced to look forinformation from the English version alone.The instructions for the first week of theassignment (where readers had to respond to 8 open-ended questions) were all in English, largely becausereaders already had practice from the weeks beforeregarding what is expected of them.The questions asked for the design analysis were asfollows.Table 1Open Design Questions asked of Participants1. Explain whether the organization ofinformation in the site is user-friendly or not?2. Explain whether the presentation of contentis appealing or not?3. Explain whether the effective use oftechnology is demonstrated?4. Who is the target audience? Is the websiteappropriate for the projected audience?5. Explain the quality of the text content.6. Is the information accessible?7. Explain whether the resources use real-world situations.8. Here are some common reasons for buildingthis website. Rank them in order of importance toyou. Do you have a reason that is not listed?5.5 Data Analysis – Use of CodersThree undergraduate students (not part of the classwith the sample) who took the same class at anearlier semester were appointed as coders with thetask of grading the first week assignment wherereaders participated in an open-ended evaluation ofthe Belize tourism website. The coders wereadvanced undergraduate students with reasonableEnglish language proficiency and better experiencewith website design and analysis, and had the abilityto grasp design lectures with reasonable success. Thecoders were given a set of criteria (discussed in thenext section and shown in Figure 3), on the basis ofwhich they graded each open-ended response, for allthe 8 questions assigned. Coders were specificallytrained as to how they could grade each response (foreach of the 8 questions) on the basis of six criteria.Each of the 8 open-ended responses for each of the17 participants was rated thrice, once by each of thethree coders. The three coders went through apractice session during the first week, wherein theygraded one student and then wrote a verbal report foreach criterion, justifying their grade. The group(including the project supervisor and the threecoders) then discussed each grade for each questionand criterion. The three coders were handed out allthe 17 response sheets (with answers to each of 8questions) for coding.6 Findings
5University of Aizu, Graduation Thesis. March, 2012 s1170047The first important finding that readers would beinterested to know is how participants reading theBelize tourism website in this EFL context hasperformed with the 8 design questions they had towrite based on their understanding of the content inthe website, and their impression of the websitedesign, capabilities, levels of comfort navigatingthrough the website, etc.Each coder has graded each of the 8 questionsseparately, based on 6 criteria, as reported in themethods section. So, the mean and SD values we seeabove is the sum of mean scores obtained from thethree graders. The maximum sum of mean values foreach question could be no more 18 (6 * 3) (6 beingthe maximum score for each question). We see thatthe highest mean value was obtained for the firstquestion on “whether the organization of the websiteis user-friendly” (sum of mean = 14.56). A high sumof mean score does not indicate that the Belizetourism website is highly user-friendly. Rather, itmeans that readers who were exploring the websitecould explain the answer with maximum efficiency,in line with the six criteria. Mostly we see the sum ofmean scores in the range of 11 ~ 12, indicating anaverage score for a single coder to be in the range of3.8 ~ 4.0 in a 6 scale.Next, let us see how the three coders have gradedthem based on the 6 criteria used for each of the 8questions. Each coder could put a grade of 1 or 0 foreach of the six criteria assigned for each of the 8questions. That means each criterion has a minimumvalue of 0 and a maximum value of 8, across 8questions. Now, if we consider all the 6 criteriatighter, each criteria would be in the range of 0 ~ 8.So, the maximum total that a person can score for the8 questions in total, for a given coder is 48 (6 criteria* 8 questions). So, the maximum mean score couldbe 48 divided by 6 (criteria) = 8 (for each criteria),and a minimum mean score could be 0 (for eachcriteria).Data shows the highest sum of mean scores for Q1(organization of information is user-friendly) at 14.56,with the score for Q2 (explain whether thepresentation of content is appealing) close behind at12.56. We see relatively lower scores for Q3 ~ Q7 inthe range of 10.33 ~ 11.67. A low score of 10.33 isobserved for Q7 (whether the resources use realworld situation). Analysis of actual responses inMoodle show that readers were not exactly sureabout what to include as part of their response to Q7.While in some cases participants included talk abouttechnologies, in some other cases they actuallyanalyzed the content of the website and whether itincluded any mention of technology and somethingthat is important for real life applications. However,coders observed relatively high levels of accuracyand response quality for questions related to audienceanalysis (Q4).Data further shows us the mean and SD values forparticipants as marked by Coder A. For Coder A, wesee that for most participants a mean of total meanscore is in the range of 4 ~ 5.5 or a little more.However, participants S6, S13 and S14 have donevery well with a high score around 6.5 ~ 7. Thesescores are the average for all the 6 criteria combined.Data shows us the mean and SD values forparticipants as marked by Coder B. For Coder B, wesee that for most participants a mean of total meanscore is in the range of 6 ~ 7.5 or a little more. Thesemean scores are significantly more than what weobserved for Coder A. However, participants S3, S7and S8 have scored less than other participants. Wesee some similarity between the results of Coder Aand B.For Coder C, similar to Coder B we see acomparatively high mean of total mean score foralmost all the participants. For S8 we see aconsistently low score across all the coders,confirming very low levels of proficiency in writingthe answers to the open-ended design questions. In allother cases, we see a consistent high score in therange of almost 7 and above, and S11 securing a fullscore of 8. Surprisingly, Coder A has given only 5.83to S11 and also, the other scores are consistentlylower for other participants as well.More data allows us to see an overview of thesignificant Pearson correlation values between thescores on the 8 open-ended questions. We see a ratherhigh number of significant correlation values,suggesting similarity of response. The scores for eachquestion, as calculated in the table 6 is a mean ofscores obtained from all three coders.7 DiscussionResults show a stronger relative performance forwhether the Belize tourism website is user-friendly ornot, and that readers could successfully explainwhether the content of the site is user-friendly. Themean scores for the above two questions arerelatively higher when compared to other scores. Ahigh score also suggests that readers could provideconcrete examples in support of their argument, andalso that technically, the language proficiencydemonstrated while writing the answers are ofacceptable quality. However, when we observe thesum of mean scores for each question, we do not seemuch of a difference that could qualify assignificantly different from each other. Given the factthat each coder was grading the same response, itcould be safely concluded that there is a large
6University of Aizu, Graduation Thesis. March, 2012 s1170047variation the score as seen in case of Coder A, butrelatively lesser variation for scores assigned byCoder B and C. This makes one wonder if Coder Awas relatively stricter than Coder B and C. However,strictness is a relative term and it also makes uswonder how Coder B and C have interpreted theresults for any given criteria. One reason for choosingmultiple coders is to get a weighted average of thescores and then consider it to explain the actualscores on the open-ended design questions as asked.However, in spite of this the fact remains that theratings of individual coders will remain subjectivelyvaried in this EFL context. The variability rises fromthe EFL orientation of the coders themselves, theirown moderate levels of language proficiency andmoderate levels of experience with interface design.8 ConclusionThis is an interesting preliminary analysis of howL2 learners in a typical EFL context could approachEnglish website analysis and how coders with betterlanguage proficiency and understanding of thespecific design context could interpret responses interms of rubrics. For coders, it was a test of not onlyunderstanding what constitutes valid information, andgood organization of response, but it also requiredability to read through the criteria rubric used in thestudy, and demonstrate at least moderate levels oflanguage proficiency. Future studies couldextensively look at website analysis in an EFLcontext with more structured design questions, andwith each design questions having specific sub-questions to channelize readers’ thought process inthe right direction. Similarly, coders could be trainedin very specific assessment mechanisms andstandards to bring about uniformity and eliminatesubjectivity among responses. Finally, theseexperiments or assignments when used in a language-learning context, should clearly explain the languagelearning outcomes, processes and expectations. Oneserious limitation of the study was systematiccontinuous feedback and input from the primaryinvestigator or instructor of the course to the coders,and to the respondents of the open-ended questions.That process could have enriched the process ofassessment, but in a repeated measures designcontext. That scenario is clearly beyond the scope ofthis specific experiment. However, this study is oneof its kinds because of lack of any substantialliterature on website analysis in EFL languagelearning context in the field of computer assistedlanguage learning.Reference Hasnah Tang King Yee, Wong Su Luan, AhmadFauzi Mohd Ayub and Rosnaini Mahmud , “AReview of the Literature: Determinants ofOnline Learning Among Students”, EuropeanJournal of Social Sciences, Vol. 8, no. 2, 2009,pp. 246-247. F. Genesee, G. R. Tucker and W. E. Lambert,“Communication Skills of Bilingual Children”,Child Development, Vol. 46, no. 4, Dec. 1975,pp. 1010-1014. James E. Bailey, Adriana Sburlati, VassilyHatzimanikatis, Kelvin Lee, Wolfgang A.Renner and Philip S. Tsai, ISI Journal CitationReports: Biotechnology and Bioengineering.New York: Thomson Reuters, 1996. Meei-Ling Liaw, “Content-Based Reading andWriting for Critical Thinking Skills in an EFLContext”, English Teaching & Learning 31.2,summer 2007, pp. 45-87. Servat Shirkhani and Mansour Fahim,“Enhancing critical thinking in foreignlanguage learners”, 1st InternationalConference on Foreign Language Teaching andApplied Linguistics, May 5-7 2011, pp. 1091-1095. Loretta F. Kasper, “NEW TECHNOLOGIES,NEW LITERACIES:FOCUS DISCIPLINERESEARCH AND ESL LEARNINGCOMMUNITIES”, Language Learning &Technology, vol. 4, no. 2, pp. 105-128,September 2000. Gail Chittleborough, Wendy Jobling, PeterHubber, and Gerard Calnin, The use of Web 2.0Technologies to promote higher order thinkingskills. Canberra: Australian Association forResearch in Education, 2008. Binnur GENC ILTER, “EFFECT OFTECHNOLOGY ON MOTIVATION IN EFLCLASSROOMS”, Turkish Online Journal ofDistance Education-TOJDE, ISSN 1302-6488,vol. 10, no. 4, Article 9, October 2009. Krajka, J, “Using the internet in ESL writinginstruction”, The Internet TESL Journal, vol. 6,no. 11, 2000. Lawrence Chao, “Learning to write in Englishvia the internet”, January 2009. Ulla K. Bunz, Usability and Gratifications[microform]: Towards a Website AnalysisModel. Washington, D.C.: ERIC Clearinghouse,2001. Magda Pieczka, Public Relations: CriticalDebates and Contemporary Practice. London:Routledge, 2006.
7University of Aizu, Graduation Thesis. March, 2012 s1170047 Jakob Nielsen, “How Users Read on the Web”.[Online]http://www.nngroup.com/articles/how-users-read-on-the-web/ Ulla Bunz, “The Website Assignment as aValuable Exercise – Beyond EstablishingPresence to Creating Significance”, GermanOnline Research Conference Göttingen, May2001, pp. 10-15. December, John and Ginsberg, Mark, HTMLand CGI. Indiana: Sams Publishing, 1995. Mike D, “Internet in the schools: A literacyperspective”, Journal of Adolescent and AdultLiteracy, vol. 40, no. 1, pp. 1-13, 1996. Warschauer M, “Computer-MediatedCollaborative Learning: Theory and Practice”,Modern Language Journal, vol. 81, no. 3, pp.470-481, 1997. Iain Davey, “The use of collaborative Webpage-design projects for teaching EFL, with afocus on Japanese university students”.[Online]http://callej.org/journal/3-1/davey.html