EVALUATION OF THE APPLICATION OF E-LEARNING METHODOLOGIES TO THE EDUCATION OF ENGINEERING

2,642
-1

Published on

PhD dissertation
This study researched the question of alignment between the intended learning outcomes and assessment in the field of engineering education. Based on the initial problem, four research questions were developed that focused on: defining and describing learning outcomes in engineering; defining and describing assessment methods and e-assessment tasks; defining a model for achieving alignment between learning outcomes and assessment; finally, applying the model to link specific learning outcomes with specific assessment methods.
During this study a conceptual model was developed with the goal of providing an answer to the research problem, the ALOA (Aligning Learning Outcomes with Assessment). The model was derived from literature research and included different components directly linked with the research questions.
The ALOA conceptual model was developed with the intention of being used by different stakeholders in Higher Education Institutions. The study defined four different scenarios of implementation of the ALOA model: to verify current alignment in existing courses; to develop an assessment strategy based on statements of learning outcomes; to verify vertical alignment of courses with higher level learning outcomes; to verify horizontal alignment of learning outcomes defined at the same level but in different contexts, as in mobility situations.
The main theoretical findings of the study were related with the clarification of core concepts in terms of assessment methods and assessment tasks. There was also an attempt to structure general information concerning assessment and e-assessment. In terms of the concept of alignment this study contributes with an innovative perspective that includes the clarification of the meaning of alignment and the definition of four criteria for achieving alignment: match, emphasis, coverage and precision

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,642
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
172
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

EVALUATION OF THE APPLICATION OF E-LEARNING METHODOLOGIES TO THE EDUCATION OF ENGINEERING

  1. 1. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringIEVALUATION OF THE APPLICATION OF E-LEARNING METHODOLOGIES TO THEEDUCATION OF ENGINEERINGRITA RODRIGUES CLEMENTE FALCÃO DE BERREDODissertação submetida para satisfação dos requisitos do grau deDOUTOR EM MEDIA DIGITAISESPECIALIDADE DE CRIAÇÃO DE AUDIOVISUAL E DE CONTEÚDOS INTERACTIVOSOrientador: Professor Doutor Alfredo Augusto Vieira SoeiroAGOSTO DE 2012
  2. 2. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringII
  3. 3. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringIIIVersão revista em Abril 2013Dissertação desenvolvida no âmbito do Programa Doutoral em Media Digitais da Faculdade deEngenharia da Universidade do Porto em colaboração com a Universidade do Texas em AustinRevised version, April 2013Dissertation submitted to the Doctoral Programme in Digital Media at the School of Engineering ofUniversity of Porto, in collaboration with University of Texas in Austin
  4. 4. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringIV
  5. 5. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringVACKNOWLEDGMENTSI want to express my sincerest gratitude to Professor Alfredo Soeiro for encouraging and supervisingmy research. His constant support, guidance and the several productive discussions were essential forreaching this final stage.My sincerest appreciation is extended to Professor Gráinne Conole and Professor Joan Hughes for theuseful conversations and their activity in my committee.I am also grateful to Professor Raul Vidal, Professor Rui Maranhão and Professor José FernandoOliveira for their willingness to participate in this research study. I am especially thankful to ProfessorGabriel David for his willingness to participate in the study and his continuous disposition to help inthe development of the technical component of the project.I would also like to acknowledge my appreciation to Dra. Lígia Ribeiro and to all the members of theUnit for New Technologies in Education of Universidade do Porto for the constant support in mypursuit to extend my studies. My gratitude also goes to the Fundação para a Ciência e Tecnologia fortheir financial aid and the opportunity to research the area of e-learning and assessment.My appreciation also goes to my closest friends and family for their support and understanding thathelped me to overcome the difficulties created by research work. A very special thanks goes to Nuno,for being there, for his support, encouragement and patience, and especially for his commitment to myresearcher’s life. Finally, my sincerest gratitude goes to my Mother that was always there, supportingme in every possible way during this long period and in particular for being the best Grandmother thatFrancisco could ever wished to have. To Francisco, my son, I dedicate this work.Your education is the only thing that you will always carry with you,no matter what happens. Nobody can take it away.Luis Falcão de BerredoTo you, my Father, only one word: saudade…
  6. 6. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringVI
  7. 7. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringVIITABLE OF CONTENTSACKNOWLEDGMENTS .....................................................................................................................VTABLE OF CONTENTS ...................................................................................................................VIIABSTRACT......................................................................................................................................... XIRESUMO .......................................................................................................................................... XIIIRESUME.............................................................................................................................................XVLIST OF TERMS AND ACRONYMS IN USE.............................................................................XVIILIST OF TABLES.............................................................................................................................XIXLIST OF FIGURES...........................................................................................................................XXICHAPTER 1 - INTRODUCTION....................................................................................................11.1 Background ......................................................................................................................................... 11.2 Field of Study and Intended Stakeholders........................................................................................... 21.3 Background and motivation of the researcher .................................................................................... 31.4 Statement of the problem and research questions.............................................................................. 41.5 Impact expectations of the study ........................................................................................................ 51.6 Structure of the study ......................................................................................................................... 61.7 Publications related with the research study ...................................................................................... 6CHAPTER 2 - RELATED WORK...................................................................................................92.1 Introduction ........................................................................................................................................ 92.2 Assessment ........................................................................................................................................112.3 Overview on e-assessment.................................................................................................................302.4 Learning Outcomes ............................................................................................................................322.5 The alignment question......................................................................................................................39
  8. 8. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringVIII2.6 Learning Outcomes in engineering education ....................................................................................412.7 Conclusions from the literature review..............................................................................................45CHAPTER 3 - RESEARCH METHOD........................................................................................473.1 Research design .................................................................................................................................473.2 Development of the conceptual model..............................................................................................483.3 Multiple case-studies approach .........................................................................................................523.4 Data collection...................................................................................................................................523.5 Data analysis......................................................................................................................................543.6 Validity, reliability and limitations .....................................................................................................553.7 Ethical considerations ........................................................................................................................56CHAPTER 4 - THE ALOA CONCEPTUAL MODEL.................................................................574.1 The rBloom matrix .............................................................................................................................584.2 Assessment methods .........................................................................................................................674.3 E-assessment tasks.............................................................................................................................864.4 The description of LO in EE.................................................................................................................924.5 Defining relations.............................................................................................................................107CHAPTER 5 - IMPLEMENTATION........................................................................................ 1135.1 Implementation scenarios................................................................................................................1135.2 Practical tools ..................................................................................................................................1185.3 Case studies .....................................................................................................................................123CHAPTER 6 - INTERPRETATION OF RESULTS................................................................ 1356.1 Application of the ALOA model........................................................................................................1356.2 Adequacy of the chosen LOs in EE (RQ1) ..........................................................................................1366.3 Adequacy of the selection of assessment methods (RQ2)................................................................1376.4 Adequacy of the ALOA model to describe assessment.....................................................................137
  9. 9. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringIX6.5 The alignment question (RQ3 and RQ4) ...........................................................................................138CHAPTER 7 - CONCLUSIONS ..................................................................................................1417.1 Conclusions regarding the research problem....................................................................................1417.2 Theoretical implications of the research project ..............................................................................1437.3 Practical implications of the research project...................................................................................1447.4 Implications for future research.......................................................................................................1467.5 Final remarks....................................................................................................................................148REFERENCES..................................................................................................................................149ANNEXES...............................................................................................................................................IAnnex I - DATABASE DIAGRAMS ................................................................................................IIIAnnex II - TEMPLATES FOR THE DIFFERENT TYPES OF CASE STUDIES .........................VAnnex III - CASE STUDY 1.............................................................................................................VIIAnnex IV - CASE STUDY 2.............................................................................................................. IXAnnex V - CASE STUDY 3 .............................................................................................................. XIAnnex VI - CASE STUDY 4........................................................................................................... XIII
  10. 10. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringX
  11. 11. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXIABSTRACTThis study researched the question of alignment between the intended learning outcomes andassessment in the field of engineering education. Based on the initial problem, four research questionswere developed that focused on: defining and describing learning outcomes in engineering; definingand describing assessment methods and e-assessment tasks; defining a model for achieving alignmentbetween learning outcomes and assessment; finally, applying the model to link specific learningoutcomes with specific assessment methods.During this study a conceptual model was developed with the goal of providing an answer to theresearch problem, the ALOA (Aligning Learning Outcomes with Assessment). The model was derivedfrom literature research and included different components directly linked with the research questions.The first one is a selection and description of learning outcomes in the field of engineering that wasbased on existing qualification frameworks of the sector: ABET and EUR-ACE. The secondcomponent was a selection and description of assessment methods. This selection was derived fromliterature and compiled in a structured list that included six general assessment categories that are thendivided in specific assessment methods. Both the learning outcomes and the assessment methods weredescribed using an adapted version of the Taxonomy Table developed by Anderson et al. in their workA Taxonomy for Learning, Teaching and Assessing. The Taxonomy Table describes the assessmentitems and the learning outcomes using a two dimensional classification system based on knowledgeand cognitive processes. During the development of the ALOA model each selected learning outcomeand specific assessment method were analysed and classified in terms of type of knowledge andcognitive processes addressed. This detailed description was used for answering the main question ofthe research problem. By using the same classification system for both types of items, it was possibleto develop an alignment proposal.The ALOA conceptual model was developed with the intention of being used by different stakeholdersin Higher Education Institutions. The study defined four different scenarios of implementation of theALOA model: to verify current alignment in existing courses; to develop an assessment strategy basedon statements of learning outcomes; to verify vertical alignment of courses with higher level learningoutcomes; to verify horizontal alignment of learning outcomes defined at the same level but indifferent contexts, as in mobility situations. The ALOA model was translated in practical tools andtested for applicability using a multiple case study approach. The results of the case studies are mainlyconcerned with the improvement of the model by detecting problems and suggesting changes.The main theoretical findings of the study were related with the clarification of core concepts in termsof assessment methods and assessment tasks. There was also an attempt to structure generalinformation concerning assessment and e-assessment. In terms of the concept of alignment this studycontributes with an innovative perspective that includes the clarification of the meaning of alignmentand the definition of four criteria for achieving alignment: match, emphasis, coverage and precision.
  12. 12. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXII
  13. 13. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXIIIRESUMOEste estudo investigou a questão de alinhamento entre os resultados de aprendizagem e a avaliação, naárea da educação da engenharia. A partir do problema inicial forma definidas quarto questões ainvestigar: como definir e descrever os resultados de aprendizagem em engenharia; como selecionar edescrever métodos de avaliação e métodos de e-assessment; como definir um modelo que permitaatingir o alinhamento entre os resultados de aprendizagem e os métodos de avaliação; como aplicar omodelo de forma a obter a ligação entre certos tipos de resultados de aprendizagem e métodos deavaliação específicos.Durante este estudo for desenvolvido um modelo conceptual que tem como objectivo dar resposta aoproblema inicial, o modelo ALOA (Aligning Learning Outcomes with Assessment). O modelo foiobtido a partir de investigação de literatura e inclui componentes que abordam cada uma das questõesacima mencionadas. A primeira componente é uma seleção e descrição dos resultados deaprendizagem na área da engenharia que foi obtida a partir de quadros de qualificações existentes nosector: ABET e EUR-ACE. A segunda componente é uma seleção e descrição de métodos deavaliação que foi obtida a partir da análise de obras de referência na área. A seleção dos métodos deavaliação foi organizada numa lista com seis categorias gerais em que cada uma inclui vários métodosespecíficos. Tanto os resultados de aprendizagem como os métodos de avaliação foram descritosrecorrendo a uma versão adaptada da Tabela Taxonómica desenvolvida por Anderson et al. na obra ATaxonomy for Learning, Teaching and Assessing. A Tabela Taxonómica define um sistema declassificação com duas dimensões: tipo de conhecimento e processos cognitivos. Durante odesenvolvimento do modelo ALOA cada resultado de aprendizagem e cada método de avaliação foianalisado e classificado em termos de cada uma destas dimensões. Esta descrição detalhada dos itensfoi utilizada para responder à questão principal deste projeto de investigação que era o alinhamentoentre os resultados de aprendizagem e a avaliação. Ao utilizar o mesmo de sistema de classificaçãopara os dois tipos de itens foi possível desenvolver uma proposta de alinhamento.O modelo conceptual ALOA foi desenvolvido com o objective de ser usado por diferentes tipos deutilizadores em instituições de ensino superior. O estudo definiu quarto cenários de implementaçãopara o modelo: verificação do alinhamento em unidades curriculares existentes; definição de uma novaestratégia de avaliação tendo como base os resultados de aprendizagem pretendidos; verificação doalinhamento vertical de uma unidade curricular por comparação com resultados de aprendizagemdefinidos a um nível superior; verificação do alinhamento horizontal de duas unidades curricularesdefinidas ao mesmo nível mas em contextos diferentes como ocorre em situações de mobilidade. Omodelo ALOA foi traduzido em ferramentas práticas e a sua aplicabilidade foi testada recorrendo amúltiplos casos de estudo.As principais conclusões teóricas deste estudo estão relacionadas com a clarificação dos conceitosmétodos de avaliação e práticas de avaliação. Houve também uma tentativa da qual resultou umaproposta de estruturação de informação relativa à avaliação e e-assessment. Em relação ao conceito dealinhamento, este estudo contribui com uma perspectiva inovadora que inclui a clarificação dosignificado do termo alinhamento e a definição de quarto critérios para que se possa atingir oalinhamento: coincidência, ênfase, cobertura e precisão.
  14. 14. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXIV
  15. 15. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXVRESUMECette étude a examiné la question de lalignement entre les acquis dapprentissage et dévaluation dansle domaine de la formation des ingénieurs. D’aprés le problème initial ont défini les quatres questionsde la recherche: comment définir et décrire les résultats dapprentissage dans lingénierie;commentchoisir et decrireles méthodes dévaluation et les méthodes dévaluation électronique; comment définirun modèle qui permettra datteindre lalignement entre les résultats de lapprentissage et les méthodesdévaluation; comment appliquer le modèle de façon à obtenir la liaison entre certains types derésultats dapprentissage et les méthodes dévaluation spécifiques.Au cours de cette étude, on a developpé un modèle conceptuel qui vise à répondre au problème dedépart , le modèle ALOA (Aligning Learning Outcomes with Assessment). Le modèle a été dérivée àpartir de la littérature scientifique et comprend des composantes qui répondent à chacune des questionsmentionnées ci-dessus. La première composante est une sélection et une description des résultatsdapprentissage en ingénierie qui a été obtenue à partir des cadres de qualifications du secteur: ABETet EUR-ACE. La deuxième composante est une sélection et une description des méthodologies qui ontété obtenus à partir de lanalyse des ouvrages de référence dans le domaine. Le choix des méthodesdévaluation a été organisée dans une liste avec six grandes catégories dans lesquelles s’incluent lesplusieurs méthodes spécifiques. Les résultats dapprentissage et les méthodes dévaluation ont étédécrites à laide dune version adaptée de la taxonomie développé par Anderson et al. Dans le travailsur la taxonomie de lapprentissage, lenseignement et lévaluation. Cette taxonomie tdéfine un systèmede classification à deux dimensions: type de de connaissance et processus cognitif. Au cours dudéveloppement du modèle ALOA on a été evalué chaque résultat de lapprentissage et chaque méthodedévaluation en chacune de ces dimensions. Cette description détaillée des éléments a été utilisé pourrépondre à la question principale de ce projet de recherche qu’était lalignement entre les résultatsdapprentissage et dévaluation. En utilisant le même système de notation pour les deux types darticlesa été possible développer une proposition d’alignement.Le modèle conceptual ALOA a été élaboré pour être utilisé par différents utilisateurs dans lesétablissements denseignement supérieur. Létude a défine quatre scénarios de déploiement pour lemodèle: vérification de lalignement dans les cours existants; définition dune nouvelle stratégiedévaluation basée sur les résultats dapprentissage escomptés; vérification de lalignement verticaldune unité denseignement par rapport aux résultats dapprentissage définis à un niveau supérieur;vérification de lalignement horizontal des deux cours fixés au même niveau, mais dans des contextesdifférents en situation de mobilité. Le modèle ALOA a été traduit en outils pratiques et leurapplicabilité a été testée à laide de plusieurs études de cas .Les principales conclusions théoriques de cette étude sont liées à clarifier les concepts, les méthodeset les pratiques dévaluation. Il y avait aussi une tentative qui a mené à une proposition de structurationde linformation sur lévaluation et le-évaluation. En ce qui concerne le concept de lalignement, cetteétude fournit une nouvelle perspective qui inclut la clarification de la signification du terme et ladéfinition de quatrecritère afin que nous puissions parvenir à un alignement: coïncidence, laccent, lacouverture et la précision.
  16. 16. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXVI
  17. 17. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXVIILIST OF TERMS AND ACRONYMS IN USEABET Organization that accredits college and university programs in the fields ofapplied sciences and engineering. Formerly known as Accreditation Board forEngineering and TechnologyALOA Model for Aligning Learning Outcomes with e-AssessmentAT Assessment taskCDIO Conceiving — Designing — Implementing — Operating real-world systemsand productsCE Continuing educationCEI Continuing education institutionsEE Engineering educationEUR-ACE European quality label for engineering degree programmesHE Higher educationHEI Higher education institutionsICT Information and communication technologiesiLO Intended learning outcomeLO Learning outcomeLT Learning technologiesMCQ Multiple choice questionPL Prior learningrBloom Adapted version of the revised Bloom’s taxonomyRPL Recognition of prior learningRQ Research questionSAQ Short answer questionTLA Teaching and Learning activitiesTT Taxonomy Table
  18. 18. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXVIII
  19. 19. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXIXLIST OF TABLESTABLE 1 - EXAMPLES OF SEARCH EXPRESSIONS USED ............................................................................................10TABLE 2 - TYPES OF SOURCES USED AND EXAMPLES..............................................................................................11TABLE 3 - TRENDS IN ASSESSMENT. FROM BROWN ET AL. [7, P.13] .......................................................................13TABLE 4 – PRINCIPLES AND PRACTICES ON ASSESSMENT (EXTRACTED AND ADAPTED FROM WOODS [50]) ...........16TABLE 5 - REASONS FOR ASSESSING STUDENTS......................................................................................................16TABLE 6 - SUMMARY OF ASSESSMENT METHODS ...................................................................................................22TABLE 7 - FAMILIES OF ESSAY QUESTIONS .............................................................................................................24TABLE 8 - GENERAL CRITERIA FOR ASSESSING ESSAYS (ADAPTED FROM BROWN ET AL. [7]).................................25TABLE 9 - PROBLEM SOLVING STAGES AND COGNITIVE PROCESSES (ADAPTED FROM WOODS [55, P.448]) ............26TABLE 10 - SUMMARY OF PROBLEM-SOLVING AS VIEWED BY PLANTS ET AL. [56].................................................26TABLE 11 - SUMMARY OF PRACTICAL ENQUIRY AS VIEWED BY HERRON ...............................................................27TABLE 12 - ASSESSING EXPERIMENTAL PROJECT (ADAPTED FROM BROWN ET AL. [7, P.128-9]) ............................28TABLE 13 - BRIEF ANALYSIS OF CRISP’S E-ASSESSMENT ITEMS..............................................................................32TABLE 14 - DIFFERENCES BETWEEN SPECIFICITY LEVELS OF OBJECTIVES (FROM ANDERSON ET AL. [40, P.17]) ....35TABLE 15 - SUMMARY OF BLOOMS TAXONOMY [41] ............................................................................................36TABLE 16 - SUMMARY OF THE KNOWLEDGE DIMENSION OF RBLOOM ....................................................................37TABLE 17 - SUMMARY OF THE COGNITIVE DIMENSION ...........................................................................................38TABLE 18 - AN EXAMPLE OF A TAXONOMY TABLE USED TO VERIFY ALIGNMENT ..................................................40TABLE 19 - COMPARISON BETWEEN QUALIFICATIONS IN EE..................................................................................44TABLE 20 - SUMMARY OF DATA COLLECTION ........................................................................................................53TABLE 21 - SUMMARY OF THE ANALYSIS OF DATA.................................................................................................54TABLE 22 - THE TAXONOMY TABLE BY ANDERSON ET AL.....................................................................................59TABLE 23 - FACTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])..........................................60TABLE 24 - CONCEPTUAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61TABLE 25 - PROCEDURAL KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29])....................................61TABLE 26 – METACOGNITIVE KNOWLEDGE (EXTRACTED FROM ANDERSON ET AL. [40, P. 29]) .............................62TABLE 27 - REMEMBER CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])...........................................63TABLE 28 - UNDERSTAND CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8]).......................................63TABLE 29 – APPLY CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..................................................64TABLE 30 - ANALYSE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])..............................................65TABLE 31 - EVALUATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/8])............................................65TABLE 32 – CREATE CATEGORY (ADAPTED FROM ANDERSON ET AL. [40, P. 67/68])..............................................66TABLE 33 – RBLOOM MATRIX, AN ADAPTED VERSION OF THE TAXONOMY TABLE BY ANDERSON ET AL ...............67TABLE 34 - MAPPING OF ASSESSMENT BASED ON MCQ TO BLOOMS REVISED TAXONOMY ...................................69TABLE 35 - MAPPING OF ASSESSMENT BASED ON ESSAYS TO BLOOMS REVISED TAXONOMY................................73TABLE 36 - MAPPING OF SIMPLE CLOSED-ENDED PROBLEM SOLVING TO BLOOMS REVISED TAXONOMY (DIAGNOSIS+ ROUTINE ACTIVITIES) ..................................................................................................................................76
  20. 20. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXXTABLE 37 - MAPPING OF COMPLEX CLOSED-ENDED PROBLEM SOLVING TO BLOOMS REVISED TAXONOMY ...........76TABLE 38 - MAPPING OF OPEN ENDED PROBLEM SOLVING TO BLOOMS REVISED TAXONOMY................................77TABLE 39 - MAPPING OF PRACTICAL WORK TO BLOOMS REVISED TAXONOMY ......................................................79TABLE 40 - MAPPING OF ASSESSMENT USING SHORT-ANSWER QUESTIONS TO BLOOMS REVISED TAXONOMY .......83TABLE 41 - MAPPING OF REFLECTIVE PRACTICE ASSESSMENT TO BLOOMS REVISED TAXONOMY..........................85TABLE 42 - COMPLETE MAPPING OF ABET PROGRAMME OUTCOMES TO BLOOMS REVISED TAXONOMY.............100TABLE 43 - COMPLETE MAPPING OF EUR-ACE PROGRAMME OUTCOMES TO BLOOMS REVISED TAXONOMY. .....107TABLE 44 – MAPPING OF THE ABILITY TO USE NEWTON’S FIRST LAW IN ROUTINE PROBLEM SOLVING.................109TABLE 45 - STAKEHOLDERS OF DIFFERENT SCENARIOS FOR ALIGNMENT..............................................................118TABLE 46 – NUMBERING OF THE CELLS TO TRANSFORM TWO DIMENSIONS INTO ONE...........................................120TABLE 47 - LINK BETWEEN ILOS AND THE LOS OF ABET ...................................................................................126TABLE 48 - LINK BETWEEN ILOS AND THE LOS OF EUR-ACE ............................................................................126TABLE 49 - SUMMARY OF METHODS OF ASSESSMENT USED IN THE CASE STUDY ..................................................127TABLE 50 - OVERLAPPING OF RBLOOM FOR SIMPLE PROBLEM SOLVING WITH ITEM CS.02.A03.14A ...................127TABLE 51 - OVERLAPPING OF RBLOOM FOR OPEN PROBLEM SOLVING WITH ITEM CS.02.A03.14B.......................127TABLE 52 - OVERLAPPING OF RBLOOM FOR PROJECT WITH ITEM CS.02.A02.......................................................128TABLE 53 - ALIGNMENT OF LO1 WITH REAL ASSESSMENT ...................................................................................129TABLE 54 - ALIGNMENT OF LO2 WITH REAL ASSESSMENT ...................................................................................129TABLE 55 - ALIGNMENT OF LO3 WITH REAL ASSESSMENT ...................................................................................129TABLE 56 - ALIGNMENT OF LO4 WITH REAL ASSESSMENT ...................................................................................130TABLE 57 - ALIGNMENT OF LO1 WITH STANDARD ASSESSMENT ..........................................................................130TABLE 58 - ALIGNMENT OF LO2 WITH STANDARD ASSESSMENT ..........................................................................130TABLE 59 - ALIGNMENT OF LO3 WITH STANDARD ASSESSMENT ..........................................................................130TABLE 60 - ALIGNMENT OF LO4 WITH STANDARD ASSESSMENT ..........................................................................131TABLE 61 - MATCHING SCORES OBTAINED FOR THE EACH ILOS WHEN COMPARED WITH STANDARD ASSESSMENTMETHODS......................................................................................................................................................131TABLE 62 - OVERLAPPING OF STANDARD ALOA SIMPLE PROBLEM SOLVING MATRIX WITH REAL ASSESSMENTFROM CASE-STUDY .......................................................................................................................................132TABLE 63 - OVERLAPPING OF STANDARD ALOA OPEN PROBLEM SOLVING MATRIX WITH REAL ASSESSMENT FROMCASE-STUDY .................................................................................................................................................133TABLE 64 - OVERLAPPING OF STANDARD ALOA PROJECT MATRIX WITH REAL ASSESSMENT FROM CASE-STUDY 133TABLE 65 - RELATION BETWEEN THE PROBLEM AND THE ALOA MODEL .............................................................142
  21. 21. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXXILIST OF FIGURESFIGURE 1 - FIELD OF STUDY......................................................................................................................................2FIGURE 2 - ASSESSMENT CYCLE AS DESCRIBED BY BROWN ET AL..........................................................................15FIGURE 3 - PURPOSES OF ASSESSMENT ...................................................................................................................17FIGURE 4 - SOURCES OF ASSESSMENT ....................................................................................................................18FIGURE 5 – CURRENT CHALLENGES IN ASSESSMENT ..............................................................................................18FIGURE 6 - DIFFERENT DIMENSIONS OF RELIABILITY OF ASSESSMENT....................................................................19FIGURE 7 - TYPES OF VALIDITY RELATED WITH CURRENT RESEARCH.....................................................................20FIGURE 8 - THE LEARNING CYCLE AS VIEWED BY KOLB.........................................................................................29FIGURE 9 - GROUPS OF ACTIVITIES RELATED WITH E-ASSESSMENT IMPLEMENTATION [26]....................................30FIGURE 10 - THE SAME LO SHOULD BE PRESENT IN ALL THE ACTIVITIES ...............................................................41FIGURE 11 - RELATION BETWEEN THE CONCEPTUAL MODEL AND THE RESEARCH QUESTIONS................................49FIGURE 12 – INITIAL REPRESENTATION OF THE PROBLEM.......................................................................................49FIGURE 13 - SECOND VERSION OF THE MODEL........................................................................................................50FIGURE 14 - THIRD VERSION OF THE MODEL...........................................................................................................51FIGURE 15 - FOURTH AND FINAL VERSION OF THE MODEL......................................................................................52FIGURE 16 - THE ALOA CONCEPTUAL MODEL.......................................................................................................57FIGURE 17 - ALIGNMENT POSSIBILITIES FOR ONE UNIT OR COURSE ......................................................................107
  22. 22. Evaluation of the Application of e-Learning Methodologies to the Education of EngineeringXXII
  23. 23. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering1CHAPTER 1 - INTRODUCTION1.1 BACKGROUNDSince the last half of the 20th Century, the World has been experiencing rapid transformation in thefield of Education led by the changing Knowledge Society [1]. As Peter Drucker explained in 1996, inthis new society access to work is only gained through formal education and not acquired throughapprenticeship. Almost two decades have passed and this is already what is happening in some parts ofthe World. Education and schooling have become a major concern for the society and it is a priority innational and transnational policies. The strategic document Europe 2020 [2] defines as a priority thedevelopment of an economy based on knowledge and innovation. All educational institutions play animportant role in achieving this goal.Higher Education (HE), Continuing Education (CE) and Vocational Training have been most affectedby this transformation, adapting to the demand for new skills of the labour market and at the sametime corresponding to the needs of an increasing number of students [3]. The global economy createdopportunity and need for the mobility of students and workers, demanding efficient recognition ofqualifications and increasing competitiveness in this field. The labour market demands more workersqualified and updated and this trend is mirrored in educational policies in Europe [4-6]. All thisgenerates pressure towards a quality-based approach for all education providers, as Drucker predictedin his article about the knowledge society[1].One visible effect of this transformation is the shift from a content-based approach in Education to anapproach centred on the student and what he/she has learned and achieved. This transformation has ledto different trends in HE as identified by Brown et al. [7]. One of the trends is the change fromstructuring education based on what should be taught by defining educational objectives to structuringeducation by defining what students should learn, the Learning Outcomes (LOs). This approach isunderpinning the development and implementation of most European Education policies atinternational and national levels [8-10]. In Europe, HEI and CEI are redefining programmes in termsof Learning Outcomes, harmonizing them with national, international and sector level frameworks ofqualifications that are also based on Learning Outcomes [11-14]. Several projects and initiatives are
  24. 24. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering2working towards the definition LOs, specific and transversal that can be used as a common reference[15, 16]. Learning Outcomes are also becoming fundamental for structuring the standards andguidelines of quality assessment of HE and CE institutions in Europe and worldwide [17-20]. Thefield of engineering education has also been affected by these current trends. Learning outcomes arebeing used in qualification frameworks specific of engineering, like ABET, EUR-ACE and CDIO. Inthis context, the assessment of Learning Outcomes becomes a crucial process for the EducationalSystem. It should be a major concern of educational institutions to ensure that assessment of studentlearning is being guided by what they should be learning, i.e. assessment should be aligned with theintended learning outcomes (iLOs).Another major revolution in our society has been the introduction of Information and CommunicationTechnologies. The use of ICT applied to Education, e-learning, has been increasing and its use createsnew opportunities for teaching, learning and assessment and has huge potential as an answer to someof the current challenges of Education. The change to the digital media has impact on the availability,reusability, accessibility and cost of learning resources, complemented by the communication andnetworking potential of the Internet that takes Education to a Global level [21] [22, 23]. Theapplication of ICT to education and in particular to assessment is a subject of great discussion. Someof the issues related with the use of e-learning in assessment are related with validity and reliability ofthe process. This study will be focusing on this particular issue.1.2 FIELD OF STUDY AND INTENDED STAKEHOLDERSThe current study will be looking into the relationship between learning and assessment. The focus ofthe study is placed in the intersection of three areas, as illustrated by Figure 1: learning outcomes,assessment of student’s learning and e-learning. It will be focusing on Higher Education but morespecifically on Engineering Education. The question of alignment of e-assessment and education isrepresented by the area where the three circles overlap.Figure 1 - Field of studyThe wide use of e-learning is already impacting Education, promoting change and innovation indifferent aspects including pedagogy, technology, organization, accessibility, and flexibility amongothers [24]. It is a complex and multidisciplinary area and, given its impact, it is important that E-learning research be informed by evidence [23]. Current literature reviews in this area indicate that e-learning approaches to assessment lack the pedagogical framework and most research describesLearningOutcomese-LearningAssessment
  25. 25. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering3implementation studies at course level [25]. The present research intends to contribute to establishinga conceptual model for the implementation of e-assessment in Engineering Education.Assessment is a crucial process of Education and is seen by current trends as part of the learningprocess and not as a separate event [7]. Assessment of student learning will encourage involvement ofstudents and provides feed-back to students and teachers [26]. It has an important role in validationand certification and is deeply related with quality issues as will be explained below. The current studywill be focusing on the specific link between what should be learned and what should be assessed. Thepurpose of this study is to research the applicability of e-assessment strategies to the field ofengineering education. It intends to help in the definition of adequate e-assessment strategies based onwhat is the stated intended learning.This research study mainly applies to engineering courses that use or intend to use e-assessmentstrategies. The implementation stage addressed existing courses of Faculdade de Engenharia daUniversidade do Porto (FEUP). It focused on the stated learning outcomes of the courses and on theexisting assessment and e-assessment strategies as important components of the alignment question.As the study will focus at course level, it will be strongly related with the activities of faculty that areusually the ones that define the iLOs and the assessment tasks. However, the results of the study haveimpact on other levels, including programme and qualification frameworks. It is expected that otherstakeholders including curriculum designers, mobility staff, and quality and accreditation staff will beusing the results of this study. In terms of students, they are not considered direct users of the studyand tools. Still, the project is based on the learner centred perspective of education and they willbenefit from a more efficient and effective educational process.1.3 BACKGROUND AND MOTIVATION OF THE RESEARCHERSince 1998 I have been working in the field of ICT applied to education. I started working as an editorof multimedia projects in an educational publishing company. Later in 2001, I continued the work inthis field as responsible for the e-learning unit of Universidade do Porto. This position gave me theopportunity to contribute to the implementation of an institutional strategy for e-learning. At the sametime, it allowed me to build a transversal perspective of the implementation of learning technologies atthe institutional course level, including different learning management systems (LMS) and otherlearning technologies (LTs). This role included giving individual support to teachers in terms ofpedagogical and technological issues. It was possible to learn from the different strategies used by theteachers in terms of teaching and assessment and to perceive the perspective of the students.Additionally it included the active participation in European research projects in the field of e-learningwhich was important to learn and update knowledge and skills in this area,During this period I was a teacher in traditional courses at the university. As I developed blendedlearning strategies to work with the students I could better understand the perspective of the teachersand learners when dealing with e-learning. I was involved in the development, teaching and tutoring ofdistance e-learning courses that contributed to improve my knowledge in this area. Finally, I was astudent in a Master’s programme of the School of Engineering that included an e-learning course withe-assessment tasks. This gave me the perception of the student view in terms of e-learning.This summary of activities shows that the motivation for the field of study comes from both theacademic and the professional area. When dealing with e-learning strategies from each of theperspectives, assessment has always been a component that raises questions and problems. Theflexibility of the e-learning environment creates opportunities for experimenting and many interesting
  26. 26. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering4and creative examples of e-assessment tasks can be found at Universidade do Porto. A recent trend atthe institution showed that teachers are using e-learning based assessment for summative purposes.Most of the e-assessment implementations were online exams but the teachers successfully developedother types of tasks [27-29]. All of these represent a very interesting field of research.The application to the field of engineering comes basically from the same background. From theresearcher perspective, there are two subject areas that are quite innovative in the use of LearningTechnologies: medicine and engineering. Also, both areas have strong civil responsibility and the needto have regulations from professional bodies. Both aspects have impact on the mobility of the studentsand workers. The affiliation to the School of Engineering as a student and a former invited teacherboth contributed to my decision of the scope of application of the research.Finally, I worked closely in the creation and development of the European project VIRQUAL that wasrelated with field of qualification frameworks, learning outcomes and assessment in the context ofinternational mobility [30-33] which developed an interest in the area of transnational Educationpolicies, Qualification Frameworks and Quality Assurance in HE, a field that has gained an increasedimportance in Higher Education.The combination of these different areas of interest contributed to the definition of the problem and tothe research approach that came after the initial research.1.4 STATEMENT OF THE PROBLEM AND RESEARCH QUESTIONSThe research problem is placed in the area of interaction of three fields of study: Assessment, LearningOutcomes and e-Learning. The scope of application will be engineering education. Engineeringeducation has specificities and requisites [34] that may constitute challenges in the implementation ofassessment based in Learning Technologies (LT). This study intends to contribute to developing e-learning strategies for assessing EE, helping to deal with current challenges in education.In general terms, the approach chosen was to develop a model that matches specific e-assessmentmethods to specific LOs in the field of engineering. This means that it might be possible for teachersto define the intended Learning Outcomes (iLO) of an online course and from this definition to havean indication of the e-assessment methods they might consider using. Formally, this problem isdefined as:To what extent may e-assessment methods be used to measure theachievement of Learning Outcomes in engineering education?Given this problem, it was necessary to recognize that there are a wide variety of engineering schools,engineering programmes and engineering courses. There are different qualification frameworks thatuse LOs in the engineering sector. So, the first challenge was how to select the LOs that were going tobe used for the purpose of the research. The selection process had to take in consideration the subjectarea, level and even the nature of the LOs. The same problem existed in relation to the assessmentmethods. Assessment tasks are usually defined at course level, even though some examples can befound at a higher level. Again, there are a considerable variety of assessment and e-assessment tasksthat could be used in this study. Another aspect of assessment is that in most cases is deeply embeddedin the structure of the course or unit and highly contextualized.
  27. 27. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering5The first two research questions translate this need for definition of the central concepts of theproblem:RQ1) Which type of Learning Outcomes in the field of Engineering arerelevant and should be considered?RQ2) What are the e-assessment methods that should be considered?After defining the concepts, it is necessary to deal with the core of the problem that is the relationshipbetween assessment and LOs. This link between the iLOs and the assessment tasks is part of what isconsidered the alignment of a course [35, 36]. The first concern with alignment is if every iLO isassessable. It is a recognized issue in HE the need to include transversal skills in curricula and howdifficult it is to assess them. In engineering, curricula are pressured to include these types of LOsamong the technical LOs specific of the sector. This is a known challenge in EE [34] that may alsorepresent a challenge to effective assessment. The last two research questions of this study are relatedwith the alignment question:RQ3) What type of intended Learning Outcomes can be measured by e-assessment methods?RQ4) Is it possible to propose specific e-assessment strategies for each typeof LO in EE?1.5 IMPACT EXPECTATIONS OF THE STUDYAssessment is an important issue in HE that has high impact on learning [3, 7, 37, 38]. It raisesquestions about the efficiency, effectiveness and adequacy of different methods and strategies. E-assessment, due to the technological component and the association with distance learning bringsadditional controversies, some of them intrinsic to e-learning. As described by Conole and Martin, e-learning is a complex area that may have impact in a diversity of fields including the teaching andlearning process (T/L), organizational structures, political and socio-cultural issues, among others[22].It is expected that the findings of this research project will:• Help teachers to decide which assessment tasks are more suitable for the LOs stated for aspecific course or module. The ALOA model developed includes practical tools that willsupport the decision process.• Help teacher and other stakeholders to verify the current alignment of existing courses. This isone of the scenarios of application of the ALOA model.• Contribute to develop a pedagogical framework for the implementation of aligned e-assessment strategies, based on the definition of Learning Outcomes. The ALOA model wasdeveloped as a conceptual and theoretical model that intends to promote the alignmentbetween iLOs and assessment.• Facilitate accreditation processes and navigation between different qualification frameworks,by providing a common tool for the description of LOs. This was defined in the current studyas vertical alignment and is one of the implementation scenarios of the ALOA model.• Promote mobility and recognition of prior learning, formal or informal by allowing thecomparison of the LOs of previous experiences with the intended ones.
  28. 28. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering61.6 STRUCTURE OF THE STUDYThis dissertation is divided in seven chapters.This chapter defines the context of the current research study and introduces the main thematic areasthat will be explored during the development of the work. After providing a rationale for the work, thechapter will present the background of the researcher and the motivations for conducting this study.Following, the research problem is described as well as the research questions that guided the project.Finally, the chapter addresses the expected impact of the research and includes a description of thestructure of this dissertation, with a brief explanation of the contents of each chapter.Chapter 2 provides a theoretical background for the problem and a critical analysis of the publishedwork in the field. From the analysis of theory and published work it was possible to define theapproach that was followed to address the problem. The ALOA conceptual model was derived fromthe literature research so this chapter grounds most the decisions that were taken during developmentof the work.Chapter 3 describes the different components of the research method, including the theoreticaldevelopment of the conceptual model and the multiple case studies approach used in theimplementation stage. The chapter also explains how the research project dealt with validity,reliability and ethical issues.In Chapter 4 the conceptual model is described in detail. It includes the description of the mainconcepts of the model: LOs in engineering and assessment methods. The chapter also includes adefinition of the relationships between the concepts that are included in the ALOA model.Chapter 5 describes the implementation of the model. It starts by identifying and describing someimplementation scenarios and the practical tools that were developed. Finally, the chapter describesthe implementation of the ALOA model to the case studies that were used to test the applicability ofthe model.In Chapter 6 the results of the applicability test are interpreted from the perspective of each case andfrom a transversal point of view. In this chapter, the results of the implementation stage are analysedfrom the perspective of the research questions.Chapter 7 is the final chapter of the dissertation. It starts by presenting the global conclusions of thework from the perspective of the research problem. Then, the results are analysed in terms of thecontributions to theory and practice. Finaly, it explores future work that could be developed based onthe current project.1.7 PUBLICATIONS RELATED WITH THE RESEARCH STUDY• E-ASSESSMENT OF LEARNING OUTCOMES IN ENGINEERING EDUCATION”,Falcão, R., Proceeding of WEEF 2012, Buenos Aires, Argentina, October 2012 (accepted)• “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING INENGINEERING EDUCATION”, Rita Falcão, Proceedings of ICL2012, Villach,Switzerland, September 2012 (accepted)• “A CONCEPTUAL MODEL FOR E-ASSESSING STUDENT LEARNING INENGINEERING EDUCATION “, Falcão, R, SITE 2012, Austin, Texas, USA, March 2012• “ASSESSING LEARNING OUTCOMES IN ENGINEERING EDUCATION: AN E-LEARNING BASED APPROACH”, WEE 2011, Lisboa, Portugal, September 2011
  29. 29. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering7• “Assessment of Learning Outcomes through e-Learning”, WCCEE2010, Singapore, October2010• "MEASURING IMPACT OF E-LEARNING ON QUALITY AND LEARNINGOUTCOMES – A PROPOSAL FOR A PHD PROJECT", Falcão, R., Soeiro, A., EDEN2007, Junho 2007;
  30. 30. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering8
  31. 31. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering9CHAPTER 2 - RELATED WORK2.1 INTRODUCTIONThe field of study of this dissertation is defined by the intersection of three areas: assessment, learningoutcomes and e-learning. Given this field, the research focused in the area of Higher Education and inparticular Engineering Education. Given the general field of the research, what was expected to obtainfrom the literature review?What guided the literature research was the analysis of the problem and of the research questions.To what extent e-assessment methods may be used to measure theachievement of intended Learning Outcomes in engineering education?The problem is composed by two dimensions and intends to explore a specific relation between them.The dimensions “Learning Objects in Engineering Education” and “e-assessment methods” are thefocus of the first two research questions.RQ1) Which Learning Outcomes in the field of Engineering are relevant andshould be considered?RQ2) Which are the on-line assessment methods that should be considered?What is intended with these two questions is a clarification of the dimensions that will be analysed soit is possible to obtain a valid selection or at least an informed selection process. Literature researchwas fundamental to analyse this aspect of the problem. The first dimension to be analysed was the LOsin EE. Starting with the general approach, it was necessary to clarify certain aspects related with LOsincluding terminology, definitions and the theory behind them. Concerning EE, it was researched whatwas the role of LOs in EE, how were they implemented and if there was a trend for a common set ofLOs in engineering courses.Concerning the second dimension, e-assessment, it was possible to find a wide variety of publishedwork with a corresponding variety of assessment methods. However, most of the published work was
  32. 32. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering10based on individual cases of application. It was difficult to find fundamentals of e-assessment or acomprehensive study of e-assessment methods. Early in the literature research of this subject it wasconsidered necessary to go back to the basics of assessment. Donnan [39] faced a similar situation inhis work. Educational developers considered that e-assessment or online assessment was not distinctfrom general assessment. So the analysis again started from the perspective of general assessment toclarify several issues and finally specific e-assessment methods were researched.The third component of the problem is a specific relation between the dimensions, the alignmentbetween LOs in EE and e-assessment methods. That is the core of the problem and the focus ofresearch questions 3 and 4.RQ3) What type of intended Learning Outcomes can be measured by e-assessment methods?RQ4) Is it possible to propose specific e-assessment strategies for each typeof LO in EE?In this stage, literature research was centred on finding suggestions on using specific assessmentmethods to measure or evaluate specific types of Learning Outcomes, in general and in the field ofEngineering. An important finding in terms of related work, was the book “A Taxonomy for Learning,Teaching and Assessment” [40] that became a central piece of the this research project.2.1.1 LITERATURE RESEARCH METHODThe initial stage of the literature research was exploratory with a view of obtaining a framework forthe research problem. The more systematic approach was guided by the analysis of the problem andprevious knowledge on the subject. Table 1 summarizes the main search expressions used.Table 1 - Examples of search expressions usedResearch question Research termsRQ1 Learning Outcomes or Learning ObjectiveandEngineering, Higher Education, qualification frameworks, ABET, EUR-ACERQ2 assessment or e-assessment or evaluationandmethods, tools, online, CBT (computer based testing), CBA (computer based assessment), CAA(computer assisted assessment), web basedRQ3 Learning Outcomes + assessmentLearning Outcomes + assessment methodsLearning outcomes + evaluationBloom + assessmentBloom + Learning Outcomes + assessmentRQ4 Assessment or e-assessment or portfolios or multiple choice questions (MCQ) or essays orshort answer questions (SAQ)andengineering, learning outcomes + engineering, learning outcomes + ABET, learning outcomes +EUR-ACE, learning outcomes + Bloom, ABET + Bloom, EUR-ACE + BloomThe research was conducted using resources available at the library of School of Engineering,including books, journals and databases of journals and conference proceedings. Additionally, onlinesearches were performed using web tools including Google, Google Scholar and specific websites.Table 2 presents the main sources used during this research project.
  33. 33. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering11Table 2 - Types of sources used and examplesTypes of sources ExamplesBooks Taxonomy of educational objectives [41]A taxonomy for learning, teaching, and assessing: a revision ofBlooms taxonomy of educational objectives [40]Teaching for quality learning at university: what the student does[42]Assessing student learning in higher education [7]Assessment for learning in higher education [43]Designing better engineering education through assessment [44]Contemporary perspectives in e-learning research : themes,methods, and impact on practice [24]The Sage handbook of e-learning research [21]The e-assessment handbook [26]Journals Assessment and Evaluation in Higher EducationEuropean Journal of Engineering EducationJournal of Engineering EducationBritish journal of educational technologyAlt-J, research in Learning TechnologyStudies in Higher EducationConference proceedings CAAFrontiers in EducationTeaching and Learning in Higher EducationEDUCOMComputers in EducationPolicy papers European CommissionBologna ProcessCopenhagen ProcessNational InstitutionsWebsites ABETEUR-ACEAHELOTuningJISCE-learning Papers2.2 ASSESSMENTAssessment is an important process in education and intimately related to how students learn, aspointed by different authors. Brown, Bull and Pendlebury [7, p. 7] state that assessment is at the centreof the learning experience and should be a concern for those involved in the learning process,including learners, educators and institutions. In their view, assessment defines what students willconsider important, how they will spend their time and even how they will see themselves in thefuture. They consider that to change learning we need to change the way we assess. This perspective isshared by Biggs [42] when he describes the backwash effect of assessment. Biggs defends thatstudents will look strategically at assessment to determine what and how they will learn. Again, themain idea is that assessment is the central driver for the learning process.Another author, Peter Knight [43, p.11] shows a similar but more dramatic view when he asks how wecan make student work without assessment. In the introduction to the book “Assessment for Learningin Higher Education” Knight considers that assessment is a moral activity that states the values of theteachers and institutions. What is assessed and the methods used give clear indication to the studentsof what is valued in a course or in a programme. In the view of the author, even though the goals of a
  34. 34. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering12full programme are stated in documents like the mission statement and the programme goals, it is infact in the assessment tasks that lays the essence of the programme and of the learning experience.In the same book, David Boud [3] considers that assessment is the aspect of HE where there is morebad practice and the effects of this bad practice has a strong impact on students learning and success.As the author refers, students may avoid bad teaching methods but they cannot avoid being assessed ifthey want to graduate. Boud, based on the work of other authors considers that as a consequence ofbad assessment practices, students may be hurt in their self-esteem and even reject some subjects.Race [45] also explores the negative effects of assessment, in particular exams, in terms of feelings.He describes the worst nightmares students get before or after an exam and how these bad feelingaffect learning.Reinforcing this idea, Brown et al. [46] write that even though assessment is an important element ofthe learning process, new or existing faculty rarely have training to improve their assessment skills.An interesting perspective from the works of these authors is that in most cases, assessing and markingare activities done privately and most of the times teachers only receive feedback from students andnot from their peers (other teachers) to validate their decisions. From the perspective of the students,the problem is similar. Most of the times students are assessed only by their teachers and don’t receivefeed-back from other students.Race [45] goes even further and considers that one of the principles of assessment is the need fortransparency not only to students and staff but also to the employers. In fact, recent calls foraccountability of the institutions placed assessment on the main stage of the educational process.Assessment provides important evidence for quality and accreditation, which means that assessmentmethods, criteria and results won’t be as private as before.Even though some statements of these authors may be consider a bit strong, there are some interestingpoints made concerning the importance of assessment that may be summarized as follows:• Assessment is one of the main drivers for learning• Has a strong impact on what and how students learn• Affects students self-esteem and confidence• Embeds the values of the teacher and the institution• Is a compulsory activity to obtain a graduation/certification• Provides evidence for quality and accreditationCurrent trends, as Knight, Boud and other authors refer [3, 7, 43, 47] treat assessment as an activitythat is part of the T/L process and that cannot be separated. As Erwin puts it [47] deciding what toteach and assess is one issue, not two. Assessment should be seen as a learning activity, centred on thelearner.Brown et al. present an overview of current trends in assessment, focused on practical issues. Asummary of these trends is represented in Table 3. It can be said that these changes are somewhatrelated with the overarching trend of student centred learning. These trends focus on promoting theformative function of assessment, increasing feedback to students and contributing to improvelearning. It can be said that assessment is becoming more personalized and students are becomingmore involved in the learning process through the use of explicit criteria and learning outcomes,allowing self and peer assessment and facilitating the recognition of prior-learning.From the trends referred in Table 3, three are of particular importance to the present work: the changefrom objectives to outcomes, from content to competences and from implicit to explicit criteria. Asreferred by Brown et al. using learning outcomes is useful to clarify the relationship between course
  35. 35. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering13design and assessment. Also, LOs play an important role in the recognition of prior learning since itseparates learning from teaching activities, opening the possibility of assessing what was learnedindependently of how it was learned.Table 3 - Trends in assessment. From Brown et al. [7, p.13]From TowardsWritten examinations CourseworkTutor led-assessment Student led-assessmentCompetition CollaborationProduct assessment Process assessmentObjectives OutcomesContent CompetenciesCourse assessment Modular assessmentAdvanced levels Assessed prior learningThe change from content to competences is related with the change to outcomes. As referred byBrown et al. competences are clusters of skills students are able to use in different situations and theyprovide a framework for defining LOs and transferable skills. Finally the use of explicit criteria inassessment plays an important role in the pedagogical process. When the assessment criteria ormarking schemes are explicit they may provide important information for the learning process. Ifshared with students, these instruments will give them indication of what is expected from them. Bydefining explicit criteria for an assessment task one is also defining the links between the task and theiLOs. Explicit criteria may have an important role in the reliability of the assessment task since it mayreduce differences between assessors. As referred by Brown et al. [7] there is considerable controversyaround the efficiency of the use of implicit and explicit instruments. However, the focus of this workis not on the instruments but on the methods of assessment.Boud [3] provides a useful description of the evolution of assessment that provides some backgroundunderstanding on this matter. In the conventional conception, assessment follows learning and aims atfinding out how much was learned; it is a quantitative perspective. There is no questioning concerninglinking the assessment task to learning. This conception is followed by educational measurement thatfollows the same principles but it intends to be more rational, more efficient and reliable. It includesideas and concepts from psychometrics. Nowadays we are still influenced by this conception as can beseen by the wide use of multiple-choice question exams that are a typical instrument of educationalmeasurement. The latest perspective identified by Boud is competency based and authenticassessment. It resulted from concerns about validity of assessment focusing on the link between whatwas assessed and what was expected for students to have learned. Authentic assessment includes thedirect assessment of complex performance and includes methods such as portfolios, open-endedproblems, hands-on lab work. It contrasts with indirect assessment methods like multiple-choicequestions that measure, among other things, indirect indicators of performance [48].This conception of assessment questions the validity of educational measurement approaches andpromotes performance-based assessment and the importance of learning outcomes. As Boud refers,
  36. 36. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering14what is important is to assess if the students are achieving the iLO, independently of how they reachedthem. Good assessment is, as described by Boud, the one that is linked with the iLO and the one thatpromotes learning. Erwin [47] also supports that the first step in educational design is to define clearlythe learning outcomes. This should be done before deciding teaching and assessment strategies, both atprogramme and course level. Another author, Race [45, p. 67] provides ten principles of assessmentand again, the first ones are related with clearly defining the purpose of assessment and integratingassessment in the course activities and not as a separate event. Race defends the importance ofassessment providing feedback to students, in agreement with the perspective of Boud.Race and other authors [46] consider that a key question about assessment is knowing what we areassessing. However, the analysis proposed by these authors is not based on the iLO but on theassessment tasks. For each task it should be clear what is being assessed: the content, the process, thestructure, the product, the style, the presentation, etc. The authors recommend that assessment criteriashould be clearly stated and then provided and explained to students. They even go further saying thatthese criteria should be negotiated with the students, to share the ownership of assessment and helpthem understand the whole assessment process.Another interesting perspective is provided by Brown et al. [46, p.82] when they suggest that theoutcome of an assessment task should not be merely a grade but a description of what students knowand can do. Again, the link between assessment and LO is highlighted as being valuable in theeducational process for the future development of students.Given the impact of assessment in student learning, Boud considers important to reflect on howassessment affects learning, what students learn from assessment. He considers assessment gives amessage to students about what they should be learning but the message is not clear and most likelywill not be interpreted the same way by teachers and students. For this reason, assessment will mostlikely have non-intended consequences in student learning. Students will respond strategically toassessment based on past experiences, choosing an approach that will lead them to success.Linn [48, p. 16] shares the same opinion saying that assessment may have intended and unintendedeffects both on learning and on teaching. As an example, both teachers and students may spend moretime teaching or studying concepts that will be explicitly included on the assessment tasks and neglectthose that will not be included. This concept is called consequential validity and is approached by Linnand other authors like Messick [49]. This is related with the backwash effect of assessment onlearning. As Boud explains, the backwash effect is positive if encourages the intended learningoutcomes and is negative when encourages ways of learning that are not desired, like memorizinginstead of understanding. Linn [48, p. 19] suggests that to understand the real cognitive complexity ofan assessment task one must analyse the task, the familiarity of the student with the task, and ideallythe process students follow to solve it. An apparently complex task may be addressing lower levelthinking skills if the student is only recalling previous knowledge about the task.To summarize, the following ideas are important when defining assessment:• Assessment should be seen as part of the teaching and learning process (T/L)• The first step is to have good assessment is to define the iLOs of the course or module• It is important to define clearly the assessment tasks and what is being considered forassessment• Assessors have to realize that assessment tasks will have intended and unintended outcomes• The real cognitive complexity of an assessment task depends not only on the task but on manyother factors, some related with the learner, others with the T/L process.
  37. 37. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering152.2.1 KEY CONCEPTS ABOUT ASSESSMENTThe term assessment, as analysed by Brown et al. [7] may have different interpretations. The origin ofthe term is Medieval Latin meaning “sitting beside”1to determine tax value. The general meaning ofassessment is to estimate worth, to judge the value. In traditional views of education, as describedabove, judging and determining value is the core function of assessment. However, current trends ineducation give assessment a new and important role, contributing directly to learning. For the purposeof this work, we will use the definition of assessment as proposed by Brown et al. [7, p.11] since it isbroad enough to include most assessment tasks:Any procedure used to estimate student learning for whatever purpose.Brown et al. [7, p.8] describe what can be considered an assessment cycle. It consists of three essentialsteps: taking a sample of what students do, making inferences and estimating the worth of what wasdone.Figure 2 - Assessment cycle as described by Brown et alThe first step, sampling may include a traditional assessment method like an essay or exam but mayalso include solving problems, carrying out a project, performing a procedure, etc. Samples areanalysed by the assessors who will make conclusions about what was achieved by the students whencompared with what was intended as described on the LOs’ statements. Finally, the assessor will makean estimate of the value of what was achieved by attributing marks or grades. This research is mostlyconcerned with the first two steps of the cycle. The third step is concerned with marking and gradingand falls out of the goals of the current work.Woods [50] defines assessment as a judgement based on the degree to which the goals have beenachieved based on measurable criteria and on pertinent evidence. This definition includes a strongemphasis on the judgemental purpose of assessment. Based on this definition, Woods and otherreferenced authors present five assessment principles and six good practice recommendations that aresummarized in Table 4.Concerning the purpose of assessment, Brown et al. [46, p.77] analyze existing motivations forassessment and identify different reasons to assess. Brown et al. [7, p.11] presented a similar list ofreasons. A synthesis of both lists is presented in Table 5.1http://dictionary.reference.com/browse/assessassessment1. Sampling2. Makinginferences3.Estimatingvalue
  38. 38. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering16Table 4 – Principles and practices on assessment (extracted and adapted from Woods [50])Assessment principles Principles in practiceAssessment is a judgement based on performance - notpersonalities.Assessment is a judgement based on evidence - not feelings.Assessment should be done for a purpose with clearly-definedperformance conditions. The student should know when he/sheis being assessed.Assessment is a judgement done in the context of publishedgoals, measurable criteria and pertinent, agreed-upon forms ofevidence.Assessment should be based on multidimensional evidence.What is being assessed? Have the goals been expressedunambiguously in observable terms? Who creates the goals?Are the goals explicit and published?Are there criteria that relate to the goals? Can each criterion bemeasured? Who creates the criteria? Are the criteria explicit andpublished? 
What evidence is consistent with the criteria? Do both theassessor and the student know that this form of evidence isacceptable?Are the goals and the collection of the evidence possible toachieve in the time and with the resources available?What is the purpose of the assessment? Under what conditionsis the student’s performance assessed? Who assesses? Whattype of feedback is given by the assessor?Have both the student and the assessor received training inassessment?Table 5 - Reasons for assessing studentsDevelopmental Judgmental Quality of teachingTo provide feed back to learners andimprove learningTo motivate learning and help them focusTo diagnose students knowledge andskillsTo consolidate student learningTo diagnose the learning status of thestudentTo rank, classify or grade studentachievementTo estimate students’ value and to allow theto progress in their studiesTo select for future courses or employmentTo provide license to practiceTo give feed-back on teaching efficiency andimprove teachingTo provide data for quality and accreditationprocessesAs stated by Brown et al., the results of assessment are used mainly for developmental andjudgemental purposes. The developmental purpose is related with improving student learning. Thejudgmental purpose is usually concerning with providing a license to proceed to the next level. Infact, in the model presented by these authors, the purpose of an assessment event is placed in acontinuum between developmental and judgemental and it is never purely in one extreme of thecontinuum. Or at least it shouldn’t be, in the opinion of these authors.This perspective is supported by other authors like Boud [3] who distinguishes the purpose ofassessment as formative (developmental) and summative (judgemental). The former aims to improvelearning and provides feed-back to the student, the latter aims at making decisions, judgements and isrelated with grading and marking. Boud considers that it is not advisable to separate both types ofassessment since students will most likely be more concerned with the summative tasks that determinetheir grades. It is recommended that both aspects of assessment should be approached together to beeffective since all assessment leads to learning.Brown et al. [46] also consider the formative component of assessment of great importance. Studentsshould receive feedback about their performance and accomplishments in a timing that allows them tobe informed and to improve. Again, they consider that there is not a clear separation between thesetypes of assessment. Even though formative assessment should not contribute to the grade of thecourse, this is not what happens due to time and workload constrictions. Coursework assignments willprovide formative feedback to students but will also, in most cases, contribute to the final grade.Summarizing, in terms of function assessment tasks can then be classified as summative or judgmentaland formative or developmental, as described above. Additionally, assessment may have a diagnostic
  39. 39. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering17function when intends to assess prior knowledge. Assessment results may be included as indicators inquality and accreditation processes of the institution.Figure 3 - Purposes of assessmentAnother issue concerning assessment is the source of assessment, the person who performs the role ofassessor. Traditionally, the assessors were the tutors but current trends in assessment defend that theinvolvement of other sources including the assessed student, peers or employers2. The use of studentsas assessors, being in self or peer-assessment has many advantages [7, 46].From the learning perspective, by assessing their work or the work of others students may increasetheir evaluating skills and their meta-cognitive knowledge, important for lifelong learning and forprofessional life. Using self- and peer-assessment also facilitates dealing with time constraints whenfacing large numbers of students, for instance. Another benefit of using students as source ofassessment is to take advantage of their privileged perspective in situations where the tutor won’t beable to make an informed judgment. This is the case of as it happens in group-work.Peer- and self-assessment may include providing feedback to the students or marking students work.In the perspective of Brown et al. self and peer assessment are more suitable for formative than forsummative assessment. These authors consider that for using different sources of assessment it isnecessary to provide training and to provide clear information about the task and the assessmentinstrument (marking scheme, grading criteria, etc.).One of the reasons is that these sources of assessment are not always well accepted by students. Asassessors, students don’t like to assess others and take responsibility for their grades. When beingassessed students prefer receiving feedback or being judged by someone with expertise in the subjectarea. Both self and peer assessment are present when using reflexive practice assessment methods likeportfolios.2Brown et al. provide a list of sources of assessmentFormative SummativeDiagnostic InstitutionalPurpose
  40. 40. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering18Figure 4 - Sources of assessmentBoud [3] identifies two factors that are increasing the pressure on assessment and that are present incurrent challenges of higher education. One is the trend of larger numbers of students in HE whichincrease the load of assessment in teachers and institutions. The other factor identified by Boud is theexisting pressure towards the need to incorporate in existing HE frameworks of competences and theneed to deal with the accountability of institutions.Assessment of transversal learning outcomes or skills, included in existing frameworks seems to be adifficult task. Erwin [47] identifies several of these that should be assessed by faculty, given theimportance they assume in society:• Ability to commit oneself through reasoned beliefs and actions• Work cooperatively• Independent work• Accept criticism• Manage personal stress• Self disciplineBrown et al. suggest that cuts in resources, modularization, along with larger number of studentincrease the pressure in education and assessment. Based on the work of other authors, they suggestseveral strategies for dealing with this pressure that are not important for the current work.Figure 5 – Current challenges in assessmentA final issue concerning assessment is the question of validity and reliability. The discussion aboutvalidity and reliability of assessment is important because it helps to clarify important issues of theassessment process. Brown et al. [7, p.234] use an interesting metaphor to explain these two concepts.If assessment was a watch, it was reliable if it was precise, if it measured time consistently. But it wasonly valid if it showed the right time. The opposite is also true: a watch may be telling the right time ata specific moment but it may be not consistent and it goes slower than it should. Also, the observer oftutor selfpeer othersSourceLarge numbers Transferable skillsAvailable resourcesPersonalizedauthentic assessmentChallenges
  41. 41. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering19the watch must know how to tell the time using that type of watch. So, an assessment task must bereliable (or consistent), must be valid (or accurate) and the assessor must know how to use theinstruments (marking schemes, grading criteria) to achieve this.Reliability may be described as being related with fairness and consistency between different assessorsand within one assessor [7, 26, 46]. Reliability is concerned with different assessors awarding thesame marks or grades to the same assessment, or the same assessor awarding the same marks or gradesin different moments. Reliability is strongly related with fairness and replicability. Marking schemes,grading criteria, anonymous marking are all assessment instruments that intend to improve reliability.It is suggested by Brown et al. that even when using explicit criteria and other instruments it is noteasy to achieve the intended reliability. As an example, using a detailed marking scheme may bedifficult to implement and if so, reliability is compromised. The same authors suggest that someassessment methods are more reliable than others. Reliable methods include MCQs and other methodswith a well-defined solution. In the particular case of MCQ it is common practice to measure theinternal consistency of the test by analysing the results. Student performance also affects reliability.The same student may respond the same task differently in separate moments, due to many reasons.Figure 6 summarizes the main issues related with reliability.Figure 6 - Different dimensions of reliability of assessmentValidity in assessment is concerned with measuring the right thing, matching what is intended tomeasure with what is actually measured. For the purpose of this work, the concept of validity ofassessment is more relevant than reliability. Validity is concerned with the sampling phase of theassessment cycle while reliability is closer to inferring and estimating value (see Figure 2).Authors identify different forms of validity [7, 48, 49] but for the purpose of this work only some willbe explored. Brown et al. [7] describe face validity as being the first impression of the assessment task.An assessment task should explain in a clear manner the purpose of the assessment and what isexpected from the student. Another form of validity is the consequential validity of assessment thatwas referred above. This is related with a broader view of assessment, how it impacts the teaching andlearning processes and the intended and non-intended outcomes of an assessment task. The sameauthors explain the concept of intrinsic validity of assessment. This means that an assessment task ismeasuring the iLOs of the course or module. The authors point out that to achieve this type of validity,the iLOs must be clearly described and described in measurable terms. The authors identify one riskassociated with this type of validity. Very detailed descriptions of iLO and of the assessment tasksmay be unmanageable by the assessors, affecting the reliability of the process. This type of validity isthe one with greater relevance to this research project.Linn et al. [48] propose eight validity criteria for complex performance based assessment. Theintention was to propose a framework to help decide on the adequacy of new forms of assessment. TheBetweenassessorsWithin one assessorIntrinsic to the studentIntrinsic to themethodReliability
  42. 42. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering20criteria are: consequences, fairness, transfer and generalizability, cognitive complexity, contentquality, content coverage, meaningfulness and cost and efficiency. Consequential validity was alreadyaddressed and is related with the impact, intended and unintended, of assessment on learning andteaching. Cognitive complexity introduces an interesting perspective to the current work. To trulyunderstand if an assessment is promoting the use of specific cognitive processes it is not enough tohave a detailed analysis of the specifications of the task. It is also necessary to consider the familiarityof the student with the task and analyse how he solves the problem. This perspective is shared by bothBloom et al. and Anderson et al. [40, 41] that identifies the student background as being a problemwhen trying to use the taxonomy to classify assessment tasks.Figure 7 - Types of validity related with current researchConcerning different assessment methods, Brown et al. [7] suggest that the ones related withremembering knowledge or well defined solutions are the ones with higher reliability. But in theirview, what is not necessarily true for validity. Given the explained concepts of reliability and validity,it is easily understood why Brown et al. [7] refer that validity and reliability have conflicting needs ofeffectiveness and efficiency. In Erwin’s comments about this issue [47] it is suggested that validityand reliability of assessment are of particular concern when the focus of assessment is not on theindividual student but when issues of accreditation and the quality of an institution start to come up.With the evaluation processes of institutions the number of people interested in the results ofassessment increases and the institution needs to be accountable for the results. Brown et al. show adifferent opinion when they refer that validity and reliability are crucial features of a fair and effectiveassessment system. These opinions are not necessarily conflicting as we may consider that reliabilityis important when analysing results from a broader perspective, when comparing students or groups ofstudents. This is important not only from the institutional perspective but to ensure fairness amongdifferent students. Validity is more focused on the learner and learning process. Still some aspects ofvalidity may be approached from a broader perspective like in accreditation processes.2.2.2 OVERVIEW OF METHODS OF ASSESSMENTBrown et al. [7] start by splitting assessment in two main categories: examinations and coursework.Traditionally, examinations would be written or oral and were typically blind, meaning the studentwould only know the questions when the examination started. Nowadays, the line dividingexaminations from coursework is not so clear. Examinations are not necessarily blind. In some casesstudents may know the questions in advance the questions or may even take the examination home tosolve them; in other cases students are allowed to take their notes to the exam. Traditionallycoursework was made of essays, problems and reports of practicals. Coursework served mainly but notexclusively for formative purposes. Changes in HE are affecting this category and courseworkincludes a variety of tasks that are currently being used for summative and formative assessment.Intrinsic FaceConsequential CognitiveValidity
  43. 43. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering21Erwin [47, p.53], based on the work of other authors, identified two broad classes of assessmentformats: constructed response and selective response. In the former, students will produce somethinglike a case study, a report or must perform a task. In the latter, students are presented with severalpossible answers and recognize/select the correct one.Brown et al. [46] propose the following list of nineteen assessment methods:• Activities putting into perspective a topic or issue• Case studies and simulations• Critical reviews of articles, viewpoints or opinions• Critiques• Dissertations and thesis• Essay plans• Essays, formal and non-traditional• Fieldwork, casework and other forms of applied research• Laboratory reports and notebooks• Literature searches• Activities putting into perspective a topic or issue• In-tray exercises• Oral presentations• Poster exhibitions• Practical skills and competences• Projects (individual or group)• Reviews for specific audiences• Seen written exams• Unseen written exams• Strategic plansIn later work, Brown et al. [7] present a shorter list of assessment in a more structured approach thatwas closer to what was intended for the purpose of the current work:• Essays• Problems• Reports on practicals• Multiple choice questions (MCQ)• Short answer questions (SAQ)• Cases and open problems• Mini-practicals• Projects, group projects and dissertations• Orals• Presentations• Poster sessions• Reflective practice assignments• Single essay examsAfter comparing and analysing both proposals it was possible to group them into similar activities.The first finding was that the first list does not include any type of questions or exercises that are acommon component of exams (MCQs, problems, SAQs) and it was mostly centred on coursework.But for the purpose of the current work it was necessary to include the summative perspective.Another issue that resulted from the analysis of both lists was the inclusion of orals or poster
  44. 44. Evaluation of the Application of e-Learning Methodologies to the Education of Engineering22exhibition as methods of assessment. It was considered that these were formats of delivery of othermethods that could be a report from practical work, a synthesis of an essay or answering verbally to aSAQ.From this type of analysis of assessment as it was presented by different authors, it was decided, forthe purpose of the current work to differentiate between assessment methods and assessment tasks.The main difference is that the methods are the essence of the assessment tasks that are independent ofthe context of implementation, the grading criteria or the media chosen for delivery. The assessmenttasks are the assessment methods in practice, as adopted by the teachers in their courses. Using thisdifferentiating principle, it was possible to summarize the findings from the literature and compile alist of six general categories of assessment methods that could be used for answering RQ2. Table 6presents a summary of both lists, distributed from the identified general categories.Table 6 - Summary of assessment methodsGeneral categories List of assessment methods by Brown etalList of assessment methods by Brown etalMCQ Multiple choice questions (MCQ)OralsSAQ Short answer questions (SAQ)OralsEssays (scripts) EssaysSingle essay examsDissertationsPresentationsPoster sessionsDissertations and thesesEssay plansEssays, formal and non-traditionalActivities putting into perspective a topicor issueCritical reviews of articles, viewpoints oropinionsCritiquesLiterature searchesPractical work ProjectsGroup projectsMini-practicalsPresentationPoster sessionsReports on practicalsProjects (individual or group)Fieldwork, casework and other forms ofapplied researchLaboratory reports and notebooksPractical skills and competencesOral presentationsPoster exhibitionsIn-tray exercisesProblems ProblemsCases and open problemsCase studies and simulationsReflexive practice Reflective practice assignments In-tray exerciseBrown et al. [7] suggest that different methods may have different applications. As an example, MCQare more suitable for sampling comprehensive knowledge while essays are better for assessingunderstanding, synthesis and evaluation skills. Still, they consider that almost every method could beused to any purpose, although with some sacrifice of validity. Another interesting statement of theseauthors which is of great importance for the current work is that the effectiveness or validity ofassessment depends not only on the method but on the specificities of the assessment task. It isnecessary to match the purpose of assessment with the iLOs and the assessment tasks, including themethods and the instruments. To achieve this match, Brown et al. propose the course designer or tutorto answer a list of questions related with iLOs, assessment methods and grading schemes/criteria thatwill lead to the detailed definition of the assessment task.Multiple Choice Questions (MCQ) or objective test questionsAs defined by Brown et al. [7] a MCQ consists of a question followed by several alternative answersfrom which the student has to choose the correct one. This type of questions is frequently used inobjective tests. Bull and McKenna [51] define objective tests as the ones where students are required

×