Blooming CQF


Published on

Report on the Blooming CQF project by Linda Steedman.

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Blooming CQF

  1. 1. CQF – QMT Meta- data standard JISC PALS - R&D by eCom Scotland Project: Blooming CQF
  2. 2. Project Aim <ul><li>  </li></ul><ul><li>The project's primary aim is to specify a system which supports the calculation of a SCQF level when supplied with a CQF/SCQF-compliant repository of questions which make use of Bloom’s Taxonomy (BT). </li></ul>
  3. 3. Blooming CQF-QMT <ul><li>  (Credit Qualification Framework – Question Meta Tags) </li></ul><ul><li>Key Deliverables </li></ul><ul><li>1. Develop a specification for a new metadata standard CQF-QMT for questions held in assessment banks. </li></ul><ul><li>2. Produce a prototype web service based on Bloom’s Taxonomy and linked to SCQF level, to give a representative output on question levelling. </li></ul>
  4. 4. Why is this of interest? <ul><li>Rise in the use of randomisation of question banks for e-assessment - SCQF levelling requirement identified. </li></ul><ul><li>Embedding of methodology into good practice for assessment authors. </li></ul>
  5. 5. Blooms taxonomy – Cognitive domains
  6. 6. SCQF - Framework
  7. 7. Web service <ul><li>A web service was developed to return a list of metadata in an XML format, utilising the Bloom’s query approach. </li></ul><ul><li>Used to tryout assessment questions to check values of existing question banks. </li></ul><ul><li>Items of interest were around how the output of the values from the blooms taxonomy could possibly be referenced the SCQF level. </li></ul>
  8. 8. QTI- Input /Output Question Test Interoperability <ul><li>Provides a well documented content format for storing and exchanging items independent of the authoring tool used to create them. </li></ul><ul><li>Supports the deployment of item banks across a wide range of learning and assessment delivery systems. </li></ul><ul><li>It provides a well documented content format for storing and exchanging tests independent of the test construction tool used to create them.   </li></ul>
  9. 9. QTI Input / Output <ul><li>It supports the deployment of items, item banks, and tests from diverse sources in a single learning or assessment delivery system. </li></ul><ul><li>It provides systems with the ability to report test results in a consistent manner. </li></ul>
  10. 10. Learning Object Model used
  11. 11. Activity Diagram
  12. 12. Use Cases Use Case Diagram.
  13. 13. Prototype –
  14. 14. Natural language results
  15. 15. Negative <ul><li>After evaluation of a range of electronic assessments across a number of curricular areas, it was determined that the relationship between the language used in assessment stems and SCQF / QCF levels could not be accurately or consistently identified through the proto-type software tool at present. </li></ul>
  16. 16. Reasons for difficulties <ul><li>The following reasons were identified – </li></ul><ul><ul><li>There was often a limited amount of text within existing question stems for the software to parse and identify the level. </li></ul></ul><ul><ul><li>The use of language within the question stems did not always reflect the learning domain / level being addressed and, as a consequence, provided no indication of the potential SCQF / QCF level. </li></ul></ul><ul><ul><li>The language used in assessments at different SCQF / QCF levels was often similar / the same therefore providing no discrimination of SCQF / QCF level. </li></ul></ul><ul><ul><li>As a consequence, the SCQF / QCF level could only be consistently and accurately identified through the tool and reference to programme / unit descriptors. </li></ul></ul>
  17. 17. Positive <ul><li>Where assessments were well designed, it was possible, though parsing the question stem, to identify the Bloom’s Domain and associated Level and the relevant SCQF / QCF generic outcome / indicator using the prototype </li></ul>
  18. 18. Good Practice Guidelines <ul><li>Through discussion in practitioner focus groups and the delivery of workshops in assessment design, it was clear that there was a value to practitioners in identifying these characteristics to support assessment design, alignment with other aspects of the curriculum (such as programme and unit descriptions and learning materials) and to ensure that assessments are deployed to address specific learning objectives </li></ul>
  19. 19. Further information <ul><li> </li></ul><ul><li> </li></ul>