Digital Educational Content
Quality Assurance Process
Preliminary Discussion
on Aspects of Quality
30/5/2016
Nikos Palavitsinis, PhD
Metadata Expert
Computer Technology Institute & Press “Diophantus”1
Slide of Contents
• Breaking down Digital Educational Content
• Instructional Design
– Quality Aspects
• Digital Assets
– Quality Aspects
• Metadata
– Quality Aspects
• Conclusions
2
Digital Educational Content
MetadataInstructional Design Digital Asset
Digital Educational Content
What is instructional design? Existing approaches
How’s instructional design assessed?
How would a workflow look like? 3
Instructional Design (1/2)
• Instructional design is the practice of creating instructional
experiences which make the acquisition of knowledge and
skill more efficient, effective, and appealing
• The process consists broadly of:
– Determining the state and needs of the learner,
– Defining the end goal of instruction, and
– Creating some "intervention" to assist in the transition
• Ideally, the process is informed by pedagogically
and andragogically tested theories of learning and may take
place in student-only, teacher-led or community-based
settings
Instructional
Design
4
Instructional Design (2/2)
• The outcome of this instruction may be directly observable
and scientifically measured or completely hidden and
assumed
• There are many instructional design models but many are
based on the ADDIE model with five phases: analysis,
design, development, implementation, and evaluation
• As a field, instructional design is historically and
traditionally rooted in cognitive and behavioral
psychology, though recently constructivism has influenced
thinking in the field
Instructional
Design
5
Instructional Design as a Process
Instructional
Design
Analyze, Design, Develop, Implement, Evaluate (ADDIE)
6
Instructional Design as a Process
Instructional
Design
1a. Analyze your learners and gather as much information as you can on them
2b. Analyze your broad goals for the lesson or unit. What are you hoping to achieve?
2a. Identify learning objectives. What specific skills/knowledge do you wish students to obtain?
2b. Identify outcomes. How will you know that the students have achieved your learning
objectives? How will you assess learning?
3a. Develop your instructional strategies. How will you facilitate students in learning the
objectives you identified so that they are able to achieve the outcomes you’ve set for them?
3b. Plan the logistics. How will you group students? How will you arrange the classroom? What
tools and materials will students have available to them?
4. Implement. Try out your lesson plan or unit with students
5. Evaluate. Were you successful? Did students learn? Did they have fun doing it? What should
you change for the next time?
7
Learning Theories (1/3)
• Conceptual frameworks describing how
information is absorbed, processed, and
retained during learning
– Behaviorism: Learning is the acquisition of a new
behavior through conditioning and social learning
• Classical conditioning: Behavior becomes a reflex
response to an antecedent stimulus
• Operant conditioning: an antecedent stimuli is followed
by a consequence of the behavior through a reward
(reinforcement) or a punishment
• Social learning theory: an observation of behavior is
followed by modeling
Instructional
Design
8
Learning Theories (2/3)
– Transfer of Learning: What one learns in school
somehow carries over to situations different from
that particular time and that particular setting
– Cognitivism: learning is an internal mental process
where the educator focuses on building
intelligence and cognitive development. Individual
learner is more important than the environment
Instructional
Design
9
Learning Theories (3/3)
– Constructivism: Importance of active involvement of
learners in constructing knowledge for themselves.
Students are thought to use background knowledge
and concepts to assist them in their acquisition of
novel information
– Transformativism: Takes place by discussing with
others the reasons presented in support of competing
interpretations, by critically examining evidence,
arguments, and alternative points of view.
Transformative learners move toward a frame of
reference that is more inclusive, discriminating, self-
reflective, and integrative of experience
Instructional
Design
10
Quality of Instructional Design
• BECTA
– Core Pedagogic Principles
• They underpin effective learning and teaching, drawing
from learning theory and commonly accepted best
practice
– Core Design Principles
• They cover issues such as resource design, accessibility
and interoperability
Instructional
Design
11
BECTA – Core Pedagogic Principles
• Inclusion & Access
• Learner Engagement
• Effective Learning
• Assessment to Support Learning
• Robust Summative Assessment
• Innovative Approaches
• Ease of Use
• Match to the Curriculum
Instructional
Design
12
BECTA – Core Design Principles
• Digital Learning Resource Design
• Robustness & Support
• Human-computer Interaction
• Quality of Assets
• Accessibility
• Interoperability
• Testing & Verification
• Effective Communication
Instructional
Design
13
Association of American Publishers
• Clearly articulated learning goals and objectives
• Appropriate grade and reading levels
• Clearly stated reputable sources
• Engaging, relevant, and up-to-date content
• Highly vetted content that is accurate, objective, and reliable
• Differentiated learning opportunities
• Standards- and evidence-based lessons/learning aligned with high-
quality assessments
• Well-designed and attractive materials for students, teachers, and
other education professionals
• Adaptable materials for individual learning styles and needs
• Comprehensive teacher guide or instructional support materials
Instructional
Design
Quality Criteria
14
Association of American Publishers
1. Determine Content
2. Research & Planning
3. Early Development
4. Editing & Review
5. Quality Reviews of 1st Version
6. Continuing Quality Reviews
7. Subsequent Editions/Versions
Content Development Process
Instructional
Design
15
ACHIEVE OER Rubrics
I. Degree of Alignment to Standards
II. Quality of Explanation of the Subject Matter
III. Utility of Materials Designed to Support Learning
IV. Quality of Assessment
V. Quality of Technological Interactivity
VI. Quality of Instructional and Practice Exercises
VII. Opportunities for Deeper Learning
VIII. Assurance of Accessibility
Instructional
Design
16
Learning Object Review Instrument
1. Content Quality
2. Learning Goal Alignment
3. Feedback & Adaptation
4. Motivation
5. Presentation Design
6. Interaction Usability
7. Accessibility
8. Reusability
9. Standards Compliance
Instructional
Design
17
Quality Aspects
• Use a clear, quantitative way of assessing the
quality of instructional design
– Cost effective, easy-to-use, time-efficient,
objective and reliable
• Define the process (workflow) as long as the
actors
– Best practices from existing platforms/services
(MERLOT, Stanford Encyclopedia of Philosophy)
Instructional
Design
18
Simple Quality Review Process
19
Instructional
Design
Review
Score
Review
Score
Content Deposited
Assigned to Peer-Reviewers
Received Recommendation
Peer
Review
Form
Content Appears as Approved
rejected
• Setup a process that re-examines (preserves) existing content periodically
• Metrics/Rubrics to be used in Peer Review Form
• Storing Review Scores
• Linking Review Scores to content creators, reviewers and skills
• Offer incentives to reviewers
Digital Educational Content
MetadataInstructional Design Digital Asset
Digital Educational Content
How’s Quality Defined? Technology-related
Compatibility, Accessibility, Format problems
How would a workflow look like? 20
Digital Asset Formats
• Video Extensions
– .mkv, .flv, .vob, .avi, .mov, .mp4, etc with different
video and audio formats
• Audio Extensions
– .3gp, .aac, .flac, .mp3, .wav, .wma
• Image Extensions
– .jpeg, .tiff, .gif, .bmp, .png
• …and many more
Digital Asset
21
Quality
• Quality varies for different digital asset
formats, in a million ways
• Technology advances also affect digital asset
quality for a learning resource
• As formats emerge and evolve and output
devices get more sophisticated, learning
objects quickly become obsolete
22
Digital Asset
Examples of Quality
• Video
– Frames per second: 24 fps
– Interlaced VS progressive
– Aspect ratio: 4:3, 16:9
– Color space & bits per pixel
• Audio
– Sample bit rate: 96kHz, 192kHz, 16-bit, 24-bit
• Image
– Sharpness, Noise, Dynamic Range, Tone Reproduction,
Contrast, Color Accuracy, etc.
23
Digital Asset
Compatibility
• Even the most widely-used file formats may
face compatibility issues
– Flash support, Java support
• Digital Educational Content developed in
proprietary formats or formats that are no
longer supported, can quickly become useless
– Cost of development stays the same but
depreciation stops when the object stops being
used (or not?)
24
Digital Asset
Accessibility
• Not all asset formats can support the
accessibility standards set forth
– IMS Access for All Digital Resource Description 3.0
– ACCMD Specification
– IMS Access for All Personal Needs and Preferences
(PNP) 3.0
25
Digital Asset
Digital Asset Examination
26
One LO
One Asset
One LO
Many Assets
Audio
Video
Photo
Scenarios
Simulations
Lesson Plans
...
Text
Quality
Criteria
Quality
Criteria
Approval Approval
RejectionRejection
• Different content type, different criteria. How much automation is possible?
• Where do we store the information coming out from the media file examination?
• Can this lead to a certification of some form?
• When to re-enter an existing and approved resource into the cycle?
Digital Asset
Digital Educational Content
MetadataInstructional Design Digital asset
Digital Educational Content
How “fluid” is metadata? Does it change regularly?
How is quality defined and measured?
How would a workflow look like? 27
Metadata for Digital Educational Content
• Metadata is a part of the instructional use of a
learning object
– Facilitating the finding of material
– Outlining its main characteristics
• Poor metadata leads to limited re-use of
resources
– A comprehensive quality approach for digital
educational content has to look at metadata as well
Metadata
28
Metadata Quality Metrics (1/4)
• Europeana (not educational)
1. Resulting from trusted processes
2. Findable
3. Readable
4. Standardized
5. Meaningful to Audiences
6. Clear on re-use
7. Visible
29
Metadata
Metadata Quality Metrics (2/4)
• Bruce & Hillman
1. Completeness
2. Accuracy
3. Provenance
4. Conformance to Expectations
5. Logical Consistency and Coherence
6. Timeliness
7. Accessibility
30
Metadata
Metadata Quality Metrics (3/4)
• Metadata Quality Assurance Certification
Process (My PhD metrics, tested & validated)
1. Completeness
2. Accuracy
3. Consistency
4. Objectiveness
5. Appropriateness
6. Correctness
31
Metadata
Metadata Quality Metrics (4/4)
• Metadata Quality Control Workflow (UCLA Library)
1. Primary quality control check
• Completeness, Appropriateness, Correctness, etc.
2. Secondary quality control check
• Accuracy, Consistency, Completeness, etc.
3. Pre-upload quality control check
• MODS XML validation
4. Post-upload quality control check
• Is the metadata displaying appropriately? (sample)
32
Metadata
Metadata Quality Considerations
• Metadata Quality is mostly manual
– Automation is OK but for important metadata,
manual is the only way to go
• Metadata Quality is primarily pro-active
– When annotation finishes, the harm is done
• Metadata Quality is still difficult to quantify
– Some quantitative metrics there but not yet
comprehensive
33
Metadata
Metadata Quality Assurance Certification Process
34
Metadata
http://www.slideshare.net/nikospala/metadata-quality-issues-in-learning-repositories
Draft Workflow
35
Metadata
Metadata Record Review Result
Peer
Review
FormMetadata Record Examination
Metadata Record Assignment
Review
Score
Metadata Record Approval
Review
Score
rejected
• Setup a process that re-examines (preserves) existing metadata periodically
• Metrics/Rubrics to be used in Peer Review Form
• Storing Review Scores
• Linking Review Scores to metadata creators, reviewers and skills
• Offer incentives to reviewers
Conclusions
• Need to incorporate all the processes and/or
workflows into a comprehensive approach
• Involve specific actors, roles, inputs and
outputs
• Consider scalability, costs and added-value
brought to education in general
36
Content-Asset-Metadata
37
Review
Score
Review
Score
Content Deposited
Assigned to Peer-Reviewers
Received Recommendation
Peer
Review
Form
Content Appears as Approved
rejected
One LO
One Asset
One LO
Many Assets
Audio
Video
Photo
Scenarios
Simulations
Lesson Plans
...
Text
Quality
Criteria
Quality
Criteria
Approval Approval
RejectionRejection
Metadata Record Review Result
Peer
Review
FormMetadata Record Examination
Metadata Record Assignment
Review
Score
Metadata Record Approval
Review
Score
rejected
Resources Used
• https://en.wikipedia.org/wiki/Instructional_design
• http://www.instructionaldesigncentral.com/
• http://mirandanet.ac.uk/wp-content/uploads/2015/05/quality_principles.pdf
• https://iktsenteret.no/sites/iktsenteret.no/files/attachments/quality_criteria_dlr.pdf
• http://education.gov.sk.ca/learning-resource-evaluation-guidelines
• http://publishers.org/our-markets/prek-12-learning/quality-content-learning-resources
• http://www.aare.edu.au/data/publications/2006/sir06100.pdf
• http://www.achieve.org/files/AchieveOERRubrics.pdf
• http://www.ifets.info/journals/10_2/5.pdf
• http://pro.europeana.eu/files/Europeana_Professional/Publications/Metadata%20Quality%20Report.pdf
• http://www.slideshare.net/nikospala/metadata-quality-issues-in-learning-repositories
• https://ecommons.cornell.edu/handle/1813/7895
• https://www.library.ucla.edu/sites/default/files/Guidelines_MetadataQualityControl.pdf
38
Digital Educational Content
Quality Assurance Process
Preliminary Discussion
on Aspects of Quality
30/5/2016
Nikos Palavitsinis, PhD
Metadata Expert
Computer Technology Institute & Press “Diophantus”39

Digital Educational Content Quality Assurance Process

  • 1.
    Digital Educational Content QualityAssurance Process Preliminary Discussion on Aspects of Quality 30/5/2016 Nikos Palavitsinis, PhD Metadata Expert Computer Technology Institute & Press “Diophantus”1
  • 2.
    Slide of Contents •Breaking down Digital Educational Content • Instructional Design – Quality Aspects • Digital Assets – Quality Aspects • Metadata – Quality Aspects • Conclusions 2
  • 3.
    Digital Educational Content MetadataInstructionalDesign Digital Asset Digital Educational Content What is instructional design? Existing approaches How’s instructional design assessed? How would a workflow look like? 3
  • 4.
    Instructional Design (1/2) •Instructional design is the practice of creating instructional experiences which make the acquisition of knowledge and skill more efficient, effective, and appealing • The process consists broadly of: – Determining the state and needs of the learner, – Defining the end goal of instruction, and – Creating some "intervention" to assist in the transition • Ideally, the process is informed by pedagogically and andragogically tested theories of learning and may take place in student-only, teacher-led or community-based settings Instructional Design 4
  • 5.
    Instructional Design (2/2) •The outcome of this instruction may be directly observable and scientifically measured or completely hidden and assumed • There are many instructional design models but many are based on the ADDIE model with five phases: analysis, design, development, implementation, and evaluation • As a field, instructional design is historically and traditionally rooted in cognitive and behavioral psychology, though recently constructivism has influenced thinking in the field Instructional Design 5
  • 6.
    Instructional Design asa Process Instructional Design Analyze, Design, Develop, Implement, Evaluate (ADDIE) 6
  • 7.
    Instructional Design asa Process Instructional Design 1a. Analyze your learners and gather as much information as you can on them 2b. Analyze your broad goals for the lesson or unit. What are you hoping to achieve? 2a. Identify learning objectives. What specific skills/knowledge do you wish students to obtain? 2b. Identify outcomes. How will you know that the students have achieved your learning objectives? How will you assess learning? 3a. Develop your instructional strategies. How will you facilitate students in learning the objectives you identified so that they are able to achieve the outcomes you’ve set for them? 3b. Plan the logistics. How will you group students? How will you arrange the classroom? What tools and materials will students have available to them? 4. Implement. Try out your lesson plan or unit with students 5. Evaluate. Were you successful? Did students learn? Did they have fun doing it? What should you change for the next time? 7
  • 8.
    Learning Theories (1/3) •Conceptual frameworks describing how information is absorbed, processed, and retained during learning – Behaviorism: Learning is the acquisition of a new behavior through conditioning and social learning • Classical conditioning: Behavior becomes a reflex response to an antecedent stimulus • Operant conditioning: an antecedent stimuli is followed by a consequence of the behavior through a reward (reinforcement) or a punishment • Social learning theory: an observation of behavior is followed by modeling Instructional Design 8
  • 9.
    Learning Theories (2/3) –Transfer of Learning: What one learns in school somehow carries over to situations different from that particular time and that particular setting – Cognitivism: learning is an internal mental process where the educator focuses on building intelligence and cognitive development. Individual learner is more important than the environment Instructional Design 9
  • 10.
    Learning Theories (3/3) –Constructivism: Importance of active involvement of learners in constructing knowledge for themselves. Students are thought to use background knowledge and concepts to assist them in their acquisition of novel information – Transformativism: Takes place by discussing with others the reasons presented in support of competing interpretations, by critically examining evidence, arguments, and alternative points of view. Transformative learners move toward a frame of reference that is more inclusive, discriminating, self- reflective, and integrative of experience Instructional Design 10
  • 11.
    Quality of InstructionalDesign • BECTA – Core Pedagogic Principles • They underpin effective learning and teaching, drawing from learning theory and commonly accepted best practice – Core Design Principles • They cover issues such as resource design, accessibility and interoperability Instructional Design 11
  • 12.
    BECTA – CorePedagogic Principles • Inclusion & Access • Learner Engagement • Effective Learning • Assessment to Support Learning • Robust Summative Assessment • Innovative Approaches • Ease of Use • Match to the Curriculum Instructional Design 12
  • 13.
    BECTA – CoreDesign Principles • Digital Learning Resource Design • Robustness & Support • Human-computer Interaction • Quality of Assets • Accessibility • Interoperability • Testing & Verification • Effective Communication Instructional Design 13
  • 14.
    Association of AmericanPublishers • Clearly articulated learning goals and objectives • Appropriate grade and reading levels • Clearly stated reputable sources • Engaging, relevant, and up-to-date content • Highly vetted content that is accurate, objective, and reliable • Differentiated learning opportunities • Standards- and evidence-based lessons/learning aligned with high- quality assessments • Well-designed and attractive materials for students, teachers, and other education professionals • Adaptable materials for individual learning styles and needs • Comprehensive teacher guide or instructional support materials Instructional Design Quality Criteria 14
  • 15.
    Association of AmericanPublishers 1. Determine Content 2. Research & Planning 3. Early Development 4. Editing & Review 5. Quality Reviews of 1st Version 6. Continuing Quality Reviews 7. Subsequent Editions/Versions Content Development Process Instructional Design 15
  • 16.
    ACHIEVE OER Rubrics I.Degree of Alignment to Standards II. Quality of Explanation of the Subject Matter III. Utility of Materials Designed to Support Learning IV. Quality of Assessment V. Quality of Technological Interactivity VI. Quality of Instructional and Practice Exercises VII. Opportunities for Deeper Learning VIII. Assurance of Accessibility Instructional Design 16
  • 17.
    Learning Object ReviewInstrument 1. Content Quality 2. Learning Goal Alignment 3. Feedback & Adaptation 4. Motivation 5. Presentation Design 6. Interaction Usability 7. Accessibility 8. Reusability 9. Standards Compliance Instructional Design 17
  • 18.
    Quality Aspects • Usea clear, quantitative way of assessing the quality of instructional design – Cost effective, easy-to-use, time-efficient, objective and reliable • Define the process (workflow) as long as the actors – Best practices from existing platforms/services (MERLOT, Stanford Encyclopedia of Philosophy) Instructional Design 18
  • 19.
    Simple Quality ReviewProcess 19 Instructional Design Review Score Review Score Content Deposited Assigned to Peer-Reviewers Received Recommendation Peer Review Form Content Appears as Approved rejected • Setup a process that re-examines (preserves) existing content periodically • Metrics/Rubrics to be used in Peer Review Form • Storing Review Scores • Linking Review Scores to content creators, reviewers and skills • Offer incentives to reviewers
  • 20.
    Digital Educational Content MetadataInstructionalDesign Digital Asset Digital Educational Content How’s Quality Defined? Technology-related Compatibility, Accessibility, Format problems How would a workflow look like? 20
  • 21.
    Digital Asset Formats •Video Extensions – .mkv, .flv, .vob, .avi, .mov, .mp4, etc with different video and audio formats • Audio Extensions – .3gp, .aac, .flac, .mp3, .wav, .wma • Image Extensions – .jpeg, .tiff, .gif, .bmp, .png • …and many more Digital Asset 21
  • 22.
    Quality • Quality variesfor different digital asset formats, in a million ways • Technology advances also affect digital asset quality for a learning resource • As formats emerge and evolve and output devices get more sophisticated, learning objects quickly become obsolete 22 Digital Asset
  • 23.
    Examples of Quality •Video – Frames per second: 24 fps – Interlaced VS progressive – Aspect ratio: 4:3, 16:9 – Color space & bits per pixel • Audio – Sample bit rate: 96kHz, 192kHz, 16-bit, 24-bit • Image – Sharpness, Noise, Dynamic Range, Tone Reproduction, Contrast, Color Accuracy, etc. 23 Digital Asset
  • 24.
    Compatibility • Even themost widely-used file formats may face compatibility issues – Flash support, Java support • Digital Educational Content developed in proprietary formats or formats that are no longer supported, can quickly become useless – Cost of development stays the same but depreciation stops when the object stops being used (or not?) 24 Digital Asset
  • 25.
    Accessibility • Not allasset formats can support the accessibility standards set forth – IMS Access for All Digital Resource Description 3.0 – ACCMD Specification – IMS Access for All Personal Needs and Preferences (PNP) 3.0 25 Digital Asset
  • 26.
    Digital Asset Examination 26 OneLO One Asset One LO Many Assets Audio Video Photo Scenarios Simulations Lesson Plans ... Text Quality Criteria Quality Criteria Approval Approval RejectionRejection • Different content type, different criteria. How much automation is possible? • Where do we store the information coming out from the media file examination? • Can this lead to a certification of some form? • When to re-enter an existing and approved resource into the cycle? Digital Asset
  • 27.
    Digital Educational Content MetadataInstructionalDesign Digital asset Digital Educational Content How “fluid” is metadata? Does it change regularly? How is quality defined and measured? How would a workflow look like? 27
  • 28.
    Metadata for DigitalEducational Content • Metadata is a part of the instructional use of a learning object – Facilitating the finding of material – Outlining its main characteristics • Poor metadata leads to limited re-use of resources – A comprehensive quality approach for digital educational content has to look at metadata as well Metadata 28
  • 29.
    Metadata Quality Metrics(1/4) • Europeana (not educational) 1. Resulting from trusted processes 2. Findable 3. Readable 4. Standardized 5. Meaningful to Audiences 6. Clear on re-use 7. Visible 29 Metadata
  • 30.
    Metadata Quality Metrics(2/4) • Bruce & Hillman 1. Completeness 2. Accuracy 3. Provenance 4. Conformance to Expectations 5. Logical Consistency and Coherence 6. Timeliness 7. Accessibility 30 Metadata
  • 31.
    Metadata Quality Metrics(3/4) • Metadata Quality Assurance Certification Process (My PhD metrics, tested & validated) 1. Completeness 2. Accuracy 3. Consistency 4. Objectiveness 5. Appropriateness 6. Correctness 31 Metadata
  • 32.
    Metadata Quality Metrics(4/4) • Metadata Quality Control Workflow (UCLA Library) 1. Primary quality control check • Completeness, Appropriateness, Correctness, etc. 2. Secondary quality control check • Accuracy, Consistency, Completeness, etc. 3. Pre-upload quality control check • MODS XML validation 4. Post-upload quality control check • Is the metadata displaying appropriately? (sample) 32 Metadata
  • 33.
    Metadata Quality Considerations •Metadata Quality is mostly manual – Automation is OK but for important metadata, manual is the only way to go • Metadata Quality is primarily pro-active – When annotation finishes, the harm is done • Metadata Quality is still difficult to quantify – Some quantitative metrics there but not yet comprehensive 33 Metadata
  • 34.
    Metadata Quality AssuranceCertification Process 34 Metadata http://www.slideshare.net/nikospala/metadata-quality-issues-in-learning-repositories
  • 35.
    Draft Workflow 35 Metadata Metadata RecordReview Result Peer Review FormMetadata Record Examination Metadata Record Assignment Review Score Metadata Record Approval Review Score rejected • Setup a process that re-examines (preserves) existing metadata periodically • Metrics/Rubrics to be used in Peer Review Form • Storing Review Scores • Linking Review Scores to metadata creators, reviewers and skills • Offer incentives to reviewers
  • 36.
    Conclusions • Need toincorporate all the processes and/or workflows into a comprehensive approach • Involve specific actors, roles, inputs and outputs • Consider scalability, costs and added-value brought to education in general 36
  • 37.
    Content-Asset-Metadata 37 Review Score Review Score Content Deposited Assigned toPeer-Reviewers Received Recommendation Peer Review Form Content Appears as Approved rejected One LO One Asset One LO Many Assets Audio Video Photo Scenarios Simulations Lesson Plans ... Text Quality Criteria Quality Criteria Approval Approval RejectionRejection Metadata Record Review Result Peer Review FormMetadata Record Examination Metadata Record Assignment Review Score Metadata Record Approval Review Score rejected
  • 38.
    Resources Used • https://en.wikipedia.org/wiki/Instructional_design •http://www.instructionaldesigncentral.com/ • http://mirandanet.ac.uk/wp-content/uploads/2015/05/quality_principles.pdf • https://iktsenteret.no/sites/iktsenteret.no/files/attachments/quality_criteria_dlr.pdf • http://education.gov.sk.ca/learning-resource-evaluation-guidelines • http://publishers.org/our-markets/prek-12-learning/quality-content-learning-resources • http://www.aare.edu.au/data/publications/2006/sir06100.pdf • http://www.achieve.org/files/AchieveOERRubrics.pdf • http://www.ifets.info/journals/10_2/5.pdf • http://pro.europeana.eu/files/Europeana_Professional/Publications/Metadata%20Quality%20Report.pdf • http://www.slideshare.net/nikospala/metadata-quality-issues-in-learning-repositories • https://ecommons.cornell.edu/handle/1813/7895 • https://www.library.ucla.edu/sites/default/files/Guidelines_MetadataQualityControl.pdf 38
  • 39.
    Digital Educational Content QualityAssurance Process Preliminary Discussion on Aspects of Quality 30/5/2016 Nikos Palavitsinis, PhD Metadata Expert Computer Technology Institute & Press “Diophantus”39