Aied99 a toolstalk_murray

300 views
169 views

Published on

Overview of the state of the art for authoring tools for intelligent and adaptive educational software

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
300
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • 1. Still iformative stages, though some are comercial or precommercial
    2. Overall question is how easy or cost effective can it be to build an ITS
  • - This is dif between ITS and CBT
    - & dynamic generation of sequencing of instr. content
  • ITSs are difficult and expensive to build
  • -xamples: over representatin of systems am most familiar with (including the tools developed in our lab at UMass) does not reflect on quality of the tool.
  • Many systems listed are only the latest in a lineage of attempts.
    So there have been many systems.
    Lets hope we have learned something!
  • - dangerous to talk about other people’s systems! Errors and omissions
    - everything I say about a given system or category of systems is partly wrong or a simplification
    - alomost unapoligeticlly mention that I will use a dispropotional number of figures from the Eon authoringsystem developed at UMass.
    - bag of tricks vs. a shelf of tools
    - more an overview and look to the future than a critique
    - difficulty in coming with categories.
    - see my IJAIED paper for more details (**give URL ***)
    - you might be surprised, you might be dissapointed!
  • Defiing characteristic: separates content from strategy
    (skip this slide?? But not title?).
  • - no system does everything;
  • - rest of categories build open these basics
    Student perspeive is like CAI
  • - agin student persopective like CAI
  • - operation and component identification are farily generic and ubiquitous tasks.
    - Perf. Montoiring and feedback tend to be straightforward: that is not the wing pressure cutoff valve; you should have checked the temperature first.
    - diagnosis is harder and authoring not as far along
  • - in the “device simulation” categories we see how having a consistent underlyingrepresentation can lead to instructin for free.
    - Systems in this category capitalkize on this fact even more.
    - examples: facts are taught with mneumonic devices and drill and practice; concepts are taught with examples and anslogies; proecedrues are taught one step at a time.
    - for student these are similar to Multiple teachign strategies category: maindifference is for he author,sincle authoring is both easier and leff flexible.
    - mention “transaction shells” in Merril’s Transaction Theory
  • - Takes the ideas just discussed a step further: rather than having templates for knowledge tpyes, create them for specific types of tasks.
    - more of a production line model
    - examples:
    - I’ll say more about special pruose vs generla purpose systm later
    template for nvestigaand decide Les
    LAThas conversational grammars ; for to train telephone customer service people
  • -
  • - sequencing PLUS navigation isses
  • - most ITS authr tools don’t allow the author to build the interface design highly interactive screens from scratch
    - most mune because its what COTS authoring systems do well
    - but important..
    - Items on screen are objects; animation and other visual effexcts are achievd by attaching properties of the object, such as its location, bitmap, or color, to dynamic simulation parameters.
  • Visual and textual (next slide) method
  • Note: multi knowledge types; objective type; difculty and abstraction level
    - separating the learning objectives from the knowledge
    - Merril’s Content types and
    - Uses Gagnes taxonomy of learning objectives
  • - A connection has to be made between the abstract world of topics and concepts to the concrete world of the screens and digital resrouces that illustrate or (make them contrete)
    - separating the domain relationships (part-of) from the learning pedagogical relationshis (prerequisite);
    - Capabilities C, Objectives ), resources R, pedagogy P
    - Resorce model; Ax abstraction, CP particular case; An analogy;
    - Resouce types inmiddle; resource linking tool on right
  • - graphic objects (e.g. pump) have alternate graphics representations. The representaiont (open or closed valve) is set according to an author defined ate variable. The author then defines rules like if linePressure > 3000 then control vallve position - open.
    - Theoryof operations in XAIDA--aircraft maintenance domain
  • -
    - constraint based and event based e simulain vent processing
    -more complicted simulaitns than XAIDA,but harder to author
  • - most Atools odn’t allow for authoring the teaching strategy…
    -- this is a structured interview tecnique
  • - like authorware or icaon authors
    - more powerful andlfexible bt less usable
  • - only system with customizable student model
  • Authoring is: design, knowledge acquisition, and fabrication,
  • - many of the tools involve helping authors visualize.
    - helping authors visualize conversational grammars
  • - interactive knowlsdge elicitation
    - stepping the author through the design process by asking questions
    - previously we had VISUAL and Strucural organization; this is PROCEDURAL scaffolding; next slide is CONSTEAIN-BASED
  • We’ll revisit scriptable and customizable later
    SKIP THIS?
  • - so far descibed a wide variety of features and teqniques.
    - to AT has all of them.
    - each has strenghts a weaknesses
    - sometimes due to amount of effort or attention payed to certain components, but aslo due to philosophical differences.
  • - strong in one area tends to be weak in some other
  • - ther are a number of hot issues; areas of freiendly controversey
    - ( add: )Worse breadth, flexibility
    - like chandrasakaren’s abstract task types
  • - ther are a number of hot issues; areas of freiendly controversey
    - ( add: )Worse breadth, flexibility
  • - some think we can built systems an order of magnitude more powerful than traditional CAI with an order of magnitude less effort. I am hopeful yet skeptical. Seems only possible for template-based systems.
    - should be able to xpect authors to have some degree of training, sophisitcation, and dedication
  • - who is going to tell these machines how to teach?
    (other suggestions: students, market analysis, legislative committees…)
    SKIP?
  • - leave themost idfficult authoringto the experts.
    - one way to deal with the tradeoffs between usabilty and power
    - use a general purpose AT to build special purpose ones
  • - category 1: toy domains
    - category 4: next slide
  • Rovust: usu. included user documentatin, some level of suport etc.
    - not sure how mnay were built with SIMQUEST or Trainins express (and lwhether they were used)
  • - Xaida figure: including trianing time
    - Demonstr8 informal study: moel tracing multi-column addition or subtractin tuutor built in < 20 minutes.
    - Summary: inconclusive estimates but all indications are that non-model tracing tutors can be authored in same time as traditioanl multimedia instruction
    - since building an ITS presumably takes additional effort for knowledge engineering (to acquire a model of the domain and/or teaching strategies) over and above traditional CAI, we must assume that there are real efficiencies involved in the modular and reusable nature the knowledge in ITSs. & generation of content on the fly
    -
  • - “Existence proofs,” (see authoring tool USE above ) as alternative to evaluation
    - how do you evaluate something like an authoring tool?
    - no clear methods or metrics, but …
  • - mostly the physical characteristcs part of the XAIDA shell
    Brenda wenzel Henry Half and colleagues
  • -- to give an idea of the types of results researchers are getting. The individual results are suggestive only, as most have not been reproduced.
  • Aied99 a toolstalk_murray

    1. 1. AIED July ITS Authoring Tools Survey 1 ITS Authoring Tools: an Overview of the state of the art Tom Murray University of Massachusetts & Hampshire College, Amherst, MA www.cs.umass.edu/~tmurray • References in Murray 1999, IJAIED 10(1): Authoring Intelligent Tutoring Systems: An analysis of the state of the art
    2. 2. AIED July ITS Authoring Tools Survey 2 OR: Cottage industry forms as thousands build intelligent tutoring systems in their basements--NOT YET! OR: ITS construction: How Easy Can It Be?
    3. 3. AIED July ITS Authoring Tools Survey 3 What is an ITS, such that one can be “authored?” – Any CBI system that separates content (what) from strategy (how) – Usually makes inferences about “what the student knows” – I.E. Contains a “model” of domain, strategy, and/or student • --> We’re talking about pretty basic ITSs
    4. 4. AIED July ITS Authoring Tools Survey 4 Purposes of ITS Authoring Tools • (Caveat: Authoring shells vs. authoring tools) • Cost-effective production of ITSs • Decreased skill threshold for authors • Insure good quality by content validation or constraining the ITS to a particular model • Allow more participation of practicing educators in ITS design and evaluation • Provide a test bed for evaluation of alternative strategy or content models
    5. 5. AIED July ITS Authoring Tools Survey 5 How many ITS authoring tools have been built? 29 projects CATEGORY PROJECTS/SYSTEMS 1 Curriculum Sequencing and Planning DOCENT, IDE, ISD Expert, Expert CML 2 Tutoring Strategies Eon, GTE, REDEEM, SmartTrainer AT 3 Device Simulation and Equipment Training DIAG, RIDES, MITT-Writer, ICAT, SIMQUEST, XAIDA 4 Domain Expert System Demonstr8, D3 Trainer, Training Express 5 Multiple Knowledge Types CREAM-Tools, DNA, ID-Expert, IRIS, XAIDA 6 Special Purpose IDLE-Tool/IMap, LAT 7 Intelligent/adaptive Hypermedia CALAT, GETMAS, InterBook, MetaLinks
    6. 6. AIED July ITS Authoring Tools Survey 6 How many ITS authoring tools …MORE 29 + 17 = 46 systems ~ 1982 to present CALAT CAIRNEY DEMONSTR8 TDK, PUPS DOCENT Study Eon KAFITS ID EXPERT Electronic Trainer, ISD-Expert IDLE-Tool IMAP, INDIE, GBS-architectures REDEEM COCA RIDES IMTS, RAPIDS, and see DIAG SIMQUEST SMISLE Smart-Trainer AT FITS Precursor systems • References in Murray 1999, IJAIED 10(1): Authoring Intelligent Tutoring Systems: An analysis of the state of the art
    7. 7. AIED July ITS Authoring Tools Survey 7 Overview: Multiple perspectives describing the field • What kinds of ITSs have been authored? • Authoring the Interface, Domain, Teaching, and Student Models • What Authoring/Knowledge Acquisition Methods Have Been Used? • How Are Authoring Systems Designed? (Design Tradeoffs & Open Issues) • Pragmatics and Use (Are ITS authoring systems “real?”)
    8. 8. AIED July ITS Authoring Tools Survey 8 What kinds of ITSs have been authored? • Both pedagogy-oriented and performance-oriented ITSs • Seven Types of ITSs • Tools constrain ITSs
    9. 9. AIED July ITS Authoring Tools Survey 9 Seven Categories of Authored ITSs • Strengths, Limits, Variations, student perspective • Categories 3, 4, & 6 are mostly “performance-oriented” CATEGORY PROJECTS/SYSTEMS 1 Curriculum Sequencing and Planning DOCENT, IDE, ISD Expert, Expert CML 2 Tutoring Strategies Eon, GTE, REDEEM, SmartTrainer AT 3 Device Simulation and Equipment Training DIAG, RIDES, MITT-Writer, ICAT, SIMQUEST, XAIDA 4 Domain Expert System Demonstr8, D3 Trainer, Training Express 5 Multiple Knowledge Types CREAM-Tools, DNA, ID-Expert, IRIS, XAIDA 6 Special Purpose IDLE-Tool/IMap, LAT 7 Intelligent/adaptive Hypermedia CALAT, GETMAS, InterBook, MetaLinks
    10. 10. AIED July ITS Authoring Tools Survey 10 1. Curriculum Sequencing and Planning Systems: DOCENT, IDE, ISD Expert, Expert CML • Basic and early historical systems • Separates content from presentation and sequencing • Rules, constraints, or strategies for “intelligently” sequencing content--at the “macro level” (topic level) • Usually low fidelity interfaces, canned content, simple student models
    11. 11. AIED July ITS Authoring Tools Survey 11 2. Tutoring Strategies Systems: REDEEM, Eon, GTE, Smart Trainer AT #1 above PLUS: • Micro-level and explicit tutoring strategies – Instructional primitives for hints, explanations, examples. reviews, feedback… – Instruction can have a more dialogue or conversational feel • Some include multiple teaching strategies and meta- strategies • Often have low fidelity interfaces, canned content, simple student models
    12. 12. AIED July ITS Authoring Tools Survey 12 Tutoring strategies category: Example REDEEM Genetics Tutor Content from (ToolBook based) CAI courseware
    13. 13. AIED July ITS Authoring Tools Survey 13 3. Device Simulation and Equipment Training Systems: DIAG, RIDES, MITT-Writer, ICAT, SIMQUEST, XAIDA • Micro-world/simulation-based learning environments • Most focus on equipment/device operation and maintenance procedures • Building the simulation is time consuming, but much of the “tutoring” then comes for free.
    14. 14. AIED July ITS Authoring Tools Survey 14 Examples from RIDES Tutors
    15. 15. AIED July ITS Authoring Tools Survey 15 4. Domain Expert System Systems: Demonstr8, D3 Trainer, Training Express • Deep/runnable models of problem solving expertise • Fine grained student diagnosis and modeling • Building an expert system is very difficult -- but then instruction can come “for free”
    16. 16. AIED July ITS Authoring Tools Survey 16 D3s Medical Tutor
    17. 17. AIED July ITS Authoring Tools Survey 17 Demonstr8’s Subtraction Tutor
    18. 18. AIED July ITS Authoring Tools Survey 18 5. Multiple Knowledge Types Systems: CREAM-Tools, DNA, ID-Expert, IRIS, XAIDA • “Gagne Hypothesis:” There are different types of knowledge --> Each has its own instructional methods and representational formalism • Template-like framework for decomposing content into facts, concepts, and procedures • Many based on instructional design theory principles • Limited so far to relatively simple facts, concepts, procedures
    19. 19. AIED July ITS Authoring Tools Survey 19 6. Special Purpose Systems: IDLE-Tool/IMap, LEAP Authoring Tool • Build tutors for a particular type of task • Can provide strong authoring guidance and constraints • Design and pedagogical principles can be enforced • The task, interface, and pedagogy must fit relatively inflexibly to the given model
    20. 20. AIED July ITS Authoring Tools Survey 20 Example: IDLE-Tool: Sickle Cell Counselor
    21. 21. AIED July ITS Authoring Tools Survey 21 7. Intelligent/Adaptive Hypermedia Systems: CALAT, GETMAS, InterBook, MetaLinks • Similar to Category #1 but also deals with Navigation and (dis)orientation issues • Accessibility and UI uniformity benefits associated with the WWW • Limited interactivity and learning environment fidelity • Potential for making inferences from large numbers of students
    22. 22. AIED July ITS Authoring Tools Survey 22 Example: InterBook
    23. 23. AIED July ITS Authoring Tools Survey 23 Authoring the Interface, Domain, Teaching, and Student Models • Interface • Domain Model – Curriculum knowledge structures – Simulations of Devices and Phenomena – Expert Systems • Teaching Model • Student Model
    24. 24. AIED July ITS Authoring Tools Survey 24 1. Authoring the Interface • Systems with interface authoring tools: RIDES (below), Eon, SIMQUEST RIDES
    25. 25. AIED July ITS Authoring Tools Survey 25 Example 2 Eon’s Interface Editor EON
    26. 26. AIED July ITS Authoring Tools Survey 26 2. Authoring the Domain model Curriculum knowledge and structures Simulations/models of devices and phenomena Domain Expertise models (expert system)
    27. 27. AIED July ITS Authoring Tools Survey 27 Authoring Curriculum Knowledge and Structures • Topics/KUs • Relationships (e.g. prerequisite) • Knowl. Type (concept, procedure…) • Objectives • Importance • Difficulty Eon
    28. 28. AIED July ITS Authoring Tools Survey 28 Example: IRIS
    29. 29. AIED July ITS Authoring Tools Survey 29 Example: CREAM Tools
    30. 30. AIED July ITS Authoring Tools Survey 30 Authoring Simulations of Devices and Phenomena XAIDA
    31. 31. AIED July ITS Authoring Tools Survey 31 Example 2 RIDES
    32. 32. AIED July ITS Authoring Tools Survey 32 Authoring Domain Expertise (Expert systems): D3 Trainer
    33. 33. AIED July ITS Authoring Tools Survey 33 3. Authoring the Teaching Model Example: REDEEM
    34. 34. AIED July ITS Authoring Tools Survey 34 Authoring the Teaching Model Example 2: Eon
    35. 35. AIED July ITS Authoring Tools Survey 35 4. Authoring the Student Model Eon’s SM Editor
    36. 36. AIED July ITS Authoring Tools Survey 36 What Authoring/Knowledge Acquisition Methods Have Been Used? • 1. Scaffolding knowledge articulation with models • 2. Embedded knowledge and default knowledge • 3. Knowledge management • 4. Knowledge visualization • 5. Knowledge elicitation and work flow management • 6. Knowledge and design validation • 7. Knowledge re-use • 8. Automated knowledge creation
    37. 37. AIED July ITS Authoring Tools Survey 37 1. Scaffolding knowledge articulation with models • Ex. 1: Templates: IDLE-Tools • Ex. 2: Ontology-Aware tools: SmartTrainer AT
    38. 38. AIED July ITS Authoring Tools Survey 38 REDEEM 2. Embedded knowledge and default knowledge
    39. 39. AIED July ITS Authoring Tools Survey 39 3. Knowledge management • Topics/KUs • Lesson Objectives • Interface objects & screens • Exercises, examples, pictures • Teaching Strategy actions CALAT
    40. 40. AIED July ITS Authoring Tools Survey 40 4. Knowledge visualization • LEAP-AT
    41. 41. AIED July ITS Authoring Tools Survey 41 5. Knowledge elicitation and work flow management • Author: “What do I do next?” “ Where do I start?” • Prompts in ID-Expert and DNA: “Which of the following describes what the student will learn: a. What is is? B. How to do it? C. How does it work?” • Top down vs opportunistic design – DNA: Semi-structured interactive dialog has prompts with choices • REDEEM: Agenda mechanism for authoring tasks
    42. 42. AIED July ITS Authoring Tools Survey 42 6. Knowledge and design validation • Opportunistic & Open ended --> more flexibility & more errors • Constraint-based advice: – “The estimated time for all Lesson-2 topics exceeeds the estimated time for Lesson-2” – “The engine maintenance procedure has no sub-steps defined” – “Lesson-3 objectives include procedural and conceptual knowledge, but there are no conceptual topics linked to Lesson-3.”
    43. 43. AIED July ITS Authoring Tools Survey 43 7. Knowledge re-use • Libraries of Content, Graphics, Strategies, etc. • Flexible reconfiguration of components
    44. 44. AIED July ITS Authoring Tools Survey 44 8. Automated knowledge creation • Example-Based programming – Inferring a general procedure/rule from an example procedure/rule DEMONSTR8
    45. 45. AIED July ITS Authoring Tools Survey 45 Ex.2: Automated knowledge creation faulty RU1 Abnormal Outcome A1 4 1 2 4 5 Indicator A Indicator B NORMAL NORMAL Outcome A2 Abnormal Outcome B1 Abnormal Outcome B2 Outcome B3 3faulty RU2 faulty RU3 faulty RU1 faulty RU2 faulty RU3 1 6 ALWAYS 5 USUALLY 4 VERY_OFTEN 3 OFTEN AS NOT 2 SOMETIMES 1 RARELY 0 NEVER 1 6 2 3 5 DIAG
    46. 46. AIED July ITS Authoring Tools Survey 46 Suggestions for a Full-Featured Authoring System • Visual reification of conceptual and structural elements • Assistance: design steps or agenda; constraint-based validation • Content reusability and object libraries • Scriptable and customizable • WYSIWIG editing, Opportunistic design, Easy design/test iteration (interpreted vs compiled), Reasonable default values – for rapid prototyping
    47. 47. AIED July ITS Authoring Tools Survey 47 How Are Authoring Systems Designed? Design Tradeoffs & Open Issues • The space of design tradeoffs • General vs. special purpose authoring systems • Who are the authors? • Who should author ITS instructional strategies? • Meta-Level Authoring
    48. 48. AIED July ITS Authoring Tools Survey 48 The Space of Design Tradeoffs Domain Model Tutoring Strategy Student Model Learning Environment Power/ Breadth Flexibility Depth Learnability Usability Productivity Fidelity Cost [The design space has 24 (6x4) independent dimensions or axes.] Domain Model Tutoring Strategy Student Model Learning Environment Power/ Breadth Flexibility Depth Learnability Usability Productivity Fidelity Cost [The design space has 24 (6x4) independent dimensions or axes.]
    49. 49. AIED July ITS Authoring Tools Survey 49 General vs. special purpose authoring systems • E.G. special purpose systems: LAT and IDLE-Tool – Greater usability, fidelity, depth -- but only for design goals that match the tools. – Does the “demand” balance the inflexibility? – How to make more customizable while maintaining ease of use? • Types of abstraction/specialization? – 1. Real-world tasks – 2. Abstract tasks – 3. Knowledge types
    50. 50. AIED July ITS Authoring Tools Survey 50 Abstracting ITSs for special purpose authoring systems • 1. Abstracting real-world tasks: Investigate & Decide; Evidence-Based reporting; Run an Organization • 2. “Abstract tasks:” Equipment operation & maintenance (RIDES); Conversational Grammars (customer service; LAT) • 3. Knowledge types: Facts, concepts, procedures, principles (CREAM- Tools, DNA, ID-Expert, XAIDA)
    51. 51. AIED July ITS Authoring Tools Survey 51 Who are the authors? What level of skill & training should be expected? • Authoring skill sets: instructional design, classroom pragmatics, graphics/UI, domain knowledge, knowledge engineering, script-level programming... • IDLE, XAIDA, REDEEM: try to allow authoring by teachers and “off the street” domain experts with minimal training
    52. 52. AIED July ITS Authoring Tools Survey 52 Suggested authoring scenario • Effort level: Building an ITS is more like writing a book than creating a greeting card! • Skill level: Skill level equivalent: Accounting applications, CAD, spreadsheet macros, 3-D modeling, advanced Photoshop…-- Special training but reasonable • Sophistication level: Authors need to look at the big picture and do ongoing quality assessment of what they build • ITSs are built by design teams, not individuals (distributed skill sets)
    53. 53. AIED July ITS Authoring Tools Survey 53 Who should specify/author ITS instructional strategies? PROS CONS Teachers PRACTICAL Practical experience Not good at articulating or abstracting expertise Instructional Designers ANALYTIC Theories are widely used in some circles Limited to basic knowledge types that are easily represented Psychologists THEORETICAL Know “how the mind works” Use 'first principles'—only useful for simple knowledge structures Educational researchers EMPIRICAL Empirical studies of tutoring and classrooms After many years still don't agree on much Computer scientists (ACTUAL?!) ...end up building the systems… “Isn’t it just all common sense?”… Domain Experts (I.E. NO acquisition of instructional knowledge Experts just show how they do a task & authoring tool infers the instructional methods Fixed instructional method
    54. 54. AIED July ITS Authoring Tools Survey 54 Meta-Level Authoring • Custom/extensible interface widgets • Customizable descriptive vocabulary • Pre-configured tutoring strategies and student models Eon
    55. 55. AIED July ITS Authoring Tools Survey 55 Use & Pragmatics (Are ITS authoring systems “real?”) • Authoring system Use • Authoring system Productivity • Authoring system Evaluation
    56. 56. AIED July ITS Authoring Tools Survey 56 Authoring Tool Use Examples: • XAIDA domains: equipment operation and maintenance, algebra, medicine, computer literacy, biology • IDLE-Tool: three informal trials with 21, 8, 8 grad student and grade school teacher authors 1. Early prototypes and proofs of concept D3 Trainer, Demonstr8, DIAG, IRIS, Expert-CML, SmartTrainer AT 2. Evaluated or used prototypes CREAM-Tools, DNA, Eon, GTE, IDLE-Tool, LAT 3. Moderately evaluated or used ISD-Expert/Training Express, REDEEM, SIMQUEST, XAIDA 4. Heavily used (relatively) IDE, CALAT, RIDES
    57. 57. AIED July ITS Authoring Tools Survey 57 (Relatively) Heavily Used Authoring Tools • Build a dozen or more ITSs • Many ITSs used in real educational settings • Robust enough for use independent of original design team • RIDES: many project spin-offs and diverse domains • CALAT: over 300 Web-based courses (used at NTT)
    58. 58. AIED July ITS Authoring Tools Survey 58 Authoring Tool Productivity • For traditional CAI: Estimated 300:1 ratio of development to instruction time • ITS authoring: goals and some spotty evidence – ID-Expert’s goal: 30:1 – XAIDA’s goal 10:1; evidence of a first time user at 16:1 – KAFITS Physics tutor w/ six hours of instruction: 100:1 – CALAT: ITS development in about the same time as traditional instruction – REDEEM: 2:1 to segment CAI content & make intelligent • Implication: AI Knowledge Representation does provide ITSs with inherent efficiencies
    59. 59. AIED July ITS Authoring Tools Survey 59 Authoring Tool Evaluations • Existence proofs: Usability; Productivity; Breadth • Examples: – XAIDA – REDEEM – IDLE-Tools, COCA, LAT, KAFITS, DNA
    60. 60. AIED July ITS Authoring Tools Survey 60 Evaluations of XAIDA • Eight authoring field studies with average of 10 instructor participants each • 13 studies of students using the built tutors Data: • Learnability: abilities assessment, self-report skills, cognitive assessment, task-based performance • Acceptability: open-ended questionnaire • Productivity: use analysis • Usability: questionnaire
    61. 61. AIED July ITS Authoring Tools Survey 61 XAIDA Evaluation Valence of Comments Across Training 0% 20% 40% 60% 80% 100% Before training End of Day 1 End of Day 2 End of Day 3 After training Neutral Negative Positive 0 5 10 15 20 25 30 35 40 Before training (22) End Day 1 (35) End Day 2 (24) End Day 3 (34) After tra (17 Frequency of Comments
    62. 62. AIED July ITS Authoring Tools Survey 62 Proficiency Using XAIDA 1 2 3 4 5 6 7 8 9 10 Before training End of Day 1 End of Day 2 End of Day 3 End of Day 4 After training NOVICE EXPERT
    63. 63. AIED July ITS Authoring Tools Survey 63 Task Time Spent Training 2 hours Course familiarisation 1 hour Describing pages and sections 4 hours Reflection points & non-computer-based tasks 1 hour Authoring questions 2 hours Classifying students 15 mins Developing teaching strategies 15 mins Relating students to sections 15 mins Relating students to strategies 5 mins Total 10 hours 50 mins REDEEM Evaluation: • 1 SME author, 3 teacher authors, 7 “virtual students” • Data: authoring sub-task time, variations among authors, appreciation of added “intelligence” Time spent by teacher practitioner:
    64. 64. AIED July ITS Authoring Tools Survey 64 Some Formative Evaluation Results (And see Productivity and Use above): • Authors’ cognitive model of the domain had a structure closer to SME’s after tool use (XAIDA) • Considerable difference between authors in content structure, strategy specification, categorization of students (REDEEM) • Teacher reactions in general positive but difficulty with complex relationships among content pieces • Teachers thought AI technology could simulate reasonable teaching strategies(COCOA) • Tools needed to give users abstract view of the content (IDLE- Tools)
    65. 65. AIED July ITS Authoring Tools Survey 65 Some Formative Evaluation Results (cont.) • Including examples for design steps/information was very helpful (IDLE-Tools) • Graphic representations for knowledge elicitation much less error-prone than text-based (LAT) • Overestimated of the level of expertise authors would gain in a short amount of time (LAT) • Authors have difficulty conceptualizing non-linear, modular content (KAFTIS) • Comparing automated knowledge elicitation to coded-by- hand task analysis: automated method covered most of the domain knowledge in a small fraction of the time (DNA)
    66. 66. AIED July ITS Authoring Tools Survey 66 Summary • Many types of ITSs have been “authored” • Wide variety of knowledge acquisition and authoring methods have been used. Too early to know when each is most appropriate. • Some tools have significant use and a few are in commercial or near-commercial form • Promising results in from evaluations of usability and productivity, with more rigorous evaluations just starting • What are the foreseeable limits?
    67. 67. AIED July ITS Authoring Tools Survey 67 Conclusions--How Easy Can It Be? • There are limits! • Limited use of cookie-cutter special purpose authoring tools-- too restrictive for most authors • Limited ability to reduce ITS authoring to easy, small, independent steps (recipes) • Authors need to think about the big picture and need skills and tools to do this
    68. 68. AIED July ITS Authoring Tools Survey 68 ...Back to the Future • Customizability requirements will usually lead to the author specifying BEHAVIORS (choices, rules, algorithms) as well as static information • This requires ability to RUN, test, and modify these behaviors • This is (simple) PROGRAMMING • Debugging skills and tools will be needed! (Tracing, stepping, inspecting states, etc.)
    69. 69. AIED July ITS Authoring Tools Survey 69 ------------------------------------
    70. 70. AIED July ITS Authoring Tools Survey 70 ITS Authoring Tools: an Overview of the state of the art Tom Murray University of Massachusetts & Hampshire College, Amherst, MA www.cs.umass.edu/~tmurray
    71. 71. AIED July ITS Authoring Tools Survey 71 People #1 CALAT (& CAIRNEY) Kiyama, M., Ishiuchi, S., Ikeda, K., Tsujimoto, M. & Fukuhara, Y. (1997). CREAM-TOOLS Frasson, C., Nkambou, R., Gauthier, G., Rouane, K. (1998). Nkambou, R., Gauthier, R., & Frasson, M.C. (1996). D3-TRAINER Reinhardt, B., Schewe, S. (1995). DEMONSTR8 (& TDK, PUPS) Blessing, S.B. (1997). Anderson, J. R. & Pelletier, R. (1991). Anderson, J. & Skwarecki, E. (1986). DIAG Towne, D.M. (1997). EON (& KAFITS) Murray, T. (1998,1996). IDLE-Tool (& IMAP, GBS-archits) Bell, B. (1999). Jona, M. & Kass, A. (1997). INTERBOOK (& ElM-Art) Brusilovsky, P., Schwartz, E., & Weber, G. (1996). IRIS Arruarte, A., Fernandez-Castro, I., Ferrero, B. & Greer, J. (1997). LAT (LEAP Authoring Tool) Sparks, R. Dooley, S., Meiskey, L. & Blumenthal, R. (1999). Dooley, S., Meiskey, L., Blumenthal, R., & Sparks, R. (1995). REDEEM (& COCA) Major, N., Ainsworth, S. & Wood, D. (1997). Major, N.P. & Reichgelt, H (1992). RIDES (& IMTS, RAPIDS, DIAG) Munro, A., Johnson, M.C., Pizzini, Q.A., Surmon, D.S., Towne, D.M, & Wogulis, J.L. (1997). Towne, D.M., Munro, A., (1988). Smart Trainer AT (& FITS) Jin, L, Chen, W., Hayashi, Y., Ikeda, M. Mizoguchi, R. (1999); Ikeda, M. & Mizoguchi, R. (1994) XAIDA Hsieh, P., Halff, H, Redfield, C. (1999). Wenzel, B., Dirnberger, M., Hsieh, P., Chudanov, T., & Halff, H. (1998).
    72. 72. AIED July ITS Authoring Tools Survey 72 People #2 DNA/SMART Shute, V.J. (1998). DOCENT (& Study) Winne P.H. (1991). Winne, P. & Kramer, L. (1988). EXPERT-CML Jones, M. & Wipond, K. (1991). GETMAS Wong, W.K. & Chan, T.W. (1997). GTE Van Marcke, K. (1998,1992). ID EXPERT (& Electronic Trainer) Merrill, M.D., & ID2 Research Group (1998). Merrill, M. D. (1987). IDE (& IDE Interpreter) Russell, D. (1988). Russell, D., Moran, T. & Jordan, D. (1988). MetaLinks Murray, T., Condit, C., & Haaugsjaa, E. (1998). SIMQUEST (& SMISLE) Jong, T. de & vanJoolingen, W.R. (1998). Van Joolingen, W.R. & Jong, T. de (1996). TRAINING EXPRESS Clancey, W. & Joerger, K. (1988).
    73. 73. AIED July ITS Authoring Tools Survey 73

    ×