'What matters (?)': As university educators, what should we be concerned about as we train, teach and educate, our students in an environment where technology is playing an increasingly larger part in the process? What can we learn from the so-called 'Google Generation' (do they exist?) about the use of technology to create learning environments that support the strategies that our students are likely or supposed to adopt? Does technology get in the way and hinder learning? does it really 'enrich' the learning experience? While technology is great at automation and standardisation, and hence the economic delivery of 'bundles' of knowledge, how can we use it as well to inculcate ingenuity, industry, and innovation? in the hope that our students will go on invent futures that make for a better society? In this talk, I will reflect on these questions as I review some of the projects we have been involved with to see what we can learn from them about 'What matters (?)' in educating our students for the future
Participants were recruited randomly; equally representingmale & female population (almost equally)The study was carried out in two stages: In stage 1: focus groups (2 groups - in total 9 participants) to identify vocabulary that users understand and use in the context of resource discovery systems, the vocabulary used during information search and query formulation and identify which electronic resources are used by the different user groups. From this information we developed three task scenarios of varying levels of difficulty and ambiguity that were used in the observation sessions. Electronic resources refer to In stage 2: part A: observational study was combined with a ’think aloud’ method that lasted about 1 h and in Part B we combined an in-depth interview with a cue recall technique. We adapted the Critical Decision Method and combined it with Cued Recall. CDM is a structured retrospective interview method that is used for learning about users’ expertise and strategies invoked during a specific incident. ETA – technique that uses a concept distillation process to rapidly and systematically identify broad themes that are similar ideas, and concepts reported across interviews and observations. The data can be then identified, indexed and collated. The themes were collated and analysed across the different study groups across all three studied institutions.
During the observation studies we identified a set of steps that participants took while searching for information. Just in short here and I’ll explain each step in detail next.(i) Users starts with ‘Initiate’, which is a process of staring the search activity. They define the subject, make sure that they understand the concepts, and define the keywords. Also they will decide on the selection of resources to use such as internal, external or personal/social network.(ii) Next is ‘Search’. It appeared that many of the users in the UBiRD study followed Spencer’s (2006) modes people experience when seeking for information. These modes of information search are: ‘known-item search’ (users know the accurate keywords), exploratory search (users might not know where to start to look, but they will recognise when they found the right answer), (iii) ‘List’unknown search (users don’t know what they need to know, they may not have the right vocabulary but will recognise when they have found the right information, (iv) ‘evaluate’(v) re-fine and reformulate - (users look for information that they found before, history search, something that they found before on the topic). These functions can supported by a variety of tools such as a single keywords, advanced multi-word and Boolean search, link and others . I’ll talk about them in detail later. (iv) Next stage if Select & review’ where users evaluate their list of results then based on that they can either change resources, (vii) ‘Change resources’ (viii) Store. We have added one additional mode ‘storage’ to Spencer’s modes where users kept their relevant information. Refine or re-formulate the query, if they found something interesting they may ‘View Details’ (where they evaluate individual documents) or they may ‘Abandon’ the search for various reasons, which I’ll discuss later. (iv)Abandon
Lostness or the participants disorientation from the task (Smith, 1996) was calculated using the following equation: Lostness = √ (Upages)/ (Pvisited -1)2 + (Opath / Upages -1)2 Lostness ranged from 0 to 2. Lostness is a ratio of total number of pages visited, optimal path and unique pages visited, if these values were similar the user was not lost as there is no diversion from the optimal path. If lostness equals 2 the participant was believed to be diverted from the optimal path or lost.
Data CharacteristicsVery large amounts of data, about many different and some possibly related, but much un-related topics, and within each topic area may have fragmentary information relating to several threads Supplied by many different sources, reside on possibly un-connected or loosely coupled data sets Be of different formats such a numerical, video, photos, un-structured text Varying quality, reliability, ambiguous, similar yet different Be incomplete with missing data, and out of sequenceEntities with unknown and unexpected relationshipsLacks contextStructuring Analytical ProblemsDecomposeIdentifying the components of the problem or situationUnderstanding their structure How the parts relate to each otherOften times the parts are not clearly related (ambiguity, uncertainty, missing ..)What assumptions needed? Are they valid?How does the evidence come to give a conclusion?ExternalizeBasic tools include: tables, lists, diagrams … Methods for manipulating the data and their relationships in order to see and test alternative views and conclusionsRe-arranging (substituting, changing sequences); deleting, combining, modifying (what if …?), re-conceptualizing, assigning more weight or significanceGenerating Alternative Possibilities Analysis of Competing Hypothesis
state of the art in terms of realism and interactivity(b) state of the art in terms of how the games are controlled at the user interface level, as well as at the substantive 'story' level visualisation - realism of objects and characters, including their rendering, as well as their movement in the sceneinteraction - the physical i/o devices such as mouse, keyboard, PS3 controller, etc to control actions (complex and simple), and their compatibility with the type of game or training simulation eg AA3 vsTruSim(c) storyline control - how situational information is presented to enable assessment, and determining and selecting what can be done (e.g. the long list in 'Preview Simulation' for training the paramedics is ridiculously long!)(d) multi-player coordination - what are the techniques used (for visualisation of the coordination, interaction (devices and interfaces) with the game so that affordances are clear, and how the storyline is controlled), so that players can see and coordinate their actions and maintain situation as well as team awareness?(e) voice communications (as in 'Ambush') is nicely used, and what / how do we do this?Training focusSituation assessment, and (team) decision makingExpertise, considerations, emotive stateRealismRendering, movement and physicsSocio-behavioral, and Believable scenariosEngagementCollaborative, project-based, authenticSystem performanceGood frame-rateNumber of simultaneous playersMoving away from procedural training into problem assessment, hypothesis formulation and action design and executionDecision makers who learn by applying their knowledge in realistic and interactive scenariosInteractivity provides immediate feedback on decisions and actions: to both right and wrong decisionsLearning to recognise and understand how knowledge is structured in the (simulated) worldTeam work and collaborationIndividual, in a team of avatars, in teams of people