• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Expert workshop report, Birmingham, February 2011 (C-SAP collections project)
 

Expert workshop report, Birmingham, February 2011 (C-SAP collections project)

on

  • 391 views

An expert workshop and user testing of OER repositories held in Birmingham on 24th February 2011 to investigate the discovery and use of digital and OERs in research methods’ teaching. The workshop ...

An expert workshop and user testing of OER repositories held in Birmingham on 24th February 2011 to investigate the discovery and use of digital and OERs in research methods’ teaching. The workshop was attended by Alan Bryman, Dave Harris, Sean Moley, Kate Orton-Johnson, Sara Ryan and Antje Lindenmeyer

Statistics

Views

Total Views
391
Views on SlideShare
391
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Expert workshop report, Birmingham, February 2011 (C-SAP collections project) Expert workshop report, Birmingham, February 2011 (C-SAP collections project) Document Transcript

    •  Report on C‐SAP (Higher Education Academy Subject Centre for Sociology, Anthropology and Politics) expert focus group and user testing, February 2011 _________________________________________________________________________________    Isabelle Brent C‐SAP, April 2011        This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    1 
    • Contents  Expert Workshop Report ........................................................................................................................ 3  Trust .................................................................................................................................................... 3  Student Expectations .......................................................................................................................... 4  Resistance to Digital Materials ........................................................................................................... 4  Digital Resources as ‘Additional’ ......................................................................................................... 4  Personal Connection ........................................................................................................................... 4  Career Stage ........................................................................................................................................ 5  Preparation Rather Than Re‐Use? ...................................................................................................... 5  Cross‐disciplinary Materials ................................................................................................................ 6  Attitudes towards OERs ...................................................................................................................... 6  Google as Orientation ......................................................................................................................... 7 Individual Interviews Summary .............................................................................................................. 8  Dave Harris .......................................................................................................................................... 8  Sara Ryan ............................................................................................................................................ 9  . Sean Moley.......................................................................................................................................... 9  Alan Bryman ...................................................................................................................................... 10  Antje Lindenmeyer ............................................................................................................................ 10  Kate Orton‐Johnson .......................................................................................................................... 10 User Testing Review .............................................................................................................................. 11  Relevance .......................................................................................................................................... 11  Interface Difficulties .......................................................................................................................... 11  An example of usability problems: ................................................................................................... 12  Barriers to Use in OER Sites .............................................................................................................. 14  Triton Blog ......................................................................................................................................... 16 Conclusion ............................................................................................................................................. 16     This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    2 
    • Expert Workshop Report As part of the C-SAP Discovering Collections Project an expert workshop was held inBirmingham on 24th February 2011 to investigate the discovery and use of digital and OERsin research methods’ teaching. The workshop was attended by Alan Bryman, Dave Harris,Sean Moley, Kate Orton-Johnson, Sara Ryan and Antje Lindenmeyer. The first session was afocus group chaired by Graham Gibbs. Topics covered included what resources were used,how trust was established in resources, disciplinary considerations and student engagement.During the coffee break short individual interviews were filmed with the participants. Therest of the workshop was devoted to a user testing session of Jorum, Xpert, Connexions,MethodSpace and the Triton politics blog.The major themes to emerge from both sessions are outlined below.Trust In general a preference was expressed for in-house materials that integrate with courses andhave been checked for quality. Reasons for this included trust issues and relevance.Resources produced by other universities were cited – particularly good ones for methodsincluded Leicester for qualitative materials and Methods at Manchester.In the user testing, preference was expressed for Google because of the quick recognition ofacademic institutions through the URL in results pages. In contrast the affiliation of someOER resources was not immediately clear which caused frustration.In addition to university websites, book companion sites (such as those by Sage) were alsotrusted to have relevant and accurate information.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    3 
    • Student Expectations The importance of responding to student expectations were raised at various points during theday which participants feel they must respond to in the contemporary political context(several mentioned the National Student Survey). Resistance to Digital Materials One interesting theme to emerge from the workshop was an identified resistance to digitallearning resources amongst students who are not sure whether such materials count as being‘properly academic’. In addition to this response there seemed to be a lot of confusionamongst students as to what constituted ‘legitimate’ sources to refer to in assessed work andhow citations were made. It was suggested that assessment strategies have not really adaptedto include digital sources which accounts for student responses and universities should be re-evaluating their assessment strategies. Digital Resources as ‘Additional’ Participants also agreed that student expectations are informed by how digital resources arepresented. Most pointed students to online materials as additional or extension activitieswhich may communicate the notion that such materials are not really necessary. It was feltthat only at postgraduate level could such attitudes can be challenged in the context oftraining researchers. Personal Connection During the user testing session one of the groups expressed a clear reservation about using anOER teaching resource developed by someone else. A distinction was made between researchand teaching institutions and it was felt like students had an expectation in research-ledinstitutions that their teachers will use personal examples: I think there is an element of students, particularly in the current climate of  thinking, ‘why are you just foisting me off with some other resource?’     This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    4 
    • This consideration was not just an expectation of students; all the group agreed that makingthe connection between teaching and research was important and represented part of theirown professional identity: there is kudos if you can have all your examples from things you have done In addition the practical difficulties of using other materials for methods teaching werereferred to. For example, in teaching coding participants felt it would be hard to get access todata which shows the various stages in coding. Their own data was trusted more because ofbeing able to refer to the different stages and having a detailed knowledge of the wholeprocess: with your own data you have coded it and are familiar with it  Career Stage All our participants were mid- or later career stage and all agreed that teaching resourcescould be useful for lecturers starting out or moving into an area they were unfamiliar with: If I was starting out as a lecturer, it would be good to see other people’s slides and  see how they do it  It could be helpful if you knew nothing and didn’t have lecturer guiding you  Preparation Rather Than Re‐Use? The reservations about using teaching resources did not mean that participants felt suchmaterials were irrelevant for themselves. There was a general feeling that seeing otherpeople’s resources was positive in a more diffuse manner as part of a general educationalprocess. This perspective is indicated in the first quotation above where the emphasis is on‘seeing how things are done’. Several participants said they would use the resources as part ofpreparation and research but would construct their own teaching resource.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    5 
    • Cross‐disciplinary Materials Participants did not see any difficulties using materials from other disciplines, howeverfinding examples that appealed to students from a wide range of backgrounds was achallenge. Examples were given from health studies and education that students were able torelate to on a personal level.Attitudes towards OERs A variety of opinions were expressed over OERs by participants reflecting their differingexposure to the field. The majority had no experience of (consciously) using OERs. Mostparticipants take for granted the assumption that they can use online materials for teachingpurposes as long as it doesn’t go outside the institution. This places a major hurdle in the wayof accepting the value of OERs because the CC license is not seen as relevant.Apart from the one participant who has already submitted materials to Jorum ere was noenthusiasm about this aspect of OERs. Some participants cited intellectual property reasons;not wanting to share a resource that has been worked hard on. More commonly the expressedview was that participants could not guarantee that all elements of their resources were freefrom copyright. This was cited as a reason why some institutions (such as Southampton) havepulled back from making more materials open. A further institutional barrier was where theuniversity had copyright over teaching materials which made sharing impossible.Above and beyond this, the OER sites that stressed the ease of which it was possible tosubmit materials caused a general lack of trust amongst those unfamiliar with the concept ofsubmitting open materials. In particular the slogan ‘Creating content in Connexions is as easyas 1, 2, 3’, caused both amusement but also mistrust of the site. One participant commented,‘you could be anybody’. In discussion it was suggested that information about submittingresources should not be on the homepage.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    6 
    • Evolving Attitudes Despite these reservations the potential advantages of sharing materials was acknowledged.During the workshop all agreed that a resource that had been quality checked would be usefulfor methods. Doctoral training centres were suggested as providing a potential mechanism forsharing resources. The challenge of how quality was to be assessed and by whom wasdiscussed.Whilst not embracing OERs in the sense of wanting to make personal contributions torepositories, attitudes did evolve both during the day and reflected in some post-workshopfeedback we received: I came away thinking that I was made aware of a lot more online resources I was  using on a regular basis, but there were a lot more ‘teaching’ resources such as  presentations and slides that I could be using This was a common sentiment. The habitual nature of online searching was acknowledged byanother participant that predisposes people to rely on ‘tried and trusted sites’. Anotherparticipant suggested that OER could have a valuable role to play in providing methodsrelated examples and data-sets that will appeal to the diverse personal experiences studentsbring to methods courses.Google as Orientation A key theme throughout the day was the pervasiveness of Google in determining how peoplemake sense of searching online. Phrases such as ‘like Google’ and ‘a kind of focussedGoogle’ were common, particularly in the user testing.Even when not mentioned, some of the difficulties described in the user testing below can beunderstood in the context of how Google operates (such as assuming the left hand menurepresents a narrowing option rather than a new search, consistent use of font colour todenote links and description of resource).   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    7 
    • One participant also suggested that Google’s ability to target adverts has increasedexpectations (as has Amazon) that results will be personalised. Now people expect similartechnologies in academic searches. Individual Interviews Summary During the break between the focus group and user testing the participants did individualinterviews that were filmed. Anna Gruszczynska asked what participants felt were the majorissues facing methods teaching and what elements they would like to see in an onlinecollection of research methods.Dave Harris Dave emphasised the need for methods to be embedded in the social sciences curriculum andtaught alongside substantive issues as opposed to being limited to a separate module.Dave articulated the value of incorporating online resources in methods teaching assupplementary materials but stressed the need for face-to-face teaching for communicatingcore academic values.The elements he identified as important to an online resource collection were defined in termsof classification and stratification as well as a degree of agreement about suitability. Howeverhe accepted that there needs to be diversity reflecting the different needs and preferences ofusers (such as those preferring videos or PDFs).   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    8 
    • Sara Ryan Sara described the research group Oxford University she is part of at that uses qualitativemethods to collect people’s experiences of different health conditions (through filmedinterviews). The materials are freely available online and are used for methods courses at anundergraduate and postgraduate level. This approach is helpful in communicating the‘messiness’ of the research process and to get students to think critically about the researchprocess. Sara identified a difference in emphasis between the use of the resource teachingdifferent levels of students. With undergraduates more generic themes are pursued whereaswith postgraduates the interviews act as discussion points for issues such as validity andtransferability.The elements Sara identified as important for an online resource collection were described aswell-organised, well referenced and well signposted. She argued that she wanted to have adegree of trust in the materials but part of this was a subjective judgement process she mustengage in individually.Sean Moley Sean was speaking from the perspective of the NCRM who provide resources for advancedmethods teaching, some of which are delivered through online courses. He identified a lackof resources available for advanced methods and described NCRM’s work to provide step-by-step guides that takes learners through more advanced techniques.The elements Sean described as important for an online resource collection was described asa ‘collection of collections’ where providers of resources could present what they areproducing in a clear way that is easy to find. This he argued was better than needing to go tothe individual websites of different providers.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    9 
    • Alan Bryman Alan highlighted the differences between teaching undergraduates and postgraduates with theformer being more difficult to persuade about the need for research methods teaching.Resources should reflect the different needs and levels of undergraduate and graduatestudents.Alan described his own use of online resources as being private and supplementary toenhance lectures rather than provide the core content.The elements Alan described as important for an online resource collection was a gatewaymodel that would be an access point for different methods and analytical approaches.Antje Lindenmeyer Antje identified engagement with students as the most important issue facing methodsteaching, particularly in the health context within which she works. Antje described thedifficulties of teaching qualitative methods when judged in terms of quantitative criteria.Kate Orton‐Johnson [Recording problems made part of this interview inaudible]Kate identified time as a key issue in using additional resources for courses. With the limitedtime available to prepare, the number of resources that can be integrated is restricted.The elements Kate described as important for an online resource was described as a sort ofportal model indexed by subject areas, interests, different methodological approaches andethical issues. The categorisation would be done in such a way so that users could draw ongeneric or specific resources.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    10 
    • User Testing Review In addition to the themes outlined in the previous section the user testing raised specificissues that need to be considered separately.The user testing looked at Jorum, Xpert, Connexions, MethodSpace and the Triton politicsblog. Participants were asked to find materials on qualitative interviewing in each of the OERsites. The Triton blog was included as part of the reciprocal relations between the twoprojects and to provide an example of an alternative approach to presenting OERs.Four recurrent themes from user testing: 1. Lack of relevant results 2. Interface difficulties 3. Barriers to use 4. Difficulties downloading materialsRelevance The user testing was coloured by the general lack of resources that participants could find onthe topic of qualitative interviewing. Trust is the sites was weakened by the appearance ofirrelevant results (such as ‘A history of ragtime music’, ‘Great unsolved mysteries inCanadian history’) One participant commented, ‘how do you get to elementary algebra eventhough you are looking for a resource on interviewing?’Interface Difficulties Part of the reason for not being able to locate relevant resources related to a lack of filteringoptions. Advanced searches did not offer better options and it was felt that it ought to bepossible to narrow by alternative criteria to those offered such as:   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    11 
    •  Subject  Topics (within subject)  Level (undergraduate/post-graduate)  Resource typeHaving to specify a subject or area from the outset in order to narrow down results wasdisliked. For example, Merlot forces a choice between materials, learning exercises andmembers that was considered unhelpful. Participants tried searching within the social sciencearea to see if more relevant materials would emerge but the process was cumbersome andachieved no better results.Two distinct approaches to searching for materials emerged. One participant (the only one tohave submitted OER materials) took what might be termed the ‘browsing’ approach, wantingto explore different categories for the sake of it. However the rest of the participants werefocused on a particular search and wanted fast results to a specific query.Within this latter approach the preferred search strategy was to perform and general searchand then to narrow down. Tag clouds were popular as well as the ability to select more thanone term. Participants tried searching within the social science area in Jorum to see if theycould find more appropriate results, however they did not like being restricted to this area andfound the number of stages cumbersome. The Boolean search features were not popular andseemed to generate no better results.An example of usability problems: The Jorum website (which was liked on the grounds of aesthetics) demonstrates many of theusability shortcomings. A screen shot is provided below1.                                                            1  The Leicester e‐link button is specific to this particular computer and is not part of the reviewed websites    This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    12 
    • 1. Misunderstanding/disliking the left hand menu People initially thought the left menu was a means of narrowing down results rather than anentirely fresh search. This is the natural area to look for such refining search techniques andparticipants felt that there should have been options to narrow down the results at this stage.2. Following the wrong link Although the ‘Download Package’ button seems unambiguous, it was not the first choicewhen participants attempted to locate the resource. The most common link participants tried   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    13 
    • first was the ‘persistent link’ URL. The ‘web resource’ URL was also tried and confusedpeople because it appeared twice in results. The ‘Download Package’ was the third choice.This may be because people expect links from specific features like URLs, particularly if acolour is used to denote links. The Jorum site’s use of colour is inconsistent with regard tolinks, for example sometimes the mid-teal colour is used to denote a clickable link (as in the‘license’ information) but other times (as in the title) it isn’t.3. Not scrolling down to the individual files When the participants realised what the ‘Download package’ was they no longer looked forinformation below it which provided the possibility in some cases to download individualfiles.Barriers to Use in OER Sites Participants identified various ‘barriers’ in all of the OER sites that would put off all but themost enthusiastic users. As one person commented, There is so much on the internet, once you reach the barrier you just move on to  something that hasn’t got the barriers . . . I appreciate sites that are easy to use  and intuitive  Inadequate Information about Nature of Resource Participants frequently asserted that they did not have enough clear information about whatthe particular resource was before committing to download it. This information included:  Clear short description of resource  Author and institutional affiliation  Description of what form resource takes (PowerPoint etc.)   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    14 
    • Requiring Email Addresses It was felt that having to put an email address in Jorum to get the resource was presentinganother hurdle: You wouldn’t know how long it would take, will it come in a week’s time, will it end  up in spam when it comes?  Difficulties downloading materials The time it took to conduct searches and to load materials caused frustration and confusion.Users weren’t sure they had clicked the right option when there was no immediate responseand several said that they would give up.Presentation/Accessibility Participants commented on the ‘feel’ of the sites, preferring the presentation of Connexionswhich is more like a standard website. Xpert was considered to be ‘uninspiring’, ‘boring’ and‘flat’.This issue goes beyond aesthetics and participants in some instances equated usability withappearance. For example: Participant 1: I don’t like the, it’s not a very useful interface  Facilitator: What’s not nice about it?  Participant 1: It’s ugly  The link between appearance and usability also relates to the organization of the informationon screen. Confusion was one of the central responses within the user testing session.Reflecting towards the end of the session one participant commented: I’m always struck by the mess of these things; it still looks messy, it is still hard to  find. There is still a lot of sifting to do. I don’t know how you would do it. I suppose    This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    15 
    • it is a combination of getting people to put better information on where stuff is  lodged in these repositories and then a better way of tidying it. Concerns were also raised that the some of the sites (particularly Xpert and Jorum) used palefonts on white backgrounds that are difficult to read and might present accessibility issues forthose with a visual impairment.YouTube videos were cited as useful resources for particular methods areas but again,presentation was cited as a reason not to use them (such as bad sound quality).Triton Blog The Triton blog was presented at the end of the user testing as an alternative way ofpresenting OERs tied to topical posts. The blog received a positive response; participantsliked the topicality of the posts and the tag cloud idea as a means of linking to resources. Themagazine-like presentation was also preferred to a ‘list of lists’.However, there was an (inaccurate) assumption that because of the Oxford/Cambridgeaffiliation that the OER resources generated through the Xpert widget had been reviewed bythe institutions.Conclusion The different activities that constituted the day workshop provided a wide range of valuableinformation relating to methods teaching that will benefit the Collections Project. Since theparticipants come from outside the OER community they provide an essential balance to theperspective of those committed to OERs.An unexpected (and possibly the most significant) issue to emerge was the reservationrelating to digital resources in general, not just OERs. For a variety of reasons digitalresources do not seem to be well integrated into higher education, particularly at   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    16 
    • undergraduate level. Reasons seem to revolve around assessment which is still mostlytraditional, and student expectations. Without reform of how assessment is carried out thevalue systems of students reflect this traditional approach where resources are valued as‘academic’ according to a limited criteria. Problems citing digital resources compound thisdifficulty as does the tendency of teaching staff to direct students to digital resources assupplementary materials.In addition to student attitudes towards digital materials, student expectations in general andstudent engagement emerged as an important theme. It was recognised that methods can bean unpopular and difficult subject to teach, particularly to students from diverse subjects andbackgrounds. This issue was more important than disciplinary considerations; in generalparticipants were happy to draw resources from different subjects to their own but describedthe challenge of making them personally relevant to students.Trust emerged as an important issue that lay behind the preference for in-house producedmaterials. During the individual interviews several participants suggested some form ofgateway to methods resources was most appropriate, where materials would be vetted,however during discussions the consensus was that with the volume of online materialsavailable now such a model is not sustainable.The user testing indicated difficulties with all the OER sites used associated with inaccuratesearch results and usability problems. This is reflected in one of the groups who unilaterallydecided to compare the results with a Google search and insisted that the Google searchproduced better results. The significance of Google in orientating people’s approach to theinternet is important to acknowledge.All but one of the participants was unwilling to submit materials to OER sites, however overthe course of the day attitudes towards OERs in general did change. It is clear that for mostparticipants the case for OERs has not been put persuasively within their own institution andthat were it, attitudes might change. This suggests that more progress might be made toestablish OERs into university teaching if it were part of the professional development ofteaching staff.   This content is licensed under Creative Commons Attribution‐NonCommercial‐ShareAlike 2.0 UK: England & Wales http://creativecommons.org/licenses/by‐nc‐sa/2.0/uk/    17