3. Are the tools working for you?
Informing emerging
conversations with
latest research
Supporting
experimentation with
emerging tools
Engaging in early
education / outreach
4. • Searching for your literature review
• Maximising your research profile: how to be seen better
• Measuring research performance
• Copyright essentials for researchers
• Mindmapping for researchers
• Keeping up to date with new research
• Open Access at Northumbria
• Research and collaboration using web tools & social media
10. “The way that publishing
has been developing and
the need to let the
wolrd(sic) know about my
outcomes”
How to monitor your impact as
a researcher through citation
alerts, altmetrics, ego
searches, etc.
I found all the lesson very
interesting in particoluar(sic) the
parts of " Choosing a journal"
and the using of the "social“ to
promote our research.
“Reinforced what I
thought I knew but was
unsure of. Handout to
accompany the
session useful.”
“The examples of how staff
at Northumbria had used
different media, etc. it was
relaly(sic) inspiring.”
11. Next steps
• Visits
• Research metrics service
• Collaboration with internal partners
Photo credit: https://pixabay.com/en/step-steps-path-direction-shoes-163948/
12. Are the tools working for you?
Informing emerging
conversations with
latest research
Supporting
experimentation with
emerging tools
Engaging in early
education / outreach
13. • What stage are your researchers / institution at?
• What do you do currently to support researchers
regarding altmetrics?
• What would you like to do?
Photo credit:
https://upload.wikimedia.org/wikipedia/commons/3/32/King%27s_College_London_Students_Evacuated_To_Bristol%2C_England%2C_1940_D430.jpg
Over to you!
Editor's Notes
I’m talking about altmetrics with a small ‘a’ and a ‘s’ and not the company but in general those measurements beyond traditional metrics
There are few case studies at the moment about universities implementing Altmetric or another product - this presentation is not about that, but about the process of opening researchers eyes to these new methods of measuring research.
Its been a evolution of support and growth of knowledge rather than a project to implement a new system.
Briefly, What are people’s experiences of researchers and altmetrics?
This session will look at the development of altmetrics at Northumbria, the rationale behind these developments and the feedback gathered from researchers.
I will discuss Northumbria’s future potential developments for supporting users with altmetrics and then I will ask you to share your experiences and what the next steps you might be taking in engaging researchers with altmetrics.
As Lapinski, Piwowar and Priem in 2013 Riding the crest of the altmetrics wave: How librarians can help prepare faculty for the next generation of research impact metrics. College & Research Libraries News, 74(6), 292-300.
“…libraries are in a unique position to help facilitate an informed dialogue.”
They further state that librarians can support in 3 main ways:
Informing emerging conversations with latest research
Supporting experimentation with emerging tools
Engaging in early education / outreach
Northumbria has begun to support in largely the informing conversations and the early education side.
We will revisit this later as we look at plans at other institutions which people might like to share and if there are any other ways librarians might support engagement with altmetrics.
Background:
Researcher development week (RDW) has run in November and March since 2010 and has grown from strength to strength during this time.
It is a well known Library training event in the academic calendar and a good proportion of 1st year Post graduate researchers attend as well as staff. This is voluntary not compulsory attendance.
Over this time the sessions have constantly evolved and been updated through our evaluation processes . Sessions have been retired and new ones born.
The week comprises of workshops of 1.5 hours and bitesize briefings of 45mins.
We run workshops on a variety of areas such as Searching for your Literature review,, Keeping up to Date, research and collaboration with web tools and social media, Open Access and Copyright
Measuring research performance and Maximising your research profile: how to be seen better (introduced March 2014) are the 2 which look at metrics - traditional and alternative.
Researcher Development Week programme: http://library.northumbria.ac.uk/skillsdev-resdev
Measuring Research Performance
This has been present in our delivery since the beginning but the content has evolved over time.
At first we only had Web of Science and so it was quite difficult to cater for all the subjects.
We covered Impact factors and h indexes using Web of Science and looked briefly at Google scholar.
We had to shy away from using the term ‘impact’ because of the connections to the REF and the different meanings given so we talk about ‘performance’ instead and quite clearly mention citations so its is different from the other workshops on impact.
We have developed the session over the years to include alternative journal ranking tools such as SJR and when we brought Scopus we added in how to calculate an h index from Scopus and Web of Science. This has also broadened out the range of research areas we can support with and our professional expertise has grown also to have more examples and knowledge to engage possible reluctant attendees.
Feedback from our researchers (academic staff and PGR students) was largely positive.
But it was always difficult with any social scientists and arts and humanities PhDs for them to see “not relevant to me” was some of the feedback we would get
We did try to offer alternatives (different subject listings, other criteria) and not be too science orientated. JIFs are not the be all and end all.
Photo credit: http://www.vads.ac.uk/results.php?cmd=search&words=NAP_4_3_13_19&mode=boolean&submit=search
Corridor at Coronation Road Test Tunnels, engineers and technicians monitor engine tests...Mechanical Engineering Image Collection
We have created various handouts to support researchers on journal ranking tools and h indexes.
Which are all available online via skills plus our online repository. www.northumbria.ac.uk/researchskills
Our presentations are also available and are created with extra notes to enable them to be a useful standalone resource.
The rationale is that we want to be able to provide good quality and accurate information to researchers at any point, so they can make informed judgements and we know we cannot see all our researchers face to face.
We had always run a course on web tools and their use in research. We made this practical and offered possibilities to people rather than issue rules on use or what app to use. People enjoyed the sessions and generally came away with a positive ideas and a personal plan for them.
With the proliferation of social media and its growing use and awareness in HE and we developed social media videos in partnership with an academic to give not just our voice but an actual real researcher perspective.
We felt this would add an extra dimension to our sessions and it was the first one we have worked on to get researchers to share their experiences with us to benefit new researchers.
But I felt we needed to address what people might do with social media once they had got on board, there was a gap between metrics and social media and at the time there was buzz around altmetrics and demonstrating impact for the REF. People were trying anything to get a grip on what impact could mean.
We also didn’t want to cram anymore into our existing sessions.
So I created a prezi on ‘maximising your research profile: how to be seen better’ which brought in ORCID, discoverability of research, social media and altmetrics.
This was pushing me professionally on using a new tool and was used as a pilot as its was our first use of Prezi.
The session was very well received by academics and professional library colleagues and people responded positively to seeing the links between existing sessions, so it was added to the menu of sessions we could offer to researchers.
And then I went onto create a leaflet on altmetrics with feedback from colleagues as more and more researchers started to mention it in professional blogs and other places I read to stay up to date.
At this point there wasn’t a great deal of demand from NU academic staff but we were hearing our colleagues in Research and Business Services were looking at altmetrics tools and so we felt it would be soon and so best to start preparing something.
When I looked around at other universities there was not much aside from webpages with a short explanation. It took sometime to decide what would be useful to researchers and not just create a list of suppliers.
So we cover benefits, the main players and how they differ, what you can measure and how, and then how you might increase your scores - linking back to the social media materials we have and also further reading using hashtags and mendeley altmetrics group so it stays current.
And we also added in a slide or 2 on altmetrics in the measuring research performance session (the traditional one)
and we ask researchers what they know about altmetrics, so people could see the connections between traditional and alternative metrics.
Hopefully we are creating these linkages between sessions and materials so researchers can see how altmetrics fits in with what they do already or if new to it all - how to go about getting started and understand why its useful.
QUESTION: Does anyone do anything similar and what is the reaction?
We still wanted to reach more research active staff as we mostly see new PhD students, or staff starting PhDs - to gauge where our support was for them.
We took the Prezi and helpsheet and spoke at our University internal research conference and made it more interactive in order to get audience participation and validation. Although we saw a small number of researchers - they were all very positive about the messages we were giving, it struck chords with people and it seemed to be at the right level - which was awareness of social media, that they should be doing ‘things’ with it and heard of altmetrics but not sure what it all meant for them.
We also presented the ‘measuring research performance’ session (traditional metrics session) at our researcher development essentials event at the end of June with our partners Graduate school and Research and Business Services and again found very similar feedback from attendees.
Obviously we do have prolific tweeters and social media savy researchers at NU and we are developing relationships with several and asking their feedback and also any real life examples of impact to use as case studies to use in teaching as a method of giving our teaching some extra bite.
Feedback so far about the developments has been really positive. These are all quotes taken from the last year from ‘Maximising your research Profile’ and ‘Measuring research performance’.
Researchers have been positive and can see the relevance to their work
I think we are generally ahead of most of the researchers in the university regarding altmetrics so we are well placed to provide them with good information using the helpsheet as a starting point and that the library has a role to play in supporting them.
There are a few keen and well informed individuals but they are the minority as far as we have been able to find out.
There is a growing appetite for more information - I was asked if we would run a separate session on altmetrics by a keen PhD student.
The proliferation of the ‘donut’ on various platforms and people asking what does this mean shows the awareness is growing.
We are on a softly, softly approach with no extra funds available at present to buy something or make a big campaign at the university (obviously this could change tomorrow, but because of all this work we have done already we in a really good position to hit the ground running)
We have considered adding altmetric information to the repository. But for many items (books etc) we would find zero citations so this could be a turn off, rather than a turn on. So for the moment we are holding fire with this.
We have had a couple of visits from various suppliers of alternative metric data and we have looked at these with RBS, but no firm strategic decisions have been as yet to any purchase of a university system.
We may well look to see what other universities have done and will seek best practice to inform our service.
If we do purchase something I feel this will be a huge jump in for us and in people’s awareness and will raise the level of enquiries, but we have supporting materials to give and guide researchers.
I think we will have a role to play in education and awareness of the tools and what they can and cannot do.
We acquired Scival last year so we are looking at our research metrics provision overall and creating a service for university.
Altmetrics will be part of that and how we roll out access and awareness will be very important.
We will need to work closely with RBS as they deal with funders and therefore have links with the researchers and managed the REF.
We will need to work with the university impact manager as they will have ideas of how to convey impact from altmetrics and will need to be able to convey information to their users. Impact was a very tricky thing to get right in the last REF so by the next REF we want to have all the right tools and methods at our disposal.
Photo credit: https://pixabay.com/en/step-steps-path-direction-shoes-163948/
So as you have heard we at Northumbria are looking at 2 of the 3 currently
Yet to look at Supporting experimentation with emerging tools
But this might come depending on partners like research and business services looking at tools or the repository keen researchers coming to us directly
Personally I think there is an illusion of ease and simplicity with these scores and especially people seeing the donut for the first time, but actually understanding what is behind that and what it really means it actually more complex and it will be a balance about how much people want to know and what we should try to make researchers aware of in order to get real meaning from altmetrics, how it could relate to impact and build it into their research workflow.
I think also depending on the discipline we may well have to tailor our advice, provision and support and this may well be hand in hand with how we role out the research metrics service.
Having the same team knowing about the institutional repository, dealing with APCs, open access and also delivering traditional metrics and altmetric data will be a bonus in cementing linkages in researchers minds I feel.
We have already done this with some of the training and the linkages we have made and that researchers have picked up on that altmetrics fits in with what they are doing or might need to do.
There is much more to altmetrics than twitter and engaging computer scientists with Github or life scientists or economists with Figshare or Dryad
could be away to make it relevant to a wider group than just the twitterati.
Have people begun in these 3 main areas and what sort of activities / conversations would people like to share?
In groups/ with your neighbour discuss the following:
What stage of development are your researchers / institutions at?
What do you do currently to support researchers regarding altmetrics?
What would you like to do?
what types of things / who would you approach / would you like to be doing in the next 12 months?
Photo credit: https://upload.wikimedia.org/wikipedia/commons/3/32/King%27s_College_London_Students_Evacuated_To_Bristol%2C_England%2C_1940_D430.jpg