University of Liverpool Library Researcher KnowHow session 2 of 3 presented by Michelle Maden PhD MAFHEA Postdoc research associate in evidence synthesis at the University of Liverpool on 22nd November 2021.
Researcher KnowHow session presentation by Sarah Roughley Barake, Scholarly Communications Librarian at the University of Liverpool.
Covers:
*What to consider when choosing a journal
*Tools to help you choose
*Where NOT to publish
Researcher KnowHow session at the University of Liverpool from 15th March 2021 presented by Ruaraidh Hill, Angela Boland, Michelle Maden.
The session provided advice on conducting key activities in a systematic review. It can also provide a ‘top-up’ to the 3 part series of workshops about systematic reviews which ran earlier in the academic session. Suitable for postgraduates and staff planning or doing a systematic review for the first time or who wish to brush up on their knowledge.
It focuses on key steps in doing a systematic review. It offers brief practical advice, showcase tools and share top tips for progressing your review.
Researcher KnowHow session presented by Catherine McManamon, Liaison Librarian at the University of Liverpool Library. Supported by Clair Sharpe, Liaison Librarian.
Data “publication” attempts to appropriate for data the prestige of publication in the scholarly literature. While the scholarly communication community substantially endorses the idea, it hasn’t fully resolved what a data publication should look like or how data peer review should work. To contribute an important and neglected perspective on these issues, we surveyed ~250 researchers across the sciences and social sciences, asking what expectations “data publication” raises and what features would be useful to evaluate the trustworthiness and impact of a data publication and the contribution of its creator(s).
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
University of Liverpool Library Researcher KnowHow session 2 of 3 presented by Michelle Maden PhD MAFHEA Postdoc research associate in evidence synthesis at the University of Liverpool on 22nd November 2021.
Researcher KnowHow session presentation by Sarah Roughley Barake, Scholarly Communications Librarian at the University of Liverpool.
Covers:
*What to consider when choosing a journal
*Tools to help you choose
*Where NOT to publish
Researcher KnowHow session at the University of Liverpool from 15th March 2021 presented by Ruaraidh Hill, Angela Boland, Michelle Maden.
The session provided advice on conducting key activities in a systematic review. It can also provide a ‘top-up’ to the 3 part series of workshops about systematic reviews which ran earlier in the academic session. Suitable for postgraduates and staff planning or doing a systematic review for the first time or who wish to brush up on their knowledge.
It focuses on key steps in doing a systematic review. It offers brief practical advice, showcase tools and share top tips for progressing your review.
Researcher KnowHow session presented by Catherine McManamon, Liaison Librarian at the University of Liverpool Library. Supported by Clair Sharpe, Liaison Librarian.
Data “publication” attempts to appropriate for data the prestige of publication in the scholarly literature. While the scholarly communication community substantially endorses the idea, it hasn’t fully resolved what a data publication should look like or how data peer review should work. To contribute an important and neglected perspective on these issues, we surveyed ~250 researchers across the sciences and social sciences, asking what expectations “data publication” raises and what features would be useful to evaluate the trustworthiness and impact of a data publication and the contribution of its creator(s).
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
Research proposal writing services help guide ph d assistance ukphdassistanceuk1
Writing a high quality PhD dissertation research proposal is intended to satisfy your supervisor and examiners that you have a worthy thesis, and gain confidence that you have the capability to complete it PhD Assistance Research Lab provides doctorate dissertation research proposal writing service.
Website Visit :
https://bit.ly/3pH88ZH
Contact Us:
UK/WhatsApp No:+44 7537144372
India No: +91 9176966446
Email us : info@phdassistance.co.uk
Learn more about peer review from the perspectives of an Editor-in-Chief, Online Publishing Systems Administrator, Associate Editor, Associate Editor Mentee and a Reviewer.
Using Library Resources for your DissertationGaz Johnson
Talk given to education distance learning postgraduate students studying at Leicester. Covers data resources available to them, along with basic Boolean searching practice.
10 SIMPLE STEPS TO BUILDING A REPUTATION AS A RESEARCHER, IN YOUR EARLY CAREERMicah Altman
A talk sponsored by the MIT Postdoctoral Association with support from the Office of the Vice President for Research.
In the rapidly changing world of research and scholarly communications researchers are faced with a rapidly growing range of options to publicly disseminate, review, and discuss research—options which will affect their long-term reputation. Junior scholars must be especially thoughtful in choosing how much effort to invest in dissemination and communication, and what strategies to use.
In this talk, I briefly discuss a number of review of bibliometric and scientometric studies of quantitative research impact, a sampling of influential qualitative writings advising this area, and an environmental scan of emerging researcher profile systems. Based on this review, and on professional experience on dozens of review panels, I suggest some steps junior researchers may consider when disseminating their research and participating in public review and discussion.
An interactive workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
A recording of the workshop is available here:
https://youtu.be/GBQK62_qCLw
Identifying journals for publication youtubeDr. Chinchu C
The presentation is about how to be careful while selecting academic journals for publication.
Malayalam YouTube video based on this presentation is available at https://youtu.be/z5_LD7qqzbw
Content:
When to start searching for journals
General and Specialized Journals
Acceptance Rates
Journal Selection Tools
Journal Indexing
Web of Science
Scopus
Medline, PubMed, and PubMed Central
UGC CARE
Journal Metrics
Impact Factor
CiteScore
Checklist for Journal Selection
Predatory Journals
Cloned/Hijacked Journals
Some Useful Places
Finding Information for Foundation Degree in MVCO (DL) StudentsGaz Johnson
Slides for the 19th April lecture given to foundation degree in Managing Community & Voluntary Organisations - detailing data resources and good searching practice.
Research proposal writing services help guide ph d assistance ukphdassistanceuk1
Writing a high quality PhD dissertation research proposal is intended to satisfy your supervisor and examiners that you have a worthy thesis, and gain confidence that you have the capability to complete it PhD Assistance Research Lab provides doctorate dissertation research proposal writing service.
Website Visit :
https://bit.ly/3pH88ZH
Contact Us:
UK/WhatsApp No:+44 7537144372
India No: +91 9176966446
Email us : info@phdassistance.co.uk
Learn more about peer review from the perspectives of an Editor-in-Chief, Online Publishing Systems Administrator, Associate Editor, Associate Editor Mentee and a Reviewer.
Using Library Resources for your DissertationGaz Johnson
Talk given to education distance learning postgraduate students studying at Leicester. Covers data resources available to them, along with basic Boolean searching practice.
10 SIMPLE STEPS TO BUILDING A REPUTATION AS A RESEARCHER, IN YOUR EARLY CAREERMicah Altman
A talk sponsored by the MIT Postdoctoral Association with support from the Office of the Vice President for Research.
In the rapidly changing world of research and scholarly communications researchers are faced with a rapidly growing range of options to publicly disseminate, review, and discuss research—options which will affect their long-term reputation. Junior scholars must be especially thoughtful in choosing how much effort to invest in dissemination and communication, and what strategies to use.
In this talk, I briefly discuss a number of review of bibliometric and scientometric studies of quantitative research impact, a sampling of influential qualitative writings advising this area, and an environmental scan of emerging researcher profile systems. Based on this review, and on professional experience on dozens of review panels, I suggest some steps junior researchers may consider when disseminating their research and participating in public review and discussion.
An interactive workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
A recording of the workshop is available here:
https://youtu.be/GBQK62_qCLw
Identifying journals for publication youtubeDr. Chinchu C
The presentation is about how to be careful while selecting academic journals for publication.
Malayalam YouTube video based on this presentation is available at https://youtu.be/z5_LD7qqzbw
Content:
When to start searching for journals
General and Specialized Journals
Acceptance Rates
Journal Selection Tools
Journal Indexing
Web of Science
Scopus
Medline, PubMed, and PubMed Central
UGC CARE
Journal Metrics
Impact Factor
CiteScore
Checklist for Journal Selection
Predatory Journals
Cloned/Hijacked Journals
Some Useful Places
Finding Information for Foundation Degree in MVCO (DL) StudentsGaz Johnson
Slides for the 19th April lecture given to foundation degree in Managing Community & Voluntary Organisations - detailing data resources and good searching practice.
EAHIL CPD Pilot Program: Search filters - what are they good for?maria gp
In this one hour webinar, Julie reviewed how to find filters, how to assess the quality of filters and occasions when filters may or not be helpful. This webinar is part of the EAHIL CPD pilot program. Visit http://eahil.eu
CareSearch creates and publishes search filters for palliative care clinicians to have ready reliable access to the best palliative care evidence. Presentation by Sarah Hayman and Jennifer Tieman to Palliative Care Australia Conference, 2013
A presentation from the joint CILIP Information Literacy Group and Library and Information Research Group's Writing Research Proposals and Publication event.
Researcher KnowHow session presented by Michelle Maden PhD MA FHEA, Postdoc research associate in evidence synthesis, Liverpool Reviews and Implementation Group
We will review a general PowerPoint template and discuss the main components that fill the slides for the final defense presentation. We will also go over tips for how to prepare the presentation and think through what types of questions might be asked. A question-and-answer session follows.
In this presentation, we go over the most common qualitative research designs. We go over the main components of Chapter 3, including methodology and rationale, the role of the researcher, the selection of participants, instrumentation, procedure, data analysis plan, and issues of trustworthiness.
Cette exemple de feuille de réponse correspond au huitième webinaire de la série club de lecture en ligne, « Comment les jeunes interprètent-ils les données probantes relatives au cannabis? »
Le Centre de collaboration nationale des méthodes et outils est financé par l’Agence de la santé publique du Canada et affilié à l’Université McMaster. Les vues exprimées ici ne reflètent pas nécessairement la position officielle de l’Agence de la santé publique du Canada.
Le CCNMO est l’un des six centres de collaboration nationale en santé publique au Canada. Les Centres encouragent et améliorent l’utilisation des connaissances provenant de la recherche scientifique et des autres savoirs afin de renforcer la pratique et les politiques de santé publique au Canada.
Similar to 'Knowing how good our searches are: an approach derived from search filter development methodology', by Sarah Hayman, Flinders University (20)
Palliative Care NSW Biennial State Conference
'Transforming our Landscapes'
13-15 October 2016
Broken Hill, NSW
By Christine Sanderson, Deb Rawlings, Deb Parker, Lauren Miller-Lewis, Jennifer Tieman
The Dying2Learn MOOC participants were invited to create a message that can be shared with the community as part of Dying to Know Day. The message is their statement about what being deathwise is. We have taken a selection of these posters to create a slideshow. We hope you enjoy them
How CareSearch uses social media to promote palliative care and interact with consumers and health professionals. Originally presented at the CNSA Winter Congress, July 2012.
This presentation on developing resources for heart failure was first given by CareSearch Director Jennifer Tieman at the Australasian Cardiovascular Nursing College 5th Annual Conference, 18-19 March 2011.
More from CareSearch palliative care knowledge network (14)
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
'Knowing how good our searches are: an approach derived from search filter development methodology', by Sarah Hayman, Flinders University
1. Knowing how good our searches
are: an approach derived from
search filter development
methodology
Sarah Hayman
8th International Evidence Based Library
and Information Practice Conference
Brisbane, July 2015
2. Smart Searching
• Introduction
• Search Filters at CareSearch and
Flinders Filters
• Developing the learning modules
• Evaluation of the learning modules
3. The Importance of Searching
• Need for evidence
• Impact of missing evidence
• Volume of information
• Complexity of sources
4. Search Filters
• CareSearch
• Flinders Filters
• Some Search filters worldwide
– ISSG Search Filters Resource
– BMJ Clinical Evidence
– Canadian Agency for Drugs and Technologies in Health
(CADTH)
– HL WIKI International
– McMaster University. Health Information Research Unit
– Scottish Intercollegiate Guideline Network (SIGN)
5. A search filter …
• is a validated search strategy of known performance
effectiveness
• is designed for and built in a particular bibliographic
database
• can be methodology- or subject-based
• can be expressed as a URL and embedded in a web
page for quick reliable access to evidence
• may be sensitive or specific
6. Using Search Filters
• Understand the nature and purpose of the search
filter
• Critically appraise the search filter (tools available at
e.g. ISSG Search Filters Resource)
• Look at the terms in the search filter and use as a
starting point for your search
• Make search filters available to your users as useful
tools for accessing evidence reliably
• Note that search filters may not be appropriate for the
comprehensive searching required for systematic
reviews
7. Example: the Heart Failure Search
Filter
As a URL:
http://www.ncbi.nlm.nih.gov/pubmed?term=((heart+failure[tw]+OR+ventricular+dysfu
nction,+left[mh:noexp]+OR+cardiomyopathy[tw]+OR+left+ventricular+ejection+fraction
[tw])+AND+Medline[sb])+OR+((heart+failure[tw]+OR+left+ventricular+dysfunction[tw]+
OR+cardiomyopathy[tw]+OR+left+ventricular+ejection+fraction[tw]+OR+cardiac+resync
hronization[tw]+OR+LV+dysfunction[tw]+OR+left+ventricular+systolic+dysfunction[tw]+
OR+left+ventricular+diastolic+dysfunction[tw]+OR+cardiac+failure[tw])+NOT+medline[s
b])+AND+english[la]
As a PubMed query:
8.
9. Smart Searching
Expert Advisory Group (EAG)
Subject experts
Gold Standard Set
Sample set
Term identification
Term identification
Validation
Testing
14. Usage and Evaluation of the
Smart Searching modules
• Direct feedback sought from site
• Web usage statistics (Google Analytics)
• User survey conducted April, 2015
17. User Survey (April, 2015) - 50 responses
Occupation
Would you recommend this site to
a colleague?
18. User Survey (April, 2015) - 50 responses
Do you think you would use this
approach for testing?
Have you applied any of these
techniques in your searching
practice?
19. User Survey (April, 2015) - 50 responses
Some Representative Comments
Functioning of the Site
Systematic and methodical approach
Clear and easy to use (13 others similar to this)
Template applicable across disciplines
While the content is useful- the constant arrow moving would not be appealing to busy
clinicians or medical librarians
Time constraints
Time is the major factor, followed closely by access to the subject experts
it can possibly save me some time as I spend a lot of time in my job training and assisting
health researchers in building effective literature searches
Time restraints, level of information need does not usually require that level of sensitivity
(several like this)
Very useful, however not sure if I would have time to test every search in a real life work
situation.
20. User Survey (April, 2015) - 50 responses
Some Representative Comments
Value of the Site
Reassessing the way I approach things
The more knowledge/ideas we share about improving search techniques the more
beneficial it is to the profession
I can see the value in being able to 'qualify' and measure my searching outcomes
Informative...I bet there are other librarians who, just like me, are not utilizing these
techniques properly.
Adding those extra dimensions increases the robustness of our searching and helps to
systematise the things we do
I tend to be more intuitive than systematic with my searches […] Reporting would force me
to ensure consistency!
Seems great as a refresher for me but will also be really useful for staff training purposes
I don't think that librarians test their search strategy and I feel it is an important tool to argue
our competence and relevancy, especially in private enterprise
It gives a measure of effectiveness that speaks for itself...numbers are extremely hard to
dispute!
Quality of the information and instruction; Quiz to reinforce what just learnt
21. User Survey (April, 2015) - 50 responses
Two Interesting Comments
Keep this aimed at professional librarians. I've never seen a profession so bent
on becoming obsolete. Professional librarians/expert searchers have unique
skills that we should be promoting to incoming librarians and teaching the new
generation of searchers.
I was not happy to see that you state that librarians are not subject experts. After
many years of searching on specialist topics I would consider myself to have
more knowledge about the thesaurus terms used by databases than most subject
'experts'.
22. Conclusion
• We developed this resource because we believe that searches are
important and should be accorded the respect of a scientific approach.
• The resource appears to be well used (4,484 unique users in first year,
from 72 countries)
• 29.55% of total sessions were from Australia in the first year
• Top six countries represent approx. 85% of all users in the first year
(Australia, US, UK, Canada, Ireland, NZ)
• Bounce rate lower and session times longer amongst Australian users
• Responses to the survey were positive (81.63% would recommend to a
colleague)
• Need more information about reasons for using – can largely only
speculate beyond what the survey responses suggested
• We will maintain and aim to improve the resource – all suggestions and
feedback welcomed
23. Acknowledgements
Thank you to:
• Colleagues at CareSearch and Flinders Filters (Raechel
Damarell, Mikaela Lawrence, Yasmine Shaheem, Jennifer
Tieman)
• Members of the Smart Searching Advisory Group
• Health Libraries Australia and Medical Director (formerly HCN)
for the Health Informatics Innovation Award 2012 that supported
the development of Smart Searching
Sarah Hayman
Flinders Filters, Flinders University
sarah.hayman@flinders.edu.au
Editor's Notes
Thank you
Hope this will fit with raechel’s talk – from the specific to the general in terms of search filters and an overview of the smart searching modules we developed with the Aim to improve and test searching effectiveness
Today this is what I aim to cover in the presentation.
Will not have time to go into search filters in detail, nor the content of the Smart Searching modules – however, for those coming to the workshop on Thursday there will be time to go into both of these in more detail.
I am also always happy to be contacted with any questions – my contact details will be at the end of the presentation
Don’t need to convince this audience of any of this – you all are well aware of both the growing importance and challenge of searching effectively. Just aiming to remind ourselves of the context in which we do this work – and why it matters so much.
The importance (in all fields) of finding and using evidence is growing rapidly, with
increased recognition that decisions should be based on sound evidence. Also important to remember the potential impact of missing a piece of evidence – inadequate searching can result ultimately in adverse outcomes for the patient – we should all always remember this.
Key to finding this evidence
is effective searching.
Alongside this imperative, the searching context is becoming more complex.
The number of articles indexed is enormous and increasing. In the medical field, PubMed contains
over 24 million citations with over 1 million entered in 2014. Effective searching requires an
understanding of database mechanisms and the terminology (including associated thesauri) of each
subject. Searchers need an understanding of the requirements of the end user: what is considered
relevant and what are the levels of evidence?
Useful tools in the searcher’s armoury are search filters. At CareSearch and its associated project Flinders Filters we have developed now a number of topical or subject-based search filters. These are designed to search the medical literature – indeed all so far have been created to search Medline and PubMed, on particular topics. Looking quickly at our two websites, we can see the topics listed: go to the links and show the names.
These filters are freely available for all to use.
I’ve also shown here some links to international websites with search filters and information about them. If you are interested in search filters and want to learn more, it is well worth going to these sites and exploring. There are many different filters and different types of search filter.
So what is a search filter?
Search strategy – set of terms – that has been tested and its performance measured. The process is transparent and rigorous. It is a scientific approach to searching where the process is documented and defensible.
Important to understand that a search filter is built to perform in a particular database. It may be translated for use in another database but it will not necessarily have the same retrieval effectiveness in a different database.
Search filters can be methodology based –m that is they look for articles using a particular methodology, such as systematic reviews. The PubMed Clinical Queries are search filters looking for particular methodologies.
The can also be topic based, as are the ones we have created at CareSearch and FF. Our filters are on topics such as bereavement, palliative care or primary health care.
Search filters once created can be expressed as a URL if the database allows that. Our PubMed filters can be created this way and delivered as links on a web page to our users so they can simply click on a link to get real time search results. This is a very useful tool that you can actually use for your users too. Show PubMed searches eg bereavement
Just to note you will sometimes see the term Hedge used for search filter
Important to remember a search filter is tweaked to perform in a certain way. They may be highly sensitive (that is retrieve a very high percentage of relevant references) or highly specific, that is, not retrieve a high percentage of irrelevant references. Dialling up the sensitivity in a search will tend to suppress specificity and vice versa. Both types are important in different contexts and you will sometimes see two versions published of the same filter – one sensitive the other specific. See e.g. the Haynes Nephrology filters. For a comprehensive search such as systematic review you would want to use the sensitive filer; for a quick search for a busy clinician the specific filter is likely to be more useful.
Nature and purpose: search filter development is guided by an interest group. In the case of our search filters, we use an expert advisory group containing clinicians and researchers. Their needs and/or understanding of the concept shape the filter. It is useful to try to understand the purpose and context of a filter in order to know whether it is what you need to use. What is it designed to retrieve exactly? Is that what you are looking for?
You can critically appraise search filters. There are tools available and often there will be a published paper establishing the existence of the filter and documenting the methodology used to develop it and the validation that was undertaken.
McMaster: The purposes of the search filters are:
1. to enable health care providers to do their own clinical searches effectively and efficiently;2. to help reviewers of published evidence concerning health care problems to retrieve all relevant citations;3. to provide resources for librarians to help clinicians to construct their own searches; and4. to provide input to the database producers about their indexing processes and the organization of their databases.
A simple way to use search filter is to look at it and at least use the terms in it as a starting point for your search. You can always supplement the search filter results with your own.
You can make search filters available to your users - link to them and even create your own topic searches using search filters that you can then publish on a web site or libguide.
Just a reminder again that search filters may not be suitable for a systematic review search. In any case, when doing sys rev searching you would alwys look in more than one place.
Here is an example of what a search filter looks like. This is the Heart Failure Search Filter, created by CareSearch. It is shown as a URL and also as a PubMed search string which is a little easier to read. It looks in the indexed part of PubMed using the equivalent of the Medline search filter but it also uses text words (natural language terms) in the second part of the filter which looks in the non indexed part. This second part is also tested for relevance when we develop the PubMed translation.
This is an outline of our search filter development Process.
Talk briefly though each step.
It is a major piece of work which takes several months.
Several librarians work on these projects and we found we were often talking to each other about how the techniques we use in this process have enhanced our own general literature searching and could be more widely applied. When we received the HLA/HCN Health Informatics innovation award in 2012 this was an opportunity to develop an online module exploring these ideas and suggesting adaptations of some of this filter development methodology. It is not at all the full methodology we use to develop a search filter but takes some elements and suggests how they could be applied in general searching – to enhance searching and provide evidence of searches’ effectiveness. Want to make the important point here too that we know many librarians are already expert searchers and we suspect you may already be doing many of these things we suggest in your own searching – these are some tips and tricks we think may perhaps systematise searching in a useful way, or be handy for training new searchers – and it may be new to consider how you can quantify your searching effectiveness.
The four points we chose to focus on in the search filter dev process shown here are: the EAG, the GSS, Term identification and Validation.
We have adapted and translated these four points as follows:
The EAG is essential to ensure the clinical usefulness of the search filter and to minimise any bias that we (as searching experts but not necessarily subject experts) might bring to the development process. EAG members advise on the scope of the filter, candidate search terms and possible sources of a representative gold standard set; they also test filter retrievals for relevance as part of the filter validation process. EAG becomes Subject Experts in our training modules.
The Gold Standard Set is a set of references representing the scope of the subject to be retrieved by the search, and externally confirmed as relevant to the topic. This set is divided into three subsets so that term identification, creation and validation can all be done within different sets of data; again, the aim here is to minimise any potential bias from building and testing in the same set. GSS becomes Sample Set in our training modules.
Term identification is the process of analysing the titles, abstracts and index terms of the references to identify textwords and controlled headings (usually MeSH terms) to be tested for their retrieval performance in the gold standard set. Term identification we felt was self explanatory and remains the same in our training modules.
Validation encompasses the testing of the search strategy within a subset of the gold standard set, the entire gold standard set and often an external validation set to arrive at a percentage figure for its retrieval performance. Its ability to retrieve items known to be relevant (e.g. within the gold standard set) gives a sensitivity percentage rating; the number of relevant records retrieved out of a total set retrieved by the search strategy gives the precision percentage rating (using relevance assessment by external reviewers). Validation we have simply called Testing in our training modules.
Won’t spend much time on the detail of the modules but just want to show them quickly to show the structure of the site. Each concept has been explained, then is illustrated with a scenario and then a short quiz is provided. It is all optional and you can navigate in and out of each part of the module without having completed the quiz if you wish.
For subject experts, the starting point in this process, we discuss what we mean by subject experts, who they might be and how to use them. Essentially using subject experts aims to capture terms and literature you may not yourself think of when undertaking a search – and like all of this model, is an attempt to minimise any bias that we as searchers might introduce ourselves, making the search less useful.
The sample set is our adaptation of the formal gold standard set. Again, an attempt to minimise bias and introduce objectivity by having a set of known relevance against which to test and then tweak your search.
In term identification we introduce methods we use to suggest candidate terms for your search filter – terms whose performance you can then test. There are tools you can use for frequency analysis which will give you an idea of terminology in the literature. These do throw up interesting terms such as “institutionalised elderly” which emerged when I was developing the residential aged care search filter – not a term I would have thought of myself.
Lastly testing is at the heart of this process – we look at why and how we should test our searches and suggest how we can then report on and use the results of our testing.
Have had some direct feedback, but on the whole very little.
Regularly check the web usage statistics and will report on them today. Also publish current Google usage stats on the site in automated link form.
Conducted a user survey in April this year and will report on this today
Note change in numbers when the survey and then the reminder were sent out in April this year. Bounce rate also rose and session duration dropped indicating more people are going to look but not necessarily staying long.
84 countries in total, top 18 shown here
Interesting to compare bounce rates
We had 50 unique responses to the user survey. These charts show answers to some of the questions that could be captured in chart form. 80% were librarians and the next highest number interestingly were other information professionals
Have not shown sector breakdown here but can report that 45 were health sector; 1 education, 1 public library, 1 regulatory, 1 social sciences and 1 youth
No one said they would not recommend the site though the one direct email I had said they thought they probably would not. This was a very positive response of over 80% saying they would recommend.
Over half said they have applied these techniques in their testing (some or all) – this may have been independent of the modules of course.
Have not been able to reproduce all 50 sets of comments here – these are a representative selection. I am very grateful to the people who took the time to send back their comments and suggestions.
I will be writing up the survey in the paper from this conference, to submit to the EBLIP journal, and welcome any further feedback, on the site itself or the survey.
These are some of the comments from users which indicate a strong interest in improving searching practice
I found these two comments very thought provoking. I have my own views about their points, but thought I would show them today to see what others think.
I certainly intended the site to be used by people other than librarians; I think we have a responsibility to ensure that all searches are done as well as they can be and teaching searching to users seems to me to be part of our role – but interested to hear what others think
In relation to the second point, I see what the person is saying; I would say however that we can certainly become experts in searching in a particular subject area but that is different from being an expert in the clinical practice of that area – and that is why when we develop search filters we use expert advisors. Again, interested in your views.
Hope this resource is and will continue to be useful
Can only speculate about the reasons for the uptake, beyond what we have been able to ascertain from the survey
Suspect there is a gap and an appetite for instructions for librarians themselves on testing and improving searches – especially free and online. Most of what I found was written by librarians for library users to help them with their searching.
Would love to know more about why and how people are using it
Please send feedback and comments so we can continuously improve it – and any ideas for further dissemination. Researchers as well as librarians? Happy for people to reuse and modify, with acknowledgement