This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Presentation slides on Open Science and research reproducibility. Presented by Gareth Knight (LSHTM Research Data Manager) on 18th September 2018, as part of an Open Science event for LSHTM Week 2018.
Identifying and tracking research resources using RRIDs: a practical approachdkNET
At this presentation, you will learn (1) Why you need to use Research Resource identifier (RRID) (2) What is Resource Identification Initiative (3) How dkNET.org supports RRID (4) What can you do with RRID
This presentation was provided by Stephanie Roth of Temple University, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Presentation by Dr Steve McEachern, ADA, to the 'Unlocking value from publicly funded Clinical Research Data' workshop, cohosted by ARDC and CSIRO at ANU on 6 March 2019.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Presentation slides on Open Science and research reproducibility. Presented by Gareth Knight (LSHTM Research Data Manager) on 18th September 2018, as part of an Open Science event for LSHTM Week 2018.
Identifying and tracking research resources using RRIDs: a practical approachdkNET
At this presentation, you will learn (1) Why you need to use Research Resource identifier (RRID) (2) What is Resource Identification Initiative (3) How dkNET.org supports RRID (4) What can you do with RRID
This presentation was provided by Stephanie Roth of Temple University, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Presentation by Dr Steve McEachern, ADA, to the 'Unlocking value from publicly funded Clinical Research Data' workshop, cohosted by ARDC and CSIRO at ANU on 6 March 2019.
This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
An overview of the National Institutes of Health new rules that aim to improve the rigor and reproducibility of research, especially research involving animals.
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Introduction to the Research Integrity Advisor Data Management Workshop, Bris...ARDC
Dr Jacobs' introduction to the RIA Data Management Workshop in Brisbane on 31 March 2017. The RIA Data Management Workshop series is a joint collaboration of the Australian Research Council, the National Health and Medical Research Council, the Australasian Research Management Society and the Australian National Data Service.
This presentation was provided by Helen Henderson of Ringgold, during the NISO at NASIG Pre-conference "Metadata in a Digital Age: New Models of Creation, Discovery, and Use," held on June 4, 2008.
This presentation was provided by Clara Llebot of Oregon State University, during the NISO hot topic virtual conference "Effective Data Management," which was held on September 29, 2021.
This presentation was provided by Patricia Payton of Proquest during the NISO webinar, Engineering Access Under the Hood, Part Two, held on November 15, 2017.
In early 2014, we asked science and social science researchers...
• What expectations do the terms publication and peer review raise in reference to data?
• What features would be useful to evaluate the trustworthiness, evaluate the impact, and enhance the prestige of a data publication?
CareSearch creates and publishes search filters for palliative care clinicians to have ready reliable access to the best palliative care evidence. Presentation by Sarah Hayman and Jennifer Tieman to Palliative Care Australia Conference, 2013
This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
An overview of the National Institutes of Health new rules that aim to improve the rigor and reproducibility of research, especially research involving animals.
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Introduction to the Research Integrity Advisor Data Management Workshop, Bris...ARDC
Dr Jacobs' introduction to the RIA Data Management Workshop in Brisbane on 31 March 2017. The RIA Data Management Workshop series is a joint collaboration of the Australian Research Council, the National Health and Medical Research Council, the Australasian Research Management Society and the Australian National Data Service.
This presentation was provided by Helen Henderson of Ringgold, during the NISO at NASIG Pre-conference "Metadata in a Digital Age: New Models of Creation, Discovery, and Use," held on June 4, 2008.
This presentation was provided by Clara Llebot of Oregon State University, during the NISO hot topic virtual conference "Effective Data Management," which was held on September 29, 2021.
This presentation was provided by Patricia Payton of Proquest during the NISO webinar, Engineering Access Under the Hood, Part Two, held on November 15, 2017.
In early 2014, we asked science and social science researchers...
• What expectations do the terms publication and peer review raise in reference to data?
• What features would be useful to evaluate the trustworthiness, evaluate the impact, and enhance the prestige of a data publication?
CareSearch creates and publishes search filters for palliative care clinicians to have ready reliable access to the best palliative care evidence. Presentation by Sarah Hayman and Jennifer Tieman to Palliative Care Australia Conference, 2013
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
University of Liverpool Library Researcher KnowHow session 2 of 3 presented by Michelle Maden PhD MAFHEA Postdoc research associate in evidence synthesis at the University of Liverpool on 22nd November 2021.
Do you have responses to open-ended questions or want to use qualitative data to evaluate CE/QI interventions? Qualitative Analysis Boot Camp at the ACEHP 2013 meeting in San Francisco on 1 February has tools to get you started.
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Palliative Care NSW Biennial State Conference
'Transforming our Landscapes'
13-15 October 2016
Broken Hill, NSW
By Christine Sanderson, Deb Rawlings, Deb Parker, Lauren Miller-Lewis, Jennifer Tieman
The Dying2Learn MOOC participants were invited to create a message that can be shared with the community as part of Dying to Know Day. The message is their statement about what being deathwise is. We have taken a selection of these posters to create a slideshow. We hope you enjoy them
How CareSearch uses social media to promote palliative care and interact with consumers and health professionals. Originally presented at the CNSA Winter Congress, July 2012.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
'Workshop on Smart Searching: Search Filters and Expert Topic Searches', by Sarah Hayman and Raechel Damarell, Flinders University
1. Workshop
Smart Searching: Search Filters
and Expert Topic Searches
Sarah Hayman and Raechel
Damarell
8th International Evidence Based Library
and Information Practice Conference
Brisbane, July 2015
2. Smart Searching
• Introduction to search filters: what are they,
where to find them, how to use them
• Development process for a search filter
• Using the Smart Searching modules
• Tips, tricks and tools
3. Searching Well
Why important?
• Need for evidence
• Impact of missing evidence
• Volume of information
• Complexity of sources
4. A search filter …
• is an experimentally developed and validated search
strategy of known performance effectiveness
• is designed for a particular bibliographic database
• can be methodology- or subject-based
• may also be called a “hedge”
• may be sensitive or specific
• can be expressed as a URL and embedded in a web
page for quick reliable access to evidence
5. Sensitivity and precision
• Sensitivity (recall) = the proportion of
relevant articles retrieved
– 100% sensitivity occurs when all relevant
citations in a dataset are retrieved
• Precision = the proportion of relevant
articles retrieved as a proportion of all
articles retrieved
– 100% precision occurs when all citations
retrieved are relevant
6. Specificity
• Specificity = the proportion of irrelevant
citations not retrieved
– 100% specificity occurs when all irrelevant
citations in a dataset are all excluded from
the search results
7. Balancing these metrics
• Generally speaking, search filters will
attempt to maximise precision without
jeopardising sensitivity
• Systematic review searches aim to
maximise sensitivity without
jeopardising precision
• Often different versions of same search
filter on offer (e.g. Clinical Queries)
8. Clinical Queries Therapy
Search Filter
Filter version Sensitivity Specificity Precision
Therapy (sensitive/broad) 99% 70% 10%
Therapy (specific/narrow) 93% 97% 54%
PubMed Help [Internet]. Bethesda (MD): National Center for Biotechnology Information
(US); 2005-. PubMed Help. [Updated 2015 Jun 16]. Available from:
http://www.ncbi.nlm.nih.gov/books/NBK3827/
9. Using Search Filters
• Nature and purpose of the search filter
• Not designed for systematic review searching
• Use terms in search filter a starting point for
your own searches
• Critically appraise the search filter
• Make search filters available to your users as
useful tools for accessing evidence reliably
– Embed in webpages/libguides
– Autoalerts
10. Search Filters
• CareSearch
• Flinders Filters
• Some search filters worldwide
– ISSG Search Filters Resource
– BMJ Clinical Evidence
– Canadian Agency for Drugs and Technologies in Health
(CADTH)
– HL WIKI International
– McMaster University. Health Information Research Unit
– Scottish Intercollegiate Guideline Network (SIGN)
11. Example: the Heart Failure Search
Filter
As a URL:
http://www.ncbi.nlm.nih.gov/pubmed?term=((heart+failure[tw]+OR+ventricular+dysfu
nction,+left[mh:noexp]+OR+cardiomyopathy[tw]+OR+left+ventricular+ejection+fraction
[tw])+AND+Medline[sb])+OR+((heart+failure[tw]+OR+left+ventricular+dysfunction[tw]+
OR+cardiomyopathy[tw]+OR+left+ventricular+ejection+fraction[tw]+OR+cardiac+resync
hronization[tw]+OR+LV+dysfunction[tw]+OR+left+ventricular+systolic+dysfunction[tw]+
OR+left+ventricular+diastolic+dysfunction[tw]+OR+cardiac+failure[tw])+NOT+medline[s
b])+AND+english[la]
As a PubMed query:
14. Smart Searching
Expert Advisory Group (EAG)
Subject experts
Gold Standard Set
Sample set
Term identification
Term identification
Validation
Testing
15. The Smart Searching Modules
https://sites.google.com/site/smartsearchinglogical/home
16. Acknowledgements
Thank you to:
Our colleagues at CareSearch and Flinders Filters (Mikaela
Lawrence, Yasmine Shaheem, Jennifer Tieman)
Health Libraries Australia and Medical Director (formerly HCN) for
the Health Informatics Innovation Award 2012 that supported the
development of Smart Searching
Raechel Damarell and Sarah Hayman, Flinders University
raechel.damarell@flinders.edu.au
sarah.hayman@flinders.edu.au
Editor's Notes
Want to be informal and relaxed – we are all colleagues – we are not necessarily any more expert than you in searching – chance for us to share our ideas and experience with you and get some feedback – and hear about your ideas and experience and searching tips.
Pleas interrupt with any questions – don’t sit there wondering!
Will aim to make it interactive, give you a chance to do some hands on exercises.
Today this is what we aim to cover in the workshop. We want you to ask questions and we will try to work through all the aspects and demonstrate examples of what we mean.
There will be a chance to interact with some data and play with the techniques. Will also give you links to useful tools and websites.
We are also always happy to be contacted with any questions – contact details will be at the end of the presentation and on the workshop website.
Don’t need to convince this audience of any of this – you all are well aware of both the growing importance and challenge of searching effectively. Maybe you can suggest other ideas, examples here?
The importance (in all fields) of finding and using evidence is growing rapidly, with
increased recognition that decisions should be based on sound evidence. Also important to remember the potential impact of missing a piece of evidence – inadequate searching can result ultimately in adverse outcomes for the patient – we should all always remember this.
Key to finding this evidence
is effective searching.
Alongside this imperative, the searching context is becoming more complex.
The number of articles indexed is enormous and increasing. In the medical field, PubMed contains
over 24 million citations with over 1 million entered in 2014. Effective searching requires an
understanding of database mechanisms and the terminology (including associated thesauri) of each
subject. Searchers need an understanding of the requirements of the end user: what is considered
relevant and what are the levels of evidence?
So what is a search filter? Not pubmed sidebar limits
Search strategy – set of terms – that has been tested and its performance measured. The process is transparent and rigorous. It is a scientific approach to searching where the process is documented and defensible. Evidence based approach.
Important to understand that a search filter is built to perform in a particular database. It may be translated for use in another database but it will not necessarily have the same retrieval effectiveness in a different database.
Search filters can be methodology based –m that is they look for articles using a particular methodology, such as systematic reviews. The PubMed Clinical Queries are search filters looking for particular methodologies.
The can also be topic based, as are the ones we have created at CareSearch and FF. Our filters are on topics such as bereavement, palliative care or primary health care.
Search filters once created can be expressed as a URL if the database allows that. Our PubMed filters can be created this way and delivered as links on a web page to our users so they can simply click on a link to get real time search results. This is a very useful tool that you can actually use for your users too. Show PubMed searches eg bereavement
Just to note you will sometimes see the term Hedge used for search filter
Important to remember a search filter is tweaked to perform in a certain way. They may be highly sensitive (that is retrieve a very high percentage of relevant references) or highly specific, that is, not retrieve a high percentage of irrelevant references. Dialling up the sensitivity in a search will tend to suppress specificity and vice versa. Both types are important in different contexts and you will sometimes see two versions published of the same filter – one sensitive the other specific. See e.g. the Haynes Nephrology filters. For a comprehensive search such as systematic review you would want to use the sensitive filer; for a quick search for a busy clinician the specific filter is likely to be more useful.
Perfect sensitivity = 100 relevant citations in a database and you retrieve all 100.
Precision = Possible to have very high precision yet very low sensitivity. Again there may be 100 relevant citations in a database. Say you retrieve only 10 citations and they are all relevant – you have 100% precision. Sensitivity however is only 10/100 = 10%.
Opposite is true – you would have 100% sensitivity if you retrieve all 100 relevant citations in the database. However, if you retrieve 1000 citations in order to capture the 100 relevant ones though, precision only 10%
Filter methodology makes it possible to measure sensitivity, but real world searching can’t. Don’t know the number of relevant citations in the database so denominator is an unknown. You can, however, measure precision of your own searches. So searching can only ever be a blind attempt to maximise sensitivity while enhancing precision.
Just a reminder again that search filters may not be suitable for a systematic review search. In any case, when doing sys rev searching you would alwys look in more than one place.
Note the term search filter is widely used and does not always mean the specific search filter that we refer to in the bibliographic searching context. PubMed call the limits on the sidebar filters – some are fully developed search filters, others are simply date limits. If you put the term search filter into Google many of the results coming back will be about search filter work in IT.
Nature and purpose: search filter development is guided by an interest group. In the case of our search filters, we use an expert advisory group containing clinicians and researchers. Their needs and/or understanding of the concept shape the filter. It is useful to try to understand the purpose and context of a filter in order to know whether it is what you need to use. What is it designed to retrieve exactly? Is that what you are looking for?
You can critically appraise search filters. There are tools available and often there will be a published paper establishing the existence of the filter and documenting the methodology used to develop it and the validation that was undertaken.
McMaster: The purposes of the search filters are:
1. to enable health care providers to do their own clinical searches effectively and efficiently;2. to help reviewers of published evidence concerning health care problems to retrieve all relevant citations;3. to provide resources for librarians to help clinicians to construct their own searches; and4. to provide input to the database producers about their indexing processes and the organization of their databases.
A simple way to use search filter is to look at it and at least use the terms in it as a starting point for your search. You can always supplement the search filter results with your own.
You can make search filters available to your users - link to them and even create your own topic searches using search filters that you can then publish on a web site or libguide.
Useful tools in the searcher’s armoury are search filters. At CareSearch and its associated project Flinders Filters we have developed now a number of topical or subject-based search filters. These are designed to search the medical literature – indeed all so far have been created to search Medline and PubMed, on particular topics. Looking quickly at our two websites, we can see the topics listed: go to the links and show the names.
These filters are freely available for all to use.
I’ve also shown here some links to international websites with search filters and information about them. If you are interested in search filters and want to learn more, it is well worth going to these sites and exploring. There are many different filters and different types of search filter.
Here is an example of what a search filter looks like. This is the Heart Failure Search Filter, created by CareSearch. It is shown as a URL and also as a PubMed search string which is a little easier to read. It looks in the indexed part of PubMed using the equivalent of the Medline search filter but it also uses text words (natural language terms) in the second part of the filter which looks in the non indexed part. This second part is also tested for relevance when we develop the PubMed translation.
This is an outline of our search filter development Process.
Talk briefly though each step.
It is a major piece of work which takes several months.
Several librarians work on these projects and we found we were often talking to each other about how the techniques we use in this process have enhanced our own general literature searching and could be more widely applied. When we received the HLA/HCN Health Informatics innovation award in 2012 this was an opportunity to develop an online module exploring these ideas and suggesting adaptations of some of this filter development methodology. It is not at all the full methodology we use to develop a search filter but takes some elements and suggests how they could be applied in general searching – to enhance searching and provide evidence of searches’ effectiveness. Want to make the important point here too that we know many librarians are already expert searchers and we suspect you may already be doing many of these things we suggest in your own searching – these are some tips and tricks we think may perhaps systematise searching in a useful way, or be handy for training new searchers – and it may be new to consider how you can quantify your searching effectiveness.
The four points we chose to focus on in the search filter dev process shown here are: the EAG, the GSS, Term identification and Validation.
We have adapted and translated four key elements of the Search Filter development methodology four points as follows:
The EAG becomes Subject Experts in our training modules.
The Gold Standard Set becomes Sample Set in our training modules.
Term identification we felt was self explanatory and remains the same in our training modules.
Validation we have simply called Testing in our training modules.
Now go to the modules and work through the examples there.