I’m presenting today about citations in scholarly literature. In particular to explain how the foundations on which the importance of citation practice is built, is shifting in the digital age: What is the impact of this? My subtitle gives a nod to the rich history of “Considered Harmful” papers, but isn’t just an affectation, and has a serious core, reflecting the seriousness of citaion based metrics
There is a lot of scholarly literature, and it’s published in lots of placesThere are in the region of 50,000 (approx) journals, containing in excess of 50m articles: within that there are many magnitudes more citationsPrinted out, the list of citations would easily create a paper-stack high enough to reach the moonhen print it out, you’d easily have enough paper to reach the moon.The reason to make this point is to understand two things:Keeping track of all this data is not an easy task, the computer used to navigate the Apollo missions to the moon would NOT have managed itAnalyzing citations allows the cream of publications to reach the top, and be more accessible. It sorts the wheat from the chaff. As such it is a crucial tool for everyone involved in the academic community, and for those outside it as well.
This is my way of explaining how citations derive their utility.Writing a new article, old text is cited to show where the knowledge derives from, and to uphold intellectual honestyThen the flow splits between what happens, as a result of publication, for the new material and for the cited materialFinally we see how the citations are used in the real world, and what their impact is: the ripples of the citation. Last but not least, the published work becomes part of the huge repository of knowledge made up by the 50K journals and 50m articles.This diagram is a cycle, and you can assume that it will repeat, with the newly accessible literature probably becoming one of the ‘prior works’ that will be cited in the future… now a little history about the inventor of citation indexing, Garfield
Eugene Garfield invented the systems from which almost all modern citation indexing practices stem. This however, is not Garfield….. This is!
Although there are many other scholars of note in the field, Eugene Garfield (87) deserves a special mention (not just because of the moustache).His 1955 work – citation indexes for science – is not his most cited work, but it IS the root of modern citation-based bibliometrics.Garfield realized that citations could be used as a tool for meta-reflection on literature, as well as just a way of demonstrating rigorThe quote demonstrates how Garfield conceptualizes this; and is really the inspiration behind the diagram I showed previouslyThe key point is that citations, although seemingly quite ethereal, are actually the building blocks of rigor-based knowledge, and without them most literature would lose value… as well as being important, citations are really cool, you can do some pretty funky stuff with them, here is one example:
I characterize Garfield’s work as the start of a renaissance in academic publishing:Although the work was hard, manual, offline, it represented a sea change: for the first time researchers could view detailed indexes of who cited who in fact they could request any page, from any journal that was in the index: the ISI would dispatch these within 24 hoursAlthough not as quick, or large, the indexes provided the same functionality that we enjoy today with online indexesThrough the 60s, 70s, 80s, and 90s the ISI obviously computerized their operations somewhat, but it remains based on the theory that Garfield developedStephen Lock – editor of the BMJ – identified in the early 90s that fields tend to split every 10 years, with this in mind he felt that the existing paradigm for “journalology” (as he called it) wouldn’t remain viableLock hadn’t predicted the world wide web, but within a couple of years of Lock’s comments, Tim Berners-Lee had begun to work on web standards, and the web was growing extremely quicklyAlthough the web facilitated online access to indexes, it also meant that citations became more numerous, more diverse: something that was good for knowledge, but maybe not good for being able to understand and categorize knowledge
As the web matured, and internet connections became more ubiquitous and faster, the seeds for a digital enlightenment were sownPeople realized that by using digital correctly a vast amount knowledge could be shared, where it wasn’t otherwiseIn 1997 Robert Cameron theorized about a comprehensive, universal, citation database:“Imagine a universal bibliographic and citation database linking every scholarly work ever written - no matter how published - to every work that it cites and every work that cites it. Imagine that such a citation database was freely available over the Internet and was updated every day with all the new works published that day, including papers in traditional and electronic journals, conference papers, theses, technical reports, working papers, and preprints. Such a database would fundamentally change how scholars locate and keep current with the works of others” The vision had been had. It was understood that citation indexing could be better, and the implications of doing it better were understood….Because publications were beginning to make their way on to the web as electronic documents, indexing could happen on a much larger scaleCiteSeer came into being in 1998: the first fully autonomous citation indexing system (working in the same way that Google Scholar does)
The enlightenment didn’t last too long though. I think we’re currently in a period that is more closely aligned to the inherently subjective Romantic period:Cameron’s vision of a unified database didn’t even come close to emerging: in fact things are at their most de-standardized, and most complexRather like this intersection, it has been designed to be the most efficient system, but it isn’t easily understood. Signposts and satnav are a mustAlthough Google Scholar is the most comprehensive index we’ve ever known, it has the highest noise-to-signal ratio – but we still use it (subjectively assuming it’s the right thing to do)Other indexes offer high quality, but are more limited in scope (and not open)Crowd-sourced datasets (such as those generated by Zotero and Mendeley) occupy a middle ground, but are by no means completeEach of these indexes/datasets can be used to generate metrics, and to carry out literature searches – but they’re opaque and only provide one kind of measure, and each one is differentGiven that metrics are used to decide careers and effect lives: this is a scary prospect (also that metrics are quite easily gameable)It seems the academy is happily, and quietly, living subjectively in a world that likes to speak in objective and absolute terms (and for the most part acts as if it is). – this is really where the title of the presentation comes from.
Over the last 2 or 3 years the field of altmetrics has emerged:The altmetrics movement suggests that, based on the shortcomings of citation based metrics, other factors should be consideredThese include data on the web such as social network feeds, download counts, discussions, blog posts and so onAlthough about as far as you can get from the inherently quantitative citation based metrics, it is arguable that these kind of measures are essential in order to properly understand research (and impact)Altmetrics is a new idea, and further research is necessary to understand how it, and citation-based metrics, can be used together
I’m viewing this landscape through the lens of communities of practice, and theory around interdisciplinary learningCitations used to be used as a way of sharing knowledge, within quite distinct fieldsTechnology and the web has facilitated a much richer landscape where learning (and citations) occur across fields much more dynamically than they used toCommunities of practice literature suggests that all learning takes place in a community in some way, and the types of association that bring about communities (Engagement, Imagination, Alignment)Within these types of association, current paradigms for making connections are quite constrained, and really only take place in one of these ways – alignmentThis is also underlined by work that studied how research in a given country, using the native language, was very much silo’d, and therefore at a disadvantageBy studying and understanding some of the issues with traditional metrics, built upon the Garfield legacy, we can begin to understand how to make the interdisciplinary learning more fertile, avoid the silos that traditional citations create, and therefore we can encourage serendipity, and richer learningThe key factors here are the shortcomings of traditional citation practices, and the burgeoning field of altmetrics, and how CoP can benefit from using one to address the otherThis leads onto what I think are the key research questions.
Jl high401 special topics presentation (inc notes)
OUT OF SIGHT, OUT OF MINDCitations considered harmful
The cyclical hierarchy of citationsUse:Impact on author;Impact on cited works;Impact on venuesPublish:Writing peer-reviewed;Work indexed;Cited works receive “+1”Cite:Show derivation;Uphold intellectualhonestyPrior work iscited in newwritingsNew writing ismade “public”Accessible tofuture literaturesearchesCited work getsa “+1”Effect on JIF, H-index, i10-index, etc
Theoretical lens for analysis of the landscapePeer-reviewedliteratureCitation-basedmetricsAccessiblecommunities forinterdisciplinarylearningAltmetrics
Research questions• Are there barriers to changing practices around citations?• Can using qualitative inputs and measures to describeresearch, alongside statistics-based measures, derive extravalue in terms of stimulating communities of practice?• Will further development in the field of (alt)metrics, and aconceptual framework designed specifically for understandinghow to measure impact, result in a better HCI experience forboth writers and analysts?
ReferencesBlackwell, A. et al., 2010. Creating value across boundaries, Available at:http://www.nesta.org.uk/library/documents/creating_value_across_boundaries_may10.pdf.Davis, P.M. & Cohen, S. a., 2001. The effect of the Web on undergraduate citation behavior 1996-1999. Journal of theAmerican Society for Information Science and Technology, 52(4), pp.309–314. Available at:http://doi.wiley.com/10.1002/1532-2890%282000%299999%3A9999%3C%3A%3AAID-ASI1069%3E3.0.CO%3B2-P.EASE, 2007. EASE statement on inappropriate use of impact factors. European Association of Science Editors, pp.1–2.Available at: http://www.ease.org.uk/publications/impact-factor-statement [Accessed March 13, 2013].Garfield, E, 2012. A Century of Citation Indexing. Collnet Journal of Scientometrics and InformationManagement, (March), p.2012. Available at:http://www.tarupublications.com/journals/cjsim/Abstract/CJSIM61_01_Abstract.pdf [Accessed March 12, 2013].Garfield, E, 1998. The impact factor and using it correctly. Der Unfallchirurg, 48(2), pp.413–414. Available at:http://www.garfield.library.upenn.edu/papers/derunfallchirurg_v101(6)p413y1998.pdf [Accessed March 2, 2013].Garfield, Eugene, 2006. Citation indexes for science. A new dimension in documentation through association of ideas.1955. International journal of epidemiology, 35(5), pp.1123–7; discussion 1127–8. Available at:http://www.ncbi.nlm.nih.gov/pubmed/16987841.Garfield, Eugene; & Sher, I.H., 1984. Experimental Citation Indexes to Genetics with Special Emphasis on HumanGenetics. Essays of an Information Scientist, 7, pp.515–522.Jinha, A.E., 2010. Article 50 million: an estimate of the number of scholarly articles in existence. LearnedPublishing, 23(3), pp.258–263. Available at: http://openurl.ingenta.com/content/xref?genre=article&issn=0953-1513&volume=23&issue=3&spage=258.Priem, J. et al., 2010. altmetrics: a manifesto. Available at: http://altmetrics.org/ [Accessed March 13, 2013].Priem, J. & Hemminger, B.H., 2010. Scientometrics: The state of the art. First Monday, 15(7). Available at:http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2874/2570.Wenger, Communities of Practice: A brief introduction. Available at: http://www.ewenger.com/theory/ [AccessedMarch 1, 2013].Wenger, E., 2000. Communities of Practice and social learning systems. Organization, 7(2), pp.225–246. Available at:http://org.sagepub.com/content/7/2/225.short [Accessed March 4, 2013].