The document discusses an institutional policy portfolio approach for scholarly publication and dissemination. It advocates that institutions require faculty to assign limited rights to the institution and deposit works in an institutional repository. Institutions should support open access publishing through direct and indirect funding. A portfolio approach should constrain funding by evaluating how faculty publication choices impact dissemination goals of accessibility, rapid access, discoverability, reusability, cost effectiveness, quality assurance and prestige.
In order to drive innovation in your organization, you have to be laying the tracks and investing in the engine. This presentation is the first in a series of four webinars on building a framework for innovation in your organization.
Finding a goldmine of natural history illustrations within BHL texts: the Ar...Trish Rose-Sandler
The Biodiversity Heritage Library (BHL) has now achieved a critical mass of digitized historic texts – over 41 million pages and counting. The BHL portal can be searched by several access points including title, author, subject, and scientific name. But, what is largely hidden and entirely unsearchable are the millions of natural history illustrations found with the BHL books and journals. These visual resources which include drawings, paintings, photographs, maps and diagrams represent work by some of the finest botanical and zoological illustrators in the world, including the likes of John James Audubon, Georg Dionysus Ehret, and Pierre Redouté. Many of the illustrations are the first recorded descriptions of much of the world’s biota, providing the scientific foundation for contemporary taxonomic research and conservation assessments. Some of them are the only verifiable resource about an organism and their existence on Earth due to changes in global climate patterns and rapid loss of natural habitat for many species. Audiences for these illustrations also cross a variety of disciplines and include: biologists, artists, historians, illustrators, graphic designers, archivists, educators, students, and citizen scientists.
In 2012, the Missouri Botanical Garden was awarded a grant from the National Endowment for the Humanities to support a project called The Art of Life: Data Mining and Crowdsourcing the Identification and Description of Natural History Illustrations from the Biodiversity Heritage Library (BHL). This talk will discuss the Art of Life objectives and current status. It will go into detail about the algorithms and schema designed for finding which pages contain illustrations and describing the subsequent output. Finally the talk will discuss the project’s benefits for the scientific community such as improving access to a significant collection of public domain images related to biodiversity.
In order to drive innovation in your organization, you have to be laying the tracks and investing in the engine. This presentation is the first in a series of four webinars on building a framework for innovation in your organization.
Finding a goldmine of natural history illustrations within BHL texts: the Ar...Trish Rose-Sandler
The Biodiversity Heritage Library (BHL) has now achieved a critical mass of digitized historic texts – over 41 million pages and counting. The BHL portal can be searched by several access points including title, author, subject, and scientific name. But, what is largely hidden and entirely unsearchable are the millions of natural history illustrations found with the BHL books and journals. These visual resources which include drawings, paintings, photographs, maps and diagrams represent work by some of the finest botanical and zoological illustrators in the world, including the likes of John James Audubon, Georg Dionysus Ehret, and Pierre Redouté. Many of the illustrations are the first recorded descriptions of much of the world’s biota, providing the scientific foundation for contemporary taxonomic research and conservation assessments. Some of them are the only verifiable resource about an organism and their existence on Earth due to changes in global climate patterns and rapid loss of natural habitat for many species. Audiences for these illustrations also cross a variety of disciplines and include: biologists, artists, historians, illustrators, graphic designers, archivists, educators, students, and citizen scientists.
In 2012, the Missouri Botanical Garden was awarded a grant from the National Endowment for the Humanities to support a project called The Art of Life: Data Mining and Crowdsourcing the Identification and Description of Natural History Illustrations from the Biodiversity Heritage Library (BHL). This talk will discuss the Art of Life objectives and current status. It will go into detail about the algorithms and schema designed for finding which pages contain illustrations and describing the subsequent output. Finally the talk will discuss the project’s benefits for the scientific community such as improving access to a significant collection of public domain images related to biodiversity.
Managing a (different) Data Deluge - SPARC OA conferenceCameron Neylon
Presentation from the Implementation Panel of the SPARC OA conference in Kansas City.
The talk discusses the challenges that arise when Open Access publishing rises to be a majority of scholarly publishing. Different systems are required to manage payments, metadata transfer and funder compliance for institutions, researchers, funders and publishers.
From research life cycle to networks: The role of the libraryCameron Neylon
Google for "research life cycle" and you'll find a million images. Everyone has their own cycle, not all of them compatible. In this talk I argue that we need to move from a cycle conception of research information flows towards one based on networks. The library has the skills and values to act as a professional guide to this terriroty.
A full-day workshop given in Belle Vernon PA on May 1st, 2009. In addition to the formal presentation, there was a time for participants to explore and play with a variety of gadgets that they might want to have in their own libraries.
Managing a (different) Data Deluge - SPARC OA conferenceCameron Neylon
Presentation from the Implementation Panel of the SPARC OA conference in Kansas City.
The talk discusses the challenges that arise when Open Access publishing rises to be a majority of scholarly publishing. Different systems are required to manage payments, metadata transfer and funder compliance for institutions, researchers, funders and publishers.
From research life cycle to networks: The role of the libraryCameron Neylon
Google for "research life cycle" and you'll find a million images. Everyone has their own cycle, not all of them compatible. In this talk I argue that we need to move from a cycle conception of research information flows towards one based on networks. The library has the skills and values to act as a professional guide to this terriroty.
A full-day workshop given in Belle Vernon PA on May 1st, 2009. In addition to the formal presentation, there was a time for participants to explore and play with a variety of gadgets that they might want to have in their own libraries.
Making and the Commons, for Europeana's "European Cultural Commons" conferenc...Michael Edson
Keynote given at Europeana's European Cultural Commons conference in Warsaw Poland, October 12, 2011.
A video of this talk from Warsaw is at http://youtu.be/RSaLnHlN4gQ
A full text version of the talk (with footnotes and hyperlinks) is at http://www.slideshare.net/edsonm/museums-and-the-commons-helping-makers-get-stuff-done-6779050
Upcycling a Schol Comm Unit: Building Bridges with Creativity, Reallocations,...NASIG
The Scholarly Communication Unit of the David L. Rice Library at the University of Southern Indiana started not with a bang, but with a lateral transition. Over the next two years, the unit has focused on creatively developing the themes of scholarly communication competencies within and outside the library despite limited resources, and this session will serve to highlight ways in which other libraries that are facing similar limitations can still provide quality services to their institutions. The knowledge and skills necessary to build the scholarly communication programs have been culled from across the library with strategic reconsideration of job lines and descriptions. Utilizing affordable professional development activities has deepened our ability to support scholarly communication activities on campus. The realignment of positions with existing personnel, has also enabled us to leverage existing relationships to produce outreach activities that include our faculty advocates. Similarly, the institutional knowledge within the library and our relationships across campus have allowed us to pursue a particularly creative approach to open access funding that does not require a new line of money from the university. Our approach to scholarly communication services has fundamentally been as a public service that requires innovative problem-solving in order to identify and enhance competencies within the library so that we can successfully take programs outside the library and strategically reallocate resources to build a Scholarly Communication Unit that serves our entire campus.
Peter Whiting
Scholarly Communication Librarian, University of Southern Indiana
Andrea Wright
Assistant Director and Head of Public Services, University of Southern Indiana
Research Excellence is a Neo-Colonial AgendaCameron Neylon
Talk given at the On Think Tanks meeting in Geneva in February 2019. Discusses the way in which research excellence is constructed and tends to 'internationalise' networks. Using the Sabato-Botana triangle as a model it argues for the importance of localism and the need for contextualised conceptions of excellence if research is to deliver value for the communities that support it.
Network Enabled Research: Connectivity, groups and growth in the production o...Cameron Neylon
We often talk about “research networks” for projects. Our measures of research quality are often based on networks of citations. Social media networks are increasingly important in internal and external communications of research. Usually we think about these things as external technologies that have affected how we do things. Social technologies of funding intended to drive collaboration, data collection technologies that let us think about not just one link between articles but the characteristics of the whole system, communications technologies with new possibilities. But to think of these as external effects is to miss the fact that the networks have always been there. What has changed is their density and interconnection. We can actually turn the question around. Rather than ask what impact social media networks have had on research, we should ask what changes were occurring that required something like social media to be developed? For science to continue growing, it needs more complex and larger networks to be formed. What are the characteristics of systems that support that? How do we design scholarship as networks so that it can continue to grow?
Excellence is a neo-colonial agenda...and what we can do aboutCameron Neylon
Slides from a keynote at the meeting 'Perspectives of Research Excellence in the Global South' - argues that considering research excellence as a neo-colonial agenda helps to defuse the dangers that a North Atlantic attitude to 'biblio-excellence' creates but also offers opportunities for developing and transitional countries to take a leadership role on the future of research policy
Will we still know ourselves? Identity and Community in a Transforming Knowle...Cameron Neylon
Keynote given at the NFAIS 2018 meeting in Alexandria, Virginia, USA on 28 February 2018
The world of information is transforming at a bewildering pace. The assumptions of yesterday, the stable institutions and cherished practices increasingly seem to be vanishing before our eyes. The first assumption of any new strategy seems to be “what would this look like if we built it from scratch, today”. And yet continuity matters, we don’t build new tools, institutions and practices from scratch, they evolve in a messy and contingent way from what we have available to us in the moment.
In this talk, Neylon unpicks the underlying drivers of change, and how they are coupled to a long history of how we manage information. Neylon will discuss how the different perspectives of important groups—scholars, publishers, funders, platform providers and the myriad of information professionals—lead to a partial focus that can make us simultaneously fearful of the change we see and blind to the shifts that actually matter.
If the arc of history bends towards justice then it follows that the arc of our knowledge and information environment necessarily bends towards greater scale and greater diversity. At the same time it is the values that underpin scholarship and the various ways in which we identify with the project of building knowledge, that drive us forward. If we are to take advantage of change, we need to understand what it is that must stay the same.
Beyond Open: Culture and Scaling in the Making of KnowledgeCameron Neylon
Open Access, Open Science, Open Government, Open Education. We often see these as new movements, set against an old world of broken – and closed – systems of scholarship and education. New technologies, primarily the web, have lifted the veil from our eyes to let us see this new world. If only we could build the right technology...mandate the right behavior then a new utopia of open scholarship will be upon us! The problem with this view is that it sees the disruption of the web as a one-off event that once worked through will provide a solution for all time. Framed that way this is obviously not true, but the challenge goes deeper than that. Scholarship, in its western institutionalized forms, has increased in scale continuously for at least 400 and possibly 2000 years. No social or institutional system can scale continuously over several order of magnitude. Therefore we must expect structural historical breaks.
The question is not how to fix scholarship, but how on earth it has managed to last this long? I will argue that what sits at the core of this survival is a set of normative cultural values that privilege openness. Their application has been far from perfect but the concepts of communication, criticism, civility and inclusion have deep roots in our institutions and communities. At the same time community and identity are critical to scholarship, and both of these imply exclusion and boundary work to define community. My argument is that the culture, forms and values of western scholarship have held these two tendencies in productive tension, allowing the academy to address the ongoing scaling (and consequent inclusion) problem through social, technical and economic innovation. Our challenge is not simply to solve today's problems, but to re-imagine our institutions so that they continuously generate and are able to adopt the innovations necessary to continue to solve the scaling problem into the indefinite future.
Slides for a presentation to the SCONUL conference in 2015. Focusses on the idea that there is an ongoing shift from working within life cycles to networks in the research world
Slides for a talk given at the Centro Cultural de la Ciencia in Buenos Aires on 4 September 2017. The opportunities and challenges for open research practices become more complex as those practices become common. Is open research just “good research” or is it a fundamental change in our approach? I will argue that seeing open research in oppositional terms is not productive, and that thinking of it as an ongoing process of opening up practice that has been ongoing for centuries may make implementation and adoptions easier and more rapid.
Openness in Scholarship: A return to core values?Cameron Neylon
The debate over the meaning, and value, of open movements has intensified. The fear of co-option of various efforts from Open Access to Open Data is driving a reassessment and re-definition of what is intended by “open”. In this article I apply group level models from cultural studies and economics to argue that the tension between exclusionary group formation and identity and aspirations towards inclusion and openness are a natural part of knowledge- making. Situating the traditional Western Scientific Knowledge System as a culture-made group, I argue that the institutional forms that support the group act as economic underwriters for the process by which groups creating exclusive knowledge invest in the process of making it more accessible, less exclusive, and more public-good-like, in exchange for receiving excludable goods that sustain the group. A necessary consequence of this is that our institutions will be conservative in their assessment of what knowledge-goods are worth of consideration and who is allowed within those institutional systems. Nonetheless the inclusion of new perspectives and increasing diversity underpins the production of general knowledge. I suggest that instead of positioning openness as new, and in opposition to traditional closed systems, it may be more productive to adopt a narrative in which efforts to increase inclusion are seen as a very old, core value of the academy, albeit one that is a constant work in progress.
Interpreting Shadows on the Elephant in the RoomCameron Neylon
Talk on the economics of sustainability models for scholarly communication given at ScienceEurope/LIBER workshop in Antwerp on 27 April 2017. Focuses on very fundamental issues of what happens in economic terms with scholarly communication and how cultural institutuions as well as formal institutions play a key role in supporting groups, clubs in economic terms, that take knowledge and covert to being more public like.
Sustainable Futures for Research CommunicationCameron Neylon
Slides for a talk given at Duke University on 7 October 2016. The talk focusses on political economics of scholarly publishing and routes forward to finding equitable and affordable ways to shift to Open Access.
How can we invest in future development of scholarly communications. Whose role is it and who is paying? In this talk, given at the UKSG meeting in 2016 I use the lens of culture to ask how scholarly communications needs to change, and where the opportunities lie for researchers and publishers.
No stories without evidence, no evidence without storiesCameron Neylon
Talk given at Sydney University on 4 August 2015.
Across many parts of our lives we are faced with the increasing availability of data to support decision making. With every element of the research process moving online, there are many new sources of data, as well as improved old sources of data, that can provide information on the performance, value and use of research and researchers.
But there is a problem. The proliferation of proxy data, and their naive equation with such weakly defined concepts as “quality” and “excellence”, have lead to a reliance on rankings and quantitative measures as institutional targets. More than this the adoption of these instrumental targets has lead us away from a critical discussion of institutional values, indeed of what the institution is for.
I will argue that it is only by moving away from such vague terms as “quality”, “excellence” and “impact” and focussing on institutional values and a well articulated mission that institutions of scholarship will continue to be relevant for the future. It is through interrogating the goals of the institution that the enormous potential resource of data on the research enterprise can be realised. Using the data effectively will allow us a window on how knowledge actually moves and is used. In combination with a clear sense of institutional goals this provides great opportunities for institutions to differentiate themselves from the pack.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Welocme to ViralQR, your best QR code generator.ViralQR
Welcome to ViralQR, your best QR code generator available on the market!
At ViralQR, we design static and dynamic QR codes. Our mission is to make business operations easier and customer engagement more powerful through the use of QR technology. Be it a small-scale business or a huge enterprise, our easy-to-use platform provides multiple choices that can be tailored according to your company's branding and marketing strategies.
Our Vision
We are here to make the process of creating QR codes easy and smooth, thus enhancing customer interaction and making business more fluid. We very strongly believe in the ability of QR codes to change the world for businesses in their interaction with customers and are set on making that technology accessible and usable far and wide.
Our Achievements
Ever since its inception, we have successfully served many clients by offering QR codes in their marketing, service delivery, and collection of feedback across various industries. Our platform has been recognized for its ease of use and amazing features, which helped a business to make QR codes.
Our Services
At ViralQR, here is a comprehensive suite of services that caters to your very needs:
Static QR Codes: Create free static QR codes. These QR codes are able to store significant information such as URLs, vCards, plain text, emails and SMS, Wi-Fi credentials, and Bitcoin addresses.
Dynamic QR codes: These also have all the advanced features but are subscription-based. They can directly link to PDF files, images, micro-landing pages, social accounts, review forms, business pages, and applications. In addition, they can be branded with CTAs, frames, patterns, colors, and logos to enhance your branding.
Pricing and Packages
Additionally, there is a 14-day free offer to ViralQR, which is an exceptional opportunity for new users to take a feel of this platform. One can easily subscribe from there and experience the full dynamic of using QR codes. The subscription plans are not only meant for business; they are priced very flexibly so that literally every business could afford to benefit from our service.
Why choose us?
ViralQR will provide services for marketing, advertising, catering, retail, and the like. The QR codes can be posted on fliers, packaging, merchandise, and banners, as well as to substitute for cash and cards in a restaurant or coffee shop. With QR codes integrated into your business, improve customer engagement and streamline operations.
Comprehensive Analytics
Subscribers of ViralQR receive detailed analytics and tracking tools in light of having a view of the core values of QR code performance. Our analytics dashboard shows aggregate views and unique views, as well as detailed information about each impression, including time, device, browser, and estimated location by city and country.
So, thank you for choosing ViralQR; we have an offer of nothing but the best in terms of QR code services to meet business diversity!
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
7. Policy for preservation
“Each Faculty member grants to the
President and Fellows of Harvard
College permission to make available his
or her scholarly articles and to exercise
the copyright in those articles…”
https://osc.hul.harvard.edu/hfaspolicy
https://www.flickr.com/photos/bswise/5008063253 CC BY
8. Policy for preservation
“…a nonexclusive, irrevocable, paid-up,
worldwide license to exercise any and
all rights under copyright relating to
each of his or her scholarly articles, in
any medium, and to authorize others to
do the same, provided that the articles
are not sold for a profit.”
https://osc.hul.harvard.edu/hfaspolicy
https://www.flickr.com/photos/bswise/5008063253 CC BY
9. Policy for preservation
“Unless their publication is in the
repository, then it doesn’t exist”
Bernard Rentier, Rector, University of Liege
http://infteam.jiscinvolve.org/wp/2013/11/22/
some-reflections-on-the-berlin-11-conference-berlin-november-2013/
https://www.flickr.com/photos/bswise/5008063253 CC BY
13. Dissemination is changing
Accessible
Rapid
Discoverable
Re-usable
Value for money
Quality Assured
New audiences
More content
Greater function
Larger impact
31. Policy Portfolio
• Require (limited) rights assignment and
local repository deposition
• Support Open Access publishing via
direct and indirect resourcing
• Constrain resource allocation by
managing dissemination as a portfolio
• Observe and Report how faculty
choices effect dissemination
32. Policy Portfolio
• Require (limited) rights assignment and
local repository deposition
• Support Open Access publishing via
direct and indirect resourcing
• Constrain resource allocation by
managing dissemination as a portfolio
• Observe and Report how faculty
choices effect dissemination
55. Illustrations by Richard Swan – Used under a CC BY-NC-ND license
http://www.themoonunderground.com/
Editor's Notes
Many people involved with, or observing Open Access have a visceral reaction to the colours on this slide. It’s as though we have created opposing teams at war with with each other. In this talk I want to argue that this has created a divide which is distracting us from the real issues. To illustrate that I will allude to these colours throughout the talk without ever actually referring to them.
Institutions have real choices to make about how the manage scholarly communications and limited resources to do that. For those charged with scanning the horizon and planning for the institutional future there are real hard choices ahead about what can and cannot be resourced.
Those choices need to be framed in the context of the institutional mission. What are universities FOR? And with that mission comes the responsibility to deliver on it.
That obviously means thoughtful, efficient and effective resource allocation. There is only so much money to go around.
But it also means a responsibility as stewards of a complex and interdependent ecosystem. And long term stewards. Many of the major universities world wide are OLDER than the nation states around them. Institutions are in many ways a better steward for the long term than countries. And they can take the long term view. They can (but do they?) understand their role as one that is not a narrow pursuit of one form of success but that the institution as a whole, the complexity of the ecosystem, is a strength in itself. And something that needs to be preserved.
So lets talk about preservation. That may seem like a strange place to come into a discussion of Open Access but again, I want to emphasize the long term nature of institutions. And we’ve reached a strange place. A century ago, a university could reliably preserve the work of its scholarly by ensuring it had a physical copy. Those copies could be preserved, curated (weeded when appropriate) and made available for future scholarship. But today we’ve got into a place where institutions don’t even have the legal rights to preserve the work that they support.
This is the Harvard Open Access policy. But I want you to see it, not as an access policy, but as a preservation policy. If an institution takes it’s long term stewardship seriously a policy like this is the absolute MINIMUM required to secure just the legal rights to allow preservation. Again, remember the institution may survive longer than countries, certainly longer than publishers or third parties that might provide preservation services…
…so to exercise that responsibility, that obligation of the institutional role it is crucial that institutions reserve some rights, that they require them from their scholars. Again, think of this, not as an Open Access policy, but as a preservation policy.
Here is another policy. This one from the University of Liege, a very successful OA policy. Here the Rector of the University, Bernard Rentier is expressing one of its most powerful aspects – that only those objects that are deposited in the repository will count for internal appraisal, promotions and grants. But it is also true within the scope of preservation. If it isn’t in the repository, then in the long term it may as well not exist. Who else can the institution trust. When I gave this talk at Harvard it was pointed out that the New England Journal of Medicine (200 years younger than Harvard) had to resort to obtaining paper copies of its own journal from the Harvard library when they wanted to digitize.
Of course it’s not just policy. The infrastructure also needs to be there for preservation. Properly funded, properly supported and strengthened with good policy. A solid institutional preservation framework will include a range of repositories, digital and physical that complement the policy framework, and the policy decisions that need to be made in choosing what to preserve. My point here is that even before we start talking about dissemination or Open Access, institutions need good repository infrastructures and policies that include rights retention to discharge another core responsibility. That of safeguarding scholarly outputs.
Now you might say that there are sufficient exceptions in copyright law to allow institutions to preserve content. This might even be true, but in the digital age preservation alone is next to useless. If it is to be worth preserving then it must be available for use in the future. Format shifting, copying and distribution are central to the USE of digital content. A preserved grain from an Egyptian tomb is a curio unless we can study it, sequence its genome, use it to bake bread or brew beer. Dissemination is part of preservation.
…which brings us to that second part of the mission. Dissemination, and via dissemination to wider access and Open Access.
Clearly the web has changed scholarly communications and has the potential to change it much more. We can reach new audiences, and old audiences better. We can share a wider diversity of research outputs, supporting different kinds of downstream use. There is the potential for much greater functionality and dynamism in the way we interact with research outputs. Ultimately there is the opportunity for research to inform better, find wider application and in general have a greater impact.
And this is not a zero sum game. Those discipline and communities that have used the web effectively to widen dissemination are moving forward. And those that haven’t risk being left behind. Whether through institutional repositories, disciplinary repositories or open access publishing, whole communities are making their work more widely available – and we know it is being read.
So where does this leave the institution? Looking out to the horizon. What choices does the institution have and what resources does it already have in place? How does your institution ensure that it stays out ahead of this change, but also ensures that it preserves the good things that are already found in the ecosystem that it stewards? How can it take a long term view while ensuring it is at the forefront of innovation?
Well if an institution has a good repository infrastructure in place then it also has a good local dissemination infrastructure in place. Those policies that are necessary as we’ve seen to support preservation also support wider and more effective dissemination. Repositories are a cost effective (when properly supported and properly resourced, they’re not free!) of expanding dissemination without needing to break the existing system. They’re also a LOCAL infrastructure which is directly under the control of the institution and can therefore be a locus of rapid innovation without requiring permissions from or collaboration with other players.
We also have an existing shared dissemination infrastructure. One that is not generally under the control of individual institutions but is an important element of the overall ecosystem in which institutions and their scholars sit.
This is of course the infrastructure provided by our existing journal and book publishing system. An infrastructure that is both symbiotic with and sometimes parasitic on the research institution. It remains to most scholars the core of how their work is disseminated.
It is also a complex ecosystem, with venerable old trees and many new shoots. The old trees have prominence, some of them provide crucial elements of the canopy, others may be rotten to the core. Where the old falls the new will grow. For the long term steward the question should be less about individual plants and more about the system. How do you cut back, where do you protect new growth and where from time to time do you fell an old tree?
Think of this as infrastructure. After all, universities pay for it for the most part.
So we have both local and shared infrastructures, some under the control of institutions, some less so. When it comes to policy choices, some would have us believe that this is an either/or choice. That policy must demand improved dissemination through repositories OR through journals.
…Or that there must be a focus on Open Access through publishing with repositories as a poor back up…This is a false dichotomy.
It’s a false dichtomy for at least two reasons. The first is based in the present. Just as in an ecosystem a monoculture is an unstable state, taking a single path to the solution of a complex problem, increasing access to scholarly outputs, and increasing the USEFULNESS of scholarly outputs is also unstable. Complexity brings complications, but it also brings robustness. Remember again we are looking to the long term, to the health of the system, not of any single piece of it.
As in any investment a portfolio approach is needed. Or as in stewarding a forest it is the preservation of the web of interacting species is needed. The large, the small, the young, the old. The safe, the risky. And also serving current needs as well as laying the ground for more radical change in the future. No existing offering in scholarly communication offers everything that is needed – so an investment portfolio is required.
Institutional repositories are a great means of providing wider access. They do have some limitations. There can be delays to access where institutions accept publisher mandated embargoes (whether they should is a separate argument, the reality is that in many cases they do). IRs are poor generally at supporting discovery and are generally not set up either legally or technically to make it easy to normalise formats or work with the content. Of course, they are also not generally peer reviewed so quality assurance as we traditionally understand it is an issue.
Disciplinary repositories tend to do a bit better on discovery precisely because the discipline goes to the repo to find stuff. But other there are often delays to access and licensing and technical support for re-use is often limited. It doesn’t need to be, but it hasn’t been a priority. There are exceptions of course with the large disciplinary preprint repositories, ArXiv, SSRN and RePEc being great examples.
By comparison traditional subscription publishing has different problems and advantages. Access is an issue, but quality assurance in its traditional form is provided. Value for money is an issue, certainly compared to the base cost of dissemination but also affordability. However if you do have access you do get the content immediately – an issue for many disciplines.
Open Access publishing can solve the access problem and many OA publishers are working hard to improve re-usability (not hard or fast enough for some but at least heading in the right direction). Compared to subscription publishing OA publishing is generally a lot cheaper on a per article basis, but if you ignore the costs of subscriptions then it looks expensive compared to putting a copy in a repository.
…some will argue about quality assurance in the OA literature.
I would argue that to the extent that OA has QA issues that subscription publishing has many of the same issues, they just get less reported. But of course I would say that. Let’s just say neither is perfect but quality assurance as we traditionally think of it is applied to the published outputs of serious journals whether they are subscription or open access based.
And of course the real issue isn’t so much quality assurance per se. That really only happens in the long term. It’s the prestige that is conferred by the brand of the journal (or not) that is really at stake here when we compare to “slapping something up on the web”. But ultimately the point here is not that one path is better than another, but that none provide the full range of desirable qualities. A portfolio approach is required and that’s what institutions will have to decide on. Not to support one or the other, but what the balance of resource allocation should be.
What should that look like in practice? This would be my institutional policy suggestion (it would be different for funders). Combining the two mission goals, both preservation and dissemination, and with a focus on a future that will be largely Open Access, institutions should ensure that they have the rights for preservation and repository deposition and should, as does Liege, explicitly couple internal assessments to objects that are in the repository. This provides a solid basis for institutional dissemination based on infrastructure and rights that are within the control of the institution.
Institutions should support Open Access publishing – but there are a range of ways to do that. Resourcing is good in the form of institutional funds but choices need to be made about eligibility and the level of payments. Support could include provision for institutional publishing platforms or it could include consortial support for journals making the switch.
Given this investment it needs to be constrained and institutions critically need to constrain overall spending. Expenditure on APCs should be coupled as far as possible to reductions in subscriptions. It may not be a zero sum game and there may be transitional investment but institutions will need to manage the full portfolio of scholarly communications activities and tension them against each other.
Finally institutions need to be a lot better at tracking what is happening and how it is working. This can both be used to develop evidence based policy but also to develop funding instruments that promote efficiency in the system. At the moment most institutions are not able to measure either what they generate, nor what they use, relying on third party providers for that information.
So lets focus a little on that support piece seeing as that is where most of the controversy lies.
Support for the incumbents to move to an OA future is one piece. Careful planning and judicious arm twisting is required to ensure that if the institutions sees hybrid approaches as a viable transition that they are truly transitional. That means tight coupling between what the institution pays in APCs to reductions in subscriptions. These negotiations are happening in the UK right now. Where APCs are not appropriate institutions can support journals directly, transitioning subscriptions to support funds, perhaps with agreements in place for a number of years to assure publishers acting in good faith that they have the time and support to make the transition.
And some of the old trees will simply need to be felled.
There’s a need for fresh approaches. Many of the low cost technologies that will support scholarly communications in the future are being developed by PhD students in your institutions. Many of your scholars are experimenting with new approaches, sometimes with third parties, sometimes on their own. Other player are out there developing new approaches and new systems.
The balance between old and new is a policy choice for the institution, but its one with consequences. The new investments will required protection and support, possibly for some time. They offer substantial possible return in the long term, but substantial risk in the short term. Where can institutions share that risk? How can it be amortised? What support and protection is appropriate.
Perhaps most importantly, how big a role does the institution want to take? Because this is the central policy question, and the one which blows the artificial distinction between repositories and OA publishing out of the water. Institutions, if they choose have the resources, and the technology to reclaim scholarly communication for their community. Or they can support scholarly communities to do it, or they can step back and let others take the lead, but provide the resourcing and stewardship to ensure that in the long term we get where we need to go.
So, what does that look like? What does the long term target imply. Or more realistically what are the medium term strategic needs to guide resource allocation?
Lets go back to this diagram. Limited as it is…
…what we want is new infrastructures and systems that give us green lights the whole way down…
With the right policy and rights retention in place and the right technical infrastructures we could imagine combining the best of OA publishing and IRs. If we start for arguments sake with todays OA publishing models we would need to tackle two areas, the value for money, bringing costs down and the provision of prestige. If we can improve OA publishing on these two axes we can make massive progress.
So lets look at costs. There are lots of reasons why the existing publishing system is expensive and its way too long to go into here but one is that journals are (generally) separate silos…
…with different journals…
…or at least different publishers replicating a lot of (mostly badly outdated and clunky) technical systems. All of this adds to the cost but more importantly it makes it harder for new technical innovation to firstly get a toe hold and secondly deliver benefits across the system.
So if we take two chunks of this existing system that seem important, communities of practice doing review to create bodies of curated content, rather than separating these out, each with their own separate systems…
…we could imagine a shared infrastructure. Actually this in some ways already exists but lets go further…
…lets imagine that the actual underpinnings are shared across the system. That the publication infrastructure is not just shared and consistent, but is cheap and the subject of constant innovation. Some of that innovation might add additional layers that we don’t really have today…
…such as general discovery, and perhaps more importantly we could focus the effort of review and curation on those pieces that need it and allow the others to just be surface through search. Now that bottom layer of publication infrastructure looks a lot like a repository, perhaps a shared repository across institutions. Or it could be produced by a publisher and run for institutions, or produced by a publisher and given to institutions.
This isn’t a blueprint, it’s a thought experiment. The bottom line is that there really isn’t any difference between a repository and a publisher. The thing we think of as the difference is a thin layer of presentation (mainly missing from repositories for political reasons) and a quality assurance process. There’s no need for presentation to be coupled to storage and no reason for QA to be tightly coupled to either. And once they are separated the question is not Repository Deposit vs OA publishing, but where the infrastructure investment is needed and who is responsible for which bits? What do institutions want to keep in house, and what do they want to out source to service providers?
But the technical side is usually the easiest…what about that bottom row? Prestige, the hard one.
Traditionally this is the point where we throw up our hands and say. “It’s a hard problem” or “it’s a collective action issue” or even more commonly “that’s something for funders to do”. But actually I think this is far more in the hands of institutions than of funders. Institutions are where the policy rubber hits the road, and where resources get allocated and jobs awarded, or not. Institutions in the end deliver the wider community prestige though their brands, their names, who they employ and who they support.
Some institutions have more leverage than others. But those with a lot are in a prime position to invest for the future to create a strong position. And that investment will be most wisely made in a portfolio of options. Including, yes, subscriptions but also quality repository infrastructure, shared investments in disciplinary systems, support for OA publishing and investment in the truly new and radical systems currently being developed. By bringing their brand to the table, by offering resources, and by celebrating new modes of communication institutions have a real opportunity both to take back control of the dissemination systems they have ceded to publishers but also to work with like minded service providers to ensure that the ecosystem as a whole is around for another 500 years.
In a century we won’t be arguing about repositories vs oa publishing. We will be arguing about the most effective means of communication and the right policy measures to support it and resource it.
So it’s not about repositories vs publishers
It’s not about new vs old
Or radical vs conservative, or humanities vs STEM
It’s about how to bring all the resources together most effectively to deliver on effective scholarly communications and preservation. To deliver in the long term on the obligations for stewardship that come with being the homes of scholarship. To bring the best capacities of all our systems together to create a complex and robust ecosystem that delivers on the different needs of different communities but ensures the long term strength of all of them.