This document discusses how Oxford Archaeology uses open source software throughout the archaeological process. They have adopted an "open ethos" of open access to data, open standards, and open source software. Some challenges in transitioning to open source included changing workflows and overcoming misconceptions, but benefits include greater staff engagement, opportunities for innovation, and the ability to better meet company goals. Overall, the open source approach has helped the company survive difficult economic times and find new opportunities.
Presentation on the Open Source seminar at the Geospatial World Forum in Rotterdam may 16, 2013. See http://www.geospatialworldforum.org/2013/open_pr.htm.
This seminar was organized by OSGeo.nl (http://osgeo.nl), the Dutch Language Local OSGeo Chapter. In this presentation I share my view on what "Open" for Geospatial is about. Further: laying out the FOSS Geospatial ecosystem with some major players like OSGeo, OGC and OpenStreetMap. Further on monetising, i.e. how geospatial businesses can make money with Open Source.
Open Data (and Software, and other Research Artefacts) -A proper managementOscar Corcho
Presentation at the event "Let's do it together: How to implement Open Science Practices in Research Projects" (29/11/2019), organised by Universidad Politécnica de Madrid, where we discuss on the need to take into account not only open access or open research data, but also all the other artefacts that are a result of our research processes.
The final report from the JISC Institutional Innovation Project Erewhon which ran from October 2008 to March 2010 and investigated uses of mobile and geo-spatial technologies in higher education.
Amongst other outputs, the most significant is now Mobile Oxford (http://m.ox.ac.uk) and the Molly Project (http://mollyproject.org)
ICT research in the context of European Union
CASE SUMMER SCHOOL ON APPLIED SOFTWARE ENGINEERING
APPLIED SOFTWARE PROCESS MANAGEMENT AND TESTING
JULY 6-10, 2009, BOZEN/BOLZANO, ITALY
The 2011 IEEE/WIC/ACM International Conference on Web Intelligence » industry...Francois Pouilloux
The industry day of the conference aims to bring together people from both academia and industry in a venue that highlights application and practical impact.
I'm pleased to present there on August 22nd 2011.
Stay tuned for the prez file after the event !
WTA paper, Urban Mill case, Ilkka KakkoKari Mikkelä
"The process of creating something like Urban Mill is a novel example of the public -‐ private partnership. The orchestrating initiator and driver taking also an operative risk of succeeding is a private company Järvelin Design Oy, and other main stakeholders involved are
Aalto University, City of Espoo and RYM Oy (an industrial RDI consortium of urban development). This kind of setup has proved to be a dynamic foundation for the initiative and could be applied as a leading principle also in other contexts."
Processing Open Data using Terradue Cloud Platformterradue
We gave this talk at the "Open Data Projects cluster meeting" organised by the European Commission on 07-08th September 2015, in Brussels.
It was part of the Session IV: Sustainability and business strategies.
Zezan Tam's slides at Mobile Monday. Zezan Tam is a Melbourne based entrepreneur. After leaving his job at Boston Consulting Group, Zezan attended Singularity University in Silicon Valley, which kickstarted his thinking and excitement towards technology and entrepreneurship. He is currently working on a number of businesses in Australia, as well as being Entrepreneur in Residence at the University of Melbourne Accelerator Program. He travelled to Yangon to see the Myanmar entrepreneurship scene, and is interested in investing into talented entrepreneurs operating in a vibrant country poised for an exciting growth period.
Open Source Software for Digital Preservation Repositories : A SurveyIJCSES Journal
In the digital age, the amount of data produced is growing exponentially. Governments and institutions can no longer rely on old methods for storing data and passing on the knowledge to future generations. Digital data preservation is a mandatory issue that needs proper strategies and tools. With this awareness, efforts are being made to create and perfect software solutions capable of responding to the challenge of properly preserving digital information. This paper focuses on the state-of-the-art in open-source software solutions for the digital preservation and curation field used to assimilate and disseminate information to designated audiences. Eleven open source projects for digital preservation are surveyed in areas such as supported standards and protocols, strategies for preservation, methodologies for reporting, dynamic of development, targeted operating systems, multilingual support and open source license. Furthermore, five of these open
source projects, are further analysed, with focus on features deemed important for the area. Along open source solutions, the paper also briefly surveys the standards and protocols relevant for digital data preservation. The area of digital data preservation repositories has several open source solutions, which can form the base to overcome the challenges to reach mature and reliable digital data preservation.
Open Source Software for Digital Preservation Repositories : A SurveyIJCSES Journal
In the digital age, the amount of data produced is growing exponentially. Governments and institutions can no longer rely on old methods for storing data and passing on the knowledge to future generations. Digital data preservation is a mandatory issue that needs proper strategies and tools. With this awareness, efforts are being made to create and perfect software solutions capable of responding to the challenge of properly preserving digital information. This paper focuses on the state-of-the-art in open-source software solutions for the digital preservation and curation field used to assimilate and disseminate information to designated audiences. Eleven open source projects for digital preservation are surveyed in areas such as supported standards and protocols, strategies for preservation, methodologies for reporting, dynamic of development, targeted operating systems, multilingual support and open source license. Furthermore, five of these open
source projects, are further analysed, with focus on features deemed important for the area. Along open source solutions, the paper also briefly surveys the standards and protocols relevant for digital data preservation. The area of digital data preservation repositories has several open source solutions, which can form the base to overcome the challenges to reach mature and reliable digital data preservation.
What is Open Science / Open Research?; Initiative of the European Union (EU); Elements of Open Science: open research process / cycle; open access (open repositories); open data; open source software; open notebook / lab book; open workflows; open reputation systems; citizen science; relationship between open research and e-research; open science in Africa and South Africa
Soap box session - Intermediaries, Research communities & LibrariesEOSCpilot .eu
This presentation was held at the 1st EOSC Stakeholder Forum 28-29/11/2017 in Brussels.
For more information on the 1st EOSC Stakeholder Forum visit: https://eoscpilot.eu/eosc-stakeholder-forum-shaping-future-eosc
Follow EOSCpilot on Twitter: https://twitter.com/eoscpilot
and LinkedIn: https://uk.linkedin.com/in/eoscpiloteu
Presentation on the Open Source seminar at the Geospatial World Forum in Rotterdam may 16, 2013. See http://www.geospatialworldforum.org/2013/open_pr.htm.
This seminar was organized by OSGeo.nl (http://osgeo.nl), the Dutch Language Local OSGeo Chapter. In this presentation I share my view on what "Open" for Geospatial is about. Further: laying out the FOSS Geospatial ecosystem with some major players like OSGeo, OGC and OpenStreetMap. Further on monetising, i.e. how geospatial businesses can make money with Open Source.
Open Data (and Software, and other Research Artefacts) -A proper managementOscar Corcho
Presentation at the event "Let's do it together: How to implement Open Science Practices in Research Projects" (29/11/2019), organised by Universidad Politécnica de Madrid, where we discuss on the need to take into account not only open access or open research data, but also all the other artefacts that are a result of our research processes.
The final report from the JISC Institutional Innovation Project Erewhon which ran from October 2008 to March 2010 and investigated uses of mobile and geo-spatial technologies in higher education.
Amongst other outputs, the most significant is now Mobile Oxford (http://m.ox.ac.uk) and the Molly Project (http://mollyproject.org)
ICT research in the context of European Union
CASE SUMMER SCHOOL ON APPLIED SOFTWARE ENGINEERING
APPLIED SOFTWARE PROCESS MANAGEMENT AND TESTING
JULY 6-10, 2009, BOZEN/BOLZANO, ITALY
The 2011 IEEE/WIC/ACM International Conference on Web Intelligence » industry...Francois Pouilloux
The industry day of the conference aims to bring together people from both academia and industry in a venue that highlights application and practical impact.
I'm pleased to present there on August 22nd 2011.
Stay tuned for the prez file after the event !
WTA paper, Urban Mill case, Ilkka KakkoKari Mikkelä
"The process of creating something like Urban Mill is a novel example of the public -‐ private partnership. The orchestrating initiator and driver taking also an operative risk of succeeding is a private company Järvelin Design Oy, and other main stakeholders involved are
Aalto University, City of Espoo and RYM Oy (an industrial RDI consortium of urban development). This kind of setup has proved to be a dynamic foundation for the initiative and could be applied as a leading principle also in other contexts."
Processing Open Data using Terradue Cloud Platformterradue
We gave this talk at the "Open Data Projects cluster meeting" organised by the European Commission on 07-08th September 2015, in Brussels.
It was part of the Session IV: Sustainability and business strategies.
Zezan Tam's slides at Mobile Monday. Zezan Tam is a Melbourne based entrepreneur. After leaving his job at Boston Consulting Group, Zezan attended Singularity University in Silicon Valley, which kickstarted his thinking and excitement towards technology and entrepreneurship. He is currently working on a number of businesses in Australia, as well as being Entrepreneur in Residence at the University of Melbourne Accelerator Program. He travelled to Yangon to see the Myanmar entrepreneurship scene, and is interested in investing into talented entrepreneurs operating in a vibrant country poised for an exciting growth period.
Open Source Software for Digital Preservation Repositories : A SurveyIJCSES Journal
In the digital age, the amount of data produced is growing exponentially. Governments and institutions can no longer rely on old methods for storing data and passing on the knowledge to future generations. Digital data preservation is a mandatory issue that needs proper strategies and tools. With this awareness, efforts are being made to create and perfect software solutions capable of responding to the challenge of properly preserving digital information. This paper focuses on the state-of-the-art in open-source software solutions for the digital preservation and curation field used to assimilate and disseminate information to designated audiences. Eleven open source projects for digital preservation are surveyed in areas such as supported standards and protocols, strategies for preservation, methodologies for reporting, dynamic of development, targeted operating systems, multilingual support and open source license. Furthermore, five of these open
source projects, are further analysed, with focus on features deemed important for the area. Along open source solutions, the paper also briefly surveys the standards and protocols relevant for digital data preservation. The area of digital data preservation repositories has several open source solutions, which can form the base to overcome the challenges to reach mature and reliable digital data preservation.
Open Source Software for Digital Preservation Repositories : A SurveyIJCSES Journal
In the digital age, the amount of data produced is growing exponentially. Governments and institutions can no longer rely on old methods for storing data and passing on the knowledge to future generations. Digital data preservation is a mandatory issue that needs proper strategies and tools. With this awareness, efforts are being made to create and perfect software solutions capable of responding to the challenge of properly preserving digital information. This paper focuses on the state-of-the-art in open-source software solutions for the digital preservation and curation field used to assimilate and disseminate information to designated audiences. Eleven open source projects for digital preservation are surveyed in areas such as supported standards and protocols, strategies for preservation, methodologies for reporting, dynamic of development, targeted operating systems, multilingual support and open source license. Furthermore, five of these open
source projects, are further analysed, with focus on features deemed important for the area. Along open source solutions, the paper also briefly surveys the standards and protocols relevant for digital data preservation. The area of digital data preservation repositories has several open source solutions, which can form the base to overcome the challenges to reach mature and reliable digital data preservation.
What is Open Science / Open Research?; Initiative of the European Union (EU); Elements of Open Science: open research process / cycle; open access (open repositories); open data; open source software; open notebook / lab book; open workflows; open reputation systems; citizen science; relationship between open research and e-research; open science in Africa and South Africa
Soap box session - Intermediaries, Research communities & LibrariesEOSCpilot .eu
This presentation was held at the 1st EOSC Stakeholder Forum 28-29/11/2017 in Brussels.
For more information on the 1st EOSC Stakeholder Forum visit: https://eoscpilot.eu/eosc-stakeholder-forum-shaping-future-eosc
Follow EOSCpilot on Twitter: https://twitter.com/eoscpilot
and LinkedIn: https://uk.linkedin.com/in/eoscpiloteu
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
AGI 2010 Notes
1. It's All One Big Opportunity
Joanne Cook, Senior IT Support and Development Officer, Oxford Archaeology/OA
Digital
Abstract:
Location runs all the way through archaeology- from the position of an individual find to the
distribution of settlements in the landscape. We do mobile GIS, augmented reality, and high-end
spatial analysis, and it's all open source. Going open has not only helped our company survive in
very difficult circumstances, but has provided new opportunities for innovation, improved
efficiency, and helped us better meet our company goals. We prove that it's possible to run a
professional company in this way, but also to innovate and succeed in an economic downturn. Our
experiences may be of value to other companies faced with a similar situation- can you also turn
adversity into opportunity?
Introduction:
In this paper I'm going to work through a hypothetical archaeological excavation, explaining how
we achieve, or are trying to achieve, the various steps with open source software. I will talk about
the challenges and misconceptions we have overcome, and the opportunities and benefits we
have gained from our approach. I will try and place these challenges, misconceptions and
opportunities into a more general concept, and highlight ways in which I believe other companies
can learn from our experiences.
What's this archaeology thing about- is it like Time Team?
The formal definition of archaeology is the study of past societies through their human remains.
The purpose of archaeology is not only to tell us about our ancestors, but to provide insights into
how we got to where we are now. For example, how did our ancestors deal with (or cause)
warfare, climate change, or epidemics? Archaeology allows us to put our own lives in context- we
are who we are, and live in the society that we live in, because of everything that has gone
before.
A few common misconceptions remain about archaeology. Firstly we don't (often) encounter
dinosaurs, and secondly, unlike on Time Team we usually take considerably more than three days
to study a site. We usually require more than a single piece of pottery to reconstruct an entire
Roman villa too! Thirdly, archaeology is no longer generally the pursuit of elderly retired
gentlemen antiquarians or adventurers, but by highly qualified professionals, often with post-
graduate degrees in their chosen specialism.
In the UK, archaeology is generally undertaken as part of the planning process prior to building
work. Some level of archaeological intervention, be that a simple desk-based investigation of
historic maps through to a full-scale open area excavation, has been a legal requirement of the
planning process since 1990. The majority of archaeological work is undertaken by commercial
units, via a highly competitive tendering process overseen by the county councils, although some
university-based research also takes place.
Archaeology therefore forms part of the complex chain of contractors, sub-contractors and
deadlines that make up the average construction job. Though a very small part of the process,
delays in the archaeological phase, such as from the discovery of something important or
unexpected, can lead to massive delays and therefore penalties later in the scheme.
Consequently, as a result of the tendering process and of integration in the overall construction
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
2. scheme, archaeologists need to work to extremely tight budgets, to strict deadlines, in a harsh
environment, and to liaise and share data with many other contractors. Not only that, but the
type of data we gather and use is extremely varied. As a result, our IT requirements can be quite
demanding, but we are a niche software market, unattractive to traditional software or hardware
manufacturers.
Oxford Archaeology are the largest commercial archaeological unit in Europe. We have 400 staff,
spread over three offices in the UK, and two in France, and are called on to provide expertise at
an international level, in countries as diverse as China, Iraq, Turkey and Tibet.
The requirement for openness
Our data is often the only record that remains of a site, once it has been developed. Our
understanding of the site is necessarily incomplete, due to the limited size of the excavation
trenches, and it is not uncommon for findings to be contradicted by later work, often decades
later. At the very least, later work may place our findings into a wider context. Consequently our
primary data must be kept in perpetuity, so that later researchers can access it, and can repeat
the analytical processes that led to the final conclusions. With a paper archive, this is relatively
straightforward. However, with digital data and devices, the situation is quite different. Either it is
necessary to maintain devices and packages capable of reading the data, and spend time curating
the archive, ensuring that it remains readable, or to choose open, published formats and
programs that will always be readable. The long-term impact of using a non-standard format,
readable only by a particular software package or device, however suitable that is for the
recording task, will be to render our archives useless or inaccessible. Moreover, for the analytical
process to be truly repeatable, at any later point, the algorithms used should be accessible, or the
process is a “black-box”, repeatable only by researchers with the same version of the same
software running on the same operating system.
The Oxford Archaeology “Open Ethos”
Our approach to the requirement for openness outlined above is called “open archaeology”. This
has three parts:
• Open access to our data,
• Open standards for file formats,
• Open source for our software.
As part of this, we are trying to create a work-flow for excavating a site that uses only open
source software.
The archaeological process
In broad terms, the process of excavating an archaeological site has the following phases:
Excavation, Analysis and Publication.
Excavation
Initially, trenches are laid out across the site with professional-grade survey equipment. Within a
trench, buried features manifest themselves as changes in colour and texture of the soil, known
as contexts. Every context is planned (drawn to scale) and recorded before being removed, and
an average excavation may contain several thousand contexts. Artefacts are generally recorded in
relation to the context in which they are found, as that relationship is one of the key ways in
which the development of a site is understood during the analytical phase of the work. Previously
this process would have been almost entirely paper-based, with site records often being
measurable in metres of paper rather than number of sheets.
The search for a device for digitally recording data on site, to replace the paper archive, has been
going on for decades. Approaches to the device problem include electronic paper, ruggedised
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
3. laptops and pdas. Devices must be capable of surviving in a dirty, wet environment, preferably be
operated by cold or gloved fingers, and have good battery life. Old favourites include Psion series
5s, which can run for days on standard AA batteries and will work inside a plastic bag, but these
are now difficult to acquire, technologically limited, and only solve part of the site recording
problem.
Our aim is for a digital recording system that collects tabular data in a database, spatially locates
it, and allows photos to be attached. The database needs to synchronise with a server-based
master version, needs to support multiple users, and also allow device-based data storage in
areas where there is little or no data signal. Having trialled a number of devices over the years,
we have discounted single-function devices in favour of multi-function devices such as pdas and
now smart phones, to keep our overall costs down.
One device that seemed originally seemed very promising was the openmoko phone. This is a
totally open source phone (both hardware and software), with built-in GPS, accelerometer, USB
host capabilities, a very readable screen, and a robust bump-proof body. Unfortunately for
openmoko, the revolution in Android phones pretty much took away their target market, and the
phones are no longer being developed.
Our current target device is a Motorola milestone phone. We issue these to most of our staff, for
use as their phone, and as a digital recording device. Getting good enough batter life from the
milestone is a challenge, but we are investigating solar charging options and it is relatively easy to
provide spare batteries. Bump proof cases, and waterproof bags protect the phones on site. The
increase in cost of the phones themselves (actually not as bad as it seems as part of a company-
wide contract) is offset by savings in data collection costs, both on site and later in the analytical
phase of the project. There are many additional benefits to this approach. Smart phones mean
staff can check their email on the move, which is important as many of our site staff may never
visit one of the main company offices. This helps them stay engaged and feel part of the
company. They are also a conspicuous sign of company investment in our staff, which again helps
with engagement and morale and ensures that staff take good care of them- all important factors
in the present economic climate!
A promising data collection program is Epicollect (http://www.epicollect.net/), that meets most, if
not all, of our data collection requirements. Staff can take geospatially enabled photos of all
features and finds, and immediately attach them to their database entry. This package is being
paired with gvSIGmini (http://confluence.prodevelop.es/display/GVMN), which provides a feature-
rich GIS client. The master database is in PostgreSQL, spatially enabled with PostGIS. Full
enterprise level database functionality ensures our data is secure, and structurally robust.
Analysis
This phase is often the longest part of an excavation. It begins with the transfer of all the records
gathered on site into a database. When site archives are mainly paper-based, this process can
take months, and often errors are introduced simply due to typographic mistakes. When a site
archive is created digitally in the first place, this phase of work can be much reduced. Specialists
then use the archive to reconstruct the developmental sequence of the site, examine the finds,
and place the site in the context of it's wider geographic and historic surroundings. Often this
phase of the project is a high-level analytical process. Examining thousands of sherds of pottery,
or pieces of animal bone for example, requires statistical techniques. Increasingly, geospatial
analytical techniques are used as part of this process. For example, when examining prehistoric
sites where the evidence is mainly artefactual rather than physical, the changes in density of flint
scatters across a site can help pin-point areas of specific activity such as tool manufacture or
animal skinning.
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
4. Previously, Microsoft Access was used as the site database, with specialists (who are often self-
employed and working remotely) being sent their own copy of the database, leading to currency
problems, and a total lack of control over the data structure being used. However, replacing
Microsoft Access completely is not advisable as staff often have many years experience in using it
to analyse data, and are not familiar with alternatives. Our current system is to use Microsoft
Access as the front-end to a PostgreSQL database. In this way, we maintain control and security
over the data, we can ensure all staff are seeing the most up to date version, can control who has
the ability to delete or change records at a very fine-grained level, but also allow staff and
specialists the familiar interface to the data.
Geospatial analysis was previously done using ArcGIS, but the cost of licensing and training meant
that few staff had regular access and the skills to use it. Until current versions, it has also not
been possible to connect to PostgreSQL databases without yet another expensive plugin. We have
now migrated almost entirely to Quantum GIS and GvSIG as our desktop GIS packages, both of
which have arguably a more powerful set of geospatial analytical tools (GRASS and Sextante
respectively), and natively connect to PostgreSQL.
Publication
The final phase of a site is to check that all of the conclusions are logically consistent, and bring
everything into a single coherent narrative. The report of the excavation is published either as a
basic site report (commonly known as “grey literature”, held by the County at a Historic
Environment Record (HER), or as an academic monograph, or even sometimes as a “popular
publication” aimed at the general reader.
The various levels of publication have different requirements. “Grey Literature” reports can be
created in a word processing package, with illustrations generated in the GIS as necessary.
Monographs and “Popular Publications” require a full desktop publishing package. We are in the
process of converting to Open Office for our word processing/spreadsheet needs, and for some
cases, to Inkscape and GIMP for creating illustrations.
Challenges
It would be naïve (and false) to suggest that there haven't been any challenges along the way-
and some still to overcome.
• Changing the work-flows of a company of approximately 400 staff, many of whom have
been doing their job for decades is not easy. It is necessary to win hearts and minds, and
to get engagement at all levels. Simply making a financial argument is not enough, as
people see the process causing them more work, or forcing them to learn new skills, and
abandon old ones (the flip-side of this is that it highlights the difference between real skills
and simply knowing which button to press).
• Misconceptions of staff, clients and other contractors that the open source approach leads
to bad data. We have encountered a complete lack of understanding of the difference
between software and data at all levels of the archaeological and construction process.
Often specific software packages are mandated in tenders for work, when what is actually
required is a specific software format. At the very least this requires us to word our
submissions extremely carefully in order to avoid losing work because we do not use
ArcGIS!
• Problems are often blamed on the open source software where previously they would have
been blamed on something else (or not blamed at all). Software crashes all the time,
everyone knows this. However people tend to be a lot more critical of open source software
than it's closed source compatriots. The reason for this is unclear- it may simply be due to
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
5. the perception that open source software is somehow not as high quality as proprietary
software (it can't be, otherwise it calls into question why one would pay for the proprietary
alternative). Furthermore with packages such as Open Office, every formatting issue is the
fault of the software. Every previous incident of this kind with Microsoft Office is
conveniently forgotten.
• Changing work-flows does need an investment of time (and therefore money). From
software installation to staff training, this process has taken time, and has been criticised
as a result. However, many of these so-called “additional costs” are comparable to those
that would be entailed by moving from (say) one version of Microsoft Office to another, or
from ArcView to ArcGIS. Many of these costs are also offset by the lack of licensing costs,
and by efficiencies that re-examining the work-flow brings.
• Dead ends. Particularly with the search for a device for digital recording on site, there have
been a number of false starts. The openmoko phone is an example of this- although since
it is possible to run the latest Android operating system on them, they make excellent
development devices!
• Limitations of the software. Some of the open source packages we use do have rough
edges, as they have been designed to meet a particular need and time has been spent
improving the functionality rather than the polished interface. Unfortunately this leads to
an undeserved impression that the software is somehow not up to scratch. GvSIG is an
example of this. In it's native form it is a Spanish package, and the translation to English is
not as good as it could be, but functionally it is a superb piece of software. Part of our task
in choosing it as one of our desktop GIS packages has been to improve the translation.
Tools without a GUI, such as ogr2ogr, are shunned because of an innate fear of the
command-line (perhaps a throwback to the DOS years). Quantum GIS and GvSIG both
lack the ability to create really high-quality cartographic output, but we circumvent that by
exporting the basic figure as a PDF or SVG and finishing it in InkScape. Initially this
process added to the amount of time required to create a figure, but as staff familiarity
with the software has improved this is no longer an issue.
• Lack of open source cad packages. While a traditional site suits a GIS-based approach, for
some of our work, such as building survey, a CAD-based approach is more suitable.
Furthermore many of the engineers that we work with on road schemes and other large-
scale construction projects use CAD rather than GIS. While there are a number of open
source alternatives, we are yet to find one that fulfils all of our requirements, and keeps up
with the annual change in the DWG format.
Opportunities
• Greater staff engagement. The elimination of licensing costs and restrictions has meant
that we can roll the software out to a much larger proportion of our staff, and afford to
train them. Working through issues, learning a new package, and helping to develop work-
flows have caught people's enthusiasm, and made them more committed members of
staff, considerably better at their job.
• Consultancy. The open source geospatial market is still small, and lacks any major players
to dominate the market. Oxford Archaeology's open approach has garnered enough
attention that we are called upon to provide support and advice to other companies looking
to incorporate some open source software into their work-flow. We are of course living
proof that it is possible to work in a high-pressure collaborative environment and to use
open source software, with only a small number of hiccups (see above).
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
6. Our consultancy arm doesn't just provide support for open source software, however. For
example, we are using the new technologies that the company is investing in, such as the Android
phones, to investigate augmented reality applications, for doing historical “virtual tours” based on
medieval maps and archaeological evidence.
• Cost savings. There are enormous financial savings to be made with an open source
approach for many companies, although these can be hard to quantify due to the costs of
implementing the new software, training new staff, providing support, and so on.
• More control- no licensing changes. For educational charities the proprietary software
market is becoming a lot more strict about licensing. Over the last couple of years, many
software providers have changed the terms of their licensing agreements to bar
educational charities from receiving a discount. It is a moot point whether or not this is
fair, but at the very least it leads to a large increase in the software budget, often at very
short notice- simply to continue using the software that they have. Moving to open source
software does at least ensure that this will not happen in the future.
Conclusions
For Oxford Archaeology the open source approach has been, and continues to be a success,
helping us to survive in a very difficult industry during an economic downturn. It has also led to
many positive factors, such as more engaged and skilled staff, and opportunities, such as our
flourishing consultancy business. We do not believe that our approach will work for everyone, in
fact our position, even within the archaeological industry, is unusual. However, we do believe that
many companies in different industries can learn something from our experiences, particularly in
the current economic climate.
For example, large-scale change of software and work-flows might seem untenable, expensive,
and difficult for staff. However, it is an opportunity for a truthful assessment of what functionality
is actually needed- as prior assumptions about how much of a particular package are used, and
why, may be wrong. Are staff skilled in achieving tasks with a particular software package, or do
they have a true understanding of their area of expertise, with flexibility and transferable skills? If
the former, then even upgrading to a later version of the same software may be costly.
Furthermore, true ownership costs should be taken into account when considering a move to a
different software package, and it's fair to say that these are often under-estimated when open
source software advocates promote their favourite software. However, the cost of remaining with
key proprietary packages who perhaps change file format every couple of years, or who demand
better and better computer specifications as a minimum, should also be considered.
Our key advice is that people should question their software use, licensing, costs and
requirements to ensure that they truly get the best value, both from the software, and their staff.
In many cases, an honest assessment of requirements may lead to a decision to investigate open
source alternatives. Our success over the last few years proves that it is possible to use open
source software in a high-pressure environment, doing advanced analysis, and we would like to
see other companies reap the same rewards.
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'
7. Our consultancy arm doesn't just provide support for open source software, however. For
example, we are using the new technologies that the company is investing in, such as the Android
phones, to investigate augmented reality applications, for doing historical “virtual tours” based on
medieval maps and archaeological evidence.
• Cost savings. There are enormous financial savings to be made with an open source
approach for many companies, although these can be hard to quantify due to the costs of
implementing the new software, training new staff, providing support, and so on.
• More control- no licensing changes. For educational charities the proprietary software
market is becoming a lot more strict about licensing. Over the last couple of years, many
software providers have changed the terms of their licensing agreements to bar
educational charities from receiving a discount. It is a moot point whether or not this is
fair, but at the very least it leads to a large increase in the software budget, often at very
short notice- simply to continue using the software that they have. Moving to open source
software does at least ensure that this will not happen in the future.
Conclusions
For Oxford Archaeology the open source approach has been, and continues to be a success,
helping us to survive in a very difficult industry during an economic downturn. It has also led to
many positive factors, such as more engaged and skilled staff, and opportunities, such as our
flourishing consultancy business. We do not believe that our approach will work for everyone, in
fact our position, even within the archaeological industry, is unusual. However, we do believe that
many companies in different industries can learn something from our experiences, particularly in
the current economic climate.
For example, large-scale change of software and work-flows might seem untenable, expensive,
and difficult for staff. However, it is an opportunity for a truthful assessment of what functionality
is actually needed- as prior assumptions about how much of a particular package are used, and
why, may be wrong. Are staff skilled in achieving tasks with a particular software package, or do
they have a true understanding of their area of expertise, with flexibility and transferable skills? If
the former, then even upgrading to a later version of the same software may be costly.
Furthermore, true ownership costs should be taken into account when considering a move to a
different software package, and it's fair to say that these are often under-estimated when open
source software advocates promote their favourite software. However, the cost of remaining with
key proprietary packages who perhaps change file format every couple of years, or who demand
better and better computer specifications as a minimum, should also be considered.
Our key advice is that people should question their software use, licensing, costs and
requirements to ensure that they truly get the best value, both from the software, and their staff.
In many cases, an honest assessment of requirements may lead to a decision to investigate open
source alternatives. Our success over the last few years proves that it is possible to use open
source software in a high-pressure environment, doing advanced analysis, and we would like to
see other companies reap the same rewards.
AGI GeoCommunity '10: Opportunities in a Changing World
'Innovate - Connect - Succeed'