Text and Image based Digital Humanities: providing access to textual heritage...Edward Vanhoutte
This document discusses digital humanities and text and image based projects that provide access to textual heritage in Flanders. It introduces Edward Vanhoutte, who will be giving a guest lecture on the topic. Digital humanities uses computational methods and digital tools to study humanities questions in an interdisciplinary, multilingual, and interactive way, while also building community and being self-critical.
This document summarizes a research project exploring digital art and how to connect private and public art collections online. The project aims to 1) understand how private collectors currently organize online databases and if they can connect with public collections, 2) determine the technical, organizational, and art history efforts needed to connect these collections into a richer database, and 3) develop ways to distribute the interrelated content through an interface while respecting intellectual property. The project is looking for experts in metadata, information retrieval, graphical user interfaces, business models, and platform development to help achieve these goals.
1) Historians write about the past through the lens of their own time period and with consideration of dominant ideas for interpreting history.
2) Historians use a variety of written sources like documents, oral accounts, artifacts, and archaeological findings to construct an accurate narrative of past events.
3) The study of history can be organized in different ways such as chronologically, culturally, territorially, or thematically and historians may focus on very specific topics or more broad universal patterns.
The document discusses linking cultural heritage objects and stories to additional context from sources like Wikipedia, libraries, archives, and museums using linked open data. It provides examples of objects like a church in Kristianstad, the medieval city wall of Visby, and an archaeological site in Falkenberg that could be connected to related stories and information from other cultural institutions through a system called K-samsök.
Wikimaps ICHC2013 Helsinki 29 June 2013Susanna Ånäs
This document discusses Wikimaps, a proposed project to compile historical maps from around the world on Wikimedia platforms. Wikimaps would work with archives to make high-resolution digital copies of maps openly licensed. Volunteers could then contextualize, georeference, and annotate maps, connecting them to Wikipedia. This would create an open database of historical geospatial data that could be used in new mapping applications and research. The document outlines how tools could be developed to help upload maps, add location data, collaboratively vectorize maps, and integrate maps with Wikidata.
The World of Digital Humanities : Digital Humanities in the WorldEdward Vanhoutte
Keynote lecture on the Cross Country/Faculty Workshop on Digital Humanities: Prospects and Proposals, North-West University Potchefstroomkampus, South-Africa, 13 November 2013
This document discusses plans for developing the Wikimaps project, which aims to make historical maps available on Wikimedia platforms. It describes efforts to digitize and georeference maps from archives, add location data to maps on Wikimedia Commons, extract features from old maps to add to Wikidata and OpenStreetMap, and develop tools to find maps and visualize historical data. The document also outlines upcoming workshops and sprints in Estonia, Hong Kong, and the US to advance this technical and collaboration work around historical maps between Wikimedia and cultural heritage organizations.
Text and Image based Digital Humanities: providing access to textual heritage...Edward Vanhoutte
This document discusses digital humanities and text and image based projects that provide access to textual heritage in Flanders. It introduces Edward Vanhoutte, who will be giving a guest lecture on the topic. Digital humanities uses computational methods and digital tools to study humanities questions in an interdisciplinary, multilingual, and interactive way, while also building community and being self-critical.
This document summarizes a research project exploring digital art and how to connect private and public art collections online. The project aims to 1) understand how private collectors currently organize online databases and if they can connect with public collections, 2) determine the technical, organizational, and art history efforts needed to connect these collections into a richer database, and 3) develop ways to distribute the interrelated content through an interface while respecting intellectual property. The project is looking for experts in metadata, information retrieval, graphical user interfaces, business models, and platform development to help achieve these goals.
1) Historians write about the past through the lens of their own time period and with consideration of dominant ideas for interpreting history.
2) Historians use a variety of written sources like documents, oral accounts, artifacts, and archaeological findings to construct an accurate narrative of past events.
3) The study of history can be organized in different ways such as chronologically, culturally, territorially, or thematically and historians may focus on very specific topics or more broad universal patterns.
The document discusses linking cultural heritage objects and stories to additional context from sources like Wikipedia, libraries, archives, and museums using linked open data. It provides examples of objects like a church in Kristianstad, the medieval city wall of Visby, and an archaeological site in Falkenberg that could be connected to related stories and information from other cultural institutions through a system called K-samsök.
Wikimaps ICHC2013 Helsinki 29 June 2013Susanna Ånäs
This document discusses Wikimaps, a proposed project to compile historical maps from around the world on Wikimedia platforms. Wikimaps would work with archives to make high-resolution digital copies of maps openly licensed. Volunteers could then contextualize, georeference, and annotate maps, connecting them to Wikipedia. This would create an open database of historical geospatial data that could be used in new mapping applications and research. The document outlines how tools could be developed to help upload maps, add location data, collaboratively vectorize maps, and integrate maps with Wikidata.
The World of Digital Humanities : Digital Humanities in the WorldEdward Vanhoutte
Keynote lecture on the Cross Country/Faculty Workshop on Digital Humanities: Prospects and Proposals, North-West University Potchefstroomkampus, South-Africa, 13 November 2013
This document discusses plans for developing the Wikimaps project, which aims to make historical maps available on Wikimedia platforms. It describes efforts to digitize and georeference maps from archives, add location data to maps on Wikimedia Commons, extract features from old maps to add to Wikidata and OpenStreetMap, and develop tools to find maps and visualize historical data. The document also outlines upcoming workshops and sprints in Estonia, Hong Kong, and the US to advance this technical and collaboration work around historical maps between Wikimedia and cultural heritage organizations.
DANS is a Dutch institute that provides permanent access to digital research data in the humanities and social sciences. It operates an online archiving system called EASY that encourages researchers to archive and reuse data. DANS also provides access to thousands of datasets through NARCIS.nl and offers training and advice on data management. The presentation discusses challenges in computational history and the need for digital research infrastructures to support collaborative efforts like sharing historical sources and datasets across networks. Infrastructures mentioned include DARIAH and CLARIN, which aim to connect distributed digital materials in the arts and humanities.
Europeana 1914-1918, User-Generated Content and Linked Open DataValentine Charles
This document discusses Europeana 1914-1918, a project that aggregates World War 1 cultural heritage objects from various European countries and provides metadata for the objects. It connects the data using semantic web technologies like the Europeana Data Model and linked open data vocabularies. This allows integration of multilingual and diverse resources about WW1 from different institutions. It developed its own controlled vocabulary translated into several languages and linked to external vocabularies. The project aims to encourage reuse of this linked data by WW1 research communities and hopes to integrate with other datasets to provide a more comprehensive view of WW1.
This document discusses key issues in digital humanities, including the foundations of computing in the humanities. It outlines the early developments of text encoding with Roberto Busa's 1949 project and the later creation of the Text Encoding Initiative (TEI) in 1987. The document also summarizes several case studies that utilize TEI encoding, including projects on the Fine Rolls of Henry III, Herodotus' Histories, Australian texts, and the Transcribe Bentham crowdsourcing initiative.
Historians write about the past through the lens of their own time and with consideration of dominant interpretations, seeking to produce accurate narratives. The historical record includes all remembered events. History draws on various sources like written documents, oral accounts, archaeology, pictures, and inscriptions. It can be organized chronologically, culturally, territorially, or thematically. Historians have sought practical lessons or pursued knowledge through intellectual curiosity.
This document discusses the field of history. It notes that historians write from the perspective of their own time and aim to interpret the past in a way that provides lessons for their society. The document outlines that the historical record consists of events that are remembered and preserved through written documents, oral accounts, and archaeological artifacts. It also discusses different ways that history can be organized, such as chronologically or thematically. The study of history incorporates methods from both the humanities and social sciences.
DIVE+ is an exploratory search tool for digital humanities research in CLARIAH Media Suite. It provides an event-centric browser for linked historical data that links objects to events and entities and builds automatic storylines. It integrates access to heterogeneous cultural heritage collections, including over 15 million triples from sources like news broadcasts, scans of radio bulletins, images, and metadata. DIVE+ allows exploring media collections, enriching metadata with historical events, and collecting crowd perspectives to support research.
The document announces the launch of Welsh Newspapers Online, a collection of digitized Welsh newspaper articles that will be indexed by the European Library. Over 130 million newspaper pages from around 29,000 titles across Europe have already been digitized and indexed, with 85% available for free. The Welsh Newspapers collection is well-positioned to contribute content and benefit from increased exposure of Welsh language, history, and ideas to an international audience through the European Library's efforts.
The document discusses how museums can reimagine themselves for the future by embracing new technologies and engaging younger audiences. It provides examples of museums using augmented and virtual reality to provide immersive digital experiences that bring visitors closer to the artwork. This allows more people to experience the museums from anywhere. The document also suggests that museums focus on interactive, hands-on, and visual experiences that appeal to younger visitors and keep them engaged by making the exhibits feel relevant and shareable on social media. Reaching out to local youth and reconsidering opening times are also recommended to attract new, younger audiences.
Towards a Graph of Ancient World Data & an Ecosystem of Gazetteersaboutgeo
Pelagios 3 is a 2-year project funded by the Andrew W. Mellon Foundation to annotate geographic documents predating 1492 by associating places mentioned in them with entries from various gazetteers. It aims to grow a graph of linked ancient world data by annotating over 39 partners from 6 countries have contributed around 830,000 annotations so far. The project develops tools to annotate documents and link gazetteer entries, and defines profiles for publishing gazetteer links and metadata online to allow cross-searching.
Oral History, audio-visual materials and Digital Humanities: a new ‘grand cha...UCL
Oral historians have long recognised that voice and gesture can communicate information, knowledge, emotion and interpretation in ways that text cannot. Indeed, oral history artefacts can be studied not only for the words they contain but also for features like interjections, gestures and silences that can, among other things, contain clues about an interviewee’s emotional state. Nevertheless, it can be argued that this process has not gone as far as is necessary, as Frisch has put it “Everyone knows that the core audio-visual dimension of oral history is notoriously underutilized” (Frisch 2006 p.102)
Technological developments---often based on advances made in the Digital Humanities community involving structured and semantic markup---have opened up a plethora of new ways to process audio-visual materials.
However, it is notable that these methods continue to privilege largely text-based approaches to Oral History; after all, what is meta-data but natural language codes inserted into a text in order to make explicit its meaning or constituent parts? Methods being developed in other fields that have, as yet, seen relatively little take up in Digital Humanities, for example, image and facial recognition, acoustic approaches to sentiment analysis, 3D imaging and modelling, digital narratology and storytelling etc offer methodologies that could be fruitfully brought to bear on the capture and especially the analysis of such sources. Not only might such approaches offer new interpretative strategies---that are neither founded upon nor predominately focused upon text---for engaging with audio-visual materials, but they could contribute to a more thorough and sustained reassessment of the dominance of the ‘written’ word in fields like Digital Humanities and Oral history. This paper will explore the possibilities for Oral History researchers that such developments might open up.
Personal bibliography forming a public image of a scientist Birute Railiene
Information service experiences technological changes – expanding possibilities for data retrieving and storing, the process also involves rising remands from the users. Library services has to change to meet the changing need of users.
Bibliography – the basis of international intellectual cooperation (EC Richardson, 1939) – still
Personal bibliography – instrument to draw a historical portrait of a person, institution, field of science
Personal bibliography – a core for prosopography in a history of science
Eaa2021 476 ways and capacity in archaeological data management in serbiaariadnenetwork
Over the past year and due to the COVID-19 pandemic, the entire world has witnessed inequalities across borders and societies. They also include access to archaeological resources, both physical and digital. Both archaeological data creators and users spent a lot of time working from their homes, away from artefact collections and research data. However, this was the perfect moment to understand the importance of making data
freely and openly available, both nationally and internationally.
This is why the authors of this paper chose to make a selection of data bases from various institutions responsible for preservation and protection of cultural heritage, in
order to understand their policies regarding accessibility and usage of the data they keep. This will be done by simple visits to various web-sites or data bases. They intend to check on the volume and content, but also importance of the offered archaeological heritage. In addition, the authors will estimate whether the heritage has adequately been classified and described and also check whether data is available in foreign languages. It needs to be seen whether it is possible to access digital objects (documents and the accompanying metadata), whether access is opened for all users or it requires a certain
hierarchy access, what is the policy of usage, reusage and distribution etc. It remains to be seen whether there are public API or whether it is possible to collect data through API.
In case that there is a public API, one needs to check whether datasets are interoperable or messy, requiring data cleaning.
After having visited a certain number of web-sites, the authors expect to collect enough data to make a satisfactory conclusion about accessibility and usage of Serbian archaeological data web bases.
1) DIVE+ is a project that aims to provide interactive exploration and discovery of integrated online multimedia collections using linked open data to connect metadata from various cultural heritage collections.
2) It extracts events, actors, places and other entities from collection metadata using both original thesauri and automated techniques like named entity recognition. These are linked to media objects to support event-centric browsing.
3) Over 350,000 media objects from four collections have been enriched with over 200,000 events and other entities through these techniques. The data is available through a SPARQL endpoint for deep exploration of interconnected entities in the collections.
Edward Vanhoutte - Opening Keynote TEI2011 ConferenceEdward Vanhoutte
This document discusses different types of scholarly editions, including maximal editions aimed at expert readers, minimal editions for common readers, and social editions that privilege interpretative changes from many readers through collaborative editing. It also addresses the need to document textual variations, challenges of sustainability for digital editions, and using social media to create a continuously changing knowledge space around a text.
DANS is a Dutch institute that provides permanent access to digital research data in the humanities and social sciences. It operates an online archiving system called EASY that encourages researchers to archive and reuse data. DANS also provides access to thousands of datasets through NARCIS.nl and offers training and advice on data management. The presentation discusses challenges in computational history and the need for digital research infrastructures to support collaborative efforts like sharing historical sources and datasets across networks. Infrastructures mentioned include DARIAH and CLARIN, which aim to connect distributed digital materials in the arts and humanities.
Europeana 1914-1918, User-Generated Content and Linked Open DataValentine Charles
This document discusses Europeana 1914-1918, a project that aggregates World War 1 cultural heritage objects from various European countries and provides metadata for the objects. It connects the data using semantic web technologies like the Europeana Data Model and linked open data vocabularies. This allows integration of multilingual and diverse resources about WW1 from different institutions. It developed its own controlled vocabulary translated into several languages and linked to external vocabularies. The project aims to encourage reuse of this linked data by WW1 research communities and hopes to integrate with other datasets to provide a more comprehensive view of WW1.
This document discusses key issues in digital humanities, including the foundations of computing in the humanities. It outlines the early developments of text encoding with Roberto Busa's 1949 project and the later creation of the Text Encoding Initiative (TEI) in 1987. The document also summarizes several case studies that utilize TEI encoding, including projects on the Fine Rolls of Henry III, Herodotus' Histories, Australian texts, and the Transcribe Bentham crowdsourcing initiative.
Historians write about the past through the lens of their own time and with consideration of dominant interpretations, seeking to produce accurate narratives. The historical record includes all remembered events. History draws on various sources like written documents, oral accounts, archaeology, pictures, and inscriptions. It can be organized chronologically, culturally, territorially, or thematically. Historians have sought practical lessons or pursued knowledge through intellectual curiosity.
This document discusses the field of history. It notes that historians write from the perspective of their own time and aim to interpret the past in a way that provides lessons for their society. The document outlines that the historical record consists of events that are remembered and preserved through written documents, oral accounts, and archaeological artifacts. It also discusses different ways that history can be organized, such as chronologically or thematically. The study of history incorporates methods from both the humanities and social sciences.
DIVE+ is an exploratory search tool for digital humanities research in CLARIAH Media Suite. It provides an event-centric browser for linked historical data that links objects to events and entities and builds automatic storylines. It integrates access to heterogeneous cultural heritage collections, including over 15 million triples from sources like news broadcasts, scans of radio bulletins, images, and metadata. DIVE+ allows exploring media collections, enriching metadata with historical events, and collecting crowd perspectives to support research.
The document announces the launch of Welsh Newspapers Online, a collection of digitized Welsh newspaper articles that will be indexed by the European Library. Over 130 million newspaper pages from around 29,000 titles across Europe have already been digitized and indexed, with 85% available for free. The Welsh Newspapers collection is well-positioned to contribute content and benefit from increased exposure of Welsh language, history, and ideas to an international audience through the European Library's efforts.
The document discusses how museums can reimagine themselves for the future by embracing new technologies and engaging younger audiences. It provides examples of museums using augmented and virtual reality to provide immersive digital experiences that bring visitors closer to the artwork. This allows more people to experience the museums from anywhere. The document also suggests that museums focus on interactive, hands-on, and visual experiences that appeal to younger visitors and keep them engaged by making the exhibits feel relevant and shareable on social media. Reaching out to local youth and reconsidering opening times are also recommended to attract new, younger audiences.
Towards a Graph of Ancient World Data & an Ecosystem of Gazetteersaboutgeo
Pelagios 3 is a 2-year project funded by the Andrew W. Mellon Foundation to annotate geographic documents predating 1492 by associating places mentioned in them with entries from various gazetteers. It aims to grow a graph of linked ancient world data by annotating over 39 partners from 6 countries have contributed around 830,000 annotations so far. The project develops tools to annotate documents and link gazetteer entries, and defines profiles for publishing gazetteer links and metadata online to allow cross-searching.
Oral History, audio-visual materials and Digital Humanities: a new ‘grand cha...UCL
Oral historians have long recognised that voice and gesture can communicate information, knowledge, emotion and interpretation in ways that text cannot. Indeed, oral history artefacts can be studied not only for the words they contain but also for features like interjections, gestures and silences that can, among other things, contain clues about an interviewee’s emotional state. Nevertheless, it can be argued that this process has not gone as far as is necessary, as Frisch has put it “Everyone knows that the core audio-visual dimension of oral history is notoriously underutilized” (Frisch 2006 p.102)
Technological developments---often based on advances made in the Digital Humanities community involving structured and semantic markup---have opened up a plethora of new ways to process audio-visual materials.
However, it is notable that these methods continue to privilege largely text-based approaches to Oral History; after all, what is meta-data but natural language codes inserted into a text in order to make explicit its meaning or constituent parts? Methods being developed in other fields that have, as yet, seen relatively little take up in Digital Humanities, for example, image and facial recognition, acoustic approaches to sentiment analysis, 3D imaging and modelling, digital narratology and storytelling etc offer methodologies that could be fruitfully brought to bear on the capture and especially the analysis of such sources. Not only might such approaches offer new interpretative strategies---that are neither founded upon nor predominately focused upon text---for engaging with audio-visual materials, but they could contribute to a more thorough and sustained reassessment of the dominance of the ‘written’ word in fields like Digital Humanities and Oral history. This paper will explore the possibilities for Oral History researchers that such developments might open up.
Personal bibliography forming a public image of a scientist Birute Railiene
Information service experiences technological changes – expanding possibilities for data retrieving and storing, the process also involves rising remands from the users. Library services has to change to meet the changing need of users.
Bibliography – the basis of international intellectual cooperation (EC Richardson, 1939) – still
Personal bibliography – instrument to draw a historical portrait of a person, institution, field of science
Personal bibliography – a core for prosopography in a history of science
Eaa2021 476 ways and capacity in archaeological data management in serbiaariadnenetwork
Over the past year and due to the COVID-19 pandemic, the entire world has witnessed inequalities across borders and societies. They also include access to archaeological resources, both physical and digital. Both archaeological data creators and users spent a lot of time working from their homes, away from artefact collections and research data. However, this was the perfect moment to understand the importance of making data
freely and openly available, both nationally and internationally.
This is why the authors of this paper chose to make a selection of data bases from various institutions responsible for preservation and protection of cultural heritage, in
order to understand their policies regarding accessibility and usage of the data they keep. This will be done by simple visits to various web-sites or data bases. They intend to check on the volume and content, but also importance of the offered archaeological heritage. In addition, the authors will estimate whether the heritage has adequately been classified and described and also check whether data is available in foreign languages. It needs to be seen whether it is possible to access digital objects (documents and the accompanying metadata), whether access is opened for all users or it requires a certain
hierarchy access, what is the policy of usage, reusage and distribution etc. It remains to be seen whether there are public API or whether it is possible to collect data through API.
In case that there is a public API, one needs to check whether datasets are interoperable or messy, requiring data cleaning.
After having visited a certain number of web-sites, the authors expect to collect enough data to make a satisfactory conclusion about accessibility and usage of Serbian archaeological data web bases.
1) DIVE+ is a project that aims to provide interactive exploration and discovery of integrated online multimedia collections using linked open data to connect metadata from various cultural heritage collections.
2) It extracts events, actors, places and other entities from collection metadata using both original thesauri and automated techniques like named entity recognition. These are linked to media objects to support event-centric browsing.
3) Over 350,000 media objects from four collections have been enriched with over 200,000 events and other entities through these techniques. The data is available through a SPARQL endpoint for deep exploration of interconnected entities in the collections.
Edward Vanhoutte - Opening Keynote TEI2011 ConferenceEdward Vanhoutte
This document discusses different types of scholarly editions, including maximal editions aimed at expert readers, minimal editions for common readers, and social editions that privilege interpretative changes from many readers through collaborative editing. It also addresses the need to document textual variations, challenges of sustainability for digital editions, and using social media to create a continuously changing knowledge space around a text.
Europeana meeting under Finland’s Presidency of the Council of the EU - Day 1...Europeana
This document discusses multilingualism in digital cultural heritage. It begins by outlining some of the challenges of multilingual access, including mismatches between user queries and content languages, heterogeneity in queries, and issues with translating metadata. It then discusses some options for bridging the language gap, such as translating queries, content, and metadata; enriching metadata; and adapting systems to better support multilingual exploration. While progress has been made, areas that still need work include improving machine translation for small languages and specialized domains, evaluating solutions, and developing multilingual entity graphs to aid exploration.
Estado arte de las Humanidades Digitales. Algunos proyectos de investigaciónGimena Del Rio Riande
Digital humanities projects and research from around the world are summarized. Key points:
- The document discusses the state of digital humanities, including conferences, participants, topics of interest.
- A history of digital humanities and related fields like humanist computing is provided, tracing work from the 1940s through present day.
- Examples of digital humanities centers, projects, resources and debates are outlined to illustrate the breadth and interdisciplinary nature of the field.
Share Copy: Arts and Humanities DH Presentation October 2016Jennifer Dellner
This document summarizes a presentation on digital humanities given by Dr. Jennifer Dellner in October 2016. It defines digital humanities as the intersection of computing and humanities disciplines, involving the investigation and presentation of information in electronic form. It provides examples of digital humanities in practice, including open access textbooks, digital archives and exhibitions, e-literature, student projects, video games, and text analysis tools. The presentation demonstrates how digital tools can be used to study and engage with the humanities.
The LINHD lab at UNED was created in 2014 to use new media and technologies for humanities research and teaching. It has several ongoing projects involving digitizing poetry collections and making them openly accessible online through standards like TEI and linked open data. The lab provides training in digital humanities through a summer school and courses. It also acts as an information hub and collaborates with other national digital humanities centers. The goal is to further scholarship through innovative use of technology.
Designing the Digital Humanities Library Lab @ Leuven (DH3L)Demmy Verbeke
This document discusses the design of the Library Lab at the University of Ghent. It begins by defining digital humanities as involving three groups: programmers, scholars, and libraries/repositories. It then discusses the role of libraries in digital humanities, including preservation, digitization, discovery/dissemination, and managing data. Reasons for having a digital humanities center are given, such as collecting expertise, enabling funding/stability for projects, and fostering collaboration. Digital humanities centers provide training, workshops, collections, tools, research support, and act as hubs connecting technology and scholars. Some centers are based in libraries. The document concludes by introducing the new Library Lab at the University of Ghent.
Digital Humanities as Innovation: ‘constant revolution’ or ‘moving to the su...Andrea Scharnhorst
Andrea Scharnhorst & Sally Wyatt
Paper given at the "New Trends in eHumanities" Research Meeting of the eHumanities group, 4 June 2015
Digital Humanities as Innovation: ‘constant revolution’ or ‘moving to the suburbs’?
The CulturePlex Lab at Western University studies complexity in cultural systems through an interdisciplinary approach. The lab applies complexity theory to analyze aspects of the Hispanic Baroque, such as its constitution, religious expressions, and urban aspects. By using computational tools, the lab seeks to understand the flow of cultural ideas across time and space. The lab also develops tools and projects like virtual language learning labs and mobile learning games to advance knowledge mobilization and digital literacy.
Digital public history utilizes new communication technologies like computers and the web to examine and represent the past to wider audiences. It draws on features of the digital realm like databases, hyperlinks, and networks to create and share historical knowledge. Digital public history aims to make history accessible to the public outside of academic environments through collaborative projects and various digital platforms and tools. It can democratize history by incorporating more voices and encouraging public participation. Some key aspects of digital public history include creating digital archives, using crowdsourcing, developing online exhibitions, and engaging communities through shared digital spaces.
The opportunistic librarian (DH2014, Lausanne)Demmy Verbeke
The opportunistic librarian: A Leuven confession discusses the role of libraries in supporting digital humanities. It provides examples of how KU Leuven University Library supports digital humanities through projects involving digitization, text analysis, and more. The library aims to focus on digitization projects, grant support, collaborating in digital humanities projects, training, and its role in scholarly communication. This allows the library to reinvent its mission and better support research through new opportunities in digital humanities.
Digital cultural heritage as humanities data: a labs approachSally Chambers
This presentation was given on 17th April 2020 as part of a #DH Hangout (during the Corona Virus) instigated by Lancaster University Digital Humanities Hub and Co-Organised by the Ghent Centre of Digital Humanities and the Digital Humanities Lab (DH_Lab) associated with NOVA-FCSH of Universidade NOVA de Lisboa.
Du Literary and linguistic computing aux Digital Humanities : retour sur 40 a...OpenEdition
The document discusses the evolution of digital humanities from literary and linguistic computing to humanities computing to digital humanities. Key points include:
1) Digital technologies have transformed humanities scholarship by making objects of study digital and changing research methods.
2) Early work in literary and linguistic computing in the 1960s-1980s used computers to analyze texts but was only accessible to technical experts.
3) Humanities computing from the 1980s-1990s saw institutionalization and standardization through projects like the Text Encoding Initiative (TEI).
4) Digital humanities from the 1990s onward has been shaped by increased digitization, collaboration, and development of new infrastructures and approaches like linking and analyzing
From cultural awareness to cultural heritageAna Monteiro
The document discusses building a framework for teaching materials on cultural awareness and cultural heritage. It argues that curricula should prepare students to respect cultural differences and appreciate diverse cultures. Teachers should develop self-awareness of their own culture first before teaching about others. When selecting cultural heritage sites to represent in teaching, it is important to consider which periods, groups and minorities are represented or omitted to avoid an imbalanced emphasis on majority cultures.
The CrossCult project aims to empower the reuse of digital cultural heritage through context-aware connections across European history. Led by Antonis Bikakis at UCL, the project will develop technologies like augmented reality, geolocation, and personalized narratives to facilitate new interpretations of history across borders. Running from 2016-2019 with partners in several countries, CrossCult will explore how facts can be interpreted differently through meta-history research and pilots connecting multiple cultural heritage sites and cities. The goal is to foster changed perspectives on history through technology-enabled experiences.
Cultural heritage: Tradition, Museums and WikisThomas Tunsch
The document discusses knowledge management in museums and their use of wikis. It describes how museums collect objects and documentation, create knowledge, and present information to the public. Wikis also collect data and document discussions to generate articles and build categories. Museums and wikis both involve collaborative communities that research, document, and publish information. The document examines how scholars can be involved in these collaborative activities and how museum documentation and research can benefit wiki communities.
From Catalogue 2.0 to the digital humanities: exploring the future of librari...Sally Chambers
This document discusses the evolving role of libraries and librarians in supporting digital scholarship and the digital humanities. It describes how traditional cataloguing tools like MARC are changing to incorporate new metadata standards and linked data. Research libraries' engagement with research infrastructures has been low but is increasing as opportunities arise in areas like research data management, digital repositories, and scholarly communication. The document argues libraries have important roles to play in discovery, data management, and as embedded partners supporting digital humanities researchers and their evolving needs. Collaboration between libraries and digital humanities centers is highlighted as a way to advance both fields.
The MA in Digital Humanities at King's College London looks at how we create and disseminate knowledge in an age where so much of what we do is mobile, networked and mediated by digital culture and technology
It gives a critical perspective on digital theory and practice in studying human culture, from the perspectives of academic scholarship, cultural heritage and the commercial world
We study the history and current state of the digital humanities, and their role in modelling, curating, analysing and interpreting digital representations of human culture in all its forms.
For more information: http://www.kcl.ac.uk/artshums/depts/ddh/study/pgt/madh/index.aspx
The document discusses cultural awareness, cultural heritage, and cultural heritage education. It addresses aims to promote cultural awareness through developing abilities like observing and participating in other cultures. It notes the need to avoid an ethnocentric perspective and instead immerse participants in other cultures. Regarding cultural heritage, it finds an overrepresentation of certain periods, elites, religions, and regions in the European cultural heritage list. It questions whose heritage is represented and which groups may be forgotten. It raises how teachers can incorporate cultural heritage education and empower diversity through their teaching materials and curriculum.
Libraries, research infrastructures and the digital humanities: are we ready ...Sally Chambers
This document discusses libraries and their potential role in supporting digital humanities research infrastructures. It describes how libraries could help manage data, serve as embedded librarians working directly with researchers, assist with digitization and curation efforts, and help with the discovery and dissemination of digital scholarship. The document emphasizes that libraries need to adopt a researcher-centric approach and form truly equitable collaborations in order to meaningfully contribute to digital humanities work.
Data versus Text: 30 years of confrontationLou Burnard
The document discusses the evolution of humanities computing and digital humanities from the 1940s to the present. Key points include:
- Early work in literary and linguistic computing in the 1940s-1980s focused on concordances, statistics, and analyzing texts as data.
- Humanities computing from 1980-1994 saw the rise of encoding standards, digital libraries and resources, and debate around whether it was a discipline.
- Digital humanities from 1995 onwards was driven by the rise of the web and mass digitization, requiring new collaborative and open infrastructures and practices.
- Current work focuses on combining text analysis with other data types, moving beyond documents to networked resources, and producing "uncritical editions
Similar to (Un)writing the histories of Humanities Computing(s) (20)
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
(Un)writing the histories of Humanities Computing(s)
1. (Un)writing the histories of Humanities Computing(s)
Edward Vanhoutte
Director of Research & Publications, Royal Academy of Dutch Language & Literature
Head, Centre for Scholarly Editing and Document Studies
Research Associate, UCL Centre for Digital Humanities
edward.vanhoutte@kantl.be
@evanhoutte
16. At a moment when the academy in general and the humanities
in particular are the object of massive and wrenching changes,
digital humanities emerges as a rare vector for jujitsu,
simultaneously serving to position the humanities at the very
forefront of certain valueladen agendas—entrepreneurship,
openness and public engagement, futureoriented thinking,
collaboration, interdisciplinarity, big data, industry tieins, and
distance or distributed education—while at the same time
allowing for various forms of intrainstitutional mobility as new
courses are mooted, new colleagues are hired, new resources are
allotted, and old resources are reallocated.
Matthew Kirschenbaum
17.
18.
19.
20.
21. ●Discover, confirm and exemplify how
computing affects analysis, so that
the basic case for humanities
computing is clear and persuasive
across the disciplines.
●Identify and cultivate kinships with
the disciplines, so that humanities
computing is informed by the
collective ways of knowing they have
cultivated.
22. ●Cultivate and exercise the ability to
explain the essentials of humanities
computing to non-specialist
colleagues and to the general public.
23. ●Develop as a prevalent habit and
as a serious, essential aspect of
work the strongly conversational
mode of scholarly publication
exemplified by Humanist and other
Internet forums.
24. ●Write the ethnography of
collaborative engagements to
document how successful
collaborations happen and how
perceptions change in the encounter
of the humanities with computing.
●Develop a genuine historiography
of humanities computing from
existing chronologies; begin writing
histories of the field.
25.
26.
27.
28. ●History & computing (Adman, 1987)
●Computing in musicology, 1966-1991 (Hewlett & Selfridge-Field,
1991)
●Statistical analysis of literature in Chum, 1966-1990 (Potter, 1991)
● A Companion to Digital Humanities (2004)
● Archaeology
● Art history
● Classics
● History
● Lexicography
● Linguistics
● Literary studies
● Music
● Multimedia
● Performing arts
● Philosophy and religion
30. Humanities Computing = Semantic Primitive
Historical Knowledge ≈ Definition
2 problems
1. Chronology of the definition
2. Defining power of the chronology
34. (Un)writing the histories of Humanities Computing(s)
Edward Vanhoutte
Director of Research & Publications, Royal Academy of Dutch Language & Literature
Head, Centre for Scholarly Editing and Document Studies
Research Associate, UCL Centre for Digital Humanities
edward.vanhoutte@kantl.be
@evanhoutte