Kawecki, Barbara, and Michael Levine-Clark, “NISO’s DDA Initiative: Cross-Industry Stakeholders Express PDA to Improve the Landscape for All,” Charleston Conference, Charleston, S.C., November 9, 2012.
Kawecki, Barbara, and Michael Levine-Clark, “NISO’s DDA Initiative: Cross-Industry Stakeholders Express PDA to Improve the Landscape for All,” Charleston Conference, Charleston, S.C., November 9, 2012.
Data grids are an emerging technology that enables the formation of sharable collections from data distributed across multiple storage resources. The integrated Rule Oriented Data System (iRODS) is a data grid developed by the DICE Center at UNC-CH. The iRODS data grid enforces management policies that control properties of the collection. Examples of policies include retention, disposition, distribution, replication, metadata extraction, time-dependent access controls, data processing, data redaction, and integrity checking. Policies can be defined that automate administrative functions (file migration and replication) and that validate assessment criteria (authenticity, integrity, chain of custody). iRODS is used to build data sharing environments, digital libraries, and preservation environments. The iRODS data grid is used at UNC-CH to support the Carolina Digital Repository, the LifeTime Library for the School of Information and Library Science, data grids for the Renaissance Computing Institute (RENCI), collaborations within North Carolina, and both national and international data sharing. At RENCI, the TUCASI data grid supports shared collections between UNC-CH, Duke, and NCSU. The RENCI data grid is federated with ten other data grids including the National Climatic Data Center, the Texas Advanced Computing Center data grid, and the Ocean Observatories Initiative data grid. International applications include the CyberSKA Square Kilometer Array for radio astronomy and the French National Institute for Nuclear Physics and Particle Physics. The collections that are assembled may contain hundreds of millions of files, and petabytes of data. A specific goal is the integration of institutional repositories with the national data infrastructure that is being assembled under the NSF DataNet program. The software is available as an open source distribution from http://irods.diceresearch.org.
Reagan Moore, UNC-RENCI; Policy-based Data Management; RDAP11 Summit
The 2nd Research Data Access and Preservation (RDAP) Summit
An ASIS&T Summit
March 31-April 1, 2011 Denver, CO
In cooperation with the Coalition for Networked Information
http://asist.org/Conferences/RDAP11/index.html
DocuFile enables automatic, policy based and revision safe archiving of files while at the same time relieving the file servers by migrating rarely used files to low cost storage systems. The migrated files are automatically indexed at the time of archiving and are permanently under the control of information lifecycle management.
RDM Roadmap to the Future, or: Lords and Ladies of the DataRobin Rice
Story of the new 2017-2020 University of Edinburgh RDM Roadmap, with a Tolkienesque theme for IASSIST-CARTO 2018 in Montreal: "Once upon a data point: sustaining our data storytellers".
Slides from Emily Stambaugh's keynote presentation at the "Looking to the Future of Shared Print" session held at the ALA Annual Conference on June 27, 2014 in Las Vegas, NV.
An analysis and characterization of DMPs in NSF proposals from the University...Megan O'Donnell
Beginning in July 2011, the University of Illinois at Urbana-Champaign Library, working in conjunction with the campus Office of Sponsored Programs and Research Administration (OSPRA) began an analysis of Data Management Plans (DMPs) in newly submitted National Science Foundation (NSF) grant proposals. The DMP became a required element in all NSF proposals beginning on January, 18th 2011. This analysis was undertaken to provide the Illinois campus and library with detailed information on the DMPs being submitted by Illinois researchers. In particular, the analysis allows us to categorize the grant applicant’s proposed DMP data storage venues and data reuse mechanisms, and provides us with data on the use of DMP templates developed by the University of Illinois Library.
RDAP14: An analysis and characterization of DMPs in NSF proposals from the Un...ASIS&T
Research Data Access and Preservation Summit, 2014
San Diego, CA
March 26-28, 2014
Lightning Talks
William Mischo, University of Illinois at Urbana-Champaign
• One of the most important decisions a distributed database designer has to make is data placement. Proper data placement is a crucial factor in determining the success of a distributed database system.
• There are four basic alternatives: namely,
– centralized,
– replicated,
– partitioned, and
– hybrid.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Data grids are an emerging technology that enables the formation of sharable collections from data distributed across multiple storage resources. The integrated Rule Oriented Data System (iRODS) is a data grid developed by the DICE Center at UNC-CH. The iRODS data grid enforces management policies that control properties of the collection. Examples of policies include retention, disposition, distribution, replication, metadata extraction, time-dependent access controls, data processing, data redaction, and integrity checking. Policies can be defined that automate administrative functions (file migration and replication) and that validate assessment criteria (authenticity, integrity, chain of custody). iRODS is used to build data sharing environments, digital libraries, and preservation environments. The iRODS data grid is used at UNC-CH to support the Carolina Digital Repository, the LifeTime Library for the School of Information and Library Science, data grids for the Renaissance Computing Institute (RENCI), collaborations within North Carolina, and both national and international data sharing. At RENCI, the TUCASI data grid supports shared collections between UNC-CH, Duke, and NCSU. The RENCI data grid is federated with ten other data grids including the National Climatic Data Center, the Texas Advanced Computing Center data grid, and the Ocean Observatories Initiative data grid. International applications include the CyberSKA Square Kilometer Array for radio astronomy and the French National Institute for Nuclear Physics and Particle Physics. The collections that are assembled may contain hundreds of millions of files, and petabytes of data. A specific goal is the integration of institutional repositories with the national data infrastructure that is being assembled under the NSF DataNet program. The software is available as an open source distribution from http://irods.diceresearch.org.
Reagan Moore, UNC-RENCI; Policy-based Data Management; RDAP11 Summit
The 2nd Research Data Access and Preservation (RDAP) Summit
An ASIS&T Summit
March 31-April 1, 2011 Denver, CO
In cooperation with the Coalition for Networked Information
http://asist.org/Conferences/RDAP11/index.html
DocuFile enables automatic, policy based and revision safe archiving of files while at the same time relieving the file servers by migrating rarely used files to low cost storage systems. The migrated files are automatically indexed at the time of archiving and are permanently under the control of information lifecycle management.
RDM Roadmap to the Future, or: Lords and Ladies of the DataRobin Rice
Story of the new 2017-2020 University of Edinburgh RDM Roadmap, with a Tolkienesque theme for IASSIST-CARTO 2018 in Montreal: "Once upon a data point: sustaining our data storytellers".
Slides from Emily Stambaugh's keynote presentation at the "Looking to the Future of Shared Print" session held at the ALA Annual Conference on June 27, 2014 in Las Vegas, NV.
An analysis and characterization of DMPs in NSF proposals from the University...Megan O'Donnell
Beginning in July 2011, the University of Illinois at Urbana-Champaign Library, working in conjunction with the campus Office of Sponsored Programs and Research Administration (OSPRA) began an analysis of Data Management Plans (DMPs) in newly submitted National Science Foundation (NSF) grant proposals. The DMP became a required element in all NSF proposals beginning on January, 18th 2011. This analysis was undertaken to provide the Illinois campus and library with detailed information on the DMPs being submitted by Illinois researchers. In particular, the analysis allows us to categorize the grant applicant’s proposed DMP data storage venues and data reuse mechanisms, and provides us with data on the use of DMP templates developed by the University of Illinois Library.
RDAP14: An analysis and characterization of DMPs in NSF proposals from the Un...ASIS&T
Research Data Access and Preservation Summit, 2014
San Diego, CA
March 26-28, 2014
Lightning Talks
William Mischo, University of Illinois at Urbana-Champaign
• One of the most important decisions a distributed database designer has to make is data placement. Proper data placement is a crucial factor in determining the success of a distributed database system.
• There are four basic alternatives: namely,
– centralized,
– replicated,
– partitioned, and
– hybrid.
Similar to Storage Strategies at the University of Iowa (20)
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
2. Central File Storage
• Individuals: 1-5 GB
• Departments: 1 GB per FTE
Base
Entitlement
• Asynchronous replication to identical
system at remote data center
4 Hour Recovery
Objectives
• < 1 hour of downtime in the past year99.97% Uptime
• 33K of 37K eligible users
89%
Participation
2
3. The Effect
“One price fits all” network file
storage
Lower-cost network file
storage
Lowest-cost bare server
storage
3
4. Storage Tiers
High
Performance
Enterprise SAN
Central File
Storage
Low Cost SAN
Target Users Centrally
managed servers
End users or
distributed servers
Distributed
servers
Funding Model Centrally funded Base entitlement,
pay for more
Pay for use
Monthly Price
per Gigabyte
N/A $0.12 – $0.25 $0.03 – $0.14
http://its.uiowa.edu/spa/storage/
4
7. Campus Storage by Provider
Information
Technology
Services
42%
College of
Engineering
27%
Institute for
Clinical and
Translational
Science
12%
College of Public
Health
8%
UI Libraries
4%
Other
7%
7
8. Storage by University Mission
Administration &
Overhead
35%
Research
51%
Teaching
11%
Public Service
3%
8
10. Issues
Research storage
• Fragmented
• Poor data protection
Backup storage
• Inefficient
• No central service
Archival storage
• Few options
• Shifts to online or
backup storage
10
Cost
11. Possible Next Steps
Identify and promote research solutions
Build central backup service
Build, promote archival solutions
11