Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storin
This white paper discusses challenges with storing and archiving data in the petroleum and gas exploration industry and presents solutions from NetApp and Interica. The challenges include rapidly growing storage needs, high costs of periodically remastering tape archives due to data deterioration and system obsolescence, risks of data corruption with tape, and slow data recovery from tape. New disk-based storage technologies like data compression, deduplication, and RAID can help address these challenges by providing better data management, protection, and faster access compared to tape-based solutions. NetApp and Interica provide integrated data protection, management and archiving solutions leveraging these disk technologies.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
This white paper, written by Josh Krischer, is an independent lab validation of ProtecTIER’s Native Replication capabilities and describes Mr. Krischer’s hands-on experience with the solution and describes the benefits customers can receive by implementing a replicated deduplication solution.
This white paper provides an overview of EMC's data protection solutions for the data lake - an active repository to manage varied and complex Big Data workloads
This white paper, written by Josh Krischer, is an independent lab validation of ProtecTIER’s Native Replication capabilities and describes Mr. Krischer’s hands-on experience with the solution and describes the benefits customers can receive by implementing a replicated deduplication solution.
3-2-1 Action! Running OpenStack Shared File System Service in ProductionSean Cohen
As OpenStack’s Shared File System Service is getting more and more adoption as one of top leading emerging projects in OpenStack deployments (according to the last OpenStack foundation user survey), we would like to share some of the key customers use cases such as DevOps, Containers and Enterprise Applications as well review the latest Newton release project updates towards delivering a production-grade deployments.
Slides from OpenStack Summit Barcelona,, October 25, 2016
Session video: https://www.youtube.com/watch?v=F5o-EbESNr8
CyberTexas Job Fair Job Seeker Handbook August 23, 2016, San Antonio, TexasClearedJobs.Net
Cyber security professionals and students please join us at the CyberTexas Job Fair to meet 22 employers, network with other cleared professionals, and have your resume professionally reviewed. A security clearance is not required. The Job Seeker Handbook contains a listing of all employers and the positions they will be seeking to fill at the Cyber Job Fair. If you are a cyber security cleared professional please pre-register at: https://clearedjobs.net/job-fair/fair/50/. If you are a non-cleared cyber security professional or student please pre-register at: https://cybersecjobs.com/job-fair/fair/50/. The CyberTexas Job Fair is free and registration is separate from the CyberTexas conference.
NetApp FlexPod Converged Infrastructure solution - Summer 2013 releases. New Designs and non-disruptive operations for mid-sized business, enterprises and now Big Data applications. Presentation includes news on expanded FlexPod family, latest validated designs, recent awards and a new iPad app.
Join the discussion with NetApp and CA Technologies Storage Experts and learn how you can store more, spend less and increase business agility – yes, it’s possible! Discover how flexible on-premise and cloud innovations are helping organizations lower operating costs and reduce administrative overhead associated with managing and scaling z Systems storage. Learn how you can accelerate business responsiveness by using the cloud to efficiently store, access, retrieve and recover z/OS data. Join us for an informative discussion and hear about new features and strategic plans that can help your business reduce the total cost of ownership (TCO) for storing and managing your mainframe data and increase your overall efficiency.
For more information, please visit http://cainc.to/Nv2VOe
On the 10 November 2016, Solidfire (a NetApp company) and Carrenza presented the benefits of SLA based high performance cloud storage in Amsterdam, the Netherlands.
Agenda
SolidFire: Andy Roberts, Principal Architect
Carrenza: Matt McGrory, Managing Director
CommuniGator: Aaron Yates, CIO
DISCUSSION THEMES
Guaranteeing IOPS in the cloud
Predictable storage performance - backed by SLA's
Customer reference case CommuniGator
Non-disruptive linear scalability
Economies of scale
Increased efficiencies
Fast and predictable performance
Speaker profiles
Andy Roberts | Principal Architect, SolidFire, Inc. now part of NetApp
Andy is very much a Systems Engineer that understands new technologies along with their benefits and values to customers, partners and colleagues.
Aaron Yates | Marketing Automation Solutions - Technical Director CommuniGator
Aaron is responsible for the creation and delivery of CommuniGator's product portfolio to the marketplace. Specialist interests are Marketing Automation and SAAS delivery. Being able to build a digital market place around you where customers value the benefits in your products.
Matthew McGrory, Managing Director, Carrenza
Carrenza is an IaaS and PaaS focused Cloud Service Provider. We help our customers run their critical business applications and revenue generating systems on our enterprise cloud architecture.
Notable customers include Comic Relief, Government Digital Service, De Bijenkorf, IOVIO, Royal Bank of Scotland and CommuniGator.
----
About SolidFire:
SolidFire is born out of the largest cloud infrastructures in the world and built to solve the challenge of delivering high-performance applications from a multi-tenant infrastructure. Founded in 2010, SolidFire is a market leader in all-flash storage systems built for the next-generation data center where simple scaling, set-and-forget management, assured performance and multi-tenancy, and cloud economic models are driving new market growth.
www.solidfire.com
About CommuniGator
Established in 2005, CommuniGator have become one of the leading email marketing software providers in the UK. Owned exclusively by our founders & staff, we have no one else to satisfy but our clients. That’s why we treat email marketing with the attention it deserves. The features & functions of our marketing automation software have been developed to make the marketer’s life both easier and more effective.
www.communigator.co.uk
PROACT SYNC 2013 - Breakout - De voordelen van een FlexPod converged datacent...Proact Netherlands B.V.
Breakout session tijdens Proact's SYNC 2013.
De voordelen van een FlexPod converged datacenter infrastructure volgens analist Forrester
Geert van Teylingen
Strategy and Consulting Benelux
OpenNebulaConf 2016 - Budgeting: the Ugly Duckling of Cloud computing? by Mat...OpenNebula Project
After more than one year since the start of the operational phase, it is time for the Leibniz Supercomputing Centre to reconsider the usage model of its cloud infrastructure based on OpenNebula. Budgeting is the tool of choice to regulate the access to the resources and to translate the diverse access priorities into allocation policies. This talk will focus on the use case of a resource provider for research and education, giving an overview of the current needs together with a proposed solution.
This Fall, FlexPod, the #1 Worldwide Integrated Infrastructure, is releasing new validated designs for large multi-tenant Clouds and enterprise Business Continuity, and is enhancing the ways to automate FlexPod management. Also for the first time since program inception, FlexPod is expanding the Cooperative Support program to include Citrix.
The tape Industry began in 1952 and the disk Industry in 1956. In 1952, the world’s first
successful commercial tape drive was delivered, the IBM 726 with 12,500 bytes of capacity per
reel. In 1956 the world’s first disk drive was delivered by IBM, the Ramac 350 with 5 megabytes
of capacity. Though no one knew it at the time, two key and lasting events linking disk and tape
for the foreseeable future had just occurred
3-2-1 Action! Running OpenStack Shared File System Service in ProductionSean Cohen
As OpenStack’s Shared File System Service is getting more and more adoption as one of top leading emerging projects in OpenStack deployments (according to the last OpenStack foundation user survey), we would like to share some of the key customers use cases such as DevOps, Containers and Enterprise Applications as well review the latest Newton release project updates towards delivering a production-grade deployments.
Slides from OpenStack Summit Barcelona,, October 25, 2016
Session video: https://www.youtube.com/watch?v=F5o-EbESNr8
CyberTexas Job Fair Job Seeker Handbook August 23, 2016, San Antonio, TexasClearedJobs.Net
Cyber security professionals and students please join us at the CyberTexas Job Fair to meet 22 employers, network with other cleared professionals, and have your resume professionally reviewed. A security clearance is not required. The Job Seeker Handbook contains a listing of all employers and the positions they will be seeking to fill at the Cyber Job Fair. If you are a cyber security cleared professional please pre-register at: https://clearedjobs.net/job-fair/fair/50/. If you are a non-cleared cyber security professional or student please pre-register at: https://cybersecjobs.com/job-fair/fair/50/. The CyberTexas Job Fair is free and registration is separate from the CyberTexas conference.
NetApp FlexPod Converged Infrastructure solution - Summer 2013 releases. New Designs and non-disruptive operations for mid-sized business, enterprises and now Big Data applications. Presentation includes news on expanded FlexPod family, latest validated designs, recent awards and a new iPad app.
Join the discussion with NetApp and CA Technologies Storage Experts and learn how you can store more, spend less and increase business agility – yes, it’s possible! Discover how flexible on-premise and cloud innovations are helping organizations lower operating costs and reduce administrative overhead associated with managing and scaling z Systems storage. Learn how you can accelerate business responsiveness by using the cloud to efficiently store, access, retrieve and recover z/OS data. Join us for an informative discussion and hear about new features and strategic plans that can help your business reduce the total cost of ownership (TCO) for storing and managing your mainframe data and increase your overall efficiency.
For more information, please visit http://cainc.to/Nv2VOe
On the 10 November 2016, Solidfire (a NetApp company) and Carrenza presented the benefits of SLA based high performance cloud storage in Amsterdam, the Netherlands.
Agenda
SolidFire: Andy Roberts, Principal Architect
Carrenza: Matt McGrory, Managing Director
CommuniGator: Aaron Yates, CIO
DISCUSSION THEMES
Guaranteeing IOPS in the cloud
Predictable storage performance - backed by SLA's
Customer reference case CommuniGator
Non-disruptive linear scalability
Economies of scale
Increased efficiencies
Fast and predictable performance
Speaker profiles
Andy Roberts | Principal Architect, SolidFire, Inc. now part of NetApp
Andy is very much a Systems Engineer that understands new technologies along with their benefits and values to customers, partners and colleagues.
Aaron Yates | Marketing Automation Solutions - Technical Director CommuniGator
Aaron is responsible for the creation and delivery of CommuniGator's product portfolio to the marketplace. Specialist interests are Marketing Automation and SAAS delivery. Being able to build a digital market place around you where customers value the benefits in your products.
Matthew McGrory, Managing Director, Carrenza
Carrenza is an IaaS and PaaS focused Cloud Service Provider. We help our customers run their critical business applications and revenue generating systems on our enterprise cloud architecture.
Notable customers include Comic Relief, Government Digital Service, De Bijenkorf, IOVIO, Royal Bank of Scotland and CommuniGator.
----
About SolidFire:
SolidFire is born out of the largest cloud infrastructures in the world and built to solve the challenge of delivering high-performance applications from a multi-tenant infrastructure. Founded in 2010, SolidFire is a market leader in all-flash storage systems built for the next-generation data center where simple scaling, set-and-forget management, assured performance and multi-tenancy, and cloud economic models are driving new market growth.
www.solidfire.com
About CommuniGator
Established in 2005, CommuniGator have become one of the leading email marketing software providers in the UK. Owned exclusively by our founders & staff, we have no one else to satisfy but our clients. That’s why we treat email marketing with the attention it deserves. The features & functions of our marketing automation software have been developed to make the marketer’s life both easier and more effective.
www.communigator.co.uk
PROACT SYNC 2013 - Breakout - De voordelen van een FlexPod converged datacent...Proact Netherlands B.V.
Breakout session tijdens Proact's SYNC 2013.
De voordelen van een FlexPod converged datacenter infrastructure volgens analist Forrester
Geert van Teylingen
Strategy and Consulting Benelux
OpenNebulaConf 2016 - Budgeting: the Ugly Duckling of Cloud computing? by Mat...OpenNebula Project
After more than one year since the start of the operational phase, it is time for the Leibniz Supercomputing Centre to reconsider the usage model of its cloud infrastructure based on OpenNebula. Budgeting is the tool of choice to regulate the access to the resources and to translate the diverse access priorities into allocation policies. This talk will focus on the use case of a resource provider for research and education, giving an overview of the current needs together with a proposed solution.
This Fall, FlexPod, the #1 Worldwide Integrated Infrastructure, is releasing new validated designs for large multi-tenant Clouds and enterprise Business Continuity, and is enhancing the ways to automate FlexPod management. Also for the first time since program inception, FlexPod is expanding the Cooperative Support program to include Citrix.
Similar to Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storin
The tape Industry began in 1952 and the disk Industry in 1956. In 1952, the world’s first
successful commercial tape drive was delivered, the IBM 726 with 12,500 bytes of capacity per
reel. In 1956 the world’s first disk drive was delivered by IBM, the Ramac 350 with 5 megabytes
of capacity. Though no one knew it at the time, two key and lasting events linking disk and tape
for the foreseeable future had just occurred
Mastering Backup and Disaster Recovery: Ensuring Data Continuity and ResilienceMaryJWilliams2
Discover the essential strategies and tools for effective backup and disaster recovery. Learn how to safeguard your data against unexpected events and ensure business continuity. Explore the latest technologies and best practices in backup and disaster recovery management. To Know more:https://stonefly.com/white-papers/backup-disaster-recovery-solutions-governments/
Xd planning guide - storage best practicesNuno Alves
The Citrix Storage planning guide provides a list of best practices, recommendations and
performance related tips that cover the most critical areas of storage integration with Citrix
XenDesktop. It is not intended as a comprehensive guide for planning and configuring storage
infrastructures, nor as a storage training handbook.
Due to scope, this guide provides some device-specific information. For additional device- specific
configuration, Citrix suggests reviewing the storage vendor’s documentation, the storage vendor’s
hardware compatibility list, and contacting the vendor’s technical support if necessary
To some, tape storage may seem like an outdated technology in the era of NAS and object-based storage. But— here’s a surprise – tape today is more relevant than ever. Even the most modern data centers can benefit from its low cost of ownership, scalability, reliability and security. In our on demand webinar, Storage Switzerland is joined by Spectra Logic, Fujifilm and Iron Mountain to discuss why tape use shouldn’t just continue but actually expand, including in hybrid cloud environments.
Achieve Higher Quality Decisions Faster for a Competitive Edge in the Oil and...Hitachi Vantara
Hitachi next-generation unified storage solutions meet the challenges of today’s data-intensive oil and gas exploration and production activities. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Tape and cloud storage targets have their pros and cons. There are many differences between these two technologies, which we will explore in this paper. These differences can steer the decision process you may have for getting virtual machine (VM) backups offsite with Veeam® Backup & Replication™.
A New Approach to Digital News ArchivingNicolas Hans
The move to tapeless production environments creates
new challenges for the production and re-use of news
archives. Too often, newscasters focus their energy on
the choice of a relevant storage system. However, the
true challenge lies in the consistent aggregation of
descriptive metadata and associated digital rights
information. This paper discusses several case studies
and suggests a new approach to digital news archiving:
one that will get approval and support from both
production and management teams.
Managing Information Storage: Trends, Challenges, and Options (2013-2014) (Wh...EMC
How are IT and storage managers coping with the organizational challenges posed by the explosion of data, increasing criticality of digitized information, and rapid introduction of new information infrastructure technologies, virtualization, and cloud?
This updated paper contains the findings of a study based on input from over 1,000 storage professionals and managers across 800+ organizations worldwide.
This research will assist IT/storage managers in comparing and correlating their environment and plans with the overall trends in the industry and the impact of emerging technologies such as storage virtualization and cloud computing.
This short paper discusses the work happening in the Fibre Channel Industry Association's T-11 committee to develop a new low latency protocol for a flash drive world. This paper is an excellent introduction to it.
The Evolution of IP Storage and Its Impact on the NetworkEMC
This white paper addresses the increased stresses on infrastructure resulting from virtualization and the importance of attending to infrastructure design and support, especially with regard to the network.
Asperitas CEO Rolf Brink presented a vision on the datacentre of the future including innovative datacentre design concepts during the Datacentre Transformation event in Manchester on the 11th of July 2017.
Similar to Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storin (20)
DevOps the NetApp Way: 10 Rules for Forming a DevOps TeamNetApp
Does your enterprise IT organization practice DevOps without a common team approach? To create a standardized way for development and operations teams to work together at NetApp, the IT team differentiates a DevOps team from a regular development team based on these 10 rules.
Spot Lets NetApp Get the Most Out of the CloudNetApp
Prior to NetApp acquiring Spot.io, two of its IT teams had adopted Spot in their operations: Product Engineering for Cloud Volumes ONTAP test automation and NetApp IT for corporate business applications. Check out the results in this infographic.
NetApp has fully embraced tools that allow for seamless, collaborative work from home, and as a result was fully prepared to minimize COVID-19's impact on how we conduct business. Check out this infographic for a look at results from the new remote work reality.
4 Ways FlexPod Forms the Foundation for Cisco and NetApp SuccessNetApp
At Cisco and NetApp, seeing our customers succeed in their digital transformations means that we’ve succeeded too. But that’s only one of the ways we measure our performance. What’s another way? Hearing how our wide-ranging IT support helps Cisco and NetApp thrive. Here’s what makes FlexPod an indispensable part of Cisco’s and NetApp’s IT departments.
With the widespread adoption of hybrid multicloud as the de-facto architecture for the enterprise, organizations everywhere are modernizing to deliver tangible business value around data-intensive applications and workloads such as AI-driven IoT and Hyperledgers. Shifting from on-premises to public cloud services, private clouds, and moving from disk to flash – sometimes concurrently – opens the door to enormous potential, but also the unintended consequence of IT complexity.
With the widespread adoption of hybrid multicloud as the de facto IT architecture for the enterprise, organizations everywhere are modernizing to deliver tangible business value around data-intensive applications and workloads such as AI-driven IoT and indelible ledgers.
10 Reasons Why Your SAP Applications Belong on NetAppNetApp
NetApp has been supporting SAP for 20 years, delivering advanced solutions for SAP applications. Here are 10 reasons why your SAP applications belong on NetApp!
Redefining HCI: How to Go from Hyper Converged to Hybrid Cloud InfrastructureNetApp
The hyper converged infrastructure (HCI) market is entering a new phase of maturity. A modern HCI solution requires a private cloud platform that integrates with public clouds to create a consistent hybrid multi-cloud experience.
During this webinar, NetApp and an IDC guest speaker covered what led to the next generation of hyper converged infrastructure and which five capabilities are required to go from hyper converged to hybrid cloud infrastructure.
As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, AI is at the heart of trends in development, data management, and delivery of applications and services at the edge, core, and cloud. Also essential are containerization as a critical enabling technology and the increasing intelligence of IoT devices at the edge. Navigating the tempests of transformation are developers, whose requirements are driving the rapid creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage. Here are some of our perspectives and predictions for 2019.
Künstliche Intelligenz ist in deutschen Unter- nehmen ChefsacheNetApp
Einer aktuellen Umfrage des führenden Datenma- nagementspezialisten in der Hybrid Cloud NetApp zufolge gewinnt künstliche Intelligenz (KI) in deut- schen Unternehmen zunehmend an Relevanz.
Iperconvergenza come migliora gli economics del tuo ITNetApp
In this NetApp Webinar we present how NetApp HCI helps improve the economics of IT: accelerating and ensuring performance for each application, simplifying your Data Center and make your architecture more scalable by reducing waste, implementing and expanding your HCI infrastructure quickly and inexpensively, making your management even simpler and more intuitive, saving time and using the skills you already have in the company.
NetApp IT’s Tiered Archive Approach for Active IQNetApp
NetApp AutoSupport technology proactively monitors the health of NetApp systems installed at customer’s location and provides 24/7 actionable intelligence to optimize their storage environment. The amount of data received back to NetApp doubles approximately every 16 months. To manage the swelling waves of data to archive, NetApp IT sought a more flexible solution.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Securing your Kubernetes cluster_ a step-by-step guide to success !
Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storing/ Challenges and Best Practices for Storin
1. White Paper
Challenges and Best Practices for Storing/
Archiving Data in the Petroleum and Gas
Exploration and Production Industry
Peter Ferri and Erik Mulder, NetApp
Chris Bearce, Interica
May 2012 | WP-7098
2. TABLE OF CONTENTS
1 EXECUTIVE SUMMARY............................................................................................................. 3
2 CHALLENGES ............................................................................................................................ 3
STORAGE VOLUME AND RETENTION.................................................................................................................. 3
DATA REMASTERING AND SYSTEM OBSOLESCENCE ...................................................................................... 4
DATA CORRUPTION .............................................................................................................................................. 4
DATA RECOVERY SPEED ..................................................................................................................................... 4
DISTRIBUTED DATA .............................................................................................................................................. 5
3 COST COMPARISON: TAPE VERSUS DISK ........................................................................... 5
4 TECHNOLOGY ENABLERS ...................................................................................................... 5
BETTER MANAGEMENT AND CONTROL OF DATA ............................................................................................. 6
DATA COMPRESSION ............................................................................................................................................ 6
DEDUPLICATION .................................................................................................................................................... 7
METADATA AND DATA MANAGEMENT ............................................................................................................... 7
DATA ENCRYPTION FOR SECURITY .................................................................................................................... 7
IMPROVED RELIABILITY ....................................................................................................................................... 7
DATA CLASS STORAGE MANAGEMENT ............................................................................................................. 7
5 NETAPP AND INTERICA SOLUTIONS ..................................................................................... 8
INTERICA PARS ..................................................................................................................................................... 8
INTERICA SMARTMOVE ........................................................................................................................................ 9
NETAPP STORAGE PLATFORM ............................................................................................................................ 9
NETAPP INTEGRATED DATA PROTECTION ...................................................................................................... 10
NETAPP EXTENDED CAPABILITIES ................................................................................................................... 10
NETAPP SOLUTIONS ........................................................................................................................................... 10
6 POTENTIAL CUSTOMER BENEFITS...................................................................................... 12
7 CONCLUSIONS ........................................................................................................................ 12
8 FOR MORE INFORMATION ..................................................................................................... 12
ABOUT INTERICA ................................................................................................................................................. 12
ABOUT NETAPP ................................................................................................................................................... 12
2 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
3. 1 EXECUTIVE SUMMARY
Data managers in information-intensive scientific industries such as petroleum and gas exploration and
production (E&P) understand the challenges that their rapidly growing datastores pose. Poor access to the
high-quality, complete data they need to make informed decisions can cost millions of dollars. Many E&P
geoscientists spend far too much of their time identifying, locating, and obtaining data they need for their
mission-critical analyses. This paper examines these challenges and then explores technology advances
that now enable disk-based archiving to complement tape-based solutions. In doing so it helps address
these challenges. The paper then examines the range of solutions from NetApp and Interica that enable
E&P companies to leverage these advances. Finally, it summarizes the potential benefits of
this approach.
2 CHALLENGES
A large percentage of data storage budgets in E&P operations are devoted to archiving for seismic storage,
processing, and interpretation. Every dollar saved on these critical yet passive requirements could be
redirected to proactive computing. The latter includes improving data access for seismic processing and
interpretation application for better reservoir characterization. This can facilitate real-time well operations
modeling or global collaboration, which ultimately helps reduce risk and decrease cycle times.
Tape technology has long been the archival mainstay for E&P companies and will continue to be a requisite
part of the storage hierarchy for many organizations for years to come, simply because of the vast amount of
legacy data. Yet today, some forward-thinking companies are looking at active archiving (near-line or online
archiving). They are evolving toward disk-based archiving solutions that can address growing data
management challenges associated with tape-only solutions.
At the core of this trend is the inability of tape to effectively fulfill the active archiving role and the slow
pace of data location, access, and recovery. (Active archives are those performed for storage management
reasons and that may be needed back online within a shorter term of one to two years.) In addition to these
limitations, the tape medium cannot ensure integrity of data over time and requires periodic tape remastering
at high cost, unless a robust archiving solution is in play that automates the latter.
Recent cost analyses between online storage and near-line storage technologies, such as tape-based
systems, show that online storage can be more cost effective than tape-based solutions over the lifetime of
the seismic data. As technological advances in disk storage densities increase and effective deduplication
and compression technologies become available, these new online disk-based solutions are becoming more
cost effective.
STORAGE VOLUME AND RETENTION
Seismic information continues to grow at staggering rates. This digital tidal wave includes new pre- and
poststack volumes as well as derivative attribute volumes. The challenge is magnified by requirements to
support higher-fidelity subsurface imaging and enhanced interpretation and visualization technologies that
can utilize increasingly larger data sets.
Geoscientists may use many different software applications to analyze this data, each of which generates
results for storage. Individual files generated may be as large as multiple GBs, and these tools may generate
thousands or even tens of thousands of files per project. E&P workgroups may juggle dozens of projects,
each with this large set of files. And data retention times for this staggering amount of data are the life of
the assets. Hence, E&P companies face high data storage costs and the need to scale their solutions from
terabytes to petabytes for rapid data growth.
3 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
4. DATA REMASTERING AND SYSTEM OBSOLESCENCE
Over the last decade, tape technology evolutions have included DLT to super DLT, AIT to AIT 1-5, LTO to
LTO 1-4, and 3590 to 3590E, 3592, TS1120, TS1130. During the same period, tape operating systems and
applications have evolved. In addition, tape deteriorates over time and may be subject to physical damage.
For these reasons, companies are forced to periodically remaster or copy these tapes every three to five
years. The deterioration of tapes during storage and the remastering process is not only expensive but also
creates many opportunities for errors to creep into the data. During each remastering project, some of the
tapes are not readable, and therefore some or all of the data from these tapes is lost.
Remastering from large legacy tape holdings can require person-years of exacting work and can cost
millions of dollars. For example, one national oil company completed a project to make prestack data
publicly available. This involved remastering more than 500,000 tapes (9-track, 21-track, and 3,480
cartridges), which took over three years to accomplish.
Conversely, over the last decade, methods of accessing networked disk storage via CIFS or NFS have
not changed substantially. Hence, networked disk storage has proven to be more stable than tape-based
storage, with minimal migration complexity or cost. As a result, networked disk storage can reduce or
eliminate remastering costs and data loss.
DATA CORRUPTION
Precise, complete seismic data supports timely, informed decision making. Conversely, corrupted or incomplete
data can impair decision makers. Magnetic tape is a fragile medium that is highly susceptible to the
generation of errors by improper care and handling. More often than not, tape errors go unnoticed until the
data is needed, because there is no practical means to verify the integrity of the tape media and its data as
they age. There are countless stories in which tape recoveries have failed. The tape itself is intact, but the
attempt to read and recover data is unsuccessful.
By comparison, disk storage systems provide protection against failed disks utilizing RAID technology.
These systems safeguard against data corruption through the use of data checksum and can proactively
notify data storage administrators of impending failures.
DATA RECOVERY SPEED
The pace of recovery of data, either due to corruption, accidental deletion, or disaster, is also an important
consideration. When a geoscientist loses a key data volume due to corruption or accidental deletion, data
should be recoverable in minutes, not days, to maximize productivity and efficient use of field assets. Yet
data recovery from tape is lengthy (hours, days, or even weeks), complex, and costly. Restoring data from
tape can also introduce risk due to the human intervention required.
The inability to locate and restore project files quickly can incur financial losses or lead to missed opportunities
for the company. For example, if a decision that involves archived data analysis is needed for an offshore
drilling rig, indecision can cost a company as much as $500,000 per day in rig costs (typically called
―nonproductive time‖). Traditionally, to restore a project, geoscientists were forced to wait until data centrally
off-loaded in warehouse storage was retrieved—a process that could take days or weeks. Hence, decision
makers were faced with the dilemma of waiting too long for the correct data or making the decision with
inadequate data.
Disk-based online storage solutions provide technologies that enable rapid, reliable data access in the case
of catastrophic failure or simply data loss. Already used today in business applications such as e-mail and
financial systems, these technologies are now extended to seismic data.
4 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
5. DISTRIBUTED DATA
As E&P geoscientists gather and analyze data, they produce a variety of data subsets using a variety
of applications. In addition to the industry-specific, sophisticated analysis tools, personnel also store
information in the form of mainstream office productivity tools. Analyzed by different users at different times,
this data and results are often stored across multiple systems and locations across the enterprise. Subparts
of the data may be archived off site at one time or another. Yet at some time in the future, all of this relevant
project data may need to be assembled in one location for further analysis. This raises the need for a
method of storing the data for rapid, complete access.
3 COST COMPARISON: TAPE VERSUS DISK
1
A study commissioned by NetApp and conducted by Oliver Wyman in 2006 and updated in 2008
revealed that disk storage can be considerably less expensive than tape backup alternatives. While this
study focused on backup solutions, NetApp has seen similar potential benefits to disk-based solutions in
archiving applications.
The study interviewed data managers using tape systems and data managers using disk-to-disk solutions
to determine the difference in costs based on three cost components:
Product acquisition and ongoing vendor costs
Direct internal costs
Cost of lost productivity because of downtime
Overall, the study found that in a system with 30TB of storage, typical backup rotations, and four remote
offices, disk-based storage provided a 39% cost advantage over tape-based systems.
Even in organizations that consider only the up-front capital and ongoing vendor costs such as support and
maintenance for a backup purchase and ignore ongoing IT operation costs, including power, cooling, floor
space, and system administration, the advantage of disk-based solutions over tape was 14%. When all three
cost components were considered, disk-based storage systems were found to be 48–54% less expensive
than a tape-based solution of similar capacity and size over a typical backup product lifecycle.
Data managers focus on the ongoing costs of managing backup environments. Seventy percent of respondents
indicated that they rely on a total cost analysis, either based on internal sources or from a third party, when
making IT purchase decisions.
When costs associated with ongoing labor and business activities are included in the analysis, the overall
total cost story changes significantly: Disk-based storage becomes 70-75% less expensive than tape
solutions. The primary cost advantage is through lower direct internal costs, mainly labor. Respondents
found, on average, a fourfold reduction in systems management time compared to tape environments,
with some participants reporting even higher gains in labor efficiency.
4 TECHNOLOGY ENABLERS
A variety of technologies now enable disk-based storage to complement or, in some instances, replace
tape-based archiving in E&P applications. Not all of these technologies can be used together in all
situations. However, consideration of each enabler allows implementation of a disk archiving system
that meets the needs and policies of each specific business.
1 ―Total Cost Comparison of Back-Up Technologies: IT Decision-Maker Perspectives on NetApp SnapVault Disk to Disk Versus Traditional Tape-Based Backup Solutions.‖
Oliver Wyman, January 2008.
5 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
6. BETTER MANAGEMENT AND CONTROL OF DATA
Project-based archiving involves grouping E&P business data into ―projects.‖ The data can include
structured, semistructured, and unstructured data—all linked by relevance to a particular project. This
approach can address storage, archival, and access of a wide range of data. The key is to preserve the
relationship of distributed data to the project so that the data can be reinstantiated without external data
requirements.
Within this project-based archiving framework, it is useful to examine how data can be better managed and
controlled using disk-based software. This software can provide—or complement features in the disk that
provide—the following capabilities:
Better data compression
Deduplication advancements
Metadata and data management for discovery, deletion, disposition
Data encryption for security
Improved reliability
Data-class storage management
DATA COMPRESSION
Data compression technology for disk storage has come a long way. Today, lossless compression solutions
significantly reduce the storage requirements for both pre- and poststack seismic data, without introducing
data integrity issues. Over the last several years, compression technologies have evolved using industry
standards. They achieve compression rates of 35–45% on SEG-Y data (see Figure 1). By deploying
compression technology in conjunction with disk-based seismic data storage, the density of storage is
greatly improved, providing savings in pure disk space, power, and air conditioning requirements as an
added benefit compared to traditional disk-based storage.
Figure 1) Storage capacity saved using digital storage with lossless compression technology.
Courtesy: Halliburton
6 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
7. DEDUPLICATION
Data deduplication reduces the amount of data required to be physically stored by eliminating redundant
information and replacing subsequent iterations of it with a pointer to the original. Deduplication ratios vary
greatly according to the type of data being processed. Data that contains primarily unique information, such
as seismic files, contains very little redundancy and delivers poor deduplication ratios. However, data that
contains significant amounts of repeated information delivers the highest levels of deduplication, typically
increasing capacity by 30–70%. For example, geoscientists and other users typically begin data analysis
with E&P applications by copying files from an existing project to begin a new ―project‖ and make changes to
the data. Deduplication of files like these, which contain significant repeated information, is especially helpful
for project-based E&P archival.
METADATA AND DATA MANAGEMENT
Over the lifetime of a typical archive, knowledge of who interpreted the project, location of the project, as
well as the important horizons, faults, and well information that helped define the prospect fade over time,
and may be lost. This information about the project (metadata) is critical to locating the data again in the
future. A proper archiving system is needed to allow collection of this metadata. Once the metadata is in
place and associated with the archive, the archiving software enables users to search for the archive using
the metadata that describes it. In this way, old and potentially valuable archives can be quantified without
the need to first restore and reinstantiate them. This saves time and production disk space.
DATA ENCRYPTION FOR SECURITY
Companies must resolve the trade-off of added security versus cost to determine whether to encrypt both
data at rest and data delivered to users. An important consideration is whether encryption is needed to
protect E&P data from internal or external threats, or both. Encryption policies are best specified in corporate
security policies.
IMPROVED RELIABILITY
A RAID 6 implementation prevents data loss when two disk drives fail. This enables use of drives that
provide economical storage density, such as low-cost SATA (Serial ATA) disks, for seismic data storage,
while also providing the needed data protection. SATA drive capacities are expected to increase rapidly in
the future. In addition, the disk storage system actively reports disk failure. No such technology exists to
report a bad data block on tape as it sits in storage.
DATA CLASS STORAGE MANAGEMENT
Traditionally, E&P companies have archived data according to the type of data (e.g., field seismic data,
prestack data, poststack data, well logs, and so on). A more useful way to archive data is to use a tiered
storage solution in which data is classified and archived according to the need for the data, or patterns of its
usage. This enables access to data as a function of likely use or need. For example, data can be classified
according to classes that respond to milestone events such as divestiture of an asset, tabling of an asset, or
investigation of another area near the asset. The timing of use, number of users, types of uses, and other
parameters can also be used to classify data. This means that recently used data, data used by many, data
used in support of recent events, and so forth can be classified as ―hot storage‖ in an active archive. Data
not used for some time, rarely used, used in support of older events, and so forth can be placed in ―cold
storage.‖ And, of course, these classes can be made as granular as needed.
7 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
8. 5 NETAPP AND INTERICA SOLUTIONS
Interica offers two solutions to address these challenges: PARS and SmartMove.
INTERICA PARS
The Project Archive and Retrieval System (PARS) archives and restores structured, semistructured, and
unstructured data from file systems, databases, and applications (see Figure 2). Specifically constructed to
manage large data sets, Interica has further tailored the system to accommodate the particular needs of the
E&P industry during its 15 years of use in this vertical. Using the system, users can group relevant data (and
customizable metadata) from any source. They can also specify where PARS is to archive information,
verify modified data against the original, search for archives based on the metadata, and more. A policy
®
framework plus an Oracle Database allow the entry and extraction of metadata. The user interface provides
point-and-click access to management features, including, of course, archive and restore.
® ®
Administrators can integrate PARS with NIS and Microsoft Active Directory (via LDAP) to manage user
authentication and access control. They can also implement PARS without significant change to existing IT
infrastructure, due to the system’s out-of-band architecture. The PARS Migrator extension uses a project-
based approach to automatically move files and data sets based on company policy-based criteria. PARS
can be deeply integrated with mainstream exploration geotechnical applications such as LandMark’s
SeisWorks, OpenWorks, and CDA; Schlumberger’s GeoFrame and Petrel; Seismic Micro-Technology’s
SMT Kingdom; and many others. Similarly, PARS can be integrated with various storage systems.
Figure 2) Interica’s Project Archive and Retrieval System archives and retrieves structured and unstructured data
from file systems and applications. (Graphic provided by Interica)
8 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
9. INTERICA SMARTMOVE
SmartMove is a cross-platform tiered storage management application (see Figure 3). It provides discovery
of data by name, owner, size, and time. The application allows construction and implementation of
automated file-movement policies. The data movements include copy, move, and migrate (i.e., move and
create a link in the source file location pointing to the file on the destination storage.) Verification checks are
made during file movement to provide data integrity.
SmartMove enables E&P storage administrators to regain control of spiraling data growth by providing
tools to classify unstructured data and migration policies that can be tailored to the environment. SmartMove
does not attempt to assume control of existing IT environments or insert itself between users and their data.
Instead, SmartMove migrates data to more cost-effective storage using open protocols (CIFS, NFS). It
maintains data accessibility using standard OS links and by retaining file and directory structures. When
these migrations are complete, users access their data without any intervention by SmartMove.
Figure 3) Interica’s SmartMove is a cross-platform tiered storage management application. (Graphic provided by
Interica.)
NETAPP STORAGE PLATFORM
NetApp’s storage systems provide simultaneous support for all protocols—Fibre Channel SAN, IP SAN,
NFS, and CIFS—in combination with Fibre Channel, SAS, or SATA disk with the ability to selectively utilize
additional capabilities, including data permanence and encryption. The result is that E&P organizations
can standardize on a single architecture for all archive and compliance initiatives that can be tailored to
meet the needs for scale, cost, and performance. Fundamentally, the same architecture that delivers high
performance and reliability for volume interpretation and enterprise-class applications, including Microsoft
®
Exchange, Oracle, and SAP , can be utilized to deliver low-cost, massively scalable archival and compliance
initiative storage solutions for the same applications. Having a single set of software, tools, and processes
simplifies the complex world of enterprise data management. A single process for activities such as
installation, provisioning, mirroring, and upgrading is used throughout the entire NetApp storage product
line, lowering administrative costs and making it easier to deploy new capabilities across all tiers of storage.
Unifying storage and data management software and processes reduces the complexity of data ownership,
enables the ability to adapt to changing business conditions without interruption, and results in lower total
cost of ownership (TCO).
9 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
10. NETAPP INTEGRATED DATA PROTECTION
®
The NetApp Unified Storage Architecture also includes an embedded set of data protection services
that simplify the process of protecting against site outages and data loss. Called NetApp Integrated Data
®
Protection, these services are tightly integrated within the NetApp Data ONTAP operating system and
can be enabled in minutes for rapid, easy implementation within any E&P environment. Integrated Data
Protection services include:
™
Snapshot copies that provide a local, point-in-time image of the archive that is created in seconds
Compliance locking that prevents data from being modified
Disaster recovery that replicates the archive to a remote location for system and site level recovery;
NetApp leverages embedded network compression to reduce network utilization up to 70%,
accelerating DR and reducing network costs
Replication-based backup that creates a remote, block-level incremental backup of the archive in
minutes and reduces storage requirements up to 90%
Storage clustering that stretches up to 100 km and is designed for zero data loss and zero downtime
NETAPP EXTENDED CAPABILITIES
NetApp offers a single, complete product infrastructure that seamlessly integrates with key enterprise
applications that drive the current generation of mission-critical enterprise data for archive and compliance.
With NetApp, archive and compliance functions such as data classification and migration, data permanence,
data security, and data discovery all integrate with the core platform, extending its capabilities. The ability to
selectively apply these capabilities to any data set provides E&P organizations with the ability to meet any
particular data management requirement (see Figure 5).
NETAPP SOLUTIONS
®
NetApp solutions include NetApp NearStore and NetApp DataFort.
NetApp NearStore protection software provides E&P data protection and retention applications such as
disaster recovery, archival, compliant retention, and content storage. A disk-based secondary storage
device for enterprise applications NearStore complements and improves existing tape archiving and data
protection schemes. NearStore provides economical storage and rapid disk-based access to reference data
to meet business and legal requirements. In addition, with a lower cost of acquisition and TCO than primary
storage and better performance than tape, NearStore is a proven solution for organization whose multisite
replication programs require an economical, high-capacity storage device to be located remotely.
NetApp deduplication is a fundamental component of NetApp’s core operating architecture—Data ONTAP.
NetApp deduplication can be used broadly across many E&P applications, including primary and archival
data. By eliminating redundant data objects and referencing just the original object, NetApp deduplication
combines the benefits of granularity, performance, and resiliency to provide a significant advantage in the
race to provide for ever-increasing storage capacity demands. Compression can complement deduplication
and provide additional savings, especially on data that does not deduplicate well, such as geoseismic data
(see Figure 4). The result is 1) reduced initial storage acquisition costs or longer intervals between storage
capacity upgrades, and 2) the ability to store more data per storage unit or retain online data for longer
periods of time.
10 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry
11. Figure 4) Deduplication complements compression to reduce storage requirements.
Networked storage speeds access to information, but it can also leave data vulnerable. Firewalls and
intrusion-prevention systems can secure assets at the perimeter, but data at the storage core can be
exposed to both internal and external attacks. NetApp DataFort Security Systems provide security
throughout the entire lifecycle of regulated and sensitive data for multiple heterogeneous storage systems
without disrupting applications, clients, servers, or user workflow. NetApp DataFort systems combine secure
access controls, authentication, storage hardware-based encryption, and secure logging to protect stored
data. E&P administrators can deploy inline or network-attached storage to reap security advantages without
impacting user workflow. NetApp DataFort fits seamlessly into existing storage environments, supporting
CIFS, NFS, IP SAN, FC disk, FC tape, and SCSI protocols. E&P companies can protect data both at rest
and in flight with encryption certified by the National Institute for Standards and Technology, even if IT
management is outsourced.
Figure 5) NetApp comprehensive data protection capabilities in the E&P environment
11 Challenges and Best Practices for Storing/Archiving Data in the Petroleum and Gas Exploration and Production Industry