Data protection and retention is critically important for commercial and public sector organizations alike. In fact, it is said that three things keep a CIO up at night: data failure, time to recovery and ensuring IT is able to meet recovery point objectives.
Today, if you have a failure, the amount of time the business is without the data can determine if you need to start looking for a new job…
This presentation will help the attendees:
• Evaluate the business case for targeting your backups to the cloud
• Identify candidate data sets for cloud backup and assess opportunities and readiness
• Mitigate the challenges of backing up to the cloud
• Build a roadmap to your cloud implementation
WWT also provides some real-world experience via use cases and introduces several tools and methodologies used to evaluate, plan and execute your journey to cloud backup in this presentation.
To find out more about WWT’s Data Protection team, log on to wwt.com.
5 questions to ask when considering cloud storageTwinStrata, Inc
This slide deck walks through the reasons for the rising interest in the cloud for storage, as well as the five questions that all organizations have (or should have) when considering cloud storage,
Are you actively using or moving to Office 365, G-Suite, or other popular cloud applications? If so, how confident are you that you can keep all of that critical data protected? Attend this session to learn how Veritas can help protect data across all of your different cloud applications--using the same solution you use to protect your existing non-cloud applications. Don't miss this opportunity to explore the advantages of using one unified solution to protect all of your data--across all of your physical, virtual, and cloud environments.
NetBackup CloudCatalyst – efficient, cost-effective deduplication to the cloudVeritas Technologies LLC
Is your organization looking for a more efficient, cost-effective way to use public and private cloud storage as a backup target? Attend this session to find out how CloudCatalyst can help, by providing deduplication of backup data to object storage environments in both public and private clouds. You'll learn how you can use CloudCatalyst to achieve petabyte scale with minimal cache storage requirements, transfer data from a NetBackup Dedupe Media Server without going through a rehydration process, and much more. Don't miss this chance find out exactly how CloudCatalyst provides the most efficient and cost-effective backups from a data center, to the cloud, or in the cloud.
Accelerate your digital business transformation with 360 Data ManagementVeritas Technologies LLC
As infrastructure continues to become more commoditized and abstracted, IT organizations are actively shifting their focus from managing infrastructure to managing information. This session will provide a detailed introduction to 360 Data Management, Veritas' vision for how IT organizations should think about and deploy data management technologies as they work to modernize and embrace this important shift. Veritas experts will reveal the six pillars of 360 Data Management – and talk about how IT leaders can take advantage of the Veritas 360 Data Management Suite to accelerate their digital business transformation.
This session was originally delivered at Veritas' Vision 2017 on Tuesday, Sep 19, 4:30 PM - 5:30 PM.
NetBackup CloudCatalyst: Efficient, Cost-Effective Deduplication to the CloudVeritas Technologies LLC
Is your organization looking for a more efficient, cost-effective way to use public and private cloud storage as a backup target? Attend this session to find out how CloudCatalyst can help--by providing deduplication of backup data to object storage environments in both public and private clouds. You'll learn how you can use CloudCatalyst to achieve petabyte scale with minimal cache storage requirements, transfer data from a NetBackup Dedupe Media Server without going through a rehydration process, and much more. Don't miss this chance find out exactly how CloudCatalyst provides the most efficient and cost-effective backups from a data center, to the cloud, or in the cloud.
Shared responsibility - a model for good cloud securityAndy Powell
An overview of the shared responsibility model that is typically adopted by cloud providers and its impact on the way that Jisc members should build secure solutions in public cloud.
Everyone understands disk has become the primary target for backups in the last several years. It’s also safe to say that the main type of disk storage used as a target for backups would be a purpose-built backup appliance that presents itself to the backup application as an NFS or SMB server and then deduplicates any backups stored on it.
But what about object storage? Object storage vendors tout that their systems are less expensive to buy and less expensive to operate than traditional disk arrays and NAS appliances. So, does it make sense to use them for backups? How much is deduplication a factor and is deduplication even available with object storage? What else can object storage bring to the table that traditional disk backup appliances can’t?
5 questions to ask when considering cloud storageTwinStrata, Inc
This slide deck walks through the reasons for the rising interest in the cloud for storage, as well as the five questions that all organizations have (or should have) when considering cloud storage,
Are you actively using or moving to Office 365, G-Suite, or other popular cloud applications? If so, how confident are you that you can keep all of that critical data protected? Attend this session to learn how Veritas can help protect data across all of your different cloud applications--using the same solution you use to protect your existing non-cloud applications. Don't miss this opportunity to explore the advantages of using one unified solution to protect all of your data--across all of your physical, virtual, and cloud environments.
NetBackup CloudCatalyst – efficient, cost-effective deduplication to the cloudVeritas Technologies LLC
Is your organization looking for a more efficient, cost-effective way to use public and private cloud storage as a backup target? Attend this session to find out how CloudCatalyst can help, by providing deduplication of backup data to object storage environments in both public and private clouds. You'll learn how you can use CloudCatalyst to achieve petabyte scale with minimal cache storage requirements, transfer data from a NetBackup Dedupe Media Server without going through a rehydration process, and much more. Don't miss this chance find out exactly how CloudCatalyst provides the most efficient and cost-effective backups from a data center, to the cloud, or in the cloud.
Accelerate your digital business transformation with 360 Data ManagementVeritas Technologies LLC
As infrastructure continues to become more commoditized and abstracted, IT organizations are actively shifting their focus from managing infrastructure to managing information. This session will provide a detailed introduction to 360 Data Management, Veritas' vision for how IT organizations should think about and deploy data management technologies as they work to modernize and embrace this important shift. Veritas experts will reveal the six pillars of 360 Data Management – and talk about how IT leaders can take advantage of the Veritas 360 Data Management Suite to accelerate their digital business transformation.
This session was originally delivered at Veritas' Vision 2017 on Tuesday, Sep 19, 4:30 PM - 5:30 PM.
NetBackup CloudCatalyst: Efficient, Cost-Effective Deduplication to the CloudVeritas Technologies LLC
Is your organization looking for a more efficient, cost-effective way to use public and private cloud storage as a backup target? Attend this session to find out how CloudCatalyst can help--by providing deduplication of backup data to object storage environments in both public and private clouds. You'll learn how you can use CloudCatalyst to achieve petabyte scale with minimal cache storage requirements, transfer data from a NetBackup Dedupe Media Server without going through a rehydration process, and much more. Don't miss this chance find out exactly how CloudCatalyst provides the most efficient and cost-effective backups from a data center, to the cloud, or in the cloud.
Shared responsibility - a model for good cloud securityAndy Powell
An overview of the shared responsibility model that is typically adopted by cloud providers and its impact on the way that Jisc members should build secure solutions in public cloud.
Everyone understands disk has become the primary target for backups in the last several years. It’s also safe to say that the main type of disk storage used as a target for backups would be a purpose-built backup appliance that presents itself to the backup application as an NFS or SMB server and then deduplicates any backups stored on it.
But what about object storage? Object storage vendors tout that their systems are less expensive to buy and less expensive to operate than traditional disk arrays and NAS appliances. So, does it make sense to use them for backups? How much is deduplication a factor and is deduplication even available with object storage? What else can object storage bring to the table that traditional disk backup appliances can’t?
A recovery Service Level Agreement (SLA) represents a commitment between IT and the business, but are you willing to sign up for it? Do you know quickly you could be up and running after a disaster and how much data you would lose?
Who Is Really Responsible: Public Cloud Misperceptions and the Need for Multi...Veritas Technologies LLC
Misperceptions around data ownership and responsibility of data management in the public cloud are plenty, but not everyone is fully aware of them. We may take things like data backup and application resiliency in the cloud for granted, but maybe this is false-confidence? In this session, we identify the top cloud misperceptions that can punish your success, as well as talk about ways to mitigate these risks with solutions you can take advantage of today.
Are you tired of paying the VMware tax? Are you stuck in a Veeam virtual data protection prison? It's time to move beyond these outdated "virtual only" solutions. Attend this session to learn how you can finally trade your limited virtual data protection solution in for a complete Veritas solution that can protect all of your data from a single platform, with a single license, managed from a single console.
After a disaster, how much of your critical infrastructure and data could you recover? And how long would it take? To make sure you can answer these important questions with complete confidence, Veritas is adding machine learning technology to its data protection solutions. Attend this session to find out how combining machine learning and data protection enhances your ability to completely protect and recover critical systems and information more quickly and efficiently – no matter where it lives or what happens to it.
This document discusses 7 common pitfalls of cloud adoption that leadership should avoid: 1) assuming technical relevance, 2) focusing solely on cloud provider controls, 3) delaying incident response testing and planning, 4) ignoring training needs, 5) delaying identity and access management plans, 6) using cloud unaware frameworks and measures, and 7) failing to review the security program context. It emphasizes that technology, personnel, processes, context awareness and capabilities must adapt to the cloud. Netskope provides a "one cloud" security platform to help organizations effectively secure access and protect data across cloud services and web applications.
Typical disaster recovery plans leverage backup and/or replication to move data out of the primary data center and to a secondary site. Historically, the secondary site is another data center that the organization maintains. But now, companies are looking to the cloud to become a secondary site, leveraging it as a backup target and even a place to start their applications in the event of a failure. The problem with this approach is that it merely simulates a legacy design and presents some significant recovery challenges.
OpenStack and Containers have rapidly emerged as the default technologies for developing and delivering new scale out architectures. Initially, these workloads were deployed in test and development environments. But now that they're reaching the production phase, it's time to rethink and update your approach to data protection. Attend this session to examine the best models, mindsets, and approaches for providing complete, effective data protection for today's new scale-out workloads.
This document provides an overview of Veritas NetBackup 8.1, which offers improved data protection capabilities for multi-cloud environments, modern workloads, and resilient infrastructure. Key features highlighted include leveraging multiple cloud platforms for long-term storage, faster backup to the cloud, protection for scale-out workloads and hyperconverged environments, simplified deployment and maintenance, and enhanced data security across networks. NetBackup 8.1 also features integrated support for popular databases and big data platforms through its use of flexible policies and frameworks.
Cloud outages are inevitable. And when they occur, they make headlines. In this session, Veritas cloud experts will discuss best practices for detecting and proactively mitigating uptime risks, meeting strict service-level objectives, and recovering business services quickly and confidently in complex multi-cloud environments.
Implementing a long term data retention strategy that leverages the cloudVeritas Technologies LLC
This document discusses implementing a long-term data retention strategy using cloud storage. It outlines challenges with cost, complexity and visibility of long-term retention. The Veritas solution integrates data retention, visibility and management across on-premise and cloud storage using NetBackup and Access. It provides several use cases including building a private cloud for retention, moving retention to public cloud for cost savings, and using Information Map for data visibility and governance. The presentation includes a customer example and demo of the Veritas tools.
The document provides a checklist for key steps to help ensure a successful move to cloud hosting. It outlines 9 steps: 1) choosing the right cloud model (public, private, hybrid); 2) understanding security requirements; 3) understanding the benefits of cloud hosting; 4) understanding setup and exit costs; 5) ensuring assumptions are realistic; 6) ensuring technical practicality; 7) understanding security risks; 8) considering non-technical factors like data protection; and 9) properly managing cloud vendors and contracts.
Four Reasons Why Your Backup & Recovery Hardware will Break by 2020Storage Switzerland
While backup software vendors continue to innovate, hardware vendors have been resting on their deduplication laurels. In the meantime, the amount of data that organizations store continues to grow at an alarming pace and the backup and disaster recovery expectations of users are higher than ever. Most backup solutions today simply will not be able to keep pace with these realities. If organizations don't act now to address the weaknesses in their backup hardware, they will not be able to meet organizational demands by 2020. In this webinar, Cloudian and Storage Switzerland discuss three areas where IT professionals need to expect more from their backup hardware and where they should demand less.
Technical Best Practices for Veritas and Microsoft Azure Using a Detailed Ref...Veritas Technologies LLC
Explore best practices around the following use cases related to the Microsoft Azure platform: Long-term retention of data in the cloud, migration of critical workloads including those running in VMware and Hyper-V, and resiliency of business services running in the cloud. Each of these scenarios are part of what Veritas 360 data management in the cloud can provide. Learn the best way to design, deploy, and manage within each of these scenarios on Azure, and gain key insights into how to avoid pitfalls of common practices and how to boost your cloud ROI – demonstrated via a reference architecture.
More data, more backups, more strain on the organisation (UK)Symantec
Our backups are taking longer and longer. The resulting strain on my staff and IT infrastructure is restricting our performance and ability to embrace new technologies. What should we do?” You need to find a way to simplify and streamline your backups to beat the backup window and meet even the strictest recovery SLAs, without compromising data protection.
Exploring the Benefits of an Integrated Classification Engine: Lessons in Eli...Veritas Technologies LLC
This document discusses the benefits of an integrated classification engine that can eliminate dark data across various Veritas products. It describes how classification has traditionally failed due to inconsistent manual tagging and presents an intelligent engine that uses techniques like proximity searching, regular expressions, and checksum validation. The engine provides default policies and patterns that can be customized and tested. Integration in products like Data Insight 6.0 and EnterpriseVault 12.2 allows classification to drive functions like targeted scanning, compliance, and flexible retention policies.
Webinar: Moving the Enterprise Backup to the Cloud – A Step-By-Step GuideStorage Switzerland
Making sure everything in the data center is properly protected is a common struggle that all data centers face. The cloud, cloud backup, seems like an answer to all those struggles. But, how exactly does IT make the conversion from on-premises backup to cloud backup? In this webinar, join experts from Storage Switzerland, Veeam and KeepItSafe to learn:
* What to look for in a cloud backup solution
* What pitfalls to avoid
* How to actually migrate backup operations to the cloud
Do you want a way to deploy CloudStack management services, including databases and supporting services, into a new environment with ease? Do you need resilience for your environment's management plane?
We've created a appliance that can host all of the components required to manage a CloudStack-based cloud infrastructure, and can be deployed on various types of hardware, with minimal requirements. The project led to the use of a few interesting technologies and methods, including a tested and customized implementation of MariaDB/Galera to backend CloudStack. During this session, we will go over this appliance design, and hopefully have a dialogue about similar deployment designs that others have used.
The document discusses backup and data protection solutions from echopath. It provides two case studies of echopath helping clients - an Indianapolis manufacturer and a contract food manufacturer - protect their data through agentless backup, public cloud solutions, and deduplication across multiple sites. It also outlines echopath's deployment models and touts echopath's ability to provide secure, compliant, and disaster recovery backup solutions.
So you have a Cloud-based DRaaS Plan, or you are considering one. GOOD DECISION. But do you have the right plan? Having the wrong Cloud DR can be its own disaster. We’ll review the right way and the horrific way to deploy a Cloud-based DR plan.
Today's unrelenting data growth continues to drive the need for greater storage efficiencies and scalability, and many organizations have embraced object storage as the best approach for providing those efficiencies. However, limitations across multiple object storage solutions have left the full potential of object storage mostly unfulfilled. Attend this session to learn how Veritas is changing this unsatisfying object storage narrative – with a new kind of solution that uses embedded AI and ML to enable greater object storage scalability and lower overall costs from both a CapEx and OpEx perspective.
If You’re Out of Cloud, You’re Out of Work; Key Skills to Move into CloudTristano Vacondio
What does it mean to be a cloud professional?
We see the term in more and more training and certification programs, and ‘cloud skills’ has increasingly become a requirement on many job descriptions. It’s quite general and vague what these specific skills this translates to for the variety of positions in IT and some non-IT functions involved with cloud computing.
Last year we shared a series of videos from interviews with the authors of the Cloud Technology Associate certification syllabus to help answer these questions. As the demand for cloud skills continues to grow in 2016, we are revisiting the topic and sitting down with them in a webinar so we can delve deeper into the subject.
vVols and Your Cloud Operating Model with Tristan ToddChris Williams
For almost 10-years now, future-focused datacenter teams have been trying to evolve to more cloud-like operating model. Some of us have succeeded, some of use have failed. During this fun-filled, example-heavy session, Pure Storage Solutions Architect Tristan Todd will share patterns of failure, patterns of success, some practical examples, and recipes for success on how organizations have succeeded in realizing success in adapting to a cloud ops model. And, of course, Tristan will highlight how Pure is helping Customer with real transformation.
A recovery Service Level Agreement (SLA) represents a commitment between IT and the business, but are you willing to sign up for it? Do you know quickly you could be up and running after a disaster and how much data you would lose?
Who Is Really Responsible: Public Cloud Misperceptions and the Need for Multi...Veritas Technologies LLC
Misperceptions around data ownership and responsibility of data management in the public cloud are plenty, but not everyone is fully aware of them. We may take things like data backup and application resiliency in the cloud for granted, but maybe this is false-confidence? In this session, we identify the top cloud misperceptions that can punish your success, as well as talk about ways to mitigate these risks with solutions you can take advantage of today.
Are you tired of paying the VMware tax? Are you stuck in a Veeam virtual data protection prison? It's time to move beyond these outdated "virtual only" solutions. Attend this session to learn how you can finally trade your limited virtual data protection solution in for a complete Veritas solution that can protect all of your data from a single platform, with a single license, managed from a single console.
After a disaster, how much of your critical infrastructure and data could you recover? And how long would it take? To make sure you can answer these important questions with complete confidence, Veritas is adding machine learning technology to its data protection solutions. Attend this session to find out how combining machine learning and data protection enhances your ability to completely protect and recover critical systems and information more quickly and efficiently – no matter where it lives or what happens to it.
This document discusses 7 common pitfalls of cloud adoption that leadership should avoid: 1) assuming technical relevance, 2) focusing solely on cloud provider controls, 3) delaying incident response testing and planning, 4) ignoring training needs, 5) delaying identity and access management plans, 6) using cloud unaware frameworks and measures, and 7) failing to review the security program context. It emphasizes that technology, personnel, processes, context awareness and capabilities must adapt to the cloud. Netskope provides a "one cloud" security platform to help organizations effectively secure access and protect data across cloud services and web applications.
Typical disaster recovery plans leverage backup and/or replication to move data out of the primary data center and to a secondary site. Historically, the secondary site is another data center that the organization maintains. But now, companies are looking to the cloud to become a secondary site, leveraging it as a backup target and even a place to start their applications in the event of a failure. The problem with this approach is that it merely simulates a legacy design and presents some significant recovery challenges.
OpenStack and Containers have rapidly emerged as the default technologies for developing and delivering new scale out architectures. Initially, these workloads were deployed in test and development environments. But now that they're reaching the production phase, it's time to rethink and update your approach to data protection. Attend this session to examine the best models, mindsets, and approaches for providing complete, effective data protection for today's new scale-out workloads.
This document provides an overview of Veritas NetBackup 8.1, which offers improved data protection capabilities for multi-cloud environments, modern workloads, and resilient infrastructure. Key features highlighted include leveraging multiple cloud platforms for long-term storage, faster backup to the cloud, protection for scale-out workloads and hyperconverged environments, simplified deployment and maintenance, and enhanced data security across networks. NetBackup 8.1 also features integrated support for popular databases and big data platforms through its use of flexible policies and frameworks.
Cloud outages are inevitable. And when they occur, they make headlines. In this session, Veritas cloud experts will discuss best practices for detecting and proactively mitigating uptime risks, meeting strict service-level objectives, and recovering business services quickly and confidently in complex multi-cloud environments.
Implementing a long term data retention strategy that leverages the cloudVeritas Technologies LLC
This document discusses implementing a long-term data retention strategy using cloud storage. It outlines challenges with cost, complexity and visibility of long-term retention. The Veritas solution integrates data retention, visibility and management across on-premise and cloud storage using NetBackup and Access. It provides several use cases including building a private cloud for retention, moving retention to public cloud for cost savings, and using Information Map for data visibility and governance. The presentation includes a customer example and demo of the Veritas tools.
The document provides a checklist for key steps to help ensure a successful move to cloud hosting. It outlines 9 steps: 1) choosing the right cloud model (public, private, hybrid); 2) understanding security requirements; 3) understanding the benefits of cloud hosting; 4) understanding setup and exit costs; 5) ensuring assumptions are realistic; 6) ensuring technical practicality; 7) understanding security risks; 8) considering non-technical factors like data protection; and 9) properly managing cloud vendors and contracts.
Four Reasons Why Your Backup & Recovery Hardware will Break by 2020Storage Switzerland
While backup software vendors continue to innovate, hardware vendors have been resting on their deduplication laurels. In the meantime, the amount of data that organizations store continues to grow at an alarming pace and the backup and disaster recovery expectations of users are higher than ever. Most backup solutions today simply will not be able to keep pace with these realities. If organizations don't act now to address the weaknesses in their backup hardware, they will not be able to meet organizational demands by 2020. In this webinar, Cloudian and Storage Switzerland discuss three areas where IT professionals need to expect more from their backup hardware and where they should demand less.
Technical Best Practices for Veritas and Microsoft Azure Using a Detailed Ref...Veritas Technologies LLC
Explore best practices around the following use cases related to the Microsoft Azure platform: Long-term retention of data in the cloud, migration of critical workloads including those running in VMware and Hyper-V, and resiliency of business services running in the cloud. Each of these scenarios are part of what Veritas 360 data management in the cloud can provide. Learn the best way to design, deploy, and manage within each of these scenarios on Azure, and gain key insights into how to avoid pitfalls of common practices and how to boost your cloud ROI – demonstrated via a reference architecture.
More data, more backups, more strain on the organisation (UK)Symantec
Our backups are taking longer and longer. The resulting strain on my staff and IT infrastructure is restricting our performance and ability to embrace new technologies. What should we do?” You need to find a way to simplify and streamline your backups to beat the backup window and meet even the strictest recovery SLAs, without compromising data protection.
Exploring the Benefits of an Integrated Classification Engine: Lessons in Eli...Veritas Technologies LLC
This document discusses the benefits of an integrated classification engine that can eliminate dark data across various Veritas products. It describes how classification has traditionally failed due to inconsistent manual tagging and presents an intelligent engine that uses techniques like proximity searching, regular expressions, and checksum validation. The engine provides default policies and patterns that can be customized and tested. Integration in products like Data Insight 6.0 and EnterpriseVault 12.2 allows classification to drive functions like targeted scanning, compliance, and flexible retention policies.
Webinar: Moving the Enterprise Backup to the Cloud – A Step-By-Step GuideStorage Switzerland
Making sure everything in the data center is properly protected is a common struggle that all data centers face. The cloud, cloud backup, seems like an answer to all those struggles. But, how exactly does IT make the conversion from on-premises backup to cloud backup? In this webinar, join experts from Storage Switzerland, Veeam and KeepItSafe to learn:
* What to look for in a cloud backup solution
* What pitfalls to avoid
* How to actually migrate backup operations to the cloud
Do you want a way to deploy CloudStack management services, including databases and supporting services, into a new environment with ease? Do you need resilience for your environment's management plane?
We've created a appliance that can host all of the components required to manage a CloudStack-based cloud infrastructure, and can be deployed on various types of hardware, with minimal requirements. The project led to the use of a few interesting technologies and methods, including a tested and customized implementation of MariaDB/Galera to backend CloudStack. During this session, we will go over this appliance design, and hopefully have a dialogue about similar deployment designs that others have used.
The document discusses backup and data protection solutions from echopath. It provides two case studies of echopath helping clients - an Indianapolis manufacturer and a contract food manufacturer - protect their data through agentless backup, public cloud solutions, and deduplication across multiple sites. It also outlines echopath's deployment models and touts echopath's ability to provide secure, compliant, and disaster recovery backup solutions.
So you have a Cloud-based DRaaS Plan, or you are considering one. GOOD DECISION. But do you have the right plan? Having the wrong Cloud DR can be its own disaster. We’ll review the right way and the horrific way to deploy a Cloud-based DR plan.
Today's unrelenting data growth continues to drive the need for greater storage efficiencies and scalability, and many organizations have embraced object storage as the best approach for providing those efficiencies. However, limitations across multiple object storage solutions have left the full potential of object storage mostly unfulfilled. Attend this session to learn how Veritas is changing this unsatisfying object storage narrative – with a new kind of solution that uses embedded AI and ML to enable greater object storage scalability and lower overall costs from both a CapEx and OpEx perspective.
If You’re Out of Cloud, You’re Out of Work; Key Skills to Move into CloudTristano Vacondio
What does it mean to be a cloud professional?
We see the term in more and more training and certification programs, and ‘cloud skills’ has increasingly become a requirement on many job descriptions. It’s quite general and vague what these specific skills this translates to for the variety of positions in IT and some non-IT functions involved with cloud computing.
Last year we shared a series of videos from interviews with the authors of the Cloud Technology Associate certification syllabus to help answer these questions. As the demand for cloud skills continues to grow in 2016, we are revisiting the topic and sitting down with them in a webinar so we can delve deeper into the subject.
vVols and Your Cloud Operating Model with Tristan ToddChris Williams
For almost 10-years now, future-focused datacenter teams have been trying to evolve to more cloud-like operating model. Some of us have succeeded, some of use have failed. During this fun-filled, example-heavy session, Pure Storage Solutions Architect Tristan Todd will share patterns of failure, patterns of success, some practical examples, and recipes for success on how organizations have succeeded in realizing success in adapting to a cloud ops model. And, of course, Tristan will highlight how Pure is helping Customer with real transformation.
Preventing The Cloud Data Breaches:
The Cloud as The New Normal,
The Concept of Cloud Computing,
Why is The cloud The New Normal?
Shared Responsibilities in the Cloud,
The Concept of The security of The Cloud and Security in The Cloud,
Your Cloud Data as Your Most Critical Asset,
Service Level Agreement/Contract Terms,
Securing your Cloud Data(Data LifeCycle, Data States, Identity & access Management, Data Obfuscation, Overall Cloud Security),
Combat Cloud Data Threats(STRIDE vs DREAD),
Putting it ALL Together
The document discusses cloud computing and security considerations for moving to the cloud. Some key points:
1) It defines cloud computing based on NIST definitions and emphasizes automation, elasticity, and flexible costing as core benefits of the cloud.
2) It notes that while cost savings are often cited, security and privacy are often overlooked but critical considerations for moving to the cloud.
3) It provides an overview of cloud security elements including identity and access management, data security, encryption, network security, and ensuring secure cloud architecture and design.
The document discusses how organizations are using far more public cloud services than they realize. On average, organizations estimate they use around 50 cloud services but audits actually discover over 700 services being used. This represents 15-20 times more cloud services being purchased and used without IT involvement, known as "shadow IT." The document advocates for discovering and monitoring all cloud usage to gain visibility, mitigate risks, and reduce redundant services in order to lower costs. It describes Cisco's Cloud Consumption services which provide a single portal to manage cloud usage across SaaS, PaaS, and IaaS environments.
Whether you're a business owner, IT professional, or anyone interested in cloud migration, this presentation will help you develop a deeper understanding of the benefits and challenges of cloud migration.
Download our presentation about Cloud Migration Strategies for Businesses and learn why cloud migration is crucial for modern businesses.
You can also see our workshop on this topic: https://www.youtube.com/watch?v=b10nxSJ5gS0&ab_channel=ZenBitTech
This workshop will provide you with actionable insights and best practices for successful cloud migration. Our expert speakers will share their knowledge and experience in preparing for cloud migration, choosing the right cloud platform, and developing effective migration strategies. You will also learn about key considerations such as data migration, security and compliance, cost optimization, and integration with existing systems.
See our blog: https://zenbit.tech/blog/cloud-migration-overview/
Designing Cloud Backup to reduce DR downtime for IT ProfessionalsStorage Switzerland
IT professionals know that the ultimate test of the data protection process is performing a recovery; whether a single server recovery or recovering an entire data center. That said, we are all guilty of focusing too much on the backup process rather than trying to reduce the amount of downtime following a system failure. In this webinar, George Crump, founder of Storage Switzerland and Ian McChord from Datto will provide you with the five critical questions you need to answer in order to reduce or even eliminate downtime.
This document summarizes discussions from a #scotcloud event. It includes summaries of presentations on cloud security from Vladimir Jirasek of the Cloud Security Alliance UK chapter, on MaidSafe from David Irvine of MaidSafe, and on cloud strategy considerations from Richard Higgs of brightsolid. Higgs outlined brightsolid's cloud philosophy and emphasized the importance of hybrid cloud solutions. He discussed 7 critical cloud considerations including continuous availability, resource optimization, security, and automation. The event also featured a presentation from Peter Sturrock of Skyscanner on how the company uses the cloud.
Software Defined Storage (SDS) has fundamentally changed the way storage hardware is purchased by abstracting storage services from the physical hardware. But to reach its full potential SDS needs to go back to school. Join us for this live webinar with George Crump, Lead Analyst at Storage Switzerland and Andrew Flint, VP of Marketing at ioFABRIC as they discuss why SDS has more to learn to deliver the next level of efficiency in the data center.
In this webinar you will learn:
•How SDS can drive better storage efficiency
•How SDS solutions can become more automated
•Why SDS does NOT need to be hyper-converged
ADV Slides: Strategies for Transitioning to a Cloud-First EnterpriseDATAVERSITY
A great comfort with cloud deployment has emerged. Businesses are migrating databases to the cloud or building databases there as a result of scale challenges with the on-premises model, the cloud becoming the “center of gravity”, on-premises databases reaching capacity or emerging uses cases that are specific to the cloud. But not all organizations! And some struggle mightily with the move!
Learn about the factors that impact organizations when shifting data and applications to the cloud. What must you consider as you move significant applications and data to the cloud? This webinar will cover the major decision points that management needs to consider when moving to the cloud.
These include changes to the software model, development and quality assurance, recovery outage, and disaster recovery as well as new concerns about query performance and service levels, data interchange in the cloud, safe harbor and cross-border restrictions, and security and privacy.
We’ll also cover the new models for capacity planning and growth and staff responsibilities, the need for increased organizational change management, and how to pick targets for the journey.
How to Make an Effective Cloud Disaster Recovery Strategy.pdfSysvoot Antivirus
Problems are inevitable and a problem that hinders the operations of a company can be tagged as a Disaster. Technical glitches or security breaches can result in disasters and once it sets in, the organization can face huge issues.
Now coming to disaster recovery. It can be defined as the process to evade or bounce back after a disaster. This helps them restore important documents. A cloud disaster recovery system aids the company to restore their files with the usage of cloud services.
The document discusses cloud resilience, provisioning, and asset management.
For cloud resilience, it outlines a structured 4-step approach: 1) Assessing assets and requirements, 2) Planning and designing resilience strategies, 3) Implementing and testing, and 4) Managing and sustaining resilience over time.
Cloud provisioning refers to how, what, and when cloud services are provisioned, including dynamic, user, and post-sales models.
Cloud asset management is about managing cloud applications, platforms, and infrastructure to address challenges like lack of visibility, usage data, and spending controls across cloud services. Effective cloud asset management provides benefits like cost optimization and readiness for cloud migrations.
The Adoption of Cloud Technology by Enterprises - A Whitepaper by RapidValueRapidValue
If we go back in time, people were dependent on the physical computer storage or servers in order to run their programs. Now, with the introduction of cloud computing, people, organizations and enterprises are able to access their programs through the Internet. Cloud computing is gaining prominence, rapidly, and the popularity is growing, each day. Cloud computing is big business, today.According to PC Magazine, it was, already, generating around $100 billion a year in 2012. It is forecasted to increase up to $270 billion by the year 2020.
Enterprises and organizations, these days, are relying heavily on the cloud services and cloud platforms to obtain resources on-demand and that too, in an automated manner.
Organizations can, now, only pay for the resources that they use. Enterprises, also,
relinquish unnecessary resources with the help of using a self-service portal. This serves as a big cost-effective solution, as you can eliminate the need for investing a huge sum of money as capital investment.
This paper addresses the primary reasons for the enterprises migrating to the cloud infrastructure, various types of cloud deployment (technology & services) models IaaS, PaaS, SaaS, public cloud, private cloud and hybrid cloud, feature comparison of three popular cloud platforms - AWS, Microsoft Azure, Google Cloud and some examples of how enterprises and consumers are using the cloud technology.
Espion and SureSkills Presentation - Your Journey To A Secure CloudGoogle
Ross Spelman will show how businesses can confidently evaluate cloud solutions and manage platforms and infrastructure in the cloud. Nigel Tozer will discuss public, private, and hybrid cloud strategies. Ruaidhri McSharry will discuss how cyber security is an organizational issue and resilience is key.
AWS Summit Singapore 2019 | Banking in the Cloud: 10 Lessons LearnedAWS Summits
Speaker: Jonathan Allen, Enterprise Strategist, AWS
Hear why customers adopt, how you can follow and the positive impact of Financial Services customers choosing to use AWS Cloud. This session will be presented by Jonathan Allen – AWS Enterprise Strategist and Evangelist. Sharing some of his experience and lessons learned when he was the CTO of Capital One UK, across the paradigms of People, Process and Technology and leveraging first-hand knowledge of the AWS Cloud Adoption Framework and Mass Migration best practices.
This document discusses Pressmart Media Limited, a cloud solutions provider based in India. It offers cloud services including AWS cloud services, publication cloud, and education cloud. It helps media companies transform to the cloud in 7 ways, such as launching new services fast, embracing new devices, and accessing content without copies. Pressmart can help achieve these transformations through typical deployments, use cases, and managing cloud technology risks for customers. The goal is to lay a foundation for balanced cloud compliance, control, and acceptance.
Presented at ISACA Indonesia Monthly Technical Meeting, 11 Dec 2019 at Telkom Landmark.
Key takeaways from my presentation:
1. Cloud customers have to understand the share responsibilities between customer and cloud provider
2. Different cloud service model (IaaS, PaaS, SaaS) has different audit methodology
3. Customer’s IT Auditor have to be trained to have the skills needed to audit the cloud service
4. Understanding IAM in Cloud is very important. Each Cloud Service Provider has different IAM mechanism
5. Understanding different type of audit logs in cloud platform is important for IT Auditor
This presentation makes the case for an organization to move its data storage to the cloud. It outlines the benefits of cloud storage, including improved data storage and retrieval, collaboration capabilities, and competitive advantage. It also discusses risks like privacy and security. Financial analysis shows the investment could pay for itself within 3 years, and adopting cloud storage strategically could enable better decision-making and collaboration across the organization.
2014 2nd me cloud conference trust in the cloud v01promediakw
This document discusses building trust in the cloud by achieving a secure, trusted, and audit-ready (STAR) cloud environment. It explains that cloud adoption is increasing but many organizations have a gap between expected cloud controls and implemented controls. To close this gap, the document recommends evaluating cloud environments based on the EY Cloud Trust Model, which consists of six control domains: technology, data, organizational, operational, audit and compliance, and governance. Achieving control in these domains can help organizations move applications and data to the cloud in a secure and trusted manner.
Similar to World Wide Technology: Is backing up to the cloud right for you? (20)
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Keywords: AI, Containeres, Kubernetes, Cloud Native
Event Link: https://meine.doag.org/events/cloudland/2024/agenda/#agendaId.4211
Introducing BoxLang : A new JVM language for productivity and modularity!Ortus Solutions, Corp
Just like life, our code must adapt to the ever changing world we live in. From one day coding for the web, to the next for our tablets or APIs or for running serverless applications. Multi-runtime development is the future of coding, the future is to be dynamic. Let us introduce you to BoxLang.
Dynamic. Modular. Productive.
BoxLang redefines development with its dynamic nature, empowering developers to craft expressive and functional code effortlessly. Its modular architecture prioritizes flexibility, allowing for seamless integration into existing ecosystems.
Interoperability at its Core
With 100% interoperability with Java, BoxLang seamlessly bridges the gap between traditional and modern development paradigms, unlocking new possibilities for innovation and collaboration.
Multi-Runtime
From the tiny 2m operating system binary to running on our pure Java web server, CommandBox, Jakarta EE, AWS Lambda, Microsoft Functions, Web Assembly, Android and more. BoxLang has been designed to enhance and adapt according to it's runnable runtime.
The Fusion of Modernity and Tradition
Experience the fusion of modern features inspired by CFML, Node, Ruby, Kotlin, Java, and Clojure, combined with the familiarity of Java bytecode compilation, making BoxLang a language of choice for forward-thinking developers.
Empowering Transition with Transpiler Support
Transitioning from CFML to BoxLang is seamless with our JIT transpiler, facilitating smooth migration and preserving existing code investments.
Unlocking Creativity with IDE Tools
Unleash your creativity with powerful IDE tools tailored for BoxLang, providing an intuitive development experience and streamlining your workflow. Join us as we embark on a journey to redefine JVM development. Welcome to the era of BoxLang.
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
QA or the Highway - Component Testing: Bridging the gap between frontend appl...zjhamm304
These are the slides for the presentation, "Component Testing: Bridging the gap between frontend applications" that was presented at QA or the Highway 2024 in Columbus, OH by Zachary Hamm.
Must Know Postgres Extension for DBA and Developer during MigrationMydbops
Mydbops Opensource Database Meetup 16
Topic: Must-Know PostgreSQL Extensions for Developers and DBAs During Migration
Speaker: Deepak Mahto, Founder of DataCloudGaze Consulting
Date & Time: 8th June | 10 AM - 1 PM IST
Venue: Bangalore International Centre, Bangalore
Abstract: Discover how PostgreSQL extensions can be your secret weapon! This talk explores how key extensions enhance database capabilities and streamline the migration process for users moving from other relational databases like Oracle.
Key Takeaways:
* Learn about crucial extensions like oracle_fdw, pgtt, and pg_audit that ease migration complexities.
* Gain valuable strategies for implementing these extensions in PostgreSQL to achieve license freedom.
* Discover how these key extensions can empower both developers and DBAs during the migration process.
* Don't miss this chance to gain practical knowledge from an industry expert and stay updated on the latest open-source database trends.
Mydbops Managed Services specializes in taking the pain out of database management while optimizing performance. Since 2015, we have been providing top-notch support and assistance for the top three open-source databases: MySQL, MongoDB, and PostgreSQL.
Our team offers a wide range of services, including assistance, support, consulting, 24/7 operations, and expertise in all relevant technologies. We help organizations improve their database's performance, scalability, efficiency, and availability.
Contact us: info@mydbops.com
Visit: https://www.mydbops.com/
Follow us on LinkedIn: https://in.linkedin.com/company/mydbops
For more details and updates, please follow up the below links.
Meetup Page : https://www.meetup.com/mydbops-databa...
Twitter: https://twitter.com/mydbopsofficial
Blogs: https://www.mydbops.com/blog/
Facebook(Meta): https://www.facebook.com/mydbops/
Lee Barnes - Path to Becoming an Effective Test Automation Engineer.pdfleebarnesutopia
So… you want to become a Test Automation Engineer (or hire and develop one)? While there’s quite a bit of information available about important technical and tool skills to master, there’s not enough discussion around the path to becoming an effective Test Automation Engineer that knows how to add VALUE. In my experience this had led to a proliferation of engineers who are proficient with tools and building frameworks but have skill and knowledge gaps, especially in software testing, that reduce the value they deliver with test automation.
In this talk, Lee will share his lessons learned from over 30 years of working with, and mentoring, hundreds of Test Automation Engineers. Whether you’re looking to get started in test automation or just want to improve your trade, this talk will give you a solid foundation and roadmap for ensuring your test automation efforts continuously add value. This talk is equally valuable for both aspiring Test Automation Engineers and those managing them! All attendees will take away a set of key foundational knowledge and a high-level learning path for leveling up test automation skills and ensuring they add value to their organizations.
"NATO Hackathon Winner: AI-Powered Drug Search", Taras KlobaFwdays
This is a session that details how PostgreSQL's features and Azure AI Services can be effectively used to significantly enhance the search functionality in any application.
In this session, we'll share insights on how we used PostgreSQL to facilitate precise searches across multiple fields in our mobile application. The techniques include using LIKE and ILIKE operators and integrating a trigram-based search to handle potential misspellings, thereby increasing the search accuracy.
We'll also discuss how the azure_ai extension on PostgreSQL databases in Azure and Azure AI Services were utilized to create vectors from user input, a feature beneficial when users wish to find specific items based on text prompts. While our application's case study involves a drug search, the techniques and principles shared in this session can be adapted to improve search functionality in a wide range of applications. Join us to learn how PostgreSQL and Azure AI can be harnessed to enhance your application's search capability.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
GlobalLogic Java Community Webinar #18 “How to Improve Web Application Perfor...GlobalLogic Ukraine
Під час доповіді відповімо на питання, навіщо потрібно підвищувати продуктивність аплікації і які є найефективніші способи для цього. А також поговоримо про те, що таке кеш, які його види бувають та, основне — як знайти performance bottleneck?
Відео та деталі заходу: https://bit.ly/45tILxj
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
"What does it really mean for your system to be available, or how to define w...Fwdays
We will talk about system monitoring from a few different angles. We will start by covering the basics, then discuss SLOs, how to define them, and why understanding the business well is crucial for success in this exercise.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
MySQL InnoDB Storage Engine: Deep Dive - MydbopsMydbops
This presentation, titled "MySQL - InnoDB" and delivered by Mayank Prasad at the Mydbops Open Source Database Meetup 16 on June 8th, 2024, covers dynamic configuration of REDO logs and instant ADD/DROP columns in InnoDB.
This presentation dives deep into the world of InnoDB, exploring two ground-breaking features introduced in MySQL 8.0:
• Dynamic Configuration of REDO Logs: Enhance your database's performance and flexibility with on-the-fly adjustments to REDO log capacity. Unleash the power of the snake metaphor to visualize how InnoDB manages REDO log files.
• Instant ADD/DROP Columns: Say goodbye to costly table rebuilds! This presentation unveils how InnoDB now enables seamless addition and removal of columns without compromising data integrity or incurring downtime.
Key Learnings:
• Grasp the concept of REDO logs and their significance in InnoDB's transaction management.
• Discover the advantages of dynamic REDO log configuration and how to leverage it for optimal performance.
• Understand the inner workings of instant ADD/DROP columns and their impact on database operations.
• Gain valuable insights into the row versioning mechanism that empowers instant column modifications.
2. Nick Cellentani has 30+ years of experience in computing
and storage infrastructure.
Nick is the senior executive for WWT’s Storage and Data
Protection practice at World Wide Technology--which
accounts for more than $500 million of storage hardware
procured per year.
Background in operations, software, and logistics,
infrastructure.
Nick Cellentani Storage & Data Protection Practice Leader
Nick.Cellentani@wwt.com
Introductions
3. Steve has 20+ years of experience in computing, storage
and data protection infrastructure.
Steve leads the Data Protection practice at World Wide
Technology, covering Data Protection, BC, and DR strategy
2+ yrs. at WWT, learning about the business model and
culture of WWT. Looking forward to another great year in
2016.
Steve Gregory Data Protection Discipline Lead
Steve.Gregory@wwt.com
Introductions
4.
5. GROWTH PRODUCTIVITY CUSTOMER RETENTION COST CONTROL RISK MITIGATION
PUBLIC SECTOR SERVICE PROVIDER COMMERCIAL ENTERPRISE
ADVISORY AGILE SOFTWARE DEVELOPMENT LAB SERVICES DEPLOYMENT STAFFING LIFECYCLE
6. Advanced Technology Center (ATC)
ATC MISSION
To create a collaborative ecosystem
to design, build, educate,
demonstrate and deploy innovative
technology products and integrated
architectural solutions for our
customers, partners and employees
around the globe.
8. The Journey Begins…
Whether You’re:
• Planning future investments in backup
infrastructure.
• Or, you’re interested in the potential value
of cloud backup for your organization.
• Or, you want to implement the cloud as an
off-site backup solution.
• Or, you’re hoping to reduce the
management effort associated with
backups.
We’ll Share our Framework and Real World Experience
Enabling you to make informed decisions
We’ll Help You:
• Understand the benefits, risks and value of
backing up to the cloud.
• Determine the degree of fit for your
organization’s backup data.
• Assess your organization’s readiness to
implement cloud backup.
• Quantify which type of cloud is best suited
for protecting your data.
9. Build a roadmap to
your cloud backup
implementation
Evaluate your
business case(s) for
backing up to
private, hybrid,
and/or the public
cloud
Identify candidate
data sets for cloud
backup and assess
opportunities and
readiness
Mitigate the
challenges of backing
up the cloud
The Steps of the Journey
12. CHALLENGE SOLUTION RESULTS
Journey Steps
Use Case Example 1: Unstructured Data
• Long retention policy on fast
growing unstructured data
• Maintaining up to 1 PB
• Using Backup Software to
replicate offsite
• Backup costs are high
• Reliability Issues
• Time-consuming management
and administration
• Not meeting SLAs
• Hybrid Data Protection
approach
• Upgrade Backup software
to support REST API
model
• Remove tape
• Options for scale out
storage with
deduplication and Object
store
• Provide Search and Index
capability
• Modernized Backup with
minimal added resources or
skills.
• Provide All-in-One Data
Repository for Backup and
Archive Data.
• Expect cost savings of
doing “business as usual”
versus investment in Hybrid
Cloud.
Identify Business
Case for Cloud
Identify Candidate
Data Sets and
Assess
Opportunities and
Readiness
Mitigate the
challenges of
backing up the
cloud
Build a roadmap
to your cloud
backup
implementation
13. CHALLENGE SOLUTION RESULTS
Journey Steps
Use Case Example 2: LTR
• Long Term Tape Retention
(LTR)
• Dated storage policy of
Retention forever on tape
• Maintaining several PB’s of
tape at provider
• Using backup software to
copy each night
• While tape is considered
“cheap” not at this scale
• Drive maintenance
• Maintain operational
recovery from disk or
PBBA
• New Cloud target instead
of tape –Object Store,
Azure, AWS, Etc.
• Data ingest services from
tape to disk
• Terminate offsite tape
contract and power
down several drives
• Easier and faster
regulatory response time
and potential elimination
of fines
• Leverage data for
business value – search
and analyze.
Identify Business
Case for Cloud
Identify Candidate
Data Sets and
Assess
Opportunities and
Readiness
Mitigate the
challenges of
backing up the
cloud
Build a roadmap
to your cloud
backup
implementation
15. Journey Advisor / Planner
1. Evaluate your business case(s) for targeting private, hybrid,
and/or public options
2. Identify Candidate Data Sets for Cloud Backup and Assess
Opportunities and Readiness
3. Mitigate the Challenges of Backing Up the Cloud
4. Build a Roadmap to your Cloud Backup Implementation
16. Don’t Worry, We Have Cruise Control
Cloud Backup Readiness Assessment
Bryan Kreutz –
Welcome
TEC37 series
Today’s topic, “Is Backing up to the Cloud Right for You?”
Research has found that 37 minutes is the optimal amount of time for webinars, so we’ll end this within 37 minutes, but our speakers will stay on for Questions and Answers.
Introducing Nick Cellentani , WWT’s Storage & Data Protection Practice Leader
Nick has 30+ years of experience in computing and storage infrastructure.
Nick has a background in operations, software, and logistics, infrastructure.
Next…
Steve Gregory is WWT’s Data Protection Discipline Lead.
Steve has 20+ years of experience in computing and storage infrastructure.
Steve leads the Data Protection practice at World Wide Technology, covering Data Protection, BC, and DR strategy
He’s had successful 2 yrs. at WWT, learned so much about the business and how to fit into the culture of WWT. Looking forward to another great year in 2016.
Before we get into the details related our unique customer engagement model, let me tell you a little bit about our company. The foundation of World Wide Technology is centered around our vision, mission and core values. Our vision is to provide revolutionary technology products, services and supply chain solutions for our customers around the globe. While our mission is to “Create a profitable growth company that is also a great place to work.”
Our Core Values are the basis for how we run our business. Known as “The Path” our Core Values combined with our integrated management and leadership approach form the basis of what we refer to as “The World Wide Way”.
I would encourage you to ask each World Wide team member that you meet how the Core Values impact our business.
We have built and commissioned a comprehensive framework to enable the journey in answering the question is backing up to the cloud right for you.
Slide 8 Notes
Seeking the answers to the question, backing up to cloud has three major things that are noteworthy
Getting the answers requires wading through complexity
Cloud is not one thing, it is many
Leading yourself and your organization on this journey is as important, if not more important than the answers.
Complexity of the Answer
While the question is simple, getting the answer correct is complex. In order to understand the who, what, when where and why. We need to take into account the many factors:
We feel an organization’s readiness is comprised of the following attributes:
Security and Compliance, Availability, Data integration, IT Skills and roles,
In addition we think and organization’s value can be derived from the following Cost efficiency, Service Agility
Cloud has many different forms
Our definition of cloud for the use in this journey is Private, Public, Hybrid, on Prem, off Prem. It can be IAAS, DRAAS etc. So, open your thinking to all the different faces of Cloud when discussing the viability of backing up to it.
By the way backing up to the cloud is not a silver bullet, Targeting backups at the cloud is not a quick win
Taking your organization on the journey
In order to arrive at a conclusion all of the stakeholders need to go on the journey to understand the dimensions of the why for a consistent buy in when concluding. This journey will help the stakeholders take ownership of the data protection environment.
So many times we see organizations asking for the short cut, tell me the answer, give me the best practices. While we will be discussing what the outcomes of going through this framework and workshop many times, we feel the organization learns more and becomes more agile by going on the journey. And this way stakeholders become enabled, not only these decisions, but are better prepared to make future one. Backing up to the cloud is not fairy dust, it does not contain any magic
The purpose of this Webinar is to help you simplify the complex, by sharing our framework and workshop courseware we have developed, so you don’t have to reinvent the wheel.
Step 1 Develop business cases
The beginning – Aligning and understand the business cases associated. Here is where it is important to understand and document the following:
Who are the stakeholders, lines of business, application owners, IT, Backup admin
What are the goal(s) of those stakeholders, like reduction in costs, avoidance of internal expansion, management time reduction on administration?
What is the present state or situation? 20 hours per week spent on backup admin, No offsite copies
What is the desired or Goal – reduce admin time by 10 hours per week, reduce Capex during the refresh by 50%
Step 2 Identify potential candidates, assess opportunities and readiness.
By data set, document the following current size and a project growth rate, RTO and RPO
Next analyze each data set for value by documenting and understanding your organizations desires for cost savings, in the form of operational, capital, personnel and agility. Assign a value scale to each attribute and sum.
Your readiness can be quantified by documenting and evaluating, by data set Security and Compliance, Availability, Integration efforts, IT skills and enabling technologies. Quantify these attributes and sum them up to give you a readiness index
Plot these two to with value on the y axis and readiness on the x axis to reveal go, no go or maybe candidates.
Step 3 Mitigate the challenges for the go candidates
Use the present and proposed state gaps to develop transition requirements for each “good” candidate.
Then determine which cloud model will be best fit for these candidates. Measure the business impact and likelihood that it will occur.
Step 4 Build a roadmap
Nothing magical here, all of the hard work in steps 1,2,3 will pay off in developing a roadmap for implementation.
New backup targets
Public Cloud (D2D2C)
Secondary Storage like Object Store, Software Defined Storage and Converged Disk technology
New Data Capture
Cloud Gateways
CDM Instant Recovery – As A Service Model via Service Provider
Mount and resume versus restore
By 2017, the number of enterprises using Cloud as a backup destination will double, up from 7% today at the beginning of 2015.
By 2018, 50% of applications with high change rates will be backed up directly to deduplication target appliances, bypassing traditional backup software, up from 25% today.
This customer was looking to understand overarching industry trend toward new architectural models, such as software-defined storage/data centers, As-A-Service, Disk based Backups, Replication and Archiving a PB of unstructured data
A Private cloud based solution was potential fit -
Using current tools
workloads with long SLO’s
Replicating active archives
13
1) Evaluate your business cases for targeting private, hybrid, and/or public options.
Start your project off on the right foot. Discuss the value of the cloud and identify realistic objectives for your backup environment implementation. Determine whether private, hybrid, and/or public cloud is the solution to achieve them.
2) Identify Candidate Data Sets for Cloud Backup and Assess Opportunities and Readiness
Identify candidate data sets for cloud backup. Determine RPOs and RTOs for candidate data sets. Discuss the potential value of the cloud for your candidate data sets. Evaluate organizational readiness for targeting backup at the cloud. Pinpoint best-fit data sets.
3) Mitigate the Challenges of Backing Up the Cloud
Select the cloud provider model that best fits your needs. Identify probably risks and estimate their likelihood. Develop a contingency plan for likely challenges. Evaluate the financial argument for – or against – cloud backup.
4) Build a Roadmap to your Cloud Backup Implementation
Perform a gap analysis to determine implementation initiatives. Construct a roadmap for your cloud backup implementation.