CommVault Intro SureSkills Simpana 10 Event 2013

1,482 views

Published on

CommVault Intro SureSkills Simpana 10 Event 2013

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,482
On SlideShare
0
From Embeds
0
Number of Embeds
91
Actions
Shares
0
Downloads
110
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Q1- For 1 point- Who can tell me the difference between Backup and Archive?Q2- For 2 points- What is a Hypervisor?Q3- For 2 points- What is the difference between source-and target-side Deduplication?Q4- For 1 point- Does a replicated copy of data constitute a backup? Q5- For 1 point- Why not?Q6- For 2 points- How and why do databases need to be protected differently then general file system data?
  • Q1- For 1 point- Who can tell me the difference between Backup and Archive?Q2- For 2 points- What is a Hypervisor?Q3- For 2 points- What is the difference between source-and target-side Deduplication?Q4- For 1 point- Does a replicated copy of data constitute a backup? Q5- For 1 point- Why not?Q6- For 2 points- How and why do databases need to be protected differently then general file system data?
  • Let me start by telling you a little story….Looking back, life in IT used to be simple. Slow, but simple. A leisurely walk through the park with days of punch cards, reel-to-reel tape, mainframes. The blood and guts of your data center might live on a tape or two that you could juggle if you were feeling adventurous. Disaster recovery might mean throwing a box of DLT tapes in the back of the family wagon for the weekend. A server was a server, an application was an application that lived on that server, and you could back it all up with tape, oh glorious tape… The good ole days.
  • But life’s gotten a bit more complex these days. The Protection of Traditional Data, while still a challenge for many, is really only one minor piece of the puzzle
  • Environments are becoming more complex, with the interaction between the server, application, OS, and storage layers interwoven. Keeping track of the interdependencies is proving to be more and more challenging, especially considering the lack of robust reporting in most environments.In the same vein, the sourceof data generation from is now spread across geographies and devices, beyond the datacenter from remote, branch, home offices, and edge devices. Regardless of the source, the value and criticality of the data is not remains, but the complexity of managing it increases..
  • Another area that’s being discussed frequently in the market is the concept of big data, and the challenges that customers are having contending with the massive data sets that are being generated. There are 2 ways to look at the big data challenge: for most customers the biggest challenges they have are dealing with the ever-expanding data sets that they need to manage in their environment. When an organization was contending with data sets in the gigabytes, traditional methodologies of protection and management were sufficient.ClickBut as the volume of data continues to grow exponentially, from gigabytes, to terabytes, to in some cases petabytes of data traditional methods of management and protection are no longer sufficient.In addition, despite storage costs which diminish on a per terabyte basis year after year, the volume of data growth is outpacing savings, and the cost of storing data continues to grow exponentially.
  • The other big driver around the “big data” phenomenon surrounds leveraging analytical tools in order to gain business value from the mounds of information being collected.That said, the volume of data available, the wide variety of data in environments, and the velocity by which data characteristics change prove to be barriers to deriving true value from an organization’s data.Optional for referenceOver the last 18 months, the term “Big Data” has been used in more and more data centers. Depending on the vendor nomenclature, the term is sometimes used to describe the skyrocketing size of the data footprint within the data center and management challenges associate with it, and sometimes describes processes and methodologies used to derive business value through the analysis of that data. Many businesses, particularly in certain verticals, want to be able to take a look at the unstructured data within their data center and gain knowledge about their business operations.
  • Another factor that comes into play is the continued growth of virtualization technologies. Whereas just a few years ago, virtualization may have been relegated to the back corner of the datacenter and used for non-mission critical applications now many organizations are virtualizing their data centers in their entirety. Many customers have had challenges dealing with the new virtualization paradigm in terms of how to sufficiently manage and protect these environments.Many of the traditional data protection vendors were caught off-guard with the virtualization boom, and the legacy platforms they provide customers did not suffice for these new models.
  • Virtualization has also been a catalyst for the migration to cloud-based managed service offerings.Multi-Tenancy and IT as a Utility have not only blurred the Data Center’s role but have also driven wider adoption of all virtualization technologies: Server, Storage and Networking. This move has meant that protection, manageability and dynamic reaction to change have become high priority business items that need to be addressed. This is manifesting itself in a number of ways:Many customers are migrating some or all of their applications and assets to cloud environmentsSolutions Providers are developing new models and services to accommodate those demandsAnd customers are taking advantage of the technologies that enable more effective provision of services to their internal customers in an attempt to accommodate demand and maintain their relevance in their organizations.
  • The mobilization trend within the corporate workforce has put new challenges on IT staff in data centers everywhere. Now, mobile users need access to critical data and applications wherever they may be. Even if a company’s workforce is not inherently mobile, bring your own device policies and the continued growth of data residing on smartphones and tablets pose similar issues to data centers across the world. In addition, the enforcement of IT policies and protection of data residing outside the data center become critical for successful IT operations.
  • The BYOB trend is also exacerbated by users becoming more savvy, and being accustomed to having access to their data anytime,anywhere and on demand. Users and have had exposure to technologies such as commercial file sharing and synchronization services which leads to expectations of a higher SLAs from their IT departments.
  • With the growth of data management, the burden of eDiscovery has shifted to IT. For most companies, stopping tape rotation and farming the media to a third party is no longer an effective means of addressing discovery requests. IT’s role in legal discovery has grown to ensure that the right data is being captured so that ensure that the organization is capable of supplying the necessary information for litigation or compliance purposes without disrupting overall business operations.
  • And last but certainly not least is the fact that IT organizations are being tasked with these increasing burdens while concurrently staff has been reduced due to economic pressures. Folks are continuously being asked to do more with less,.
  • These factors, as well as many others, have led to a perfect storm of data management challenges that are causing organizations to rethink their traditional methodologies for protection and management of their critical data assets.
  • The simple fact is that backup in and of itself no longer equates to a comprehensive data management strategy. The good old days are gone, and organizations need to find ways to modernize their data management strategies.
  • When we look at how organizations have addressed data management issues historically, it has been primarily with a silo approach, but as data management challenges have emerged, organizations have invested in point solutions to address them. Often times this is due to the fact that their legacy solutions cannot accommodate the new emerging protection and management needs.For example you see many environments where multiple backup solutions exist. One for data center assets, another for protection of mission-critical databases, and another for virtualized environments. Likewise differing solutions are often deployed for protection of data in remote sites or data that resides at the edge of an environment on laptops and remote desktops.As another example, when de-duplication started to emerge as a viable technology in the market, many organizations invested in appliances to address the long-term capacity management and retention needs that the duplication helps to address.The bottom line is that customers end up with multiple point solutions that have minimal or no integration points across them, even though they are dealing with the same data on the backend.
  • Pools of Data and InfrastructureIn a silo approach, customers are left with islands of infrastructure scattered throughout an organization. Whether it’s servers, networking infrastructure, storage, or some combination thereof, or appliances which are essentially a server with some storage and proprietary software running on it, you wind up with isolated pockets of infrastructure that cannot be leveraged except for their designated purpose. If you’ve been around the market long enough you’ll recognize this as being very similar to the challenges organizations had with direct attached storage prior to the emergence of San and network attached shared storage technologies. All of this hardware has costs associated with it, that can contribute to non-optimized and inefficient infrastructure spend.
  • One of the obvious burdens of having a multitude of solutions in a single environment is the operational aspect. Often the solutions are spread across multiple functional groups, and valuable resources are pulled away from their core competencies and value to help with the data management functions.
  • The redundant data in organizations is also highly problematic.Some of this is due to obvious reasons, such as the same files being housed across multiple users, but we also see inefficient use of storage due to application owners tendency to over protect their digital assets.There's a large customer of ours in the Midwest that was using our software to protect much of their environment, and they were exploring expanding their capabilities to protect more of their advanced databases. We assisted with an analysis for them, and one of the things that we found was that on average their Oracle databases were being copied 12 times, with all of the dumps sitting on primary spinning disk.In non-optimized non-managed environments, the redundancy of data can lead to wasted storage and increased costs.
  • Legacy data management solutions may be able to accommodate some of the challenges facing their customers with a silo’d point solution approach, but instead of solving problems of complexity, operational efficiency and cost, they are actually exacerbating many of these issues.
  • And if we look at our competitive brethren in the market, they’ve actually propagated this issue. As data management challenges have emerged in the market, many of the organizations in this space have grown their portfolios by acquiring organizations whose products address specific needs. For sake of example, when de-duplication began to emerge as a viable and effective technology, the major backup and recovery providers went shopping to acquire niche organizations that could facilitate their presencein this arena.You will invariably hear providers talk about buying best-of-breed solutions to build their portfolio. But remember, only one can be best-of-breed,then you are left with the best of whatever is left over.You’ll also hear about the integration of these disparate solutions, but is usually an iterative process that takes place over multiple releases of the software packages. Much of the development efforts are focused on integrationat the expense of innovation.
  • The grid depicted here is in stark contrast to the approach that CommVault has taken with our Simpana(r) data management software suite. CommVault’s Simpana® software combines a multitude of data management functions into a truly integrated software suite that has been built on a common code base without injection of third-party solutions. We call it our Singular Information Management® philosophy.
  • CommVault’s Simpana solution provides the capability to handle a wide variety of data management functions in a truly integrated holistic package for data management and protection. By adopting Simpana® software, a customer can see a dramatic increase in operational efficiency, as well as significantdecrease in costs and inefficiencies associated with data management on both the infrastructure and operations side.This is how Simpana software can drive significant savings and operational efficiencies within organizations. By combining data management functions into a common platform, also known as are common technology engine, we are able to reduce infrastructure burden, drive operational efficiencies, and significantly reduce costs for organizations.
  • The Common Technology Engine continues to be the CommVault’s innovative platform. The messaging on this slide depicts the architectural flexibility that Common Technology Engine continues to provide. This platform provides the infrastructure benefits inherent in previous releases of Simpana software but continues to enhance the enterprise integration of those products. CommVault continues to enhance Simpana® features and functionality, several of which will be addressed in the remainder of this module.
  • Simpana Software includes “best-in-class” modules for:Backup, Recovery & DRDeduplicationVirtualization and CloudArchiveeDiscoverySearch
  • To wrap up, Commvault’s Simpana software for data protection and information management helps organizations manage data growth, cut costs and reduce risk by simplifying data management functions through a single, unified platform and architecture. Our common code base ensures that you’ve purchased not only a product that is mature, but also stable and proven.The extensibility of the platform to deliver new features and functionality to meet emerging needs provides “future-proofing” of the data center, along with investment protection.
  • To wrap up, Commvault’s Simpana software for data protection and information management helps organizations manage data growth, cut costs and reduce risk by simplifying data management functions through a single, unified platform and architecture. Our common code base ensures that you’ve purchased not only a product that is mature, but also stable and proven.The extensibility of the platform to deliver new features and functionality to meet emerging needs provides “future-proofing” of the data center, along with investment protection.
  • CommVault Intro SureSkills Simpana 10 Event 2013

    1. 1. CommVault Introduction Mark Bell mbell@commvault.com 07788 818 589
    2. 2. Level Set What do Commvault do?         Backup & Recovery Archive Virtualization Deduplication Replication Content Indexing Snapshots Database Protection  Edge Data Protection  Cloud Storage  Storage Resource Management  Encryption  eDiscovery  Storage Infrastructure  Retention
    3. 3. So why is Commvault so different?  Data Protection  Data Management  Information Management What are your customers looking for now?
    4. 4. I want it NOW!!
    5. 5. I want it NOW!!
    6. 6. The Legacy Approach Point/Silo Approach to Data Management Windows Protection Search VM Protection Archive Storage Resource Management Deduplication HSM Snapshot Management Reporting Remote Backup Database Protection UNIX Protection Replication
    7. 7. Infrastructure Silos
    8. 8. Operational Inefficiency
    9. 9. “Over-Protection” and Bloat
    10. 10. Legacy Portfolio Coverage
    11. 11. Your customers? Symantec Backup & Recovery Archive, Email, Files, eDiscovery & Compliance Data Reduction Deduplication & Compression Enterprise Reporting CA Technology IBM EMC Networker NetBackup D2D HomeBase TSM for SharePoint Centera Commo n Store Princeton Softech Enterprise Vault Xtenders SourceOne PureDisk Command Central SRM Control Center Replication, Management, & CDP Desktop / Laptop Replication Exec™ Recover Point Replistor NOT AVAILABLE DPA Repl. Manager
    12. 12. The Simpana® Way CommVault
    13. 13. Singular Information Management® Data Center Backup VM Protection SRM Snap Mgmt Database Protection Archive Reporting Remote Backup Replication Search Dedupe
    14. 14. Single Platform for Data & Info Management Extensible and Adaptable APPLICATIONS DATABASES MESSAGING FILE SYSTEMS EDGE AUTOMATE MANAGE Data Management Business Insight  Protection Copies  Reduce Complexity  Archive Copies  Reduce Sprawl  Snapshot Copies  Increase Data Governance  Many Devices  Reduce cost  Many Locations  Increase Efficiency & Agility  Intelligent Index  Mitigate Risk RECOVER RETAIN DISCOVER REPORT ACCESS DISPOSE EMPOWER
    15. 15. CommVault’s Mission and Background Solely dedicated to enterprise data and information management Simpana® Software provides truly integrated data management and protection Modern Data and Information Management enables companies to efficiently capture, move, retain, find and recover data from any storage tier Our customers reduce cost, improve operational efficiency and derive business value from their stored data
    16. 16. Market Proven FY13 Results
    17. 17. Enterprise Disk Backup Magic Quadrant CommVault is positioned as a Leader in Gartner’s Magic Quadrant for Enterprise Backup/ Recovery Software for the third consecutive year • This Magic Quadrant graphic was published by Gartner, Inc. as part of a larger research note and should be evaluated in the context of the entire report. • The Gartner report is available upon request from CommVault. • The Magic Quadrant is copyrighted by Gartner, Inc. and is reused with permission. • The Magic Quadrant is a graphical representation of a marketplace at and for a specific time period. • It depicts Gartner’s analysis of how certain vendors measure against criteria for that marketplace, as defined by Gartner. • Gartner does not endorse any vendor, product or service depicted in the Magic Quadrant, and does not advise technology users to select only those vendors placed in the “Leaders” quadrant. • The Magic Quadrant is intended solely as a research tool, and is not meant to be a specific guide to action. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. Source: Magic Quadrant for Enterprise Backup/ Recovery Software; by David Russell, Sheila Childs, Pushan Rinnen; June 5, 2013 © 2013 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. For more information, e-mail info@gartner.com or visit gartner.com. Used with permission.
    18. 18. A Single Platform for Data & Info Management Extensible & Adaptable Backup and Recovery Deduplication Self-Service Access Snapshot Management Intelligent Archive Regulatory & Compliance Virtual Server Protection Storage Tiering Content Indexing for Search Simpana OnePass™ Technology Workflow CommVault Edge™ Technology Edge Data Protection Reporting, Trending, & Analytics Data Insight
    19. 19. Simpana® Software Extensible and Exhaustive Data Management Commvault’s Simpana software for data protection and information management helps organizations manage data growth, cut costs and reduce risk by simplifying data management functions through a single, unified platform and architecture.
    20. 20. Simpana® Software Extensible and Exhaustive Data Management What can we help you with?

    ×