Veel organisaties worstelen op dit moment met hun datagroei. Traditionele back-up technieken voldoen niet langer en zullen de uitdagingen op het gebied van data- en informatie management van vandaag niet kunnen oplossen. Want kan de huidige back-up software:
- Analyse, back-up en herstel, replicatie, archivering en het doorzoeken van data en informatie vanuit één GUI bekijken, beheren en benaderen?
- Data op elk opslagapparaat, van datacenters tot desktops en laptops in de cloud, beveiligen en beheren?
Bekijk in deze presentatie hoe binnen een uniform dashboard een volledig holistische benadering van datamanagement met een flexibele en uitbreidbare architectuur wordt gerealiseerd.
This document provides an introduction to Commvault, a company that offers data protection and information management software. It discusses Commvault's Singular Information Management platform, which provides a single software solution for backup and recovery, archiving, replication, deduplication, search, and other data and storage management functions. This consolidated approach aims to reduce complexity compared to point solutions from multiple vendors. The document also notes that Commvault has been recognized as a leader in Gartner's Magic Quadrant for enterprise backup and recovery software. It positions Commvault as a proven solution that can help organizations manage growing data volumes while lowering costs and risks.
SureSkills - Introducing Simpana 10 Features Google
Simpana 10 is a new version of CommVault's data protection and management software that features significant improvements including:
- Faster backup and archive speeds of up to 2x and 50% less time managing operations through features like OnePass and IntelliSnap.
- Increased scalability, efficiency and reduced complexity for large enterprises and cloud environments through new virtual machine integration and automation capabilities.
- Enhanced search, reporting, mobile access and self-service features that boost workforce productivity and transform protected data into accessible information assets.
This document summarizes a webinar about maximizing IT for business advantage. The webinar focuses on three key technologies: all-flash systems that accelerate access to information, unified storage solutions that enable processing more workloads in less time, and unified compute solutions that enhance productivity while avoiding over or underprovisioning. Upcoming webinars are listed on optimizing flash storage and unified storage performance, simplified VMware management, and deploying Microsoft private cloud with SQL Server data warehouse on Hitachi solutions.
The document discusses the benefits and trends of modernizing a data warehouse. It outlines how a modern data warehouse can provide deeper business insights at extreme speed and scale while controlling resources and costs. Examples are provided of companies that have improved fraud detection, customer retention, and machine performance by implementing a modern data warehouse that can handle large volumes and varieties of data from many sources.
PCTY 2012, Tivoli Storage Strategi og Portfolio Update v. Greg TevisIBM Danmark
The document provides an overview and strategy for IBM Tivoli Storage. It discusses IBM's Tivoli Storage strategy, including moving to a cloud strategy with consumption-based metering and dynamic capacity optimization. It summarizes updates to products in the Tivoli Storage portfolio, including enhancements to IBM TSM for improved data reduction, virtualization support, user experience, disaster recovery, scalability, platform coverage, and licensing.
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
HDS Influencer Summit 2014: Innovating with Information to Address Business N...Hitachi Vantara
Top Executives at HDS share how the company is Innovating with Information to address business needs. Learn how the company is transforming now and into the future. #HDSday.”
This document provides an introduction to Commvault, a company that offers data protection and information management software. It discusses Commvault's Singular Information Management platform, which provides a single software solution for backup and recovery, archiving, replication, deduplication, search, and other data and storage management functions. This consolidated approach aims to reduce complexity compared to point solutions from multiple vendors. The document also notes that Commvault has been recognized as a leader in Gartner's Magic Quadrant for enterprise backup and recovery software. It positions Commvault as a proven solution that can help organizations manage growing data volumes while lowering costs and risks.
SureSkills - Introducing Simpana 10 Features Google
Simpana 10 is a new version of CommVault's data protection and management software that features significant improvements including:
- Faster backup and archive speeds of up to 2x and 50% less time managing operations through features like OnePass and IntelliSnap.
- Increased scalability, efficiency and reduced complexity for large enterprises and cloud environments through new virtual machine integration and automation capabilities.
- Enhanced search, reporting, mobile access and self-service features that boost workforce productivity and transform protected data into accessible information assets.
This document summarizes a webinar about maximizing IT for business advantage. The webinar focuses on three key technologies: all-flash systems that accelerate access to information, unified storage solutions that enable processing more workloads in less time, and unified compute solutions that enhance productivity while avoiding over or underprovisioning. Upcoming webinars are listed on optimizing flash storage and unified storage performance, simplified VMware management, and deploying Microsoft private cloud with SQL Server data warehouse on Hitachi solutions.
The document discusses the benefits and trends of modernizing a data warehouse. It outlines how a modern data warehouse can provide deeper business insights at extreme speed and scale while controlling resources and costs. Examples are provided of companies that have improved fraud detection, customer retention, and machine performance by implementing a modern data warehouse that can handle large volumes and varieties of data from many sources.
PCTY 2012, Tivoli Storage Strategi og Portfolio Update v. Greg TevisIBM Danmark
The document provides an overview and strategy for IBM Tivoli Storage. It discusses IBM's Tivoli Storage strategy, including moving to a cloud strategy with consumption-based metering and dynamic capacity optimization. It summarizes updates to products in the Tivoli Storage portfolio, including enhancements to IBM TSM for improved data reduction, virtualization support, user experience, disaster recovery, scalability, platform coverage, and licensing.
Big Data – Shining the Light on Enterprise Dark DataHitachi Vantara
Content stored for a business purpose is often without structure or metadata required to determine its original purpose. With Hitachi Data Discovery Suite and Hitachi Content Platform, businesses can uncover dark data that could be leveraged for better business insight and uncover compliance issues that could prevent business risks. View this session and learn: What is enterprise dark data? How can enterprise dark data impact business decisions? How can you augment your underutilized data and deliver more value? How can you decrease the headache and challenges created by dark data? For more information please visit: http://www.hds.com/products/file-and-content/
HDS Influencer Summit 2014: Innovating with Information to Address Business N...Hitachi Vantara
Top Executives at HDS share how the company is Innovating with Information to address business needs. Learn how the company is transforming now and into the future. #HDSday.”
Transform to Cognitive Healthcare with IBM Software Defined Infrastructure an...Paula Koziol
Medical data is exploding. The internet of things is changing how we work and live. The healthcare industry is responding and transforming. In this cognitive and cloud era, IBM is positioned to help healthcare organizations of all sizes transform, thrive and deliver better outcomes. Learn about IBM's cognitive healthcare platform for infrastructure and how it delivers a scalable, secure hybrid cloud for GE Healthcare applications and cloud ecosystems. Review of case studies demonstrate the resiliency, flexibility and cost savings achieved while managing the velocity of enterprise imaging and healthcare data.
Addressing VMware Data Backup and Availability Challenges with IBM Spectrum P...Paula Koziol
Whether in the enterprise or small-to-medium sized firms, VMware IT administrators and storage management teams face an increasingly complex set of decisions when it comes to deploying, managing, protecting and supporting storage infrastructure. Hear about the emerging issues in the most rapidly changing part of the IT environment, virtual machine (VM) management and availability. Learn about IBM’s perspective on data availability and how it is addressing future challenges you may not even be thinking about today with the new, highly flexible IBM Spectrum Protect Plus VM backup and availability management solution.
Powering the Creation of Great Work Solution ProfileHitachi Vantara
Hitachi Data Systems provides scalable storage solutions to power digital workflows in film, video, and game production. Their solutions deliver high performance, capacity, and modular architecture to handle large data volumes and enable simultaneous access. This removes bottlenecks and constraints, allowing creative teams to focus on their work without storage limitations. Hitachi storage drives improved productivity, accelerated rendering, and reduced production costs for studios.
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Cloudera, Inc.
In this session, we will cover how to move beyond structured, curated reports based on known questions on known data, to an ad-hoc exploration of all data to optimize business processes and into the unknown questions on unknown data, where machine learning and statistically motivated predictive analytics are shaping business strategy.
Hitachi Virtual Infrastructure Integrator (Virtual V2I) is a VMware vCenter plugin plus associated software. It provides data management efficiency for large VM environments. Specifically, the latest release addresses virtual machine backup and recovery and cloning services. Customer want to leverage storage based snapshots as it is scalable, more granular backup from hours between backups to minutes resulting in improved RPO. VMworld 2015.
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bring...Hitachi Vantara
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bringing Flexibility, Agility and Readiness to the Real-Time Enterprise. VMworld 2015
Accelerate the Business Value of Enterprise StorageHitachi Vantara
When it comes to enterprise storage, IT has always had to choose between features and cost. Ongoing tradeoffs between the best technologies to support business operations and an adequate budget to pay for those technologies generally impede an organization’s ability to be competitive, innovative and cost efficient. The entry-enterprise storage market has opened up new opportunities for storage customers – and eliminated the need for tradeoffs. Join this webinar to understand how to accelerate business value with entry-enterprise storage systems and learn about the new Hitachi Data System offering, Hitachi Unified Storage VM. View this WebTech to: Understand the common tradeoffs and challenges within the entry-enterprise storage market. Understand the business value of new entry-enterprise offerings. Learn how Hitachi Unified Storage VM is bringing enterprise-level features to the midrange. For more information on Hitachi Unified Storage VM please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-vm.html?WT.ac=us_mg_pro_husvm
Solve the Top 6 Enterprise Storage Issues White PaperHitachi Vantara
Storage virtualization can help organizations solve common enterprise storage issues by consolidating multiple physical storage systems into a single virtual pool. This allows for increased utilization of existing assets, simplified management across heterogeneous systems, and reduced costs through measures like thin provisioning and automation. Virtualization helps organizations address issues like exponential data growth, low storage utilization, increasing management complexity, and rising capital and operating expenditures on storage infrastructure.
This document discusses the benefits of software defined storage (SDS) in addressing challenges posed by changing business needs, data growth, and complexity. It introduces IBM's Spectrum Storage family which provides a comprehensive set of SDS offerings that can be deployed flexibly on cloud, as an appliance, or software. The solutions aim to securely "unbox" data from hardware constraints and optimize storage costs through analytics-driven management and automatic data placement across systems. Case studies show customers transforming their storage and reducing costs with IBM Spectrum Storage.
A-B-C Strategies for File and Content BrochureHitachi Vantara
Explains each strategy, including archive 1st, back up less, consolidate more, distributed IT efficiency, enable e-discovery and compliance, and facilitate cloud. For more information on Unstructured Data Management Solutions by HDS please visit: http://www.hds.com/solutions/it-strategies/unstructured-data-management.html?WT.ac=us_mg_sol_udm
Hitachi Unified Storage 100 family systems consolidate and manage block, file and object data on a central platform. For more information on our unified storage please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-100-family.html?WT.ac=us_mg_pro_hus100
Leveraging the cloud for analytics and machine learning 1.29.19Cloudera, Inc.
Learn how organizations are deriving unique customer insights, improving product and services efficiency, and reducing business risk with a modern big data architecture powered by Cloudera on Azure. In this webinar, you see how fast and easy it is to deploy a modern data management platform—in your cloud, on your terms.
Delivering Modern Data Protection for VMware EnvironmentsPaula Koziol
The document discusses modern data protection trends for VMware environments and IBM's data protection portfolio. It notes that organizations face challenges around data growth, security threats, and the need for data availability and reuse. IBM's portfolio aims to address these challenges through features like agentless backups, instant recovery, data reuse capabilities, cyber resiliency tools, and integration with multiple clouds. The document provides an overview of why various stakeholders in an organization, like business leaders, lines of business, and IT teams, care about modernizing data protection and what capabilities are important to them.
Storage Analytics: Transform Storage Infrastructure Into a Business EnablerHitachi Vantara
View this webinar session to learn how you can transform your storage infrastructure into a business enabler. You will learn: Tips and tricks to streamline storage performance monitoring across your Hitachi environment. How to define and enforce performance and capacity objectives for key business applications by establishing storage service level management. How to create storage service level management reports that satisfy the needs of multiple IT stakeholders (that is, CIO, architect, administrator). For more information on controlling costs of sprawling storage with storage analytics white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-control-costs-and-sprawling-storage-with-storage-analytics.pdf
In this presentation from GITEX 2018, Virgil Dobos provides his perspective on creating a comprehensive data management strategy with Veritas solutions.
Microsoft SQL Server 2012 Data Warehouse on Hitachi Converged PlatformHitachi Vantara
Accelerate breakthrough insights across your organization with Microsoft SQL Server 2012 Data Warehouse running on the mission-critical and ready-to-deploy Hitachi server-storage-networking platform, Hitachi Unified Compute Platform. Amplify infrastructure performance with Hitachi and Microsoft SQL Server 2012 Fast Track Data Warehouse xVelocity in-memory technologies. Learn how your organization can extract 100 million+ records in 2 or 3 seconds versus the 30 minutes required previously. With SQL Server 2012 Fast Track Data Warehouse and Hitachi software, your organization will be able to leverage a data platform that processes any data anywhere. View this webcast and learn:How to reduce deployment time with ready-to-deploy solutions that have been engineered and pre-configured by Hitachi and validated by the Microsoft Fast Track Data Warehouse program. How Hitachi and Microsoft have optimized performance for your data warehouse requirements. How your organization can realize immediate ROI from your data warehouse investment. For more information on Hitachi Unified Compute Platform please visit: http://www.hds.com/products/hitachi-unified-compute-platform/?WT.ac=us_mg_pro_ucp
2020 Cloudera Data Impact Awards FinalistsCloudera, Inc.
Cloudera is proud to present the 2020 Data Impact Awards Finalists. This annual program recognizes organizations running the Cloudera platform for the applications they've built and the impact their data projects have on their organizations, their industries, and the world. Nominations were evaluated by a panel of independent thought-leaders and expert industry analysts, who then selected the finalists and winners. Winners exemplify the most-cutting edge data projects and represent innovation and leadership in their respective industries.
Introducing Cloudera Data Science Workbench for HDP 2.12.19Cloudera, Inc.
Cloudera’s Data Science Workbench (CDSW) is available for Hortonworks Data Platform (HDP) clusters for secure, collaborative data science at scale. During this webinar, we provide an introductory tour of CDSW and a demonstration of a machine learning workflow using CDSW on HDP.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Transform to Cognitive Healthcare with IBM Software Defined Infrastructure an...Paula Koziol
Medical data is exploding. The internet of things is changing how we work and live. The healthcare industry is responding and transforming. In this cognitive and cloud era, IBM is positioned to help healthcare organizations of all sizes transform, thrive and deliver better outcomes. Learn about IBM's cognitive healthcare platform for infrastructure and how it delivers a scalable, secure hybrid cloud for GE Healthcare applications and cloud ecosystems. Review of case studies demonstrate the resiliency, flexibility and cost savings achieved while managing the velocity of enterprise imaging and healthcare data.
Addressing VMware Data Backup and Availability Challenges with IBM Spectrum P...Paula Koziol
Whether in the enterprise or small-to-medium sized firms, VMware IT administrators and storage management teams face an increasingly complex set of decisions when it comes to deploying, managing, protecting and supporting storage infrastructure. Hear about the emerging issues in the most rapidly changing part of the IT environment, virtual machine (VM) management and availability. Learn about IBM’s perspective on data availability and how it is addressing future challenges you may not even be thinking about today with the new, highly flexible IBM Spectrum Protect Plus VM backup and availability management solution.
Powering the Creation of Great Work Solution ProfileHitachi Vantara
Hitachi Data Systems provides scalable storage solutions to power digital workflows in film, video, and game production. Their solutions deliver high performance, capacity, and modular architecture to handle large data volumes and enable simultaneous access. This removes bottlenecks and constraints, allowing creative teams to focus on their work without storage limitations. Hitachi storage drives improved productivity, accelerated rendering, and reduced production costs for studios.
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Cloudera, Inc.
In this session, we will cover how to move beyond structured, curated reports based on known questions on known data, to an ad-hoc exploration of all data to optimize business processes and into the unknown questions on unknown data, where machine learning and statistically motivated predictive analytics are shaping business strategy.
Hitachi Virtual Infrastructure Integrator (Virtual V2I) is a VMware vCenter plugin plus associated software. It provides data management efficiency for large VM environments. Specifically, the latest release addresses virtual machine backup and recovery and cloning services. Customer want to leverage storage based snapshots as it is scalable, more granular backup from hours between backups to minutes resulting in improved RPO. VMworld 2015.
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bring...Hitachi Vantara
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bringing Flexibility, Agility and Readiness to the Real-Time Enterprise. VMworld 2015
Accelerate the Business Value of Enterprise StorageHitachi Vantara
When it comes to enterprise storage, IT has always had to choose between features and cost. Ongoing tradeoffs between the best technologies to support business operations and an adequate budget to pay for those technologies generally impede an organization’s ability to be competitive, innovative and cost efficient. The entry-enterprise storage market has opened up new opportunities for storage customers – and eliminated the need for tradeoffs. Join this webinar to understand how to accelerate business value with entry-enterprise storage systems and learn about the new Hitachi Data System offering, Hitachi Unified Storage VM. View this WebTech to: Understand the common tradeoffs and challenges within the entry-enterprise storage market. Understand the business value of new entry-enterprise offerings. Learn how Hitachi Unified Storage VM is bringing enterprise-level features to the midrange. For more information on Hitachi Unified Storage VM please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-vm.html?WT.ac=us_mg_pro_husvm
Solve the Top 6 Enterprise Storage Issues White PaperHitachi Vantara
Storage virtualization can help organizations solve common enterprise storage issues by consolidating multiple physical storage systems into a single virtual pool. This allows for increased utilization of existing assets, simplified management across heterogeneous systems, and reduced costs through measures like thin provisioning and automation. Virtualization helps organizations address issues like exponential data growth, low storage utilization, increasing management complexity, and rising capital and operating expenditures on storage infrastructure.
This document discusses the benefits of software defined storage (SDS) in addressing challenges posed by changing business needs, data growth, and complexity. It introduces IBM's Spectrum Storage family which provides a comprehensive set of SDS offerings that can be deployed flexibly on cloud, as an appliance, or software. The solutions aim to securely "unbox" data from hardware constraints and optimize storage costs through analytics-driven management and automatic data placement across systems. Case studies show customers transforming their storage and reducing costs with IBM Spectrum Storage.
A-B-C Strategies for File and Content BrochureHitachi Vantara
Explains each strategy, including archive 1st, back up less, consolidate more, distributed IT efficiency, enable e-discovery and compliance, and facilitate cloud. For more information on Unstructured Data Management Solutions by HDS please visit: http://www.hds.com/solutions/it-strategies/unstructured-data-management.html?WT.ac=us_mg_sol_udm
Hitachi Unified Storage 100 family systems consolidate and manage block, file and object data on a central platform. For more information on our unified storage please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-100-family.html?WT.ac=us_mg_pro_hus100
Leveraging the cloud for analytics and machine learning 1.29.19Cloudera, Inc.
Learn how organizations are deriving unique customer insights, improving product and services efficiency, and reducing business risk with a modern big data architecture powered by Cloudera on Azure. In this webinar, you see how fast and easy it is to deploy a modern data management platform—in your cloud, on your terms.
Delivering Modern Data Protection for VMware EnvironmentsPaula Koziol
The document discusses modern data protection trends for VMware environments and IBM's data protection portfolio. It notes that organizations face challenges around data growth, security threats, and the need for data availability and reuse. IBM's portfolio aims to address these challenges through features like agentless backups, instant recovery, data reuse capabilities, cyber resiliency tools, and integration with multiple clouds. The document provides an overview of why various stakeholders in an organization, like business leaders, lines of business, and IT teams, care about modernizing data protection and what capabilities are important to them.
Storage Analytics: Transform Storage Infrastructure Into a Business EnablerHitachi Vantara
View this webinar session to learn how you can transform your storage infrastructure into a business enabler. You will learn: Tips and tricks to streamline storage performance monitoring across your Hitachi environment. How to define and enforce performance and capacity objectives for key business applications by establishing storage service level management. How to create storage service level management reports that satisfy the needs of multiple IT stakeholders (that is, CIO, architect, administrator). For more information on controlling costs of sprawling storage with storage analytics white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-control-costs-and-sprawling-storage-with-storage-analytics.pdf
In this presentation from GITEX 2018, Virgil Dobos provides his perspective on creating a comprehensive data management strategy with Veritas solutions.
Microsoft SQL Server 2012 Data Warehouse on Hitachi Converged PlatformHitachi Vantara
Accelerate breakthrough insights across your organization with Microsoft SQL Server 2012 Data Warehouse running on the mission-critical and ready-to-deploy Hitachi server-storage-networking platform, Hitachi Unified Compute Platform. Amplify infrastructure performance with Hitachi and Microsoft SQL Server 2012 Fast Track Data Warehouse xVelocity in-memory technologies. Learn how your organization can extract 100 million+ records in 2 or 3 seconds versus the 30 minutes required previously. With SQL Server 2012 Fast Track Data Warehouse and Hitachi software, your organization will be able to leverage a data platform that processes any data anywhere. View this webcast and learn:How to reduce deployment time with ready-to-deploy solutions that have been engineered and pre-configured by Hitachi and validated by the Microsoft Fast Track Data Warehouse program. How Hitachi and Microsoft have optimized performance for your data warehouse requirements. How your organization can realize immediate ROI from your data warehouse investment. For more information on Hitachi Unified Compute Platform please visit: http://www.hds.com/products/hitachi-unified-compute-platform/?WT.ac=us_mg_pro_ucp
2020 Cloudera Data Impact Awards FinalistsCloudera, Inc.
Cloudera is proud to present the 2020 Data Impact Awards Finalists. This annual program recognizes organizations running the Cloudera platform for the applications they've built and the impact their data projects have on their organizations, their industries, and the world. Nominations were evaluated by a panel of independent thought-leaders and expert industry analysts, who then selected the finalists and winners. Winners exemplify the most-cutting edge data projects and represent innovation and leadership in their respective industries.
Introducing Cloudera Data Science Workbench for HDP 2.12.19Cloudera, Inc.
Cloudera’s Data Science Workbench (CDSW) is available for Hortonworks Data Platform (HDP) clusters for secure, collaborative data science at scale. During this webinar, we provide an introductory tour of CDSW and a demonstration of a machine learning workflow using CDSW on HDP.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Smarter Data Protection And Storage Management Solutionsaejaz7
This document discusses IBM's solutions for data protection, storage management and service management. It highlights IBM Tivoli Storage Manager which provides data protection, recovery and archival. It also discusses IBM TotalStorage Productivity Center which enables end-to-end storage management across the SAN. The document emphasizes that with increasing data growth, organizations need solutions that optimize storage resources, ensure data security and availability, and provide visibility and control over the storage infrastructure.
Caching for Microservices Architectures: Session II - Caching PatternsVMware Tanzu
In the first webinar of the series we covered the importance of caching in microservice-based application architectures—in addition to improving performance it also aids in making content available from legacy systems, promotes loose coupling and team autonomy, and provides air gaps that can limit failures from cascading through a system.
To reap these benefits, though, the right caching patterns must be employed. In this webinar, we will examine various caching patterns and shed light on how they deliver the capabilities needed by our microservices. What about rapidly changing data, and concurrent updates to data? What impact do these and other factors have to various use cases and patterns?
Understanding data access patterns, covered in this webinar, will help you make the right decisions for each use case. Beyond the simplest of use cases, caching can be tricky business—join us for this webinar to see how best to use them.
Jagdish Mirani, Cornelia Davis, Michael Stolz, Pulkit Chandra, Pivotal
The document provides an overview of NetBackup and Veritas 360 data management solutions. It discusses challenges with current data management including complex cloud migrations, fragmented protection and rising storage costs. Veritas' approach provides unified data protection across cloud, physical and virtual environments. Key solutions highlighted include NetBackup for data protection, Information Map for data visibility, and Resiliency Platform for business continuity.
This document provides an overview of HP Openview Data Protector including:
- An introduction to key concepts like backup devices, specifications, restoring data, and the internal database
- The benefits of Data Protector like scalability, high performance, and integration with other applications
- Details on the Data Protector architecture including the cell manager, clients, agents, and supported platforms
The document discusses several data storage solutions from HGST, including:
- The Ultrastar Data102 and Data60 hybrid storage platforms, which can store up to 1.2PB and 720TB respectively in a 4U form factor, and feature vibration isolation and efficient cooling technologies.
- The ActiveScale X100 and P100 object storage systems, which provide scalable, durable storage for applications such as analytics, media and entertainment, and backup.
- The IntelliFlash all-flash and hybrid storage arrays, which combine flash performance with data management capabilities to accelerate a wide variety of workloads while maintaining high density and compelling economics.
Backup software is continuously improving. Solutions like Veeam Backup and Replication deliver instant recoveries, enabling virtual machine volumes to instantiate directly on the backup device, without having to wait for data to transfer back to primary storage. These solutions can also move older backups to higher capacity, lower cost object storage or cloud storage systems. To deliver meaningful performance during instant recovery without exceeding the backup storage budget requires IT to re-think its backup storage architecture.
Modern backup processes need high performance, low capacity systems to deliver high-performance instant recovery, as well as high-capacity, modest performance systems to store backup data long term and software to manage data placement for the most appropriate recovery performance while not breaking the budget.
The document provides information about an experienced machine learning solutions architect. It includes details about their experience and qualifications, including 12 AWS certifications and over 6 years of AWS experience. It also discusses their vision for MLOps and experience producing machine learning models at scale. Their role at Inawisdom as a principal solutions architect and head of practice is mentioned.
Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/3uu8dEy
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from
- What actions and controls the Denodo Platform offers to keep costs at bay
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/4bYOOgb
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture.
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from?
- What actions and controls the Denodo Platform offers to keep costs at bay.
The document outlines steps for assessing an organization's application portfolio for migration to the cloud, including conducting discovery workshops, interviewing application owners, implementing discovery tools, profiling applications, identifying complexity and priorities, and developing a migration roadmap. It also discusses criteria for segmenting and prioritizing applications, different application disposition options, and patterns for actual application migration.
Virtualisation de données : Enjeux, Usages & BénéficesDenodo
Watch full webinar here: https://bit.ly/3oah4ng
Gartner a récemment qualifié la Data Virtualisation comme étant une pièce maitresse des architectures d’intégration de données.
Découvrez :
- Les bénéfices d’une plateforme de virtualisation de données
- La multiplication des usages : Lakehouse, Data Science, Big Data, Data Service & IoT
- La création d’une vue unifiée de votre patrimoine de données sans transiger sur la performance
- La construction d’une architecture d’intégration Agile des données : on-premise, dans le cloud ou hybride
Modernize storage infrastructure with hybrid cloud & flashCraig McKenna
As we enter the cognitive era leveraging data (your own institution's but combined with data from other sources) is the only way to survive and thrive in a competitive landscape. Hybrid cloud is the platform and the right data management strategy (and partner) is essential. Are you ready for the cognitive era?
Accelerate your digital business transformation with 360 Data ManagementVeritas Technologies LLC
As infrastructure continues to become more commoditized and abstracted, IT organizations are actively shifting their focus from managing infrastructure to managing information. This session will provide a detailed introduction to 360 Data Management, Veritas' vision for how IT organizations should think about and deploy data management technologies as they work to modernize and embrace this important shift. Veritas experts will reveal the six pillars of 360 Data Management – and talk about how IT leaders can take advantage of the Veritas 360 Data Management Suite to accelerate their digital business transformation.
This session was originally delivered at Veritas' Vision 2017 on Tuesday, Sep 19, 4:30 PM - 5:30 PM.
Introducing IBM Spectrum Scale 4.2 and Elastic Storage Server 3.5Doug O'Flaherty
The document discusses IBM Spectrum Scale, a software-defined storage product. It provides a unified file and object storage system with integrated analytics support. New features in versions 4.2 and 3.5 include reducing costs through compression and quality of service policies, accelerating analytics with native HDFS support, and simplifying deployment with new graphical user interfaces.
The document discusses HPE's software portfolio, which is organized into several categories: IT Operations Management, Enterprise Security, Information Management & Governance, Big Data Analytics, and Application Delivery Management. It provides overviews and descriptions of the various solutions within each category, including capabilities around automation, orchestration, security, data analysis, application testing and delivery. The portfolio is designed to help customers with challenges such as optimizing hybrid infrastructures, gaining security insights, managing and analyzing data, and delivering quality applications across mobile, traditional and cloud platforms.
InfoSphere Streams is an advanced computing platform that can quickly ingest, analyze and correlate information as it arrives from thousands of real-time sources.
IT professionals are being asked to do more with less and highly skilled resources are in demand. As streaming applications play a growing role in critical applications so does the need for simplicity. InfoSphere Streams empowers IT users of all types and skill levels to have deeper insights into operations and performance. In today’s engaged world, a five minute delay means business goes elsewhere. A new administration console, a Java Management Extensions (JMX) management and monitoring application programming interface (API), simpler security and adoption of Apache Zookeeper are now available in InfoSphere Streams
The document discusses data warehousing and the Data Warehouse Network. It provides an overview of the Data Warehouse Network as Europe's premier data warehousing consultancy and membership organization. It then covers various aspects of data warehousing including the differences between operational and data warehouse environments, the conceptual architecture of a data warehouse, and the evolutionary process of planning, building, and managing a data warehouse over time.
This document discusses technology limitations for outsourcing processes and provides an overview of managed cloud services as an alternative. It outlines four fundamental pillars that can go wrong with outsourcing: change, scalability, validation, and development/innovation. It then presents managed cloud services as a way to address these pillars through features like scalability, validation based on best practices, and flexibility. The rest of the document provides examples of case studies and services offered through a managed cloud approach.
The document discusses NetApp private storage solutions for Microsoft Azure that allow enterprises to leverage public cloud benefits while maintaining control over their data. It describes how NetApp storage integrated with Azure via ExpressRoute provides enterprises cloud elasticity with the performance, availability and control of on-premises storage. Example use cases covered include using Azure for peak workloads, disaster recovery, development/testing, and multi-cloud application continuity. The document advocates for a software-defined hybrid cloud data management fabric using NetApp to span on-premises, private clouds and public clouds.
7 criteria waarop u uw beoogde service provider/outsourcer kunt beoordelenProact Netherlands B.V.
Nu de voordelen van cloudtechnologie steeds zichtbaarder zijn, zoeken bedrijven naar een manier om veilig over te stappen naar de cloud zonder verstoringen binnen de legacy-omgeving. Een services provider/outourcer kan organisaties hierbij helpen. Waar moet u op letten als u op zoek gaat naar een dergelijke partij?
- Outtasking of outsourcing?
- Brengt deze partij u innovatie, ook na de handtekening?
- Hoe ziet het portfolio van de serviceprovider eruit?
- Welke keuzes heeft u qua locatie van uw infrastructuur, afrekenmodellen, ownership en het beheer?
De groei van data en applicaties zorgt voor extra complexiteit binnen back-up & recovery. Is uw data nog wel voldoende beschermd?
In deze presentatie worden de verschillen weergegeven tussen het in eigen beheer houden en uitbesteden van back-up en recovery om aan deze complexiteit het hoofd te bieden.
Nieuwe eisen van de moderne IT:
- Cloud v.s. in eigen beheer?
- Beschikking over de meest efficiënte technologie?
- Welke keuzes zijn er?
Desktop 2.0: een VDI omgeving als beheerde private cloud, maar gebruik makend...Proact Netherlands B.V.
Het afnemen van IAAS (Infrastructure-as-a-Service) bij Microsoft Azure zorgt bij een organisatie voor o.a. meer flexibiliteit in de benodigde infrastructuur resources. Maar heeft u dan nog wel voldoende controle over deze infrastructuur? Een gecombineerde infrastructuur van een private cloud omgeving met een cloud dienst als Microsoft Azure bundelt beide voordelen, waardoor een krachtig en dynamisch platform ontstaat voor Citrix XenDesktop 7.5 en Office 365.
- Waarom zal in de toekomst de private cloud niet verdwijnen?
- Wat komt er bij kijken om Citrix XenDesktop 7.5 en Microsoft Office365 aan te bieden op deze hybride infrastructuur omgeving?
Leer in deze presentatie hoe een Citrix XenDesktop 7.5 omgeving en Microsoft Office365 aangeboden kan worden op een gecombineerde infrastructuur omgeving van Microsoft Azure en een private cloud.
Meer data, applicaties, apparaten en gebruikers drijven veel organisatie naar de cloud. Cloud computing met haar ingebakken elasticiteit kan bedrijven immers helpen om op die uitdagingen in te spelen. De vraag is echter:
- Hoe komt een organisatie van haar legacy-omgeving naar de private cloud?
- Hoe creëert dit een shift in focus van beheer naar innovatie?
Leer in deze sessie de praktische handvatten van een private cloud implementatie: van standaardisatie, via consolidatie en virtualisatie naar het volledig automatiseren van IT. Want een private cloud heeft enorme kostenvoordelen, maar deze voordelen behaal je alleen als je verregaand standaardiseert en beheersmatig automatiseert.
Wist u dat een IT-afdeling gemiddeld 80% van haar tijd bezig is met het updaten en onderhouden van haar infrastructuur? Tijd die niet kan worden besteed aan verbetering van applicaties en processen, de lagen waar de innovatie vandaan zou moeten komen? De vraag is:
- Hoe zijn repeterende taken binnen infrastructuurbeheer te automatiseren?
- Hoe laten we eindgebruikers meer zelf doen: via selfserviceportals en automatisering van beheer?
Bekijk in deze presentatie hoe orkestratie, automation, provisioning en monitoring via één dashboard sterk vereenvoudigd beheer biedt voor zowel fysieke als virtuele omgevingen.
The document discusses EMC's ViPR software-defined storage platform. ViPR aims to virtualize storage from multiple vendors into a single pool and automate provisioning to reduce provisioning times from hours to seconds. It also provides data services and tools to help enable hybrid cloud storage capabilities.
Flash is een game changing technology, althans dat is wat de markt u graag wil doen geloven. Immers, voorspelbare consistente performance en IO efficiency worden hierdoor mogelijk gemaakt. Maar…
- Microsecondes maken het verschil maar de spelregels veranderen niet.
- Not all Flash was created equal
- Disk is niet dood, al willen sommige leveranciers dat u doen geloven
Bekijk deze presentatie om een nuchtere kijk op Flash te krijgen en uit te vinden wat de echte impact is op uw datacenter infrastructuur.
Infrastructuur beheer zelf doen of uitbesteden? De 4 belangrijkste overwegingen Proact Netherlands B.V.
Het beheer van een IT-omgeving legt binnen een organisatie een flink beslag op mensen en middelen. Daarom laten steeds meer organisaties dit over aan een gespecialiseerde partij.
- Sleutelwoorden bij deze dienstverlening zijn veiligheid, flexibiliteit en beschikbaarheid.
- Maar waarom en in welke situatie zou een organisatie moeten kiezen voor outtasking?
In deze presentatie worden de 4 belangrijkste overwegingen op een rij gezet. Van kennisborging tot flexibele Pay-Per-Use modellen. Want alleen zo maakt u van het uitbesteden van Infrastructuur (beheer) een verlengstuk van uw eigen IT-afdeling.
Ontsluiting van remote offices en mobiele locaties: 5 bouwstenen uitgelegdProact Netherlands B.V.
Organisaties die, al dan niet door fusies en overnames, naar meerdere vestigingen gegroeid zijn en te maken hebben met verschillende decentrale oplossingen per vestiging, krijgen te maken met extra ICT uitdagingen:
- Hoe beheer ik de ICT in de internationale vestigingen?
- Kan de data van de decentrale vestigingen worden geconsolideerd in mijn eigen datacenter?
- Kunnen de decentrale applicaties mogelijk gecentraliseerd worden?
Leer in deze presentatie meer over de mogelijke oplossingen voor de ontsluiting van remote offices en mobiele locaties: WAN-optimalisatie, Edge computing, virtuele storage appliances en meer.
Deze presentatie geeft u een heldere kijk op de leidende flash oplossingen in de markt. Waarschijnlijk bent u geïnteresseerd in de kracht van een All Flash Array (AFA) en bent u het stadium al gepasseerd dat AFA wordt gezien als een duur alternatief voor spinning disks. Maar waar moet u nu echt op letten bij de aanschaf van een AFA en hoe vergelijken de diverse merken zich?
- Welke features zijn inbegrepen en hoe onderscheiden ze zich?
- Schaalbaarheid: Welke stappen nodig zijn om een AFA uit te breiden?
- System Management: Welke management tools worden meegeleverd met de AFA’s (beheer WebUIs, GUI's, plug-ins en command line tools)?
Hoe technische beperkingen uw outtasking of -sourcing traject kunnen laten m...Proact Netherlands B.V.
"Ik besteed het toch uit?"
Ook als een infrastructuur in de vorm van outtasking als een dienst wordt afgenomen is techniek van belang. In de dagelijkse praktijk blijken outtasking en -sourcing failures namelijk regelmatig te worden veroorzaakt door verkeerde technische keuzes. Houd dus grip op de technische aspecten van een dienst. Waar kunt u op letten?
- Hoe zorg ik ervoor dat mijn provider een infrastructuur als gedegen fundament biedt?
- Kan innovatie eenvoudig worden geïncorporeerd of vereist deze forklift upgrades en extra kosten?
- Is deze schaalbaar en waar liggen de onvoorziene glazen plafonds?
In deze presentatie worden de belangrijkste technische valkuilen van outtasking besproken aan de hand van praktijk voorbeelden.
Dat VDI-omgevingen organisaties talloze voordelen kunnen brengen is inmiddels algemeen bekend. Waar echter de belangrijkste valkuilen voor o.a. het beheer en de performance liggen is niet voor iedereen helder.
Waarom?
- Gebrek aan inzicht over de invloed van het VDI image op de I/O load
- Omgaan met boot- en login-stormen
Bekijk in deze presentatie hoe door performance en storage-optimalisatie binnen een VDI omgeving zelfs de meest veeleisende desktopvirtualisatie omgevingen mogelijk worden gemaakt.
Een virtuele server-infrastructuur zorgt, ondanks haar vele voordelen, voor nieuwe uitdagingen vanwege geconsolideerde datastores, dynamic workloads en de wens voor schaalbare infrastructuren. Bedrijven worden dan ook gedwongen om hun strategie voor databeveiliging te herzien.
Dit vereist een oplossing die o.a.:
- Databeveiliging consolideert van fysieke en virtuele omgevingen en dit op meerdere platforms ondersteunt (VMware® en Microsoft® Hyper-V)
- Een hybride model kan bieden waarin meerdere hypervisors of zelfs een mix tussen private en public cloud omgevingen wordt geboden.
Bekijk in deze presentatie alle voordelen van hypervisor en back-up integratie vanuit één single platfom.
Alle voordelen van FlexPod en EMC VSPEX converged infrastructuren op een rijProact Netherlands B.V.
De uitdaging van een IT-manager is door de jaren heen in principe niet veel veranderd: het verlagen van kosten, elimineren van risico en gegarandeerde uptime & performance richting de organisatie.
Dit is echter geen eenvoudige taak, want:
Wat doet u met de constante draaikolk van nieuwe hardware, software-upgrades en diversiteit aan merken in uw infrastructuur? En hoe beheert u deze?
Hoe snel kunt u innoveren en inspelen op de vragen vanuit de business zoals de uitrol van nieuwe applicaties?
Bekijk in deze presentatie de voordelen van NetApp FlexPod en EMC VSPEX converged infrastructuren bestaande uit standaard bouwblokken van servers, network en storage aangestuurd vanuit één management interface. Oplossingen die ook vanuit uw eigen datacenter of dat van Proact als managed dienst kunnen worden geleverd.
PROACT SYNC 2013 - Breakout - Versterk uw converged data center infrastructur...Proact Netherlands B.V.
Breakout session tijdens Proact's SYNC 2013.
Versterk uw converged data center infrastructure met de kracht van Commvault back-up (as a service)
Pieter Kestelyn
PROACT
PROACT SYNC 2013 - Breakout - De voordelen van een FlexPod converged datacent...Proact Netherlands B.V.
Breakout session tijdens Proact's SYNC 2013.
De voordelen van een FlexPod converged datacenter infrastructure volgens analist Forrester
Geert van Teylingen
Strategy and Consulting Benelux
Breakout session tijdens Proact's SYNC 2013.
VSPEX en vBlock Converged Infrastructure bouwblokken van hypervisor server network en storage.pptx
John Lavallée
Practice Mgr – Cloud Services EMEA
EMC | Global Services Partners
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Van backup only naar end-to-end-datamanagement vanuit één centrale GUI
1. SIMPANA 10
INNOVATION AND INTEGRATED
APPROACH TO ENTERPRISE DATA
AND CONTENT MANAGEMENT
MANAGEMENT BREAK-OUT:
VAN BACKUP ONLY NAAR END-TO-END-
DATAMANAGEMENT VANUIT ÉÉN
CENTRALE GUI
Presenter: Gary W. Cook
Sr. Technology Evangelist, CommVault EMEA
gcook@commvault.com / www.linkedin.com/in/garywcook V1.5
2. Agenda
Simpana Data Management Strategy
– One evolving platform for “Data Everything”
Solutions
– Some key solutions from the Simpana Suite
3
Data Management
– The Landscape and Data Management Strategy
Closing Remarks
– End of presentation. Additional Resources. Questions if time.
3. Current Dynamics In The Industry
- “What We Hear”
4
Data Volume, Velocity & Variety
Rising Infrastructure Complexity
IT Agility: Consolidation, Virtualization,
Cloud
Derive Business & Operational Value
from Data
4. What Makes Up “Data Management”?
Insight
Orchestrate &
Index Snaps +
Replicas
5
Content
Awareness
Application
Awareness
Compute
Awareness
Storage
Awareness
Security
Access
DM
Actions
Index
Lifecycle
(Copy Mgmt)
Discover
Protect
Access
Manage
5. Data Management Strategy is not Easy
Recover
Access
Fitting Service Levels to Business Needs
Managing Operational Costs
Meeting Compliance
6
Balancing act of Business/Operations factors in delivering SLA Outcome
Deployment … Time-to-Value
Operations (tools vs platform, training)
Scaling .. Up and Out
Innovations … Change Management
30 days
2 years
Death + 2 years
Classification
Retention
Compliance
Audit
3-yr HW generation/support
Microcode compatibility
Migration to next location
Lifecycle
Cost
Migration
Immutability
6. Future Proofing Scale-Out
7
How does your plan address your environment 12-24months into future
Sources – Expansion,
Provisioning, P2V, V2V? ,x2C?
Classification– All or
Filtered?, 30%+ GR, Consolidation
impact, Tiering
Movers – Transport
infrastructure, networks, locations ,
laws of physics…
Operations – Costs,
Resources, Locations, Retention,
Compliance
7. Agenda
Simpana Data Management Strategy
– One evolving platform for “Data Everything”
Solutions
– Some key solutions from the Simpana Suite
8
Data Management
– The Market and Data Management Strategy
Closing Remarks
– End of presentation. Additional Resources. Questions if time.
8. INNOVATIVE CONCEPTS FOR:
Data Management
Data Anywhere, Anytime, Any Device, Strategic Archiving,
Manage and Price by Data Value, Document Sharing Across
Devices and Individuals, Tailored Solutions, Automate Everything
with Work Flows
IT Infrastructure Management
Improve Operational Efficiency, Next Generation Proactive
System Design Support: New Architectural Services, Enterprise
Wide Reporting and Monitoring
Business Intelligence
Transform Data into Valuable Knowledge, Next Gen Search,
Enterprise Wide Open Virtual Data Repository, Archive File
System for Easy Retrieval, Analytics, Vertical Market Solutions
9
9. Single Platform Provides Expansion Opportunities
Building From Competency to Competency
Content
Classification
Mobile File
Retention
Governance
Modern
Data Protection
Storage
Backup Product
v1
Space Management
Data
Records
Hardware
Snap Management
Laptop Protection
Sync
IT
Operations
Analytics
Disaster
Recovery
Replication
Active Archive
v10
Data
Analytics
Unique
Scalable
Distributed
Data
Index
10
1990’s 2004 2007 2010 2014
10. SINGULAR INFORMATION MANAGEMENT
CommVault Symantec HP
Computer
Associates
EMC IBM
Backup &
Recovery
Archive, Email,
Files, eDiscovery
& Compliance
Data Reduction
Deduplication &
Compression
Enterprise
Reporting
Replication,
Management,
& CDP
Desktop / Laptop
Protection
NetBackup
Enterprise
Vault
PureDisk
Ops Center
Replication Exec™
Data Protector
TRI
M
StoreOnce
Storage Essentials
ZDB
Data Protector
IA
P
IDOL
NOT AVAILABLE
SRM
D2D
NOT AVAILABLE
Networker
HomeBase
Control Center
Centera
SourceOne
Replistor
Recover
Point
DP
A
Repl.
Manager
Xtenders
TSM for SharePoint
Princeton
Softech
Common
Store
11. Data Management Efficiency with Simpana
12
Protect/Archive
Application/ Content
Awareness
OnePass- Touch/
Index Once
Snap & Replication
Orchestration
Lifecycle Mgmt
Secure Encrypt
Copy
Efficient Storage
(dedupe)
Enterprise Data Store
“Content Store”
Mobility
Secure Access
Monitor & Analytics
Review, Hold, Export
eDiscovery Ready
Compliance Retention
Lifecycle Data
Tier Data
12. Agenda
Simpana Data Management Strategy
– One evolving platform for “Data Everything”
Solutions
– Some key solutions that are part of the Simpana Suite
13
Data Management
– The Market and Data Management Strategy
Closing Remarks
– End of presentation. Additional Resources. Questions if time.
13. Single Platform Evolving to the Content Store
Reusable Data
Edge
Web console
Edge
Mobile
access
manage
protect
Compliance
Search & Hold
Operational
Analytics
Discover Search
15. Data Management Efficiency
Provide a Robust Enterprise Services Platform
DR / COOP
Simpana management is based
on premise of minimizing
movement and unnecessary
redundancy in tasks or data
copies.
Our platform approach allows
automated data flows from
recovery images to archive or
compliance retention copies to
meet your SLAs.
1,000’s TB scale, billions of indexed items, stored for speed
Relevant data, lifecycle managed
SEARCH RETAIN ANALYZE RECOVER DISCOVER ACCESS
OnePass
Content
aware
Snapshot
tiering
Classification-review-
retain
Recovery Tier
Archive Tier
Records Sets
databas
e
Source
Dedupe
16. Content Store: Data-On-Demand
- Across All Locations, Times and Media
Snaps/clones
19447
3547
1716 1580
2833
1319 992
562 466
1384
2396
25000
20000
15000
10000
5000
0
last week 1-2 wks 2-3 wks 3- 4 wks 1-2 mth 2-3 mth 3-6 mth 6-9 mth 9-12 mth 1-2 yrs 2+ yrs
17
REVERT
RESTORE
RETRIEVE
from SEARCH
REPLICATE
Backup Copies
Archives
Vaults
Annual Data Retrieval Tasks – Global Bank
Showing what retention period the data was retrieved from
# Tasks
17. Agenda
Simpana Data Management Strategy
– One evolving platform for “Data Everything”
Solutions
– Some key solutions from the Simpana Suite
18
Data Management
– The Market and Data Management Strategy
Closing Remarks
– End of presentation. Additional Resources. Questions if time.
18. The Virtual Server Lifecycle
The Right Strategy – Cloud Independent – Converged Approach
“Manage &
Archive”
.. managing relevant
retention
19
“Backup &
Replication”
Automated Policy-based
Persistent Copy / Array or Off-host
Recovery indexed
Resource efficient with scale
..vs. legacy around
smart VM protection
Create & Self-Managed
Recovery Points (snaps)
Policy driven Expiration
Starting in the
private or public
cloud …
19. Guided Provisioning
20
Create VM Workload
• Select from provisioning
policy prescribed
options
• Notification email when
done
• Can auto-opt to backup
policy
Pick a Pool
• Select policy
• Can be across
private/public
20. Automated VM Lifecycle Retirement
- Single Protection & Archive Policies (VMware)
VM Protect & Manage
APP
1
APP 3
APP
OS
APP
OS
APP
OS
APP
OS
APP
OS
Protect &
Recover
Find idle VMs
and power off
APP
OS
CPU MEM
OS
APP
OS
APP
OS
2
Shift image to
cheaper store
APP
OS
$$$ $$
Archive image
off production
APP
OS
OS
APP
OS
APP
OS
T1 T2
Prod Storage
Automate
Reclaim Prod
Space
Policy based
Retention
Reduce TCO
21. 22
Group Admin set provisioning policy
• Sets cluster/images to pick from)
VM Provisioning policy
• Tenant oriented config parameters, lifecycle
and security access
End-Users Self-Service
• Create/manage/snap/expire
(extend)
• User= owner
Create instance
from provision
policy
• Name, expiration,
resource defaults
• Auto install backup
client
Manage – power up/down, snap/mount
• All your VMs ( Private > VMware, Hyper-V or Public > Azure)
• Extend expiration needs (auto delete date)
• Private Cloud – offers VM image protection with granular file
retrieval
• Public Cloud – offers content-aware, indexed file protection
of the guest host retention safety net
VM Lifecycle Mgmt
22. Simplified Access and Mgmt
VM Mgmt
• Start/Stop
• Snap/Revert
• Extend
23
Win Explorer
VM
Access
• Web
• Explorer
VM Restore
• File level
• Image level
• Backup copies
23. Notifications and Alerts
24
Notification Actions
• Tenant based
• Additional supplement with email and reporting
• Alerts escalate to SCOM
24. Empowering Users at the Edge
Explorer Plug-In
• Check
Protection
State
• File
Browse /
Restore
• Recall
Restore
VM Plug-In
• Check VM
Protection
State
• File / Folder
level restore
Outlook Plug-In
• Email
• Browse /
Retrieve
• Preview
• Recall
Edge Webconsole
• Run / Manage
Edge Backups
• MyData Access
• End User
Search
• Folder
Synchronize
• VM Lifecycle
Mgmt
Secure
Edge Mobile App
• MyData
Access
• User
Search
Process
Compliance Search
• Search /
Audit
• Group
Review
• Preserve
• Export
Healthcare & Verticals
• Index DICOM
images
• Extended
“patient” search
criteria
• Built in
refinements
• Image preview
End User
Transparent
Self Service
Specialist
Familiar Vertical
Architects Operators
25
25. Transform Data into Information Assets with
Content Indexing for Search
Deep indexing of data extends
powerful role based Search
capabilities for increased productivity
Integrates across broadest content
type for backup & archive to
eliminate silos & reduce risk
Scale out architecture grows along
with the business to optimize costs
‘Preview’ files and emails with HTML
views to complement mobility
initiatives
Physical
Search Cloud
Search Nodes
Virtual Search
Nodes
SCALE
OUT
26. Data Mobility – Access from Anywhere
Edge Mobile App
• MyData Access
• User Search / Download
• User Upload or device
sync
Personalized
Data Access
Owned by
“Brian”
27
27. Data Mgmt Efficiency with Simpana
28
Protect/Archive
Application/ Content
Awareness
OnePass- Touch/
Index Once
Snap & Rep
Orchestration
Lifecycle Mgmt
Secure Encrypt
Copy
Efficient Storage
(dedupe)
Enterprise Data Store
“Content Store”
Mobility
Secure Access
Monitor & Analytics
Review, Hold, Export
eDiscovery Ready
Compliance Retention
Lifecycle Data
Tier Data
28. Achieve continuous backup
operations with built-in node failover
Leverage MA/SSD for dedupe
database to maximize capacity and
performance
Parallel Deduplication
Performance and Scale
2x
Increased Dedupe Backup
Performance & Scale
with 2 Node Grid
Individual Nodes – CV Source (Target)
Dedupe
120TB Store
120TB Store
OR
Grid Based – Scale Linearly
Global Dedupe Store – 240 Terabytes
29
29. Agenda
Simpana Strategy
– Why a platform for data and information management changes
the game
Defining a Data Management Strategy
– Delivering operational hard value returns on your investment
Impact Studies
– Examples from customer use-cases
30
30. Business:
Global Hosting Provider
Scale:
70K+ clients
2.3M Backups per month
42 PB/Month of backups WW
Failure rate under 0.3%
GIGE Networks ( DC centric )
Backup Infrastructure
Servers/Movers
Secondary
Storage
SLA:
Guaranteed backup –
window/scope/retention
Daily expansion
Challenge:
Expansion vs. Saturation
Saturation
31
Impact Study
31. Stalled at the Wall
Data Movement Overwhelms
the Infrastructure
Under 20G/H:
2,191
20 - 40G/H:
25
+100G/H:
Backup Speeds: Too Slow 2
• Use filtering to get “critical
data”
• Police users on data
• Upgrade systems (?)
# Jobs waiting for streams over day
Continuous Backlog
Missing SLA Window
Bottleneck: Too busy
• Missed windows/ SLA impact
• No room for growth
• Build out new infrastructure
to share the load
Network Saturated
Saturated Network: Overwhelmed
• Device deduplication benefit sits
at the wrong end of the pipe
• 10G upgrade $20M+ prospect
32
32. # Jobs waiting for streams over day
Continuous Backlog
Missing SLA Window
Under 20G/H:
2,191
20 - 40G/H:
25
+100G/H:
2
Switch to CV Source Dedupe
Unleashing the power of
smarter software (source-dedupe)
Under 20G/H:
Backup Speeds: 230% Faster 2,191 702
• 3 weeks after switch
• Same data/servers
• Impact of working smarter
# Jobs waiting for streams over day
No Queue
SLA: Met with room for more
• Window finishes - No
overhang
• Opens incremental room for
more clients in same
resource pool
20 - 40G/H:
25 1,299
+100G/H:
2 377
Reduced Movement: 75% Drop
• Eliminate wasteful movement at
source
• 75% reduction in bandwidth
• Defer forced investment to 10G
75%
Network Saturated
33
33. Innovation – Email Cloud
Email Cloud Consolidation
Recovery enhanced snapshots matched to user archiving services
34
Use-Case
Managed Services
architecture
2M MB cloud –
recovery SLA
User MB archiving –
with site mobility
eDiscovery VIP Service
Solution
E2010 – HDS Snap /
Dedupe Protection /
COOP
MB Archive / Quota
Cross-site user
movement with
central archive
eDiscovery Oriented
VIP
Modernize
Protection
Consolida
Outcome
d
tion
Modernize
d Email
Recovery /
Archive
Mobility –
Protect/Di
scovery
Big Data services
architecture
RTO/RPO SLA
Space efficient
retention and
COOP (open
storage)
4GB MB Archive
tier per user
Content
Retention
Virtual
Lifecycle
Mgmt
34. SNAP
MA
SNAP
MA
SNAP
MA
SNAP
MA
SNAP
MA
Impact Study
35
Business:
DOD Email Cloud
Scale:
2M initial E2010 mailbox,
designed to expand to 12M+
SNAP
SNAP
MA
Consolidation / Big Data
Policy Driven
MA
Rapid Recovery
SLA:
Recovery/Retention/Cost
control
SNAP
MA
SNAP
MA
SNAP
MA
Challenge:
Big Data vs. Operations Archive Retention
Management
35. Modernizing Data Protection/Recovery
Short Retention, Restore, Discovery
36
5-10K Mailboxes Pool
DB Production Stores
1TB
Active
1TB
Active
1TB
Passive
DAG
1 TB
Passive
DAG
RPO/RTO
=<4hrs
SnapCopies
MA
Protection Copies DR Copies
Long term Deep Retention
Vault Copies
Source
Recovery
Amount
Processed
Amount
Transferred
Amount
Stored
Operations
Touch
•App Aware
•Auto-discovery (DAG/DB)
Roll back to hours ago,
Reduce data loss gap
Reutilize copy for new purposes
• Snap / Offload
•Delta blocks
• Global Dedupe
•Automate
DR 2X Cost/Effort
Confidential
36. Automating Policy-Based Management
37
Action Outcome
Automate operations • Eliminate scripting – with single application
aware protection policy
• Drives snapshot, dedupe copies, DR site
copies and aging
• Standardize to reduce operational costs
Increase performance to
meet SLA
• Snap/Revert volumes with 3-5 day retention
• Offload production impact with off-host/proxy
• Extract 30-60 day restore copies to global
dedupe pool – including cross site DR/DASH
copies
Archive msgs from stores
to cheaper storage
resources
• Deliver user access/retention needs by
moving older messages out of production
stores
• Add indexing/search to support VIP service
37. Innovation – Content Retention
Modernize
d
Protection
Consolida
tion
Content Aware Retention
Redefine deep archive / compliance SLAs by saving relevant data only
38
Use-Case
Compliance to
retention/vaulting
drives expense and
complexity
Need way to cull or
extract the right data
for retention – save
only that
Adopt cloud options –
in the right set of
economics
Solution
Addition of
Reference Copy
options
Use content rules to
select and copy
relevant data
Reorganize from the
RECOVERY set to
the selective
CONTENT set
Outcome
Mobility –
Protect/Di
scovery
Reduce
compliance
storage over 10X
Reduce Risk
Accelerate
Discovery
Modernize
d Email
Recovery /
Archive
Content
Retention
Virtual
Lifecycle
Mgmt
38. Content Aware Retention
Reference Copy Policies Manage Retention of Critical OBJECTS
Backup Copies Reference Copies
Backup Copies
Export Jobs
Vaulted Mountain
Data Grooming
Rules – Files/Mail
Metadata/Content
Content “objects”
Relevant & Organized
Independent Lifecycle
Reduce Cost & Risk
This is what
we really
need…
Recovery- Oriented
Content- Oriented
39. Content-Oriented Retention Vision
- Deep Archive / Reference Copy keeps what is “important..”
Storage Policy Copy
Content Oriented Retention
− Switch from “Job” retention to
“Object” based retention, but…
− ..do it at the right level
(backend)
− Source content from existing
backup or archive client storage
policies using our catalog
− Hold and store (and price that
data) smartly
− Rules automate the movement
and "trimming"
− Content Index the retained
objects? Optional.
Snaps
Active
Backup
Archive
JOBS
Extract data instances for deep
retention in new copy
Reference Copy Pool
Passive
Deep
Archive
OBJECTS
Metadata or Content –
based rules
40
40. Retention Strategy Comparisons
41
Tape Vaulting
Monthly Tape Export (1-12M)
Quarter Selective (13-36M)
18X Stored v Base
Dedupe Store in Cloud
Monthly Export (1-12M)
Quarter Selective (13-36M)
8X Stored v Base
Content / Reference Copy
50% of unique data
Monthly Export (1-36M)
2X Stored v Base
41. Innovation – Mobility / Discovery
Endpoint Protection, Mobile Access, Discovery
Reduce downtime or exposure risk for Workplace users
42
Use-Case
Secure Protection of
Endpoint Systems
Automation –
deployment/operations
End-user driven –
secure access
Ediscovery ready
Solution
Simpana Edge
protection (DLO)
End user access
enabled … with
extensions on
sharing
Discovery driven
search/review/hold
Modernize
Protection
Consolida
Outcome
d
tion
+95% SLA
Mobility –
Protect/Di
scovery
+User Productivity
Reduce Risk
Accelerate
Discovery
Modernize
d Email
Recovery /
Archive
Content
Retention
Virtual
Lifecycle
Mgmt
42. CommVault Edge
Protecting and Enabling the Mobile Workforce
Easily scale to protect and
manage thousands of
laptop/desktop backups
Anytime, anywhere self-service
access to increase productivity
Reduce risk with integrated
eDiscovery & Search
Up to 50% Increased
Productivity
Edge
Web console
Edge
Mobile
Edge Data
Compliance Search
& Hold
access
protect
43. Inspiration, Innovation and Partnering
~100/wk
…3 yr
Edge clients
rolled out per
week vs. 26K
objective
Scale 10.0 pushes 3X increase
15,000 Edge/Cell
SLA Achieves 98+% consistently
vs. 72% with prior solution
Efficiency 40:1 weekly dedupe rate
on each cell
30000
25000
20000
15000
10000
5000
0
Protected Clients /CC
Jun'12 Sep'12 Dec'12 Mar'13 May'13
Adopt
36K
clients
3/2014
44. Data Mobility – Access from Anywhere
Edge Mobile App
• MyData Access
• User Search / Download
• User Upload or device
sync
Personalized
Data Access
Owned by
“Brian”
45
45. Driving Value from Big Data
- Automating retention through relevancy rules
SEARCH RETAIN ANALYZE RECOVER DISCOVER ACCESS
1001011010
0101100010
0011101011
1010
Determining relevancy is critical to
aligning retention strategies to
meet compliance needs or expose
assets to broader intelligence and
discovery needs.
Wholesale single namespace
search is an impractical, compute
challenge at mass scale. Answer
is breaking problem scope into
automated, distributed parts that
can process in parallel and report
to the whole.
Objective is to reduce big data to
relevant data – which can be
reduced and managed separately (
mini-Content Stores)
Search
Search
Mine
Classify
Retain
46
46. Scale Out Search
Discovery Oriented User features
Compliance Search
• Search /
Audit
• Group
Review
• Preserve
• Export
Search Any
Cloud
Classified,
Relevant Results
Specialized – tasks,
custodian review
Targeted – source,
time, content
Index Cloud
Search
Nodes
Physical
Virtual
47
47. Mobile Protect … Cloud Share
Compliance Search &
Hold
Custodian Hold
Upload
Add data
into the
C-Store
Sharing
Web
Folder
Share
Files/Folders
with others
(web)
Folder sync to
Cloud (CS) –
collaborate with
others
Lock
User-Set
Access
Lock
Edge
Web console
Edge
Mobile
access
File
Sync
DLP
Secure Wipe
48. Enterprise Class
V9 Today (V10)
5,000 max client per
CommCell
2X
Scale
10,000 max server clients per
CommCell
100 simultaneous Java
UI Connections
10X
Concurrency
1,000s of simultaneous Web 2.0
End User UI connections
40,000 Jobs per CommCell
3X
Workload
120,000 Jobs per CommCell
Two separate Backup &
Archive Process for Email
2X
Performance
100% faster Email Protection &
Archive with OnePass
Single Dedupe store maxes
out at 360TB*
2X
Capacity
Almost 2x scale of single Dedupe
store to 720TB, using a node-architecture
that allows linear
scaling**
49
49. Workflow Automation Enables Enterprise &
Cloud Datacenters
Automate repetitive tasks to
eliminate runbooks with
“Actionable Best-Practices”,
reduce management & errors
Single job flow controls a
sequence of actions that spans
methods, sites and users
All part of Simpana platform
baseline
Use Case Examples:
• New client provisioning/ registration
• Verify database backup and alert
• Disaster Recovery verification and automated
recovery
50
50. Web-based Reporting Provides Insight
into Global Operations
• Customized reporting
& dashboards provide
easier insight into trends
and operations
• Manage proactively based
on usage and performance
– with “health checks”
• Mobile App eases
management & reporting for
mobile admins
51
53. Agenda
Simpana Data Management Strategy
– One evolving platform for “Data Everything”
Solutions
– Some key solutions from the Simpana Suite
54
Data Management
– The Market and Data Management Strategy
Closing Remarks
– End of presentation. Additional Resources. Questions if time.
55. eDiscovery
Gartner
eDiscovery
CommVault has one of most balanced products with tools for both eDiscovery
and storage management. (InfoTech)
CommVault, the most complete product at its price point. (InfoTech)
Advanced functionality such as Simpana OnePass™, Reference Copy,
ContentStore and new indexing and search differentiate CommVault Simpana
software from competitors.
56. Product Vision
• Securely expose historical information
in ContentStore to End Users for
greater productivity and access
• API or Direct File protocol access from
the ContentStore to 3rd parties as
efficient shared data services platform
• Instrument the infrastructure & report on
its utilization, optimization and operation
• Personalization, workflow and
adaptation to exact customer’s
requirements
Enterprise
File Sharing
NFS
Healthcare
3rd Party
Applications API
email
Cloud Archive
Vertical Market
57
57. Product Vision
• Get data sprawl (growth & change) under
control with effective, high performance
protection & recovery strategies
• Consolidate independent data silos into a highly
efficient, virtual, secure ContentStore for
reduced cost & improved management
• Apply fast, scalable content inspection of data
to deliver search-based organization,
presentation or retention of information
• Retain information based on its content value to
business or compliance with governance
requirements, organized into neatly groomed,
policy-based buckets
Backup
& Restore
Snapshot
& Recovery
Replica
& Clone/Vault
58
58. Achievable Results?
Disk and
server costs
50%
decrease
Maintenance
reduction
as much as
30%
Administrative
time
50-80%
decrease
DR
preparation
4+ hours
to
10 minutes
Backup
success
90%
raise
Media
infrastructure
60-70%
decrease
Data transferred
over WAN
90%
reduction
Backup
time
60%
reduction
SIGNIFICANT
COST
REDUCTION
50%
decrease
59
59. Better way to understand what’s new (&why?)
BOL Reformat – more context and
recommendation vs. step-step, book
like with advanced search and usability
60
Short packaged demo/pitch in bite
sized pieces .. Youtube /playlists
http://documentation.commvault.com/commvault/v10/others/new_features/newslette
rs/newsletter_sp6.pdf
https://www.youtube.com/user/commvault/playlists