Storage virtualization can help organizations address key challenges like managing storage growth demands, leveraging existing assets, and simplifying data movement issues. It allows pooling of storage resources and thin provisioning to improve capacity utilization and reduce costs. Controller-based storage virtualization in particular separates logical views from physical assets, allowing heterogeneous storage systems to be managed as a single pool. This provides benefits like reduced complexity, improved flexibility, and leveraged cost savings.
Achieve Higher Quality Decisions Faster for a Competitive Edge in the Oil and...Hitachi Vantara
Hitachi next-generation unified storage solutions meet the challenges of today’s data-intensive oil and gas exploration and production activities. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Face Data Challenges of Life Science Organizations With Next-Generation Hitac...Hitachi Vantara
Hitachi Unified Storage 100 family drives efficiency at reduced costs and improves the discovery-to-market cycle for life sciences organizations. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Explains how backup-free storage reduces cost and complexity; provides benefits of Hitachi Content Platform; includes brief HDS backup use cases.
For more information on our Unstructured Data Management Solutions please check: http://www.hds.com/go/hitachi-abc-ebook-managing-data/
Achieve Higher Quality Decisions Faster for a Competitive Edge in the Oil and...Hitachi Vantara
Hitachi next-generation unified storage solutions meet the challenges of today’s data-intensive oil and gas exploration and production activities. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Face Data Challenges of Life Science Organizations With Next-Generation Hitac...Hitachi Vantara
Hitachi Unified Storage 100 family drives efficiency at reduced costs and improves the discovery-to-market cycle for life sciences organizations. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Explains how backup-free storage reduces cost and complexity; provides benefits of Hitachi Content Platform; includes brief HDS backup use cases.
For more information on our Unstructured Data Management Solutions please check: http://www.hds.com/go/hitachi-abc-ebook-managing-data/
A-B-C Strategies for File and Content BrochureHitachi Vantara
Explains each strategy, including archive 1st, back up less, consolidate more, distributed IT efficiency, enable e-discovery and compliance, and facilitate cloud. For more information on Unstructured Data Management Solutions by HDS please visit: http://www.hds.com/solutions/it-strategies/unstructured-data-management.html?WT.ac=us_mg_sol_udm
As more companies grow their business in global markets, they discover the need to capture new opportunities in a matter of days rather than months to have competitive advantage and to capture new market share. Their machines are producing terabytes of various data types — video, audio, Microsoft® SharePoint®, sensor data, Microsoft Excel® files — and leaders are searching for the right technologies to capture this data and help provide a better understanding of their business. The HDS big data product roadmap will help customers build a big data enterprise plan that ingests data faster and correlate meaningful data sets to create intelligence that’s easy to consume and helps leaders make the right business decisions. View this webcast to learn about Hitachi’s product roadmap to big data. For more information on HDS Big Data Solutions please visit: http://www.hds.com/solutions/it-strategies/big-data/?WT.ac=us_mg_sol_bigdat
Storage Analytics: Transform Storage Infrastructure Into a Business EnablerHitachi Vantara
View this webinar session to learn how you can transform your storage infrastructure into a business enabler. You will learn: Tips and tricks to streamline storage performance monitoring across your Hitachi environment. How to define and enforce performance and capacity objectives for key business applications by establishing storage service level management. How to create storage service level management reports that satisfy the needs of multiple IT stakeholders (that is, CIO, architect, administrator). For more information on controlling costs of sprawling storage with storage analytics white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-control-costs-and-sprawling-storage-with-storage-analytics.pdf
In the Age of Unstructured Data, Enterprise-Class Unified Storage Gives IT a ...Hitachi Vantara
Your enterprises can no longer be the realm of monolithic block-centric storage systems. Unstructured data drives the adoption of unified storage systems that support multiple protocols. Mobile smartphones and tablets accelerate the spread of file-based applications and data. Is your enterprise ready to join the mobile revolution? In this paper, see how Hitachi with our next generation of unified enterprise storage will help you move into the next era of business agility and efficiency.
Reduce Costs and Complexity with Backup-Free StorageHitachi Vantara
The growth in unstructured data stresses traditional backup and restore operations. Numerous, disparate systems with large numbers of files and duplicate copies of data increase backup and restore times and hurt the performance and availability of production systems. Cost and complexity rise, with more backup instances to buy and manage, more care and handling of an increasing numbers of tapes, and more management of offsite storage. In addition, you may need to support analytics, a compliance audit, or legal action that needs information that is stored offsite. By tiering data to an archive, you can reduce total backup volume by at least 30%. By extending that core archive to the edges of your business, your potential gains are worth investigating. View this webcast to learn how to: Lower capital expenses (hardware, software, licensing, and so on). Control maintenance costs. Simplify management complexity. Reduce backup volume, time cost, and time and administrative effort. For more information on Hitachi Data Systems File and Content Solutions please visit: http://www.hds.com/products/file-and-content/?WT.ac=us_mg_pro_filecont
Microsoft SQL Server 2012 Data Warehouse on Hitachi Converged PlatformHitachi Vantara
Accelerate breakthrough insights across your organization with Microsoft SQL Server 2012 Data Warehouse running on the mission-critical and ready-to-deploy Hitachi server-storage-networking platform, Hitachi Unified Compute Platform. Amplify infrastructure performance with Hitachi and Microsoft SQL Server 2012 Fast Track Data Warehouse xVelocity in-memory technologies. Learn how your organization can extract 100 million+ records in 2 or 3 seconds versus the 30 minutes required previously. With SQL Server 2012 Fast Track Data Warehouse and Hitachi software, your organization will be able to leverage a data platform that processes any data anywhere. View this webcast and learn:How to reduce deployment time with ready-to-deploy solutions that have been engineered and pre-configured by Hitachi and validated by the Microsoft Fast Track Data Warehouse program. How Hitachi and Microsoft have optimized performance for your data warehouse requirements. How your organization can realize immediate ROI from your data warehouse investment. For more information on Hitachi Unified Compute Platform please visit: http://www.hds.com/products/hitachi-unified-compute-platform/?WT.ac=us_mg_pro_ucp
Big Data in Oil and Gas: How to Tap Its Full PotentialHitachi Vantara
Tap the full potential of big data to find oil more quickly, enhance oil production, and reduce the health, safety, and environmental risks of equipment failure or operator error. Join this informative 60 minute webcast featuring IDC Energy Insights’ analyst Jill Feblowitz and leading energy experts from Hitachi Data Systems. Explore key findings from IDC Energy Insights' recent examination of big data and analytics in upstream oil and gas. Learn how to: Benefit from the newest technology innovations in upstream oil and gas. Improve the geoscience workflows for more accurate and reliable results. Create big data solutions that scale and perform as you need. Build true big data solutions that are easier to procure, service and support globally. For more information on HDS Solutions for Oil & Gas please visit: http://www.hds.com/solutions/industries/energy.html?WT.ac=us_inside_rm_nrgy
Hitachi Data Systems Hadoop Solution. Customers are seeing exponential growth of unstructured data from their social media websites to operational sources. Their enterprise data warehouses are not designed to handle such high volumes and varieties of data. Hadoop, the latest software platform that scales to process massive volumes of unstructured and semi-structured data by distributing the workload through clusters of servers, is giving customers new option to tackle data growth and deploy big data analysis to help better understand their business. Hitachi Data Systems is launching its latest Hadoop reference architecture, which is pre-tested with Cloudera Hadoop distribution to provide a faster time to market for customers deploying Hadoop applications. HDS, Cloudera and Hitachi Consulting will present together and explain how to get you there. Attend this WebTech and learn how to: Solve big-data problems with Hadoop. Deploy Hadoop in your data warehouse environment to better manage your unstructured and structured data. Implement Hadoop using HDS Hadoop reference architecture. For more information on Hitachi Data Systems Hadoop Solution please read our blog: http://blogs.hds.com/hdsblog/2012/07/a-series-on-hadoop-architecture.html
Hitachi Unified Storage 100 family systems consolidate and manage block, file and object data on a central platform. For more information on our unified storage please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-100-family.html?WT.ac=us_mg_pro_hus100
A-B-C Strategies for File and Content BrochureHitachi Vantara
Explains each strategy, including archive 1st, back up less, consolidate more, distributed IT efficiency, enable e-discovery and compliance, and facilitate cloud. For more information on Unstructured Data Management Solutions by HDS please visit: http://www.hds.com/solutions/it-strategies/unstructured-data-management.html?WT.ac=us_mg_sol_udm
As more companies grow their business in global markets, they discover the need to capture new opportunities in a matter of days rather than months to have competitive advantage and to capture new market share. Their machines are producing terabytes of various data types — video, audio, Microsoft® SharePoint®, sensor data, Microsoft Excel® files — and leaders are searching for the right technologies to capture this data and help provide a better understanding of their business. The HDS big data product roadmap will help customers build a big data enterprise plan that ingests data faster and correlate meaningful data sets to create intelligence that’s easy to consume and helps leaders make the right business decisions. View this webcast to learn about Hitachi’s product roadmap to big data. For more information on HDS Big Data Solutions please visit: http://www.hds.com/solutions/it-strategies/big-data/?WT.ac=us_mg_sol_bigdat
Storage Analytics: Transform Storage Infrastructure Into a Business EnablerHitachi Vantara
View this webinar session to learn how you can transform your storage infrastructure into a business enabler. You will learn: Tips and tricks to streamline storage performance monitoring across your Hitachi environment. How to define and enforce performance and capacity objectives for key business applications by establishing storage service level management. How to create storage service level management reports that satisfy the needs of multiple IT stakeholders (that is, CIO, architect, administrator). For more information on controlling costs of sprawling storage with storage analytics white paper: http://www.hds.com/assets/pdf/hitachi-white-paper-control-costs-and-sprawling-storage-with-storage-analytics.pdf
In the Age of Unstructured Data, Enterprise-Class Unified Storage Gives IT a ...Hitachi Vantara
Your enterprises can no longer be the realm of monolithic block-centric storage systems. Unstructured data drives the adoption of unified storage systems that support multiple protocols. Mobile smartphones and tablets accelerate the spread of file-based applications and data. Is your enterprise ready to join the mobile revolution? In this paper, see how Hitachi with our next generation of unified enterprise storage will help you move into the next era of business agility and efficiency.
Reduce Costs and Complexity with Backup-Free StorageHitachi Vantara
The growth in unstructured data stresses traditional backup and restore operations. Numerous, disparate systems with large numbers of files and duplicate copies of data increase backup and restore times and hurt the performance and availability of production systems. Cost and complexity rise, with more backup instances to buy and manage, more care and handling of an increasing numbers of tapes, and more management of offsite storage. In addition, you may need to support analytics, a compliance audit, or legal action that needs information that is stored offsite. By tiering data to an archive, you can reduce total backup volume by at least 30%. By extending that core archive to the edges of your business, your potential gains are worth investigating. View this webcast to learn how to: Lower capital expenses (hardware, software, licensing, and so on). Control maintenance costs. Simplify management complexity. Reduce backup volume, time cost, and time and administrative effort. For more information on Hitachi Data Systems File and Content Solutions please visit: http://www.hds.com/products/file-and-content/?WT.ac=us_mg_pro_filecont
Microsoft SQL Server 2012 Data Warehouse on Hitachi Converged PlatformHitachi Vantara
Accelerate breakthrough insights across your organization with Microsoft SQL Server 2012 Data Warehouse running on the mission-critical and ready-to-deploy Hitachi server-storage-networking platform, Hitachi Unified Compute Platform. Amplify infrastructure performance with Hitachi and Microsoft SQL Server 2012 Fast Track Data Warehouse xVelocity in-memory technologies. Learn how your organization can extract 100 million+ records in 2 or 3 seconds versus the 30 minutes required previously. With SQL Server 2012 Fast Track Data Warehouse and Hitachi software, your organization will be able to leverage a data platform that processes any data anywhere. View this webcast and learn:How to reduce deployment time with ready-to-deploy solutions that have been engineered and pre-configured by Hitachi and validated by the Microsoft Fast Track Data Warehouse program. How Hitachi and Microsoft have optimized performance for your data warehouse requirements. How your organization can realize immediate ROI from your data warehouse investment. For more information on Hitachi Unified Compute Platform please visit: http://www.hds.com/products/hitachi-unified-compute-platform/?WT.ac=us_mg_pro_ucp
Big Data in Oil and Gas: How to Tap Its Full PotentialHitachi Vantara
Tap the full potential of big data to find oil more quickly, enhance oil production, and reduce the health, safety, and environmental risks of equipment failure or operator error. Join this informative 60 minute webcast featuring IDC Energy Insights’ analyst Jill Feblowitz and leading energy experts from Hitachi Data Systems. Explore key findings from IDC Energy Insights' recent examination of big data and analytics in upstream oil and gas. Learn how to: Benefit from the newest technology innovations in upstream oil and gas. Improve the geoscience workflows for more accurate and reliable results. Create big data solutions that scale and perform as you need. Build true big data solutions that are easier to procure, service and support globally. For more information on HDS Solutions for Oil & Gas please visit: http://www.hds.com/solutions/industries/energy.html?WT.ac=us_inside_rm_nrgy
Hitachi Data Systems Hadoop Solution. Customers are seeing exponential growth of unstructured data from their social media websites to operational sources. Their enterprise data warehouses are not designed to handle such high volumes and varieties of data. Hadoop, the latest software platform that scales to process massive volumes of unstructured and semi-structured data by distributing the workload through clusters of servers, is giving customers new option to tackle data growth and deploy big data analysis to help better understand their business. Hitachi Data Systems is launching its latest Hadoop reference architecture, which is pre-tested with Cloudera Hadoop distribution to provide a faster time to market for customers deploying Hadoop applications. HDS, Cloudera and Hitachi Consulting will present together and explain how to get you there. Attend this WebTech and learn how to: Solve big-data problems with Hadoop. Deploy Hadoop in your data warehouse environment to better manage your unstructured and structured data. Implement Hadoop using HDS Hadoop reference architecture. For more information on Hitachi Data Systems Hadoop Solution please read our blog: http://blogs.hds.com/hdsblog/2012/07/a-series-on-hadoop-architecture.html
Hitachi Unified Storage 100 family systems consolidate and manage block, file and object data on a central platform. For more information on our unified storage please visit: http://www.hds.com/products/storage-systems/hitachi-unified-storage-100-family.html?WT.ac=us_mg_pro_hus100
IDC Whitepaper: Achieving the full Business Value of VirtualizationDataCore Software
Are you struggling with how to choose the right storage virtualization solution, or just looking to achieve a scalable software-based storage virtualization solution that fits your budget? Consolidate storage and server assets
Increase the number of virtualized servers running on individual physical servers while doubling storage utilization rates for installed storage
Leverage lower-cost/higher-capacity storage tiers that can significantly cut the cost of acquiring new storage assets
Improve application and information availability while shrinking backup times
Significantly reduce the cost to meet the performance and business continuity objectives of virtualized IT organizations
Software-Defined Storage Accelerates Storage Cost Reduction and Service-Level...DataCore Software
In this White Paper, IDC, a major global market intelligence firm assesses DataCore in the Software-Defined Storage (SDS) space.
DataCore is one of the leading providers of hardware independent storage virtualization software. Its customers are actively leveraging the benefits of software-defined storage in IT environments ranging from large datacenters to more modest computer rooms, thereby getting better use from pre-existing storage equipment.
This White Paper further discusses the emerging storage architecture of software-defined storage and how DataCore enables its customers to take advantage of it today.
Download this IDC White Paper to learn about:
- The four major forces that have led to a major transformation in changing the way we use IT to do our jobs and how datacenters need to adapt.
- Why companies are switching to SDS and the benefits, including significant reductions in cost, that they can expect upon adoption.
- An Overview of DataCore’s SDS solution and the key differentiators that make it well equipped to handle the next generation of storage challenges.
How do you get CIOs to jump on the storage virtualization bandwagon if they’re not on it already? Use these five compelling points to persuade them that storage virtualization is right for their organization:
1. It’s Inevitable and Strategic.
2. Drives Productivity and Innovation.
3. Talk Return on Investment.
4. Deferring CapEx, Reducing OpEx.
5. Times are Changing and so is the CIO’s job.
This whitepaper will help you understand how to realize measurable cost savings and superior ROI by using a comprehensive storage management solution. For more information on IBM Software Solutions, please visit: http://bit.ly/16Tj2M0
Enabling Storage Automation for Cloud ComputingNetApp
This paper looks at the requirements of both sets of customers and the challenges that each faces. It then overlays the NetApp strategy as a storage supplier in serving both sets of customers by providing policy-based storage automation and thus enabling IT service automation.
1- Lower total cost of ownership
2- A platform for rapid reporting and analytics
3- Increased scalability and availability
4- Support for new and emerging applications
5- Flexibility for hybrid environment
6- Greater simplicity
(Original share from Francisco González Jiménez)
1- Lower total cost of ownership
2- A platform for rapid reporting and analytics
3- Increased scalability and availability
4- Support for new and emerging applications
5- Flexibility for hybrid environment
6- Greater simplicity
How companies are managing growth, gaining insights and cutting costs in the ...Virginia Fernandez
6 reasons to upgrade your database:
Reason 1: Lower total cost of ownership
Reason 2: A platform for rapid reporting and analytics
Reason 3: Increased scalability and availability
Reason 4: Support for new and emerging applications
Reason 5: Flexibility for hybrid environments
Reason 6: Greater simplicity
How companies are managing growth, gaining insights
and cutting costs in the era of big data.
Top reasons to change your database:
1. Lower total cost of ownership
2. A platform for rapid reporting
and analytics
3. Increased scalability and
availability
4. Support for new and emerging
applications
5. Flexibility for hybrid environments
6. Greater simplicity
EMC Isilon: A Scalable Storage Platform for Big DataEMC
This white paper provides insights into EMC Isilon's shared storage approach, covering a wide range of desired characteristics including increased efficiency and reduced total cost.
Know whether cloud based storage or dedicated storage is best for your business IT infrastructure depending on our organization requirements. Check Netmagic’s outlooks.
This article takes a look at some of the reasons behind this data explosion, and some of the possible effects if the growth is not managed. We’ll also examine some of the ways in which these problems can be avoided.
Similar to Hitachi white-paper-storage-virtualization (20)
Hitachi Vantara and our special guest, Dr. Alison Brooks, Research Director at IDC, discuss:
• How video and other IoT data can help your business become smarter, safer and more efficient.
• How to harness IoT data to gain operational intelligence and achieve better business outcomes.
• How Hitachi’s customers are innovating with IoT to excel.
• Which practical applications and best practices will get you started on your own IoT journey to reach your goals and tackle your challenges.
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bring...Hitachi Vantara
Virtualizing SAP HANA with Hitachi Unified Compute Platform Solutions: Bringing Flexibility, Agility and Readiness to the Real-Time Enterprise. VMworld 2015
Hitachi Virtual Infrastructure Integrator (Virtual V2I) is a VMware vCenter plugin plus associated software. It provides data management efficiency for large VM environments. Specifically, the latest release addresses virtual machine backup and recovery and cloning services. Customer want to leverage storage based snapshots as it is scalable, more granular backup from hours between backups to minutes resulting in improved RPO. VMworld 2015.
Economist Intelligence Unit: Preparing for Next-Generation CloudHitachi Vantara
Preparing for next-generation cloud: Lessons learned and insights shared is an Economist Intelligence Unit (EIU) research programme, sponsored by Hitachi Data Systems. In this report, the EIU looks at companies’ experiences with cloud adoption and assesses whether the technology has lived up to expectations. Where the cloud has fallen short of expectations, we set out to understand why. In cases of seamless implementation, we gather best practices from firms using the cloud successfully.
HDS Influencer Summit 2014: Innovating with Information to Address Business N...Hitachi Vantara
Top Executives at HDS share how the company is Innovating with Information to address business needs. Learn how the company is transforming now and into the future. #HDSday.”
Information Innovation Index 2014 UK Research ResultsHitachi Vantara
Hitachi Data Systems releases insights from its inaugural ‘Information Innovation Index’, a UK research report, conducted by independent UK technology market research agency, Vanson Bourne, in which 200 IT decision-makers were surveyed during April 2014 to provide insights into how current approaches to IT are thwarting companies’ ambitions to leverage data to drive innovation and business growth.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Key Trends Shaping the Future of Infrastructure.pdf
Hitachi white-paper-storage-virtualization
1. Storage Virtualization: How to Capitalize on Its
Economic Benefits
DATA DRIVEN GLOBAL VISION CLOUD PLATFORM STRATEG
ON POWERFUL RELEVANT PERFORMANCE SOLUTION CLO
VIRTUAL BIG DATA SOLUTION ROI FLEXIBLE DATA DRIVEN V
WHITEPAPER
By Hitachi Data Systems
April 2014
2. WHITE PAPER 2
Contents
Executive Summary 3
Introduction: The State of Storage Affairs in Today's Corporate IT 4
Address Storage Growth Demands 4
Leverage Existing Assets to Meet Budget 4
Manage Data Movement Issues 5
Avoid Single-Vendor Issues 5
Ensure Reliability Across the Storage Infrastructure 5
The Tools for Achieving Cost Efficiencies 5
The Role of Storage Economics for Bettering the Bottom Line 5
Understand the Value of Capacity Efficiency 6
The Role of Storage Virtualization in Creating Cost Efficiencies 6
In the Box and Out 6
Controller-Based Storage Virtualization 7
Leveraged Benefits 7
The Top Use Cases for Virtualization 9
Consolidate, Simplify and Scale 9
Leverage Current Assets 10
Boost Data Mobility and Migrations 11
Build a Bridge to Easier Online Migration Strategies 11
Apply a Risk Estimation Strategy 12
Dual-Vendor Sourcing 13
Create Rock-Solid Recovery and Compliance 14
Final Notes of Consideration 15
3. WHITE PAPER 3
Storage Virtualization: How to Capitalize on Its Economic Benefits
Executive Summary
The corporate IT world is diligently trying to solve difficult business issues on a massive scale. From a data growth
perspective, there seem to be more demands and inefficiencies than budget to manage effectively. IT leaders are
wrestling with everything from sluggish, isolated storage systems to disruptive, costly data migrations. They are con-
sidering how best to scale and manage the environment without adding more complexity, expense or risk.
The virtues of storage virtualization are many. Storage virtualization has become a mainstay in the enterprise environ-
ment, helping companies to modernize and simplify countless storage activities. However, many organizations are not
fully capitalizing on its manageability and economic benefits.
This paper examines 5 key customer problems that storage virtualization can help to solve. It also looks at how to
truly leverage virtualization technologies for significant savings on capital and operating expenditures (capex and
opex). As an industry leader in proven, cost-efficient storage virtualization platforms, Hitachi Data Systems has a long
history of helping organizations to design, deploy and leverage economically superior storage architectures across
the data center.
In this paper, Hitachi provides a closer look at how to create new opportunities for rationalizing, simplifying and scal-
ing storage throughout the enterprise. It also helps storage administrators understand how to minimize the budget
and time overruns of complicated data migrations. It explains how they can apply online migration capabilities that
dramatically reduce the costs and time involved.
With its highly acclaimed Storage Economics program, Hitachi Data Systems enables enterprise organizations to
explore the best methods for lowering the total storage cost structure. At the same time they can optimize existing
storage resources.
Storage virtualization is the foundation of many of today's IT success stories. Read this paper to learn how some
enterprises benefit from capacity efficiency gains, advanced storage virtualization designs and dual-vendor storage
strategies.
4. WHITE PAPER 4
Introduction: The State of Storage Affairs in Today's Corporate IT
Storage virtualization seems to be a silver bullet squarely aimed at intractable cost-efficiency issues prevalent across
most data centers today. Amid the ever-rising complexities, risks and expenses of managing data storage resides the
promise of a real remedy that achieves business benefits through a more economical IT architecture.
Efficient, cost-effective information management is one of the top measures of business health. As IT leaders persis-
tently pursue healthier data infrastructures, storage virtualization has evolved into a proven method for getting there
faster.
Achieving better business health through cost-effective storage architectures requires more than status-quo technolo-
gies and thinking. For most organizations, there is now an urgency to manage and scale the business with economic
exactitude, while still meeting new standards for compliance and data growth. In IT, this urgency means getting in
front of complicated, often convoluted data environments that are continually shifting, changing and expanding. The
only part of the data center that does not appear to be growing is the budget. The imperative for meeting the myriad
of IT challenges is to identify and implement simpler, more reliable and affordable ways to manage it all. Some of
these key challenges include the need to:
Address Storage Growth Demands
Data storage is growing relentlessly. Lengthening retention regulations for compliance and e-discovery purposes cou-
pled with a lack of aggressive management policies has created a glut of both structured and unstructured content.
New enterprise infrastructure configurations associated with virtual servers have compounded the problems even
more. To keep up with data growth, IT departments are buying endless amounts of storage capacity, trying to sweat
inefficient assets, and preparing for big data applications that involve massive volumes.
Data center inefficiencies can grow faster than the data. Disruptions, downtime, low storage utilization rates and mis-
alignment of data to the appropriate storage media all contribute. Opex can quickly spiral out of control, as efforts to
manage the data infrastructure exceed opportunities for new storage capital purchases.
IT administrators are focusing on consolidation and simplification to meet storage growth demands and lower the
total cost structure.
Leverage Existing Assets to Meet Budget
Information has quickly become a highly valuable business commodity. Existing storage infrastructure is probably
not driving up the value of the data, only the cost of it. The prevalence of older technologies and equipment makes
it more difficult to optimize the storage environment. It is estimated that the annual cost to manage storage is about
60% of all enterprise storage-related spending, including software, power, cooling and administrative time.
IT organizations are looking to optimize current investments, as well as any future purchases. Inefficient technologies
magnify costs and exponentially worsen with data growth. Reducing disparate systems and isolated data islands in
favor of centralized pooled storage is essential to streamlining both operational and energy efficiencies. Storage vir-
tualization helps to aggregate systems. Thus, it requires fewer storage systems, including older, less efficient ones, to
store data and save on floor space, power and cooling costs in the data center. Making better use of capacity and
foregoing traditional provisioning methods allow deferral on short-term upgrade costs. Efficient architectures reduce
storage requirements, increase storage utilization, lower energy-related expenses and improve administrative produc-
tivity for a greater return on assets and investments.
5. WHITE PAPER 5
Manage Data Movement Issues
Data migrations can be painful in terms of cost, risk and complications. Most enterprise IT shops have to plan for
storage platform end of life and migration to new storage. They must thoroughly consider the potential impacts on
production environments and ensure ample support and expertise.
Storage system and data migration costs can escalate to as much as US$15,000 per terabyte, and can take months
to complete. Many businesses are already grappling with petabyte-size migration challenges. Limited maintenance
windows, complicated SAN rezoning, orchestrating required host reboot activities, and administrative or support
inexperience all quickly contribute to budget overruns and downtime durations.
IT is looking to minimize manual efforts, create more transparent migration activity and take advantage of flexible data
mobility capabilities. In these ways, the price tag and hazards of moving data are diminished.
Avoid Single-Vendor Issues
The all-in-1-basket approach to storage often leads to unintended outcomes. Some organizations have accumulated
a jumble of vendor products or employ a best-in-industry strategy, and both of these approaches present challenges
in fast-growing environments. Likewise, having only 1 storage vendor has a unique set of risks. By relying on only 1
storage provider, enterprises set themselves up for poor pricing economics, less competition and less innovation. For
multitiered environments, the opportunity to save on mid- and lower-tier storage through competition is lost. A dual-
vendor strategy for storage preserves an organization's ability to create a more balanced, nimble and cost-efficient
data environment.
Ensure Reliability Across the Storage Infrastructure
Reliability is fundamental and key in designing a more manageable and therefore cost-efficient IT environment.
Outages, data loss or corruption, and difficulty accessing information or applications collectively impact a company's
ability to meet customer, regulatory and financial obligations.
Building resiliency and stability into existing and future scenarios starts with understanding what technologies are
available and can fit together in a clean way. Assessing reliability factors across the data center is critical. Are there
risks of outage or noncompliance? Can legacy systems be consolidated and rationalized to simplify backup, archiving
and disaster recovery?
When reliability is intentionally built into the storage environment, organizations are better prepared to protect data
and the business.
The Tools for Achieving Cost Efficiencies
Making strides toward cost-effective data storage that maps to anticipated growth demands, as well as unforeseen
changes, means equipping IT leaders with a new set of tools. What are the tools of change that can unify visibility
and manageability across storage, and how can businesses capitalize on them for greater cost savings and business
health?
The Role of Storage Economics for Bettering the Bottom Line
IT leaders know that they can only improve what can be measured. While dollars-per-megabyte formulas and
hardware-only equipment justifications do help to quantify costs, they are inadequate for accurately gauging the total
cost of storage ownership (TCO).
6. WHITE PAPER 6
A working knowledge of the TCO is essential to assessing the true price of storage. The purchase of hardware or
infrastructure, known as capex, typically accounts for only 25% of TCO. The remaining 75% is allocated opex, those
less tangible, softer costs related to administration, energy, support and so on.
Storage Economics provides a proven framework for bettering the bottom line. Storage Economics has cataloged
nearly 3 dozen different types of capex and opex costs. The program aids IT in pinpointing and classifying these
insidious budget busters, for a precise picture of what storage is actually costing in the environment. Using com-
prehensive cost-to-solution mapping and extensive expertise in financial modeling, Storage Economics applies a
systematic approach to help IT leaders reduce costs over time.
Understand the Value of Capacity Efficiency
Some industry experts have predicted the perfect storm for creating a storage efficiency crisis.
Growth in the amount of data and content being kept and stored online continues to accelerate.
This growth is driving demand for unprecedented amounts of disk storage, while for a number of
reasons rapid price declines have slowed or stopped. The old standby challenges of low utilization
and stranded storage only add to the calamity. Thus, organizations must become smarter about
storage utilization and reclamation across current storage assets. Capacity efficiency as a whole
seeks to accomplish the following:
■■ Increase storage utilization to be able to use more of existing and new assets.
■■ Reduce or defer capacity purchases in step with business growth.
■■ Reclaim capacity and free up high-performance capacity for future use.
■■ Repurpose and extend the useful life of existing assets by managing heterogeneous storage capacity as a single
pool that inherits the benefits of parent storage systems.
■■ Take advantage of cost savings features, such as thin provisioning and dynamic data mobility, with existing storage
systems, not just new ones.
Hitachi Data Systems employs best practices, assessment services and more than a dozen proven technologies to
help organizations proactively address capacity efficiency through reduction, reclamation and deferral.
The Role of Storage Virtualization in Creating Cost Efficiencies
At the core of capacity efficiency resides storage virtualization and storage pooling. Virtualization and pooling helps
organizations reclaim, utilize and optimize storage to craft an economically and ecologically superior data center. The
important characteristics of virtualization are reduced complexity when compared to managing devices discretely
and greater capability to improve services. When storage is virtualized, logically pooled and centrally managed, costs
for hardware, SAN infrastructure and environmental controls go down. Flexibility, scalability and cost efficiencies
accelerate.
In the Box and Out
A key storage virtualization technology performed internally, or inside the box, is thin provisioning, also known as
dynamic provisioning. Unlike traditional provisioning, which requires projected growth of a volume to be fully allo-
cated, dynamic provisioning presents only the capacity actually used, freeing up other capacity until needed. In this
way, a virtual pool of shared capacity can be larger than the actual amount of physical storage available. This sizing
enables administrators to deliver capacity on demand from a common pool of storage. What differentiates it in the
virtualized storage environment is the ability to dynamically provision from both internal and external storage. Thus, it
allows advanced virtualization capabilities on older or lower-tier storage systems.
LEARN MORE
Evaluator
Group:
Capacity
Efficiency
7. WHITE PAPER 7
Thin provisioning reduces waste of allocated capacity, improves performance with wide striping, and enables stor-
age reclamation. Extending thin provisioning through storage virtualization and logically pooling internal and external
resources make opportunities for greater efficiency possible. Virtualization outside the box enables seamless integra-
tion of new and future capabilities, including data migration, lifecycle management, media migration and integration of
data across applications.
Storage virtualization can bridge to more advanced efficiency technologies. It can support virtualized multitiered
storage architectures and dynamic tiering, as well as multitenancy and cloud services. With this support, the cost effi-
ciencies of storage virtualization can be exponential.
Controller-Based Storage Virtualization
Storage virtualization is performed in several ways, including host-based, controller-based and appliance-based.
Hitachi Data Systems uses storage controller-based virtualization, which separates the storage controller from the
disk storage system. This approach separates logical views from physical assets and allows heterogeneous multiven-
dor storage systems to be directly connected to the controllers.
Using a single management interface, Hitachi virtualization permits the external systems to be "discovered" so they
can be managed along with internal disks as a single pool of storage. Pooling these resources in a logical way helps
remove the physical barriers and optimizes the resources to reach their full potential. The complexity associated with
all the components required to deliver storage is buffered behind intuitive virtualization software that helps simplify
operations of managing the environment. More importantly, the virtualization software delivers greater flexibility to
meet the needs of applications.
This block-based virtualization technique permits the externally attached storage to leverage capabilities of internal
storage. For example, it allows nondisruptive data migration, replication, partitioning and thin provisioning, with-
out extra hardware, latency issues or new complexity. In a recent engagement, Hitachi Data Systems was able to
improve storage utilization by over 30%. In addition, in a recent survey of IT organizations using Hitachi enterprise
storage, more than 70% rated Hitachi support for externally virtualized storage as very good or better.1
As a result, organizations can achieve the right cost, performance, reliability and availability characteristics of storage,
as needed, to match with application requirements. In turn, storage infrastructure is flexibly aligned with business
requirements.
Leveraged Benefits
Virtualization has become a mainstream technology and is used to increase hardware utilization as well as lower
operating costs in the data center. Virtualization capabilities can be extended to other forms of storage and plat-
forms, such as NAS, iSCSI, active archive and virtual tape libraries. Even mainframes can be connected to reduce
islands of stranded storage and provide greater data protection, search and management functions across the entire
environment.
With virtualization embedded in the controller, an enterprise storage system has up to 10 times the cache size, 6x the
IOPS performance and more port connectivity than a SAN-based virtualization appliance. These benefits extend scal-
ability well beyond SAN-based appliances. For modular attached storage, using controller-based virtualization can
enhance performance by as much as 30%.2
1
Source: TechValidate survey, TVID: F21-DA2-763 and TVID: BA6-25B-E4B
2
Storage Virtualization for Dummies: Hitachi Data Systems Edition, 2010 by Wiley Publishing, Inc., http://media.wiley.com/assets/2112/24/9780470597705.pdf
8. WHITE PAPER 8
The right virtualization environment creates a synergy for doing more with less complexity, less risk and lower costs.
Storage services are now aggregated for agile, scalable and service-level-based delivery. Improvements in capacity,
manageability, integrity and reliability promote a better return on assets (ROA). The bottom line for storage virtualiza-
tion's command of capacity efficiency is that it becomes cheaper to virtualize and reclaim existing storage than to
purchase new capacity. The areas where virtualization can affect cost reductions include:
■■ Lower capex.
■■ Reclaimed capacity.
■■ Longer asset usefulness.
■■ Lower cost of growth with overall utilization improvements.
■■ Cost of migration.
■■ Labor, tools, outage, ROA.
■■ Cost of software licenses.
■■ Common pool, 1 source for control, replication.
■■ Software maintenance costs.
■■ Hardware maintenance costs.
■■ Support rates commensurate with the tier of data.
■■ Demotion of older assets (as appropriate) to lower tiers and different maintenance rates.
9. WHITE PAPER 9
THE ECONOMICS AT ATOS
“The customer is always right” has taken on a whole new meaning at Atos, a highly successful systems
integrator and outsourcing company based in Europe and Asia Pacific. When its largest customer, an
international bank, wanted to overhaul its hosted IT environment at Atos, the focus was on creating a
superior storage architecture. The plan was to drive efficiency and resiliency up and costs down. The
bank requested that Atos re-engineer its infrastructure to include virtualization and tiered storage, with
the goal of pushing utilization rates to near 80%. At the same time, it wanted to lower the overall total
cost of storage ownership.
With the help of Hitachi Data Systems, Atos built a highly nimble and cost-efficient multitiered eco-
system that answered the bank’s call. Atos initiated a systematic unified approach to managing future
growth for all its customers. Atos won a 7-year contract renewal with its big customer, along with a
model and path for its future corporate growth.
We have grown from 1PB to 1.8PB, and have converted from a single, Tier 1 structure to very-
well-organized multiple tiers. We have already improved storage utilization from its original 25%
to 35%, to now utilizing at 66% and counting.
– Stephen Ko, Director of Operations for Atos Managed Services in the APAC Region
Atos Storage Virtualization Benefits
■■ Simpler data migration and mobility.
■■ Development of a service catalog.
■■ Ability to monitor capacity utilization and performance.
■■ Signification cost savings.
We have achieved a blended overall rate of 30% cost reduction in storage, by distributing
Tier 1-only storage across the new hierarchy of tiers. With the Hitachi technology in place, we
have been able to reduce overall costs of managing, storing and safeguarding large amounts
of data. I expect that this type of trending will likely increase, which is fantastic from a financial
perspective, as well as for customer satisfaction and loyalty. We can invest in advanced tech-
nology and can still earn from our investments.
– Stephen Ko, Director of Operations for Atos Managed Services in the APAC Region
The Top Use Cases for Virtualization
Consider the merits of storage virtualization for meeting greater capacity efficiency, demanding data growth and
better business value. Examine the following use cases.
Consolidate, Simplify and Scale
How does storage virtualization help the corporate IT department overcome the complexity and cost of disparate
data growth and access? As the numbers of interfaces, tools, systems and software proliferate across a frag-
mented storage enterprise, so does the amount of time and money to manage it. A lack of flexibility hinders the
storage administrator's opportunity to assess problems, provide access, provision services and attempt to optimize
10. WHITE PAPER 10
resources. In a survey of more than 200 Hitachi virtualization customers, 70% rated consolidation and simplification
as a 4 in importance on a scale of 1 to 5, with 5 being most important.3
IT organizations can slow the need to purchase extra assets by using storage virtualization. They can ensure that
the technologies in place maximize value from current storage investments while handling data growth needs. These
technologies include dynamic provisioning, dynamically tiered storage, data deduplication, storage consolidation and
integrated archiving.
By consolidating storage through virtualization, administrators can rapidly reduce the clutter of disparate or stranded
systems and thereby begin to simplify the storage infrastructure. As external assets become virtualized and central-
ized into a common storage pool, IT can better align storage tiers with business and application requirements. Higher
performance disks can be saved for true Tier 1 mission-critical data, while less-active or less-critical data can reside
on lower-cost tiers. In these ways, service level agreements (SLAs) are more easily accommodated, too.
When data storage infrastructure is unified with a single management interface, IT is able to take advantage of higher-
level features that further efficiency. For example, using a single pane to manage all the storage assets allows external
systems to be optimized with virtualization tools as if they were internal to the system. Now, all storage, regardless of
where it sits in the virtualized framework, has access to online data migration tools, dynamic provisioning and repli-
cation activity. Unified storage is able to deliver a shared storage services model for multiple applications across the
SAN for increased storage utilization, flexible scalability, better performance and better data availability.
Leverage Current Assets
By using storage virtualization, organizations can reclaim capacity and increase utilization on their current storage
systems. These benefits are a boon to IT departments trying to save on the soaring cost of hard disk drives and
extend the useful life of existing assets. Nearly 90% of IT organizations using Hitachi enterprise storage systems were
able to increase storage utilization by 11% to 25% with virtualized storage. Financial services companies with data-
base- and online-intensive applications saw increases between 26% and 40%.4
Hitachi Data Systems recognizes several important technologies that, in concert with storage virtualization, further
leverage current assets and promote greater cost efficiencies (see Table 1). Technologies such as Hitachi Dynamic
Provisioning create pools of pages that are spread across many RAID groups. Therefore, when a volume is cre-
ated, it is written a page at a time across the width of the pool, automatically generating wide-stripe performance.
To increase the stripe or make changes to the RAID group, Dynamic Provisioning performs these tasks automatically
and rebalances the stripe transparently, for greater operational efficiency over manual striping.
Greater flexibility optimizes disks across various scenarios, or when considering higher-density drives. Architectures
that use technologies such as Hitachi Dynamic Tiering eliminate the need for time-consuming manual data classifica-
tion and movement, to optimize usage of tiered storage and management of data lifecycles. Data is automatically
moved on fine-grained pages within the virtual volumes to the most appropriate media, according to workload
requirements.
Another effective technology is storage reclamation. Reclaimable storage is often the most easily realized benefit
available through storage virtualization. Reclamation allows untouched storage space to be folded back into the
useful pool of storage capacity, by moving "thick" volumes (those volumes that are provisioned traditionally) to cen-
tralized "thin" volumes. This process helps reduce the need for storage at current rates and defers disk purchases
while optimizing current resources. Many companies have a low storage utilization rate but are unable to reclaim a
portion of their unused allocated storage because of archaic infrastructure and processes. In fact, companies with
3
TechValidate Survey, TVID: EC4-04B-275
4
TechValidate Survey, TVID: B87-0D5-DB5 and TVID: BEF-53F-6F8
11. WHITE PAPER 11
inefficient storage use often continue to spend on new storage capacity despite the significant amounts of reclaim-
able storage available within their existing infrastructures. This practice results in new capex, which in turn increases
opex because of the need to manage the associated new storage.
TABLE 1. TECHNOLOGIES THAT LEVERAGE CURRENT ASSETS AND PROMOTE COST EFFICIENCIES
Technologies Benefits
Tiered storage with Hitachi Dynamic
Tiering
Integrating automated tiered storage architectures fosters better performance, greater
simplicity and higher availability. These improvements equate to lower acquisition and
operating costs for any production environment.
Disk configuration optimization Proper setup of physical and logical drives is essential for optimizing drive
performance of existing and new storage resources. Thin provisioning and automation
of wide striping across disk drives increases capacity utilization and enables greater
flexibility and optimal drive performance of existing and new storage.
Zero page reclaim and write same These storage reclamation technologies return unused storage to the virtualized pool
as free space, helping to recoup capacity on existing storage resources and improve
opex costs to manage utilization.
Thin replication and copy-on-write Replicating data stored on a thin-provisioned volume, and copying only the portion
of the volume that is actually being used, saves on communications bandwidth and
costs for greater replication efficiency.
Service catalogs and chargeback
information
Standardizing storage configurations and clearly defining service level metrics, unit
costs and storage services allows organizations to segregate and meter capacity
consumption of user groups in multitenancy situations. It supports accurate
chargeback capabilities and reporting.
Compression, deduplication and single
instancing
These intelligent archiving technologies are key in efficiently using high-performance,
high-cost storage. They eliminate unnecessary duplicate data, saving space and
automating the movement of stale data to lower-cost tiers. They also reduce backup
costs, allowing for faster and more granular recovery, reduced backup times and
fewer data volumes.
Boost Data Mobility and Migrations
Data mobility is more than the ability to move data, volumes and applications: It is about how to move it to bolster
business continuity, to reduce risk and to lower costs for the organization. One of the most recognizable use cases
for storage virtualization is better data mobility, especially when it means less painful data migration activity.
Organizations are always looking for ways to minimize the burdens associated with data migration. Two of the biggest
concerns of migration projects are downtime and budget overruns. According to a TechValidate survey, the average
enterprise storage migration project runs 4 to 6 hours per host. This range includes the time to plan and execute, and
70% of customers reported schedule overruns.
Build a Bridge to Easier Online Migration Strategies
Organizations spend considerable resources and assume more risk than necessary to conduct data migrations. The
use of heterogeneous storage virtualization technologies minimizes risks and costs. By connecting the old storage
12. WHITE PAPER 12
systems with new storage via the SAN, no outage is required. Migration activities such as host discovery, configura-
tion, testing and data copy are transparently redirected to the new system. Once all the data is on the new storage,
the legacy system can be removed or decommissioned.
Online migration is essential for scaling large amounts of data, running higher data throughputs and ensuring flexibility
for various outage windows. Automating data migration enables seamless movement between tiers and subsystems
to amplify application availability and reduce IT and business risk. Additionally, administrators are able to initiate and
enforce policy management and reduce the costs for meeting SLAs.
Apply a Risk Estimation Strategy
Online migration and storage virtualization greatly decrease the costs and perils of long, drawn-out migrations that
require manual intervention. But any migration has inherent risks, and being able to avoid them requires a strategic
approach to risk estimation.
Organizations can begin by leveraging a vendor who understands how to successfully migrate data across various
platforms and multiple technologies. Evaluating business continuity plans requires expertise that few organizations
have in house.
Hitachi Data Systems Global Solution Services (GSS) offers a Risk Analysis Workshop for just this
reason. The Risk Analysis Workshop provides a unique and proven approach to helping IT assess
the resilience of their environment and prioritize plans for how to resolve gaps. For organizations
trying to address data protection, payback, justification and risk management issues, the Risk
Analysis Workshop consultants give guidance. They provide a systematic evaluation to identify the
gaps in the company's risk management. Following the workshop, Hitachi also offers Quantitative
Operational Risk Assessment services, during which GSS consultants design the optimal solution to close material
gaps identified in the workshop. This design includes both technical and economic analysis. As a result, organizations
are prepared to make more informed decisions about their business continuity investment and how best to improve
the quality of service they provide.
REVIEW
Storage
Readiness
Checklist
13. WHITE PAPER 13
EXAMINE THE ECONOMICS AT OVERSTOCK.COM
Overstock.com used storage virtualization to significantly reduce both capital and operating costs.
We’ve reduced data-migration-related downtime from several hours to less than 30 minutes.
Overall, by using Hitachi virtualization, dynamic provisioning and tiered storage, we’ve reduced
our capital and operating costs for an improved return on our storage investment.
– Carter Lee, VP Technology Operations, Overstock.com
the storage solution
■■ Hitachi enterprise storage with Hitachi Dynamic Provisioning software.
■■ Seamless and nondisruptive data movement between storage systems.
■■ Consolidation and unification of all heterogeneous storage into a single virtualized pool that can
scale up 247PB.
capacity efficiency benefits
■■ Reduced technology-refresh time by up to 90%.
■■ Reduced provisioning tasks by up to 80%.
■■ Initial reclamation yielded savings of approximately 50%.
■■ Improved utilization rate to about 80%.
Hitachi Data Systems has quickly simplified the process of nondisruptively provisioning stor-
age. With Hitachi virtualization technologies, we’ve seen storage capacity savings of 50% on
some arrays, can now provision storage in 25% of the time, and have increased utilization
rates by over 30%.
– Carter Lee, VP Technology Operations, Overstock.com
Dual-Vendor Sourcing
Organizations that operate under a dual-vendor strategy are looking to overcome inherent costs and risks of having
only 1 provider and 1 infrastructure. The value of deploying a dual-vendor strategy comes in acquiring greater infra-
structure flexibility and cost savings.
By balancing the storage environment with a blend of 1 or more storage vendors, administrators are better able to
achieve cost savings in the critical high-growth area of 2nd-, 3rd- and lower-tier storage. The key drivers for deploying
a dual-vendor strategy include reducing costs while driving up innovation, flexibility and choice.
It is critical to ensure that vendors are chosen to meet certain criteria, such as open standards, unified interfaces and
flexibility. These standards enable greatest interoperability, availability and the ability to seamlessly virtualize storage.
By virtualizing all the applicable capacity across the storage environment, there is greater opportunity to retain opera-
tional efficiencies and simplify management.
14. WHITE PAPER 14
According to a Gartner analysis, dual-vendor tactics can lower storage acquisition costs by 25% or more, as com-
pared with organizations maintaining a single-source storage infrastructure5
. Organizations considering dual-vendor
sourcing have the opportunity to commoditize and attain a lower cost of procurement. Competition among vendors
results in better pricing for potential solutions. The requesting organization is able to choose the cheapest solution
that meets business needs. The new system, no matter how comparatively inexpensive, inherits the best features of
the Hitachi storage controller.
EXAMINE THE ECONOMICS OF DUAL-VENDOR WINNERS
Consider these examples of the dramatically lower total cost structure possible with implementation of a
dual-vendor strategy.
Global 10 Telco Operator - European Headquarters
Hitachi Data Systems submitted a proposal to consolidate a number of international data centers as a
competitive approach to the incumbent vendor.
Results:
■■ The incumbent reduced its proposal price by 36% from US$19.5 million to US$12.5 million.
■■ The telco adopted a dual-vendor approach to storage infrastructure for an annual spend of US$100
million.
Global 5 Technology Company - US Headquarters
A large global technology company that has been predominately a single-source supplier has seen
more than US$10 million in savings just by engaging Hitachi Data Systems. This savings was apparent
even before any purchase.
Results:
■■ These savings were in 1 department.
■■ Hitachi Data Systems has since saved this customer an additional US$5 million by competing in
requests for quotes (RFQ). Subsequently, it also won business by helping to create a dual-vendor strat-
egy within the organization.
Global 5 Technology Company - US Headquarters
A recent auction for global dual-sourcing generated the following savings:
■■ Costs reduced by 46% for SAN storage.
■■ Costs reduced by 47% for NAS storage.
Create Rock-Solid Recovery and Compliance
A true measure of successful storage environments can be found by reviewing recovery and compliance strategy. It
is assumed that the majority of enterprise data centers feature storage systems and storage network infrastructures
5
Gartner, Inc., “Toolkit Decision Framework: Viability of Pursuing a Dual-vendor Disk Strategy,” by Stanley Zaffos, Adam W. Couture, and Stewart Buchanan, April 2007.
15. WHITE PAPER 15
from multiple vendors. Regardless of how the storage environment is managed, organizations must be able to ensure
system-wide backup, archival and disaster recovery capabilities, and compliance with numerous legal and regulatory
requirements. If the storage landscape is highly complex, fragmented and difficult to manage, reliability can suffer,
leading to myriad problems. Figuring out how to achieve reliability cost-efficiently, in a single clean way rather than in
many different ways, is both more effective and economical.
Virtualization technologies provide the base for a simplified storage architecture that ensures cost-effective reliability
and system-wide compliance. Achieving reliability by design allows an organization to put in place an economically
superior storage architecture. Such an architecture is capable of increasing reliability, reducing risk of outages, mini-
mizing the impact on business operations and lowering TCO.
As strengthening resilience across the storage environment is essential for supporting business continuity require-
ments, unified disaster recovery and long-term archiving are called for. One mainstay for accomplishing these goals
is rationalization of storage. Rationalization is the ability to obtain a single or cohesive view of data across all systems
while maintaining different aspects of the data to leverage information and better manage operational processes.
Rationalizing and consolidating these assets into an optimal portfolio provides a much more effective, responsive and
flexible model for the future. Rationalization and consolidation rapidly help to lower costs while improving IT's ability to
safeguard data.
Rationalizing new storage is fairly straightforward, while rationalizing existing storage, which is being accessed by
multiple applications and involves multiple layers, is more challenging. The right storage virtualization solution goes
a long way in helping to ease the processes associated with rationalization and to do so with as little disruption to
assets as possible.
An important tenet of economically superior Hitachi storage architectures is creating a foundation for an economical
but rock-solid recovery and compliance design.
Final Notes of Consideration
By observing the cost benefits that storage virtualization affords, IT leaders are better equipped to make informed
choices for how to proceed with data center improvements. When storage services are collectively orchestrated,
capacity efficiencies flourish. The integrative properties of storage virtualization allow for pervasive consolidation, rec-
lamation and increased utilization of stranded assets, while simplifying data movement activities.
Storage virtualization technologies help to extend the useful life of externalized storage, to optimize pooled storage
resources and to create new cost-effective opportunities for flexibly managing data growth demands. For companies
operating a dual-vendor strategy, folding in the right storage virtualization technologies can further advance both
capex and opex savings while promoting greater stability and innovation across the enterprise. By procuring an ele-
gant, reliable and unified storage environment, organizations end up with a lower total cost structure, less complexity
and risk, and greater agility to meet future challenges.
For more information on Hitachi storage virtualization technologies and services, please visit www.HDS.com.