SlideShare a Scribd company logo
1 of 12
Download to read offline
Powered by:
Storage
eBook
Storage
Insider
Software-defined Storage
solves performance problems
Optimising your existing various storage devices allows
your infrastructure to operate at peak performance levels
Published by
2	Storage-Insider.de | Software-defined Storage: Performance
	 3	 Software-defined storage: The devil is
in the detail
	 SDS is the framework of the future
	 7	Software-defined storage solves
performance problems
	Making optimal use of existing storage media
Content
DataCore Software GmbH
Bahnhofstr. 18, 85774 Unterföhring
Phone	 +49 89 4613570-0
E-mail	infoGermany@datacore.com
Website	 www.datacore.com/de
Vogel IT-Medien GmbH
August-Wessels-Str. 27, 86156 Augsburg,
Germany
Phone +49 (0) 821/2177-0
E-mail redaktion@storage-insider.de
Website www.Storage-Insider.de
General manager: Werner Nieberle
Editor in chief: Rainer Graefen, responsible
as per press laws, rainer.graefen@vogel-it.de
Publication date: September 2014
Title image: vege - Fotolia.com
Liability: Should any articles or information be inaccurate,
the publisher will only be liable in the event of proven gross
negligence. Where articles identify the author by name, the
author himself will be responsible.
Copyright: Vogel IT-Medien GmbH. All rights reserved.
Reprints, digital use of any kind and/or duplication hereof are
only permitted with the written consent of the editorial staff.
Reprints and electronic use: If you would like to use any
articles from this eBook for your own publications, such as
special prints, websites, other electronic media or customer
newsletters, you can obtain the necessary information and the
required licences online at www.mycontentfactory.de, phone
+49 (0) 931/418-2786.
Powered by:
3	Storage-Insider.de | Software-defined Storage: Performance
IDC, Forrester or Gartner: regardless of
analyst house, market experts universally
agree that software-defined storage will
become the de-facto platform for storage
provision in the future. The main reason for
this, they say, is that businesses of any size
will always need more capacity to store
their data. Moreover, the requirements
of performance and availability tend to
increase in importance, depending on
the applications in use. Yet companies
only have limited funds available to invest
in storage. Based on the most recent
research by leading analyst company,
Ask any market analyst
and he will agree -
software-defined storage
is the platform of the
future.
The term „software-defined“ is now regularly used to describe storage.
You would be hard-pressed to find any major supplier who today does
not use this keyword to describe their products. Yet, what exactly does
this concept imply, what are the advantages for businesses when
they implement it and at what point is it worthwhile considering
software-defined storage?
451 Research, the percentage of funds
invested in storage compared to the overall
IT budget has, in fact, decreased over the
last two years.
To satisfy this purse tightening, there is
demand for solutions that can be scaled to
fit exact needs; that can offer businesses a
higher degree of flexibility and that promise
to save costs. This is where software-
defined storage (SDS) comes in.
SDS: simply a marketing claim?
What exactly is SDS? The explanation
provided by IDC analysts can be used
as an initial reference: they interpret the
term software-defined storage as “a
storage software stack installed on shared
resources (x86 hardware, hypervisors or
in the cloud) or on commercially available
computer hardware”. This provides the
basis for “allowing the bundling of existing
storage resources, the improvement of their
utilization and the capability to structure a
service-based infrastructure”.
By contrast, manufacturers are still having
a hard time finding a generally applicable
definition or even a willingness to agree
on standards. This is understandable
because storage hardware suppliers are, of
course, primarily interested in continuing
to sell their own systems successfully. In
Software-defined storage:
The devil is in the detail
SDS is the framework
of the future
©vege-Fotolia.com
4	Storage-Insider.de | Software-defined Storage: Performance
the storage component. The result is that
storage is no longer defined by physical
limits, but instead can be distributed more
flexibly, thus becoming logically accessible.
A division between the physical and
logical brings several advantages: Existing
resources can be used more efficiently,
expansions are easier to implement, data
can be migrated without interruption,
management can be centralised and new
functions can be introduced at all levels.
In the solution, the choice between the
numerous technical options primarily
depends on which direction each of the
manufacturers has decided to follow.
As examples, in SAN, virtualization can
either take place by means of an in-band,
out-band or split-path process, either in
the host or in the storage controller of the
storage system. Generally, with technology
inherently tied to specific devices or
models, we must accept that they only
work properly with the systems offered by
their particular manufacturer.
Thus, for a long time now, one tried tested
and therefore effective solution has been
to revert to software-based solutions. They
can bundle every resource at a software
level that is valid for all devices. The
less products that are bound to specific
platforms and/or manufacturers, the
better. The result is that all performance
criteria can be made available at all levels,
irrespective of the existing hardware;
access to the storage systems can be
controlled at a central level and the entire
storage infrastructure can be uniformly
managed from a single console.
Approaches that focus on hardware
suffer from limitations
There is much to be said about integrating
“cookie-cutter” functions and management
solutions at software level and replacing
classic hardware-focused architectures
with non-proprietary virtual and
the meantime, however they continue to
deliver models carrying an “SDS” label.
More often than not deployment does
not bring about any change, because
the required functions are still tied to
their specific storage platforms, typically
proprietary software. Thus the system‘s
own set of features can neither interact
with new components nor with other
manufacturers‘ systems.
Needless to say, this contradicts the
principles of SDS, were the software
determines the functions of the storage and
does so entirely independently from the
underlying devices or selected topologies.
Storage virtualization serves as an
SDS vehicle
Generally, manufacturers revert to storage
virtualization techniques as a means to an
end, typically integrating an abstraction
level between the application server and
SDS is the framework
of the future
SDS solutions bundle all
resources into a software
layer used by all devices.
This solution allows all
performance criteria to be
made generally available
to all devices, irrespective
of the existing hardware.
(Image: DataCore)
5	Storage-Insider.de | Software-defined Storage: Performance
software-defined approaches. There are
quite a few reasons for doing so.
Firstly, data volumes will continue to
increase, making it difficult to determine
just how much storage space must be
reserved in the medium term. Applications
- sophisticated tier 1 applications in
particular - and the requirements of storage
infrastructure become more demanding
with an increase in work load.
Yet classic systems are not designed for
this and are not flexible enough to adjust
to these changing conditions. What makes
things even more difficult is the limited
useful life of the hardware, which for
storage arrays is on average around five or
at most seven years. Frequently businesses
purchase oversized storage space so that
they are equipped for any scenario during
this period. This approach does not allow
the appropriate flexibility to react to new
requirements.
However, if capacity and performance
are not sufficient in the day-to-day work
environment, expansions will be required,
combined with the need to procure
additional devices that more often than not
have to be managed separately or at worst,
require a complete change of architecture.
This, in turn, creates even more problems.
The result is a highly complicated jumbled
mess of storage environments, requiring a
great deal of effort to operate and manage.
Additional hardware takes up much more
space, expenses for power and cooling
increase and, in equal measure, an increase
in maintenance expenses.
SDS frees businesses from
technical constraints
Both from a technical as well as an
economic perspective, old-fashioned
hardware-based storage architectures
will eventually reach their limits in the
short or long term. With this in mind,
software-defined storage represents a
future-orientated conceptual approach
that may be interesting for both small and
medium-sized businesses alike.
It is worthwhile to put some detailed
thought into SDS, especially when it
becomes necessary to purchase new
storage hardware, or when the use of flash/
SSDs or server and desktop virtualization
projects is on the agenda. This is just as
important when business continuity is a key
topic of discussion, requiring a fail-safe,
high performance and high availability
IT infrastructure as the basis for running
business processes without interruptions.
No matter which of these scenarios
applies: by separating the storage services
and functions from devices, businesses are
given the freedom to make use of standard
software, irrespective of its type, for their
SDS is the framework
of the future
The future of SDS from the point of view of analysts
IDC
Based on a survey conducted by IDC, a majority of European businesses
do in fact deal with SDS as a topic, yet by now only eight percent of them
have implemented relevant solutions. Despite this, software-defined storage
represents an attractive approach - 42 percent of the IT decision makers
questioned in the survey consider software to be a key engine for innovation
in the field of storage.
Gartner
Market researcher Gartner considers SDS to be a concept still in the
making, but one that businesses should already be discussing now. From an
analyst’s point of view, one of the greatest benefits of SDS is the integration
of hardware infrastructures that are not manufacturer dependent; that are
operated based on SLAs and that can provide solutions to problems that
once posed challenges to conventional data storage. Based on estimates
by Gartner analysts, however, it will take at least another ten years before
SDS becomes prevalent on a large scale.
Forrester
According to Forrester, storage budgets can no longer keep up with the
demand for storage. As a result, IT administrators are being challenged
and are seeking solutions that will allow them to make storage capacities
and performance available as needed, preferably automatically. The
analysts do not think that integrating additional platforms would be the
best response available today, because in their opinion this would increase
the silo mentality even more, thus making the storage environment even
more complicated. Instead, they are convinced that the weak points of this
conventional approach only serve to accelerate the introduction of SDS.
6	Storage-Insider.de | Software-defined Storage: Performance
own purposes and to manage all their
storage needs with software. In this way,
existing traditional hard drive storage can
be combined with flash media and hybrid
systems in storage architectures tailored to
their own individual requirements.
This is the key to replacing existing island
solutions and to finally being able to say
goodbye to parallel, block-orientated
SAN, file-based NAS and separate backup
and disaster recovery systems, various
hypervisors or flash solutions.
SDS: A performance turbine for
critical business applications
The classic storage systems of the past
are no longer capable of satisfying the
performance needs of critical data or
transaction-focused business applications.
This is why, over the years, the added use
of flash media or solid state disks (SSDs)
has become a common option to increase
overall performance. However, integrating
Flash efficiently into existing environments
still poses a challenge to IT managers.
On the other hand, SDS-based
architectures are able to solve integration
problems because fast storage can be
integrated quickly, with no complications
and almost no interruptions, allowing
the use of existing components. Indeed,
businesses can benefit from the large
number of cross-platform functions and
services designed to speed up and optimise
performance; in addition to automatic
tiering and load balancing, there are also
functions offered such as sophisticated
caching methods. If the primary objective
is to eliminate performance bottlenecks,
SDS may thus prove to be the best
approach. We explore the do’s and don’ts
in the following article. Tina Billo
The optimal use of various
storage media allows
storage infrastructure
to operate at peak
performance levels.
©vege-Fotolia.com
SDS is the framework
of the future
7	Storage-Insider.de | Software-defined Storage: Performance
Virtual work stations are
on the rise. They demand
better performance from
the storage infrastructure.
(Image: IDC)
While the amount of computing power and
networking speed that can be achieved
has rapidly multiplied over the last decade,
as far as storage systems are concerned,
the only radical changes that have occurred
are in disk density and disk capacity, and
do not really consider overall performance.
Since 2000, the speed of traditional
mechanical hard drives has been 15,000
rpms, for example; due to the physical
limitations, we can hardly expect any
further developments in this regard.
And so opens the glaringly obvious
performance gap between the CPU and
attached storage.
This has been posing a problem for
businesses for quite some time. One reason
for this is, that the data volumes that need
to be processed have increased beyond
proportion - the average annual rate of
growth is between 40 and 45 percent.
The use of mobile devices has exploded
in most recent years; this tendency will
continue in the future, with social networks
and cloud services propagating even greater
volumes of data on the go. Simply recording
and storing this data is only one small part
of the puzzle. Evaluation and management
presents an ever greater ongoing challenge.
Making optimal use
of existing storage
media
Big Data, Cloud Computing, Social Media, Mobile Business: These
trends have spurred on exponential volumes of data that now need to
be recorded, processed and analysed. High performance applications
are required to assist, but also storage systems need to offer sufficient
performance to efficiently store and back up the data they generate,
guaranteeing high availability and overall administration. Traditional
hard drive- storage arrays quickly reach their limits here. As a result,
alternative paths are sought to work around these limits. This paper
provides an overview of storage technologies available today and their
advantages and disadvantages.
Software-defined storage
solves performance
problems
8	Storage-Insider.de | Software-defined Storage: Performance
Decision makers hope to obtain important
insights from their data for their own
businesses’ benefit so that they can
stay ahead of the competition. Such
levels of understanding are only gained
through data mining using sophisticated
applications which come with a price tag
of high performance needs; high flow rates
and lower latency.
Traditional storage systems cannot keep
up, and why would they? They’re based
on 20-year old architectural models, so
they are not designed for such extreme
workloads. The gap created between the
processor and the storage speed creates a
“bottleneck” situation here.
Bottleneck storage: Server and
desktop virtualization require
more powerful systems
Atthesametime,virtualizationtechnologies
are also being integrated into businesses
with increasing momentum. Based on
a survey conducted among IT decision
makers, analysts at Forrester estimate that
by now 77 percent of all companies around
the world are working with virtualised
servers.
At the same time, virtual desktop
infrastructures (VDI) are on the rise. A
2013 IDC study showed that 27 percent of
European companies already have virtual
work stations set up, with a further 20
percent discussing their implementation
and another 27 percent trialling
introduction.
However, simultaneously executing
applications in virtual machines (VMs)
creates a pattern of mixed workloads and
arbitrarily distributed data access points.
Classic disk storage options prove to be
more of a stumbling block, because they
do not have sufficient I/O performance
to read and write the data fast enough.
Even though the IOPS performance can
be increased by integrating additional
disks, many businesses feel that costly
undertakings of this kind are no longer an
approach suitable for this day and age.
The end justifies the means
More recent concepts and solutions are
defining what is yet to come. This also
includes, among other things, creating
more storage space and/or increasing
performance by adding additional devices
(“scaling out”) or by upgrading existing
systems by adding more components
(“scalingup”).Inthelattercase,ITmanagers
are generally leaning more towards solid
state storage, which uses NAND or flash
drives as a storage medium in the form of
solid state disks (SSDs) and flash modules.
Where performance is concerned, they are
far superior to HDDs and therefore do very
well, even given a large number of arbitrary
read and write operations, such as those
common in virtualised environments.
Another option to achieve improved
application performance is to consider
Making optimal use
of existing storage
media
Convergent systems
combine server, storage
and network technologies.
These “out-of-the-box”
data processing centres
come with the promise
of improved application
performance. (Image: IDC)
9	Storage-Insider.de | Software-defined Storage: Performance
convergent systems which combine server,
storage and networking technologies.
According to IDC, these virtualised ‘out-of-
the-box” processing centres are becoming
increasingly appealing. About 16 percent
of the companies queried in a 2013 survey
had already implemented a converged
approach with an additional 53 percent
considering implementation.
Companies also expected improved
utilisation of the existing systems and a
higher storage performance gained through
storage virtualization, either implemented
with systems that are already in operation
or via a software solution.
Software-defined storage (SDS) is deemed
to be the next logical step. Once again the
focus is on introducing an abstraction level
between the applications and the physical
devices, with the aim of logically combining
resources for shared access.
There are already so many options
out there that promise to improve the
performance of storage infrastructures and
companies will need to give due thought
to which approach is ultimately the most
appropriate.
SDS: Boosted performance for
storage infrastructures
If the decision is to set up a software-
defined storage environment, companies
have two options they can choose from.
They can either revert to the solutions
offered by hardware manufacturers for
their platforms or they can choose a
purely software-based/device-independent
approach.
The first option poses the very real risk that
functions are partially or entirely tied to
the components of each hardware brand
and cannot be made available across the
board.
Taking the pure software-defined storage
option, will consolidate all storage
resources, services and management
processes, and offset any proprietary
limitations and incompatibilities. The
intelligence and functions are moved to the
software providing an autonomous virtual
intermediate layer, detached from the
physical hardware restrictions. This offers
the distinct advantage that all storage
media, irrespective of format, becomes
available via a standardised centralised
platform. This includes, for example,
automated storage tiering, caching or load
balancing processes, all of which serve
one single purpose: to utilise to the best
the performance potential of different
resources and to speed up applications.
Using auto-tiering to meet
application requirements perfectly
Thanks to high data transfer rates and
extremely short access times, solid state
disks and flash technologies are the
ideal option to counteract the increased
performance requirements for critical
business applications.
Yet performance does not come cheaply:
Purchasing fast storage is still much
more expensive than classic hard
drives. This is why businesses rely on
solutions that allow them to make the
best possible and economically feasible
use of costly storage space. This can be
achieved by using software-­controlled
Making optimal use
of existing storage
media
Data blocks with high
access ratios are
automatically migrated
to faster SSDs and less
active ones to slower
mass storage, using
the software-controlled
automatic tiering process.
(Image: DataCore)
10	Storage-Insider.de | Software-defined Storage: Performance
auto-tiering. For this purpose, storage
media appears consolidated as virtual
storage pools which are first organised
into separate storage classes, or “tiers”
determined on their price-to-performance
characteristics, before being organised and
profiled. Using pre-defined criteria, such
as date and degree of utilisation, intelligent
mechanisms ensure seamless placement
on the most suitable storage type at block
level, based on cost/performance aspects.
Data blocks with high access rates can
be automatically migrated to faster SSDs
and less active ones relegated to slower
mass storage, based on pre-defined
rules. With consistent monitoring of the
I/O performance and accounting for all
competing I/O requirements, the software
automatically allocates demanding,
latency-sensitive workloads to fast storage
media, while allocating the workloads that
are not time-critical to slower, more cost
effective ones.
In order to cushion foreseeable peak loads
that occur regularly at peak times, virtual
harddrivescanalsobestaticallyassignedto
a high performance tier. As soon as capacity
is exhausted, the loads can be switched
to a lower level. This makes it possible
to meet the performance and availability
requirements of critical application
workloads, speed up response times and
enormously increase the processing of
business-critical tier 1 applications across
the entire infrastructure, irrespective of the
underlying hardware.
Faster data reading and writing
using caching
More technically sophisticated storage
virtualization solutions take this a stage
further by making use of the strengths
of caching to increase access speeds.
If the selected software runs on x86-64
standard servers, the devices connected in
a storage pool can make use of the DRAM
working memory and the I/O resources of
each node as a high-performance “mega
cache”. A part of the physical server RAM
is available to respond directly to incoming
application queries.
Frequently read blocks remain in
intermediate storage, relieving the load on
the back-end data carriers and reducing
I/O latency.
Moreover, established caching processes
such as reading ahead, writing behind or
harmonising arbitrary writing processes
can be applied across the board to
allocated sequential disk I/O processes
(“write co-alescing”).
In consequence, applications can be
executed more quickly, thus increasing the
performance of disk storage by a factor of
3x to 5x. Caching writing operations also
increase the life of SSDs, because they
Making optimal use
of existing storage
media
Data caches increase
the performance of disk
storage three- to five-fold
and also extend the life
span of SSDs. (Image:
DataCore)
11	Storage-Insider.de | Software-defined Storage: Performance
only need to run a lower number of writing
and reading cycles.
If companies prefer to keep the shared
storage close to the applications, setting up
a central virtual SAN is an interesting option.
With this option, in addition to storage for
the application servers, VMs also have
access to resources of the virtualization
nodes and to all other connected physical
storage systems, including components
such as DRAM caches, flash- or cloud-
based solutions from a single source. This
improves the scalability of the entire
infrastructure with regard to capacity and
performance even more.
Furthermore, businesses also benefit
from the fact that they can access
company-wide storage functions which
in the past were reserved for classic SAN
infrastructures and can automate and
manage them centrally from a console.
This includes the storage pooling, auto
tiering, adaptive reading/writing caching
and load balancing in addition to a large
number of other services.
Utilisation of the installed storage capacity
can be improved with thin provisioning
and creating snapshots and continuous
data backups (CDP) that guarantees
comprehensive protection of critical
company data. Additionally, technologies
such as synchronous mirroring and
asynchronous replication ensure that
invaluable information for day-to-day
business operations is available to all
locations without the fear of downtime.
InstallingavirtualSANistheperfectsolution
if a medium sized company is interested in
moving towards software-defined storage
without the heavy investment overhead.
Load balancing improves data flow
rates and response times
Load balancing is yet another component
used to prevent typical storage bottlenecks
such as the “blender effect”. This term
describes the reoccuring problem when
many applications compete for shared
storage resources at the same time in
virtualised environments.
Classic hard-drive-based storage arrays
simply cannot handle this rush, nor the high
number of I/O-intensive access operations
and application performance suffers as a
result.
Automatic load balancing is an option to
correct this problem, which in conjunction
with auto tiering and caching, forms the
cornerstone of high performance.
Generally, we distinguish between two
methods. One option is to distribute the
load on the available front-end connections
between the application servers and the
storage virtualization node(s). The other is
to distribute the data load between various
physical hard drives within the pool.
Summary
Companies primarily interested in
finding practical solutions to increase
the performance of their overall storage
infrastructure should take a closer look at
SDS. If a fast, cost-effective entry in line
with the IT budget is required, this can be
realized by using a virtual SAN.
Because the software defines the functions,
performance improvements can be gained
across all storage options and completely
independently of the manufacturer or the
Making optimal use
of existing storage
media
A virtual SAN improves
the scalability of the entire
infrastructure in terms of
capacity and performance.
(Image: DataCore)
12	Storage-Insider.de | Software-defined Storage: Performance
technology. In addition, they also have
the flexibility to integrate components
based on current developments into their
existing infrastructure at any time. As a
result, they can react to changes in the
performance requirements and speed up
the performance of critical tier 1 business
applications.
A recent global study conducted by
TechValidate Research proves just how
enormous the gain is.. The study showed
that 72 percent of companies that already
rely on software-defined storage were
able to quote a three- to ten-fold increase
in profits. Similarly, as far as capacity
optimisationisconcerned,theyalsoverified
that they have achieved good results:
Some 64 percent of the companies queried
were able to reclaim over half of what was
over-provisioned wasted storage space.
As a result, the companies questioned
showed that existing total capacity was
utilised by four fold, and that they were
able to utilise existing hardware longer
with no further investment in additional
storage space required. Typically, they
demonstrated savings of between 25 to
75 percent. Therefore SDS in practice is
a worthwhile investment for companies of
any size. Tina Billo
Making optimal use
of existing storage
media
Load balancing is an
additional component used
to prevent typical storage
bottlenecks.
(Image: DataCore)

More Related Content

More from DataCore Software

Top 3 Challenges Impacting Your Data and How to Solve Them
Top 3 Challenges Impacting Your Data and How to Solve ThemTop 3 Challenges Impacting Your Data and How to Solve Them
Top 3 Challenges Impacting Your Data and How to Solve ThemDataCore Software
 
Business Continuity for Mission Critical Applications
Business Continuity for Mission Critical ApplicationsBusiness Continuity for Mission Critical Applications
Business Continuity for Mission Critical ApplicationsDataCore Software
 
Dynamic Hyper-Converged Future Proof Your Data Center
Dynamic Hyper-Converged Future Proof Your Data CenterDynamic Hyper-Converged Future Proof Your Data Center
Dynamic Hyper-Converged Future Proof Your Data CenterDataCore Software
 
Community Health Network Delivers Unprecedented Availability for Critical Hea...
Community Health Network Delivers Unprecedented Availability for Critical Hea...Community Health Network Delivers Unprecedented Availability for Critical Hea...
Community Health Network Delivers Unprecedented Availability for Critical Hea...DataCore Software
 
Case Study: Mission Community Hospital
Case Study: Mission Community HospitalCase Study: Mission Community Hospital
Case Study: Mission Community HospitalDataCore Software
 
Emergency Communication of Southern Oregon
Emergency Communication of Southern OregonEmergency Communication of Southern Oregon
Emergency Communication of Southern OregonDataCore Software
 
Integrating Hyper-converged Systems with Existing SANs
Integrating Hyper-converged Systems with Existing SANs Integrating Hyper-converged Systems with Existing SANs
Integrating Hyper-converged Systems with Existing SANs DataCore Software
 
Fighting the Hidden Costs of Data Storage
Fighting the Hidden Costs of Data StorageFighting the Hidden Costs of Data Storage
Fighting the Hidden Costs of Data StorageDataCore Software
 
Can $0.08 Change your View of Storage?
Can $0.08 Change your View of Storage?Can $0.08 Change your View of Storage?
Can $0.08 Change your View of Storage?DataCore Software
 
The Need for Speed: Parallel I/O and the New Tick-Tock in Computing
The Need for Speed: Parallel I/O and the New Tick-Tock in ComputingThe Need for Speed: Parallel I/O and the New Tick-Tock in Computing
The Need for Speed: Parallel I/O and the New Tick-Tock in ComputingDataCore Software
 
Delivering First Class performance and Availability for Virtualized Tier 1 Apps
Delivering First Class performance and Availability for Virtualized Tier 1 Apps Delivering First Class performance and Availability for Virtualized Tier 1 Apps
Delivering First Class performance and Availability for Virtualized Tier 1 Apps DataCore Software
 
Optimizing The Economics of Storage: It's All About the Benjamins
Optimizing The Economics of Storage: It's All About the BenjaminsOptimizing The Economics of Storage: It's All About the Benjamins
Optimizing The Economics of Storage: It's All About the BenjaminsDataCore Software
 
ESG Datacore SANsymphony-V Whitepaper
ESG Datacore SANsymphony-V WhitepaperESG Datacore SANsymphony-V Whitepaper
ESG Datacore SANsymphony-V WhitepaperDataCore Software
 
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...DataCore Software
 
Dynamic Storage Mobility for Hyper V
Dynamic Storage Mobility for Hyper VDynamic Storage Mobility for Hyper V
Dynamic Storage Mobility for Hyper VDataCore Software
 
How to Sell Storage Virtualization to The CIO
How to Sell Storage Virtualization to The CIOHow to Sell Storage Virtualization to The CIO
How to Sell Storage Virtualization to The CIODataCore Software
 

More from DataCore Software (20)

TUI Case Study
TUI Case StudyTUI Case Study
TUI Case Study
 
Thorntons Case Study
Thorntons Case StudyThorntons Case Study
Thorntons Case Study
 
Top 3 Challenges Impacting Your Data and How to Solve Them
Top 3 Challenges Impacting Your Data and How to Solve ThemTop 3 Challenges Impacting Your Data and How to Solve Them
Top 3 Challenges Impacting Your Data and How to Solve Them
 
Business Continuity for Mission Critical Applications
Business Continuity for Mission Critical ApplicationsBusiness Continuity for Mission Critical Applications
Business Continuity for Mission Critical Applications
 
Dynamic Hyper-Converged Future Proof Your Data Center
Dynamic Hyper-Converged Future Proof Your Data CenterDynamic Hyper-Converged Future Proof Your Data Center
Dynamic Hyper-Converged Future Proof Your Data Center
 
Community Health Network Delivers Unprecedented Availability for Critical Hea...
Community Health Network Delivers Unprecedented Availability for Critical Hea...Community Health Network Delivers Unprecedented Availability for Critical Hea...
Community Health Network Delivers Unprecedented Availability for Critical Hea...
 
Case Study: Mission Community Hospital
Case Study: Mission Community HospitalCase Study: Mission Community Hospital
Case Study: Mission Community Hospital
 
Emergency Communication of Southern Oregon
Emergency Communication of Southern OregonEmergency Communication of Southern Oregon
Emergency Communication of Southern Oregon
 
DataCore At VMworld 2016
DataCore At VMworld 2016DataCore At VMworld 2016
DataCore At VMworld 2016
 
Integrating Hyper-converged Systems with Existing SANs
Integrating Hyper-converged Systems with Existing SANs Integrating Hyper-converged Systems with Existing SANs
Integrating Hyper-converged Systems with Existing SANs
 
Fighting the Hidden Costs of Data Storage
Fighting the Hidden Costs of Data StorageFighting the Hidden Costs of Data Storage
Fighting the Hidden Costs of Data Storage
 
Can $0.08 Change your View of Storage?
Can $0.08 Change your View of Storage?Can $0.08 Change your View of Storage?
Can $0.08 Change your View of Storage?
 
The Need for Speed: Parallel I/O and the New Tick-Tock in Computing
The Need for Speed: Parallel I/O and the New Tick-Tock in ComputingThe Need for Speed: Parallel I/O and the New Tick-Tock in Computing
The Need for Speed: Parallel I/O and the New Tick-Tock in Computing
 
Solutions for Healthcare IT
Solutions for Healthcare ITSolutions for Healthcare IT
Solutions for Healthcare IT
 
Delivering First Class performance and Availability for Virtualized Tier 1 Apps
Delivering First Class performance and Availability for Virtualized Tier 1 Apps Delivering First Class performance and Availability for Virtualized Tier 1 Apps
Delivering First Class performance and Availability for Virtualized Tier 1 Apps
 
Optimizing The Economics of Storage: It's All About the Benjamins
Optimizing The Economics of Storage: It's All About the BenjaminsOptimizing The Economics of Storage: It's All About the Benjamins
Optimizing The Economics of Storage: It's All About the Benjamins
 
ESG Datacore SANsymphony-V Whitepaper
ESG Datacore SANsymphony-V WhitepaperESG Datacore SANsymphony-V Whitepaper
ESG Datacore SANsymphony-V Whitepaper
 
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...
Uninterrupted access to Cluster Shared volumes (CSVs) Synchronously Mirrored ...
 
Dynamic Storage Mobility for Hyper V
Dynamic Storage Mobility for Hyper VDynamic Storage Mobility for Hyper V
Dynamic Storage Mobility for Hyper V
 
How to Sell Storage Virtualization to The CIO
How to Sell Storage Virtualization to The CIOHow to Sell Storage Virtualization to The CIO
How to Sell Storage Virtualization to The CIO
 

Recently uploaded

why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerThousandEyes
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsAlberto González Trastoy
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providermohitmore19
 
Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendArshad QA
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsArshad QA
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionSolGuruz
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
How To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsHow To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsAndolasoft Inc
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...harshavardhanraghave
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 

Recently uploaded (20)

why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and Backend
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with Precision
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
How To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.jsHow To Use Server-Side Rendering with Nuxt.js
How To Use Server-Side Rendering with Nuxt.js
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 

Software Defined Storage Solves Performance Problems

  • 1. Powered by: Storage eBook Storage Insider Software-defined Storage solves performance problems Optimising your existing various storage devices allows your infrastructure to operate at peak performance levels Published by
  • 2. 2 Storage-Insider.de | Software-defined Storage: Performance 3 Software-defined storage: The devil is in the detail SDS is the framework of the future 7 Software-defined storage solves performance problems Making optimal use of existing storage media Content DataCore Software GmbH Bahnhofstr. 18, 85774 Unterföhring Phone +49 89 4613570-0 E-mail infoGermany@datacore.com Website www.datacore.com/de Vogel IT-Medien GmbH August-Wessels-Str. 27, 86156 Augsburg, Germany Phone +49 (0) 821/2177-0 E-mail redaktion@storage-insider.de Website www.Storage-Insider.de General manager: Werner Nieberle Editor in chief: Rainer Graefen, responsible as per press laws, rainer.graefen@vogel-it.de Publication date: September 2014 Title image: vege - Fotolia.com Liability: Should any articles or information be inaccurate, the publisher will only be liable in the event of proven gross negligence. Where articles identify the author by name, the author himself will be responsible. Copyright: Vogel IT-Medien GmbH. All rights reserved. Reprints, digital use of any kind and/or duplication hereof are only permitted with the written consent of the editorial staff. Reprints and electronic use: If you would like to use any articles from this eBook for your own publications, such as special prints, websites, other electronic media or customer newsletters, you can obtain the necessary information and the required licences online at www.mycontentfactory.de, phone +49 (0) 931/418-2786. Powered by:
  • 3. 3 Storage-Insider.de | Software-defined Storage: Performance IDC, Forrester or Gartner: regardless of analyst house, market experts universally agree that software-defined storage will become the de-facto platform for storage provision in the future. The main reason for this, they say, is that businesses of any size will always need more capacity to store their data. Moreover, the requirements of performance and availability tend to increase in importance, depending on the applications in use. Yet companies only have limited funds available to invest in storage. Based on the most recent research by leading analyst company, Ask any market analyst and he will agree - software-defined storage is the platform of the future. The term „software-defined“ is now regularly used to describe storage. You would be hard-pressed to find any major supplier who today does not use this keyword to describe their products. Yet, what exactly does this concept imply, what are the advantages for businesses when they implement it and at what point is it worthwhile considering software-defined storage? 451 Research, the percentage of funds invested in storage compared to the overall IT budget has, in fact, decreased over the last two years. To satisfy this purse tightening, there is demand for solutions that can be scaled to fit exact needs; that can offer businesses a higher degree of flexibility and that promise to save costs. This is where software- defined storage (SDS) comes in. SDS: simply a marketing claim? What exactly is SDS? The explanation provided by IDC analysts can be used as an initial reference: they interpret the term software-defined storage as “a storage software stack installed on shared resources (x86 hardware, hypervisors or in the cloud) or on commercially available computer hardware”. This provides the basis for “allowing the bundling of existing storage resources, the improvement of their utilization and the capability to structure a service-based infrastructure”. By contrast, manufacturers are still having a hard time finding a generally applicable definition or even a willingness to agree on standards. This is understandable because storage hardware suppliers are, of course, primarily interested in continuing to sell their own systems successfully. In Software-defined storage: The devil is in the detail SDS is the framework of the future ©vege-Fotolia.com
  • 4. 4 Storage-Insider.de | Software-defined Storage: Performance the storage component. The result is that storage is no longer defined by physical limits, but instead can be distributed more flexibly, thus becoming logically accessible. A division between the physical and logical brings several advantages: Existing resources can be used more efficiently, expansions are easier to implement, data can be migrated without interruption, management can be centralised and new functions can be introduced at all levels. In the solution, the choice between the numerous technical options primarily depends on which direction each of the manufacturers has decided to follow. As examples, in SAN, virtualization can either take place by means of an in-band, out-band or split-path process, either in the host or in the storage controller of the storage system. Generally, with technology inherently tied to specific devices or models, we must accept that they only work properly with the systems offered by their particular manufacturer. Thus, for a long time now, one tried tested and therefore effective solution has been to revert to software-based solutions. They can bundle every resource at a software level that is valid for all devices. The less products that are bound to specific platforms and/or manufacturers, the better. The result is that all performance criteria can be made available at all levels, irrespective of the existing hardware; access to the storage systems can be controlled at a central level and the entire storage infrastructure can be uniformly managed from a single console. Approaches that focus on hardware suffer from limitations There is much to be said about integrating “cookie-cutter” functions and management solutions at software level and replacing classic hardware-focused architectures with non-proprietary virtual and the meantime, however they continue to deliver models carrying an “SDS” label. More often than not deployment does not bring about any change, because the required functions are still tied to their specific storage platforms, typically proprietary software. Thus the system‘s own set of features can neither interact with new components nor with other manufacturers‘ systems. Needless to say, this contradicts the principles of SDS, were the software determines the functions of the storage and does so entirely independently from the underlying devices or selected topologies. Storage virtualization serves as an SDS vehicle Generally, manufacturers revert to storage virtualization techniques as a means to an end, typically integrating an abstraction level between the application server and SDS is the framework of the future SDS solutions bundle all resources into a software layer used by all devices. This solution allows all performance criteria to be made generally available to all devices, irrespective of the existing hardware. (Image: DataCore)
  • 5. 5 Storage-Insider.de | Software-defined Storage: Performance software-defined approaches. There are quite a few reasons for doing so. Firstly, data volumes will continue to increase, making it difficult to determine just how much storage space must be reserved in the medium term. Applications - sophisticated tier 1 applications in particular - and the requirements of storage infrastructure become more demanding with an increase in work load. Yet classic systems are not designed for this and are not flexible enough to adjust to these changing conditions. What makes things even more difficult is the limited useful life of the hardware, which for storage arrays is on average around five or at most seven years. Frequently businesses purchase oversized storage space so that they are equipped for any scenario during this period. This approach does not allow the appropriate flexibility to react to new requirements. However, if capacity and performance are not sufficient in the day-to-day work environment, expansions will be required, combined with the need to procure additional devices that more often than not have to be managed separately or at worst, require a complete change of architecture. This, in turn, creates even more problems. The result is a highly complicated jumbled mess of storage environments, requiring a great deal of effort to operate and manage. Additional hardware takes up much more space, expenses for power and cooling increase and, in equal measure, an increase in maintenance expenses. SDS frees businesses from technical constraints Both from a technical as well as an economic perspective, old-fashioned hardware-based storage architectures will eventually reach their limits in the short or long term. With this in mind, software-defined storage represents a future-orientated conceptual approach that may be interesting for both small and medium-sized businesses alike. It is worthwhile to put some detailed thought into SDS, especially when it becomes necessary to purchase new storage hardware, or when the use of flash/ SSDs or server and desktop virtualization projects is on the agenda. This is just as important when business continuity is a key topic of discussion, requiring a fail-safe, high performance and high availability IT infrastructure as the basis for running business processes without interruptions. No matter which of these scenarios applies: by separating the storage services and functions from devices, businesses are given the freedom to make use of standard software, irrespective of its type, for their SDS is the framework of the future The future of SDS from the point of view of analysts IDC Based on a survey conducted by IDC, a majority of European businesses do in fact deal with SDS as a topic, yet by now only eight percent of them have implemented relevant solutions. Despite this, software-defined storage represents an attractive approach - 42 percent of the IT decision makers questioned in the survey consider software to be a key engine for innovation in the field of storage. Gartner Market researcher Gartner considers SDS to be a concept still in the making, but one that businesses should already be discussing now. From an analyst’s point of view, one of the greatest benefits of SDS is the integration of hardware infrastructures that are not manufacturer dependent; that are operated based on SLAs and that can provide solutions to problems that once posed challenges to conventional data storage. Based on estimates by Gartner analysts, however, it will take at least another ten years before SDS becomes prevalent on a large scale. Forrester According to Forrester, storage budgets can no longer keep up with the demand for storage. As a result, IT administrators are being challenged and are seeking solutions that will allow them to make storage capacities and performance available as needed, preferably automatically. The analysts do not think that integrating additional platforms would be the best response available today, because in their opinion this would increase the silo mentality even more, thus making the storage environment even more complicated. Instead, they are convinced that the weak points of this conventional approach only serve to accelerate the introduction of SDS.
  • 6. 6 Storage-Insider.de | Software-defined Storage: Performance own purposes and to manage all their storage needs with software. In this way, existing traditional hard drive storage can be combined with flash media and hybrid systems in storage architectures tailored to their own individual requirements. This is the key to replacing existing island solutions and to finally being able to say goodbye to parallel, block-orientated SAN, file-based NAS and separate backup and disaster recovery systems, various hypervisors or flash solutions. SDS: A performance turbine for critical business applications The classic storage systems of the past are no longer capable of satisfying the performance needs of critical data or transaction-focused business applications. This is why, over the years, the added use of flash media or solid state disks (SSDs) has become a common option to increase overall performance. However, integrating Flash efficiently into existing environments still poses a challenge to IT managers. On the other hand, SDS-based architectures are able to solve integration problems because fast storage can be integrated quickly, with no complications and almost no interruptions, allowing the use of existing components. Indeed, businesses can benefit from the large number of cross-platform functions and services designed to speed up and optimise performance; in addition to automatic tiering and load balancing, there are also functions offered such as sophisticated caching methods. If the primary objective is to eliminate performance bottlenecks, SDS may thus prove to be the best approach. We explore the do’s and don’ts in the following article. Tina Billo The optimal use of various storage media allows storage infrastructure to operate at peak performance levels. ©vege-Fotolia.com SDS is the framework of the future
  • 7. 7 Storage-Insider.de | Software-defined Storage: Performance Virtual work stations are on the rise. They demand better performance from the storage infrastructure. (Image: IDC) While the amount of computing power and networking speed that can be achieved has rapidly multiplied over the last decade, as far as storage systems are concerned, the only radical changes that have occurred are in disk density and disk capacity, and do not really consider overall performance. Since 2000, the speed of traditional mechanical hard drives has been 15,000 rpms, for example; due to the physical limitations, we can hardly expect any further developments in this regard. And so opens the glaringly obvious performance gap between the CPU and attached storage. This has been posing a problem for businesses for quite some time. One reason for this is, that the data volumes that need to be processed have increased beyond proportion - the average annual rate of growth is between 40 and 45 percent. The use of mobile devices has exploded in most recent years; this tendency will continue in the future, with social networks and cloud services propagating even greater volumes of data on the go. Simply recording and storing this data is only one small part of the puzzle. Evaluation and management presents an ever greater ongoing challenge. Making optimal use of existing storage media Big Data, Cloud Computing, Social Media, Mobile Business: These trends have spurred on exponential volumes of data that now need to be recorded, processed and analysed. High performance applications are required to assist, but also storage systems need to offer sufficient performance to efficiently store and back up the data they generate, guaranteeing high availability and overall administration. Traditional hard drive- storage arrays quickly reach their limits here. As a result, alternative paths are sought to work around these limits. This paper provides an overview of storage technologies available today and their advantages and disadvantages. Software-defined storage solves performance problems
  • 8. 8 Storage-Insider.de | Software-defined Storage: Performance Decision makers hope to obtain important insights from their data for their own businesses’ benefit so that they can stay ahead of the competition. Such levels of understanding are only gained through data mining using sophisticated applications which come with a price tag of high performance needs; high flow rates and lower latency. Traditional storage systems cannot keep up, and why would they? They’re based on 20-year old architectural models, so they are not designed for such extreme workloads. The gap created between the processor and the storage speed creates a “bottleneck” situation here. Bottleneck storage: Server and desktop virtualization require more powerful systems Atthesametime,virtualizationtechnologies are also being integrated into businesses with increasing momentum. Based on a survey conducted among IT decision makers, analysts at Forrester estimate that by now 77 percent of all companies around the world are working with virtualised servers. At the same time, virtual desktop infrastructures (VDI) are on the rise. A 2013 IDC study showed that 27 percent of European companies already have virtual work stations set up, with a further 20 percent discussing their implementation and another 27 percent trialling introduction. However, simultaneously executing applications in virtual machines (VMs) creates a pattern of mixed workloads and arbitrarily distributed data access points. Classic disk storage options prove to be more of a stumbling block, because they do not have sufficient I/O performance to read and write the data fast enough. Even though the IOPS performance can be increased by integrating additional disks, many businesses feel that costly undertakings of this kind are no longer an approach suitable for this day and age. The end justifies the means More recent concepts and solutions are defining what is yet to come. This also includes, among other things, creating more storage space and/or increasing performance by adding additional devices (“scaling out”) or by upgrading existing systems by adding more components (“scalingup”).Inthelattercase,ITmanagers are generally leaning more towards solid state storage, which uses NAND or flash drives as a storage medium in the form of solid state disks (SSDs) and flash modules. Where performance is concerned, they are far superior to HDDs and therefore do very well, even given a large number of arbitrary read and write operations, such as those common in virtualised environments. Another option to achieve improved application performance is to consider Making optimal use of existing storage media Convergent systems combine server, storage and network technologies. These “out-of-the-box” data processing centres come with the promise of improved application performance. (Image: IDC)
  • 9. 9 Storage-Insider.de | Software-defined Storage: Performance convergent systems which combine server, storage and networking technologies. According to IDC, these virtualised ‘out-of- the-box” processing centres are becoming increasingly appealing. About 16 percent of the companies queried in a 2013 survey had already implemented a converged approach with an additional 53 percent considering implementation. Companies also expected improved utilisation of the existing systems and a higher storage performance gained through storage virtualization, either implemented with systems that are already in operation or via a software solution. Software-defined storage (SDS) is deemed to be the next logical step. Once again the focus is on introducing an abstraction level between the applications and the physical devices, with the aim of logically combining resources for shared access. There are already so many options out there that promise to improve the performance of storage infrastructures and companies will need to give due thought to which approach is ultimately the most appropriate. SDS: Boosted performance for storage infrastructures If the decision is to set up a software- defined storage environment, companies have two options they can choose from. They can either revert to the solutions offered by hardware manufacturers for their platforms or they can choose a purely software-based/device-independent approach. The first option poses the very real risk that functions are partially or entirely tied to the components of each hardware brand and cannot be made available across the board. Taking the pure software-defined storage option, will consolidate all storage resources, services and management processes, and offset any proprietary limitations and incompatibilities. The intelligence and functions are moved to the software providing an autonomous virtual intermediate layer, detached from the physical hardware restrictions. This offers the distinct advantage that all storage media, irrespective of format, becomes available via a standardised centralised platform. This includes, for example, automated storage tiering, caching or load balancing processes, all of which serve one single purpose: to utilise to the best the performance potential of different resources and to speed up applications. Using auto-tiering to meet application requirements perfectly Thanks to high data transfer rates and extremely short access times, solid state disks and flash technologies are the ideal option to counteract the increased performance requirements for critical business applications. Yet performance does not come cheaply: Purchasing fast storage is still much more expensive than classic hard drives. This is why businesses rely on solutions that allow them to make the best possible and economically feasible use of costly storage space. This can be achieved by using software-­controlled Making optimal use of existing storage media Data blocks with high access ratios are automatically migrated to faster SSDs and less active ones to slower mass storage, using the software-controlled automatic tiering process. (Image: DataCore)
  • 10. 10 Storage-Insider.de | Software-defined Storage: Performance auto-tiering. For this purpose, storage media appears consolidated as virtual storage pools which are first organised into separate storage classes, or “tiers” determined on their price-to-performance characteristics, before being organised and profiled. Using pre-defined criteria, such as date and degree of utilisation, intelligent mechanisms ensure seamless placement on the most suitable storage type at block level, based on cost/performance aspects. Data blocks with high access rates can be automatically migrated to faster SSDs and less active ones relegated to slower mass storage, based on pre-defined rules. With consistent monitoring of the I/O performance and accounting for all competing I/O requirements, the software automatically allocates demanding, latency-sensitive workloads to fast storage media, while allocating the workloads that are not time-critical to slower, more cost effective ones. In order to cushion foreseeable peak loads that occur regularly at peak times, virtual harddrivescanalsobestaticallyassignedto a high performance tier. As soon as capacity is exhausted, the loads can be switched to a lower level. This makes it possible to meet the performance and availability requirements of critical application workloads, speed up response times and enormously increase the processing of business-critical tier 1 applications across the entire infrastructure, irrespective of the underlying hardware. Faster data reading and writing using caching More technically sophisticated storage virtualization solutions take this a stage further by making use of the strengths of caching to increase access speeds. If the selected software runs on x86-64 standard servers, the devices connected in a storage pool can make use of the DRAM working memory and the I/O resources of each node as a high-performance “mega cache”. A part of the physical server RAM is available to respond directly to incoming application queries. Frequently read blocks remain in intermediate storage, relieving the load on the back-end data carriers and reducing I/O latency. Moreover, established caching processes such as reading ahead, writing behind or harmonising arbitrary writing processes can be applied across the board to allocated sequential disk I/O processes (“write co-alescing”). In consequence, applications can be executed more quickly, thus increasing the performance of disk storage by a factor of 3x to 5x. Caching writing operations also increase the life of SSDs, because they Making optimal use of existing storage media Data caches increase the performance of disk storage three- to five-fold and also extend the life span of SSDs. (Image: DataCore)
  • 11. 11 Storage-Insider.de | Software-defined Storage: Performance only need to run a lower number of writing and reading cycles. If companies prefer to keep the shared storage close to the applications, setting up a central virtual SAN is an interesting option. With this option, in addition to storage for the application servers, VMs also have access to resources of the virtualization nodes and to all other connected physical storage systems, including components such as DRAM caches, flash- or cloud- based solutions from a single source. This improves the scalability of the entire infrastructure with regard to capacity and performance even more. Furthermore, businesses also benefit from the fact that they can access company-wide storage functions which in the past were reserved for classic SAN infrastructures and can automate and manage them centrally from a console. This includes the storage pooling, auto tiering, adaptive reading/writing caching and load balancing in addition to a large number of other services. Utilisation of the installed storage capacity can be improved with thin provisioning and creating snapshots and continuous data backups (CDP) that guarantees comprehensive protection of critical company data. Additionally, technologies such as synchronous mirroring and asynchronous replication ensure that invaluable information for day-to-day business operations is available to all locations without the fear of downtime. InstallingavirtualSANistheperfectsolution if a medium sized company is interested in moving towards software-defined storage without the heavy investment overhead. Load balancing improves data flow rates and response times Load balancing is yet another component used to prevent typical storage bottlenecks such as the “blender effect”. This term describes the reoccuring problem when many applications compete for shared storage resources at the same time in virtualised environments. Classic hard-drive-based storage arrays simply cannot handle this rush, nor the high number of I/O-intensive access operations and application performance suffers as a result. Automatic load balancing is an option to correct this problem, which in conjunction with auto tiering and caching, forms the cornerstone of high performance. Generally, we distinguish between two methods. One option is to distribute the load on the available front-end connections between the application servers and the storage virtualization node(s). The other is to distribute the data load between various physical hard drives within the pool. Summary Companies primarily interested in finding practical solutions to increase the performance of their overall storage infrastructure should take a closer look at SDS. If a fast, cost-effective entry in line with the IT budget is required, this can be realized by using a virtual SAN. Because the software defines the functions, performance improvements can be gained across all storage options and completely independently of the manufacturer or the Making optimal use of existing storage media A virtual SAN improves the scalability of the entire infrastructure in terms of capacity and performance. (Image: DataCore)
  • 12. 12 Storage-Insider.de | Software-defined Storage: Performance technology. In addition, they also have the flexibility to integrate components based on current developments into their existing infrastructure at any time. As a result, they can react to changes in the performance requirements and speed up the performance of critical tier 1 business applications. A recent global study conducted by TechValidate Research proves just how enormous the gain is.. The study showed that 72 percent of companies that already rely on software-defined storage were able to quote a three- to ten-fold increase in profits. Similarly, as far as capacity optimisationisconcerned,theyalsoverified that they have achieved good results: Some 64 percent of the companies queried were able to reclaim over half of what was over-provisioned wasted storage space. As a result, the companies questioned showed that existing total capacity was utilised by four fold, and that they were able to utilise existing hardware longer with no further investment in additional storage space required. Typically, they demonstrated savings of between 25 to 75 percent. Therefore SDS in practice is a worthwhile investment for companies of any size. Tina Billo Making optimal use of existing storage media Load balancing is an additional component used to prevent typical storage bottlenecks. (Image: DataCore)