This supplement is an independent publication from Raconteur Media
For smart CIOs, the current
economic climate offers an
opportunity to shine, writes
here next for effi- from bloated IT infrastructures and As a result, many CIOs may now feel rent focus on driving costs out of the virtualisation needs to become an
cient IT? It’s a ques- deliver more efficient, high-quality that they have little room left to move current infrastructure. They need to overarching data centre design prin-
tion plenty of CIOs IT services to the business. on the cost side of the equation. radically rethink how IT resources ciple, rather than a handy way to ad-
are asking right now. For many, that has involved funda- What the situation calls for is a are sourced and utilised, in order to dress immediate tactical issues. Sec-
After all, those who have done their mental shifts in their approaches to whole new, ‘super-charged’ approach make a quantum leap in efficiency. ond, automation needs to become
jobs well in the last few years have architecture, to sourcing and to or- to efficient IT. It’s not enough for For many, that will mean delving
already done much to drive out cost ganising available talent to best effect. CIOs to simply maintain their cur- deeper into three technologies. First, continued on page three
Growing pains A cloud of our own Walls come tumbling down
As digital information explodes, how are smart Organisations reluctant to release private data When data lives in the cloud, traditional approaches
companies using information management to onto public IT infrastructures can still reap to information security no longer offer adequate
get the most out of storage capacity? page 4 the benefits of cloud computing. page 8 protection from threats. page 14
Three’s a cloud
EMC, Cisco and VMware come together to shape the future of computing - centre pages
Dell™ NX4 breaks down traditional barriers,
allowing users to share files between Windows®,
Linux®, and UNIX® environments
LEARN MORE AT DELL.CO.UK/EMC
4 EFFICIENT IT
As digital information explodes, how are smart The decade preceding the recent
economic slowdown was defined
companies preparing to take the strain? by untrammeled spending on stor-
Gareth Kershaw reports age. During those years, the cheaper
storage capacity became, the more
companies seemed compelled to
n 2007, the ‘digital world’, consist- In fact, the explosion of digital in- buy. Now that the financial outlook
ing of all the data produced and formation is just one of the problems is less positive, it’s time for business-
replicated across the globe, was with which IT professionals are hav- es to be more realistic. Gareth meatyard (emC):
estimated at 281 exabytes, or 281 ing to contend. At the same time, they “Users need seamless
billion gigabytes (GB) in size by ana- are also wrestling with new legal and Time for a reThink access to archived content
lysts at IT market research company regulatory demands that dictate what In fact, it’s a case of “back to the future” and proactive information
IDC. That’s around 45GB of informa- data must be kept, for how long and for storage and information manage- management”
tion for every person on the planet. By how quickly it must be retrieved. And ment, says Dr Graham Oakes, an in-
2011, they predict it to grow to over ten they must find smarter, more innova- dependent technology consultant who
times its current size.
The technology industry is no
tive tools and technologies to address
these challenges, against a backdrop of
has provided advice on storage strate-
gies to organisations including Oxfam,
hittinG the mail on the head
stranger to unrest and upheaval - but widespread IT spending cuts. government-owned savings bank Na-
for many organisations, keeping pace As a result, current pressures demand tional Savings & Investments (NS&I) One of the most visible symbols of the explosion in digital information,
with that kind of growth has become a complete reassessment of information and the Office of the Deputy Prime email can also be one of the trickiest to manage. As a communications
more onerous in recent years than ever management strategies at many organi- Minister. Organisations must revert medium, its availability is now taken entirely for granted by users at
before, putting skills in information sations. And that will be no easy to the more stringent levels of scrutiny most companies; but for hard-pressed IT departments, the storage and
and storage management to the test task – particularly because that were applied to purchases back management of emails pose a number of significant challenges.
and pushing demand for storage it will require a complete when storage media and systems were For a start, emails need to be auditable, searchable and easily retrieva-
capacity to new heights. break from recent practice still relatively costly, he says. ble for operational and compliance purposes. Email systems, meanwhile,
when it comes to mak- In fact, those organisations that don’t are expected to be up and running 24 hours a day, 7 days a week and
ing storage and in- take a step back from rampant storage 365 days a year. Business users now find it almost impossible to do their
formation manage- acquisition and haphazard informa- jobs without it, so stringent business continuity plans are paramount.
ment investments. tion management practices may floun- That’s perhaps why many organisations’ email management strate-
der sooner than they expect. Many data gies have moved beyond basic back-up to a more holistic approach,
centres are rapidly running out of space based on wider information management principles.
and power, so while UK firms need to For a start, strong archiving practices are a must-have, says David Par-
grow their storage capacity, many can’t kin, director of sales for EMEA at security specialist Sunbelt Software.
expand beyond their current physical “Approximately 80 per cent of businesses now use email for closing
or energy footprint. orders and performing transactions, making them subject to statutory
As a result, they will be forced to get records retention requirements. But exactly what should be stored and
more not only from existing storage how long for is poorly understood by most businesses,” he says.
systems, by boosting utilisation rates In fact, robust information managment policies should be applied long
and jettisoning redundant and dupli- before the archiving stage. For example, it’s particularly important to elimi-
cate information, but also from avail- nate duplicate copies of emails before the archiving stage, which is why
able data centre space, by consolidat- organisations are increasingly deploying de-duplication technology, which
ing storage capacity into fewer, more scans each email, assigns it a unique tag (much like a fingerprint), indexes
efficiently utilised systems. and retains it. Redundant or duplicate copies are simply deleted.
“Now, more than ever, cost containment is a key concern. Data re-
a Three-pronged aTTack duction technologies for primary data and secondary copies of data
In the drive for greater efficiency, say (backup and disaster recovery copies) can help drive down costs by
industry watchers, organisations will using less storage, and can, perhaps, extend the useful life of cur-
battle that storm on three fronts: com- rently deployed solutions,” says a recent Gartner report.
plexity, cost and automation. Just as vital is the ability to retrieve email rapidly - especially im-
In terms of complexity, for example, portant if a regulator or potential opponent in a law case comes
there is a notion that high-profile initia- knocking. That’s why many companies, particularly those working on
tives like storage consolidation and vir- e-discovery projects, are prioritising access and availability of stored
tualisation have greatly simplified and emails, says Gareth Meatyard (pictured above), product specialist for
demystified storage infrastructures. EMC’s recently announced SourceOne products.
“Users need seamless access to archived content and proactive in-
formation management to help with litigation readiness, including
a central archive to accelerate large-volume discovery searches and
enable secure legal hold,” he explains.
The EMC SourceOne product family is a suite of information gov-
ernance and integrated content archiving solutions that share a com-
mon goal: to help organisations manage their information resources
intelligently – for the highest return on investment, at the lowest risk,
and for maximum competitive advantage.
Within this family, EMC SourceOne Email Management aims to sup-
port proactive e-discovery, email retention policies and cost-efficient
tiered storage in high-volume email environments. It provides all core
email archiving capabilities for Microsoft Exchange and IBM Lotus
Notes/Domino environments, as well as instant messaging.
“Many of the email archiving solutions written ten or more years ago
have been challenged to meet very large mailbox environments (that is,
50,000 or more mailboxes),” says Laura DuBois, an analyst at IT market
the growth in research company IDC. “The larger the environment, the more strain
corporate information the system architecture faces from ingestion performance, indexing
is putting both skills speed, database scalability, index integrity, as well as search and policy
and storage capacity management.” EMC SourceOne Email Management, she adds, “offers
to the test at many a next-generation archiving architecture to meet these challenges.”
EFFICIENT IT 5
Chris Gabriel, director of solutions With file-level deduplication, for ex- value. In ‘storage tiering’ scenarios, “Many users implement the static storage needs to take all three ‘storm
marketing at systems integration com- ample, one copy of a file is retained says Ryan, “current data can be stored approach and combine this with the fronts’ into account. But wider eco-
pany Logicalis, notes that, without the as a reference and all other copies of on high-performance disk drives and most common element of active tier- nomic conditions notwithstanding,
right “mindset”, such technologies can the file are replaced with a unique older data can be archived to very ing – archiving,” says Ryan. “Tradi- it’s a great time for companies to be
spawn more information manage- identifier, or ‘pointer’, to the file. “This large, lower-performing drives.” tionally, data was archived to tape and thinking about re-architecting their
ment problems than they solve. “The approach lends itself well to data re- When it comes to implementing stor- optical media, but current legislative storage infrastructures, because the
thing is, virtualisation ain’t new and tention policies, where retention age tiering, organisations can choose a requirements make these unsatisfac- efficiency gains and increased val-
– shock, horror – it’s actually not that requirements are applied to the ref- static approach or an active approach tory for rapid information retrieval, ue that can be achieved will make
clever,” he says. “Yes, it allows you to erence copy and adopted by all appli- – or a combination of the two. e-discovery and other modern booby- maximum impact on efficiency-
put more [data] onto less [storage], cations using the file,” says Ryan. In a static scenario, different types traps,” he says. focused businesses.
and in today’s increasingly data-based Object-level de-duplication, mean- (or ‘volumes’) of data, relating to Which bring us to the third key tar-
lifestyle, that’s undoubtedly a benefit. while, can be applied not just to a the same application, are stored on get that CIOs are looking to address:
However, in having access to seeming- single file, but also to collections of different disk drive types. So the automation. Darren Thomas of Dell
ly endless storage capacity, it’s easy to files. “This type of de-duplication is log files, index files and tables that calls it “the factor that’s truly driving
get lazy and slip back into bad habits.” usually associated with compliance make up a database – but which are today’s market”, taking its place along-
The point, he says, is that it isn’t how projects,” he says. accessed with different degrees of side more established information
a company stores its data, it’s how it Finally, block-level de-duplication frequency – all reside in different management drivers of “scale, capac-
manages it that makes the difference. breaks data into small blocks, or places. “We refer to this as ‘static’, ity and performance”.
If companies wish to tackle the sec- ‘chunks’, and assigns a unique identifi- because the different volume types Today’s storage automa-
ond issue – cost – then it’s time they tion technology aims
stopped treating their storage systems to take day-to-day
like a garage, “a place they chuck things
because they don’t want to throw any-
thing away”, says Darren Thomas, glo-
Enterprise storage systems storage decisions and
tasks out of the hands
of hard-pressed IT
bal vice president and general man-
ager for enterprise storage at systems
company Dell. Instead of old bicycles
are full of junk data that will staff and automati-
cally allocate data and
information to dif-
and boxes, he says, enterprise storage
systems are full of junk data that will
never be needed again.
never be needed again ferent storage tiers,
according to pre-de-
fined rules relating to
Here, data deduplication technology their business value.
can be a big help, according to Dennis er to each chunk, says Ryan. This kind will always reside on their respec- Automation tools for
Ryan, partner sales development man- of de-duplication, he says, is largely tive disk types, rather than move be- storage virtualisation,
ager at EMC in EMEA. This works to relevant in back-up and restore envi- tween tiers,” he explains. thin provisioning and
detect and eliminate information that ronments today. In active scenarios, by contrast, data tiered storage are three
is already stored elsewhere in an organ- The cost issue can also be tack- regularly moves between tiers of stor- hot tickets in this space.
isation’s storage infrastructure. led by using storage disks of varying age, depending on a number of fac- A more intelligent
Today’s deduplication can work at performance and capacity to store tors, with “age being the most com- approach to informa-
a number of different levels, he says. information according to its business mon”, says Ryan. tion management and
Want to make your business sharper?
Virtualisation makes commercial sense...
It’s just simple mathematics. But choosing the right independent expert
can ensure it delivers what it says on the tin.
For over a decade, Computacenter has delivered complex virtualisation
projects across server, storage, application and desktop platforms. We
understand that each client is different and that one solution doesn’t
necessarily fit all. That’s why we choose to work with industry-leading
vendors with proven pedigrees – vendors like EMC and VMware – to
ensure we deliver pragmatic solutions that make your business sharper.
Realise the benefits…
Whatever stage of the virtualisation journey you’re at – if you’re looking
to analyse the impact of virtualisation, de-risk the project, or validate the
efficiencies and ROI, we have the tools and skills to help you realise
the benefits of virtualisation.
We do this for some of the UK’s best-known
organisations – and we can do it for you too.
Contact us for a no obligation assessment to see how our industry
knowledge and skilled consultancy could make your business sharper.
Tel: 0845 604 5151
Visit: www.computacenter.com/virtualisation SOLUTION CENTER
Ef f i c ie nt IT = Vir t u a l ized IT
130,000 businesses choose to virtualize with VMware.
Why? Because today’s IT mantra is all about doing more with less.
More efficiency. More control. More choice.
Less capital expenditure. Less operating expense. Less energy cost.
VMware virtualization solutions transform your IT and build the foundation for
the next generation of IT efficiency gains that cloud computing will bring.
To find out more, contact us now on 0800 032 7597 or visit www.vmware.com
EFFICIENT IT 7
virtualise to capitalise
The benefits of virtualisation are an issue that no
organisation with an eye on efficiency can afford to
ignore, says Guy Clapperton
rotterdam, where city officials are keen advocates of virtualisation
t is a medical fact that only a rela- the financial sector that has embraced number of servers, each needing power on the way to cloud computing. “One of are used, there may be issues for either
tively small part of the human this model of computing [first].” to run and cool them, while most are sig- the big issues with cloud computing so far the user or the vendor or both regard-
brain is used during the average The IT industry is now eyeing the mid- nificantly under-utilised. Virtualisation has been the use of proprietary cloud plat- ing data security, privacy and other legal
lifetime. Nobody knows what market as the next likely area of massive addresses this problem by enabling busi- forms, which make it very difficult for an compliance,” he says. “Since virtualisa-
purpose the rest is meant to serve. growth. “Virtualisation has come to the nesses to simultaneously run multiple, IT department to move their workloads tion depends upon moving data around
It would be wrong to say the same fore over the last five years as computing isolated workloads on one physical server, into the cloud,” he says. “Then you have the world, perhaps splitting it up and
rule applies to computers, but it’s true power far outstrips what the operations so several applications can be safely hosted the problem of vendor lock-in, whereby a sending it to different locations, depend-
that the tasks of an individual PC – or systems and the applications that run on one box. This leads to better utilisation customer that has gone to the effort of re- ing on capacity, use and bandwidth, then
an array of servers, or any other com- atop them are capable of,” says Millman. of the hardware, reducing the amount of writing an application for the cloud finds it’s much more difficult for the user to
bination – may be better served by boxes needed, resulting in far greater en- it too difficult to change providers.” know where the data is held. ”
‘virtualising’ a system of software onto BOOSTING UTILISATION RATES ergy efficiency. The more virtual servers Because virtual machines are hardware It gets even more complicated when
another physical system. It’s happened because the technology to you can run on one box, the more energy independent and portable, he says, virtu- the data holder is regulated by the fi-
So, on a corporate level, a massive multitask without harming core tasks saved, so density and efficiency is key.” alisation can help customers to move their nancial authorities, he adds, in which
amount of storage might sit on physi- has become available, making IT systems Solid business benefits start to ac- applications between their own data cen- case, different regulations will apply in
cal systems at a remote location that far more efficient. “Servers run with crue quickly when this sort of technol- tres – or the internal cloud – and external different territories.
are shared with another enterprise, many more processing cores than before ogy model is in use. Take, for example, clouds. “This idea of federation between In a period when increased efficien-
making one set of servers work as two and thus utilisation of these resources the City of Rotterdam, which employs internal and external clouds based on vir- cy is a major corporate goal across the
‘virtual’ sets. has been low. The need to run one appli- some 1,000 people in its Dienst Steden- tualisation is where we are focusing a great board, however, the benefits of virtuali-
On a much smaller level – the small- cation per operating system so it doesn’t bouwe en Volkshuisvesting (DS+V) deal of our development efforts,” he adds. sation remain clear. On the face of it, it’s
est possible – your correspondent is interfere with other apps means that, for department and is responsible for town Inevitably, there are a handful of cave- a no-brainer. Does your enterprise want
writing on an Apple computer that has the most part, the server is idling. Virtu- planning, housing and traffic. ats. Chris Coulter is a partner at City law- to buy hundreds of servers and systems
a virtual PC running in one window, so alisation solves that problem, with many In 2004, the department implemented yers Morrison & Foerster and, although - or tens that can behave like hundreds,
one computer is acting as two. operating systems running on the same the open-source operating system, Red he recognises the considerable benefits with all the savings in time and energy
That idea is catching on at many server without each operating system in- Hat Enterprise Linux 3, for a few appli- of virtualisation, he has concerns. “De- that go with that? For many CIOs, it’s not
businesses, but the main benefits have stance impacting on the other.” cations, but stuck with a combination of pending upon how [virtualised systems] an issue that requires much thought.
been noted at the higher end so far, Vendors of virtualised solutions con- Windows and Unix for the main plat-
says Rene Millman, senior analyst with cur. Serguei Beloussov, chief executive of form, using 40 servers to run the com-
Gartner. “This has made a beachhead
in the large enterprise, where server
Parallels, which also works in the related
cloud computing arena, points to the
bination of Microsoft, Unix and Linux.
When it came to putting a new adminis-
the virtualisation promise
utilisation has traditionally been low many downsides of having a dedicated tration and registration application in for
and organisations are looking to ex- server for every mission-critical applica- the City’s real estate activities, however, With a virtualised environment, organisations have the opportunity
tract maximum usage from their in- tion in every department: “This approach the Council started looking at a virtual- to get their entire IT infrastructure running as a single pool of highly
frastructure,” he says. “As usual, it is has led to organisations accumulating a ised system based exclusively on Linux. efficient computing resources.
After a period trying the system on a
pilot basis, the organisation migrated its One application per operating system no longer applies
From virtualisation to the Cloud servers to Red Hat Enterprise Linux 5 and
was able to use virtualisation to run 10
virtual machines across just six servers.
Having embraced virtualisation, some organisations are using early The council used Red Hat Global
wins in the area to start exploring cloud computing in more depth. One File System for storage virtualisation
example is the Pensions Regulator, the UK government body charged and Red Hat Satellite Server for faster
with overseeing work-based pensions schemes. There, virtualisation deployment of both physical and vir-
technology from VMware has allowed IT staff to decommission over 40 tual systems on the network.
physical servers and cut power and cooling costs by 30 per cent. “One of the key benefits of Red Hat
It’s a great start, but the journey is far from over, says Ray Heffer, techni- Enterprise Linux 5 with virtualisation
cal infrastructure manager at the Pensions Regulator. “Perhaps the most is that we can install and roll out a new
critical service we support is a pensions web portal, ’Exchange’, for pen- application in 60 minutes to all our
sions scheme administrators across the country, and we have already systems, compared to four hours per
taken steps towards a cloud approach to support this, using VMware.” system previously,” says Hennie Stam,
The physical infrastructure and virtualisation technology needed to sup- senior system administrator at DS+V.
port the 24x7 portal is provided by a hosting provider, he explains, “but
crucially, we can monitor and maintain the virtual machines running on WHO OWNS THE CLOUD?
that infrastructure centrally, as if they were within our own data centre.” Others see further potential benefits. Fre-
“This has been such a success that we are now looking at using this host- drik Sjostedt, director of product mar-
ing facility for offsite disaster recovery purposes in the future,” he adds. keting EMEA at virtualisation company Source: VMware
VMware, believes this is a staging point
EFFICIENT IT 9
First, it enables organisations to
dramatically increase the utilisation of
that physical infrastructure. “In many
companies, utilisation rates for servers
and storage systems hover at around
10 per cent. The virtualisation capa-
bilities of a private cloud infrastruc-
ture can push these up to 70 per cent
plus. So straight away, you’re reducing
the costs associated with information
management and storage.”
Second, with the appropriate informa-
tion management tools in place, he says,
organisations can move that informa-
tion around the infrastructure, accord-
ing to that information’s overall value
to the business. “In a private cloud, data
can be moved and manipulated more
freely - that could involve, for example,
the migration of data from operational
systems to a data warehouse. An organi-
sation that can achieve more agility with
its information is in a better position to
analyst and interpret it. In other words,
they can more easily turn data into in-
formation.” (For more on information
management and storage challenges, see
article on page 4, ‘Growing pains’.)
Naturally, any talk of cloud comput-
ing raises inevitable questions about
data security. How do you lock down
data when it resides not behind a tra-
ditional firewall and subject to stand-
ard network security approaches, but
somewhere out there in in the cloud?
Approaches are emerging that aim to Thomas Bittman, Gartner
tackle this issue head-on, but for now,
it’s sufficient to say that organisations
need to move to a model whereby the
security of data and information takes
WhAT The AnAlysTs sAy
priority over infrastructure-centric
approaches. (For more on security in “The business wants cloud computing, because it wants fast time-
ontrol over corporate information virtualised environments, see article on to-market and to pay only for what it consumes; that requires IT
page 14, ‘Walls come tumbling down’.) resources to organically adapt to the business and deliver com-
ing companies taking their first step organisations have this already and Either way, the private cloud trend mensurate economics. An internal cloud provides businesses with
with experimenting with both and then are achieving huge efficiency gains as is clearly one that no organisation can the same assurance that the specific safeguards and processes that
looking to tie the two together, so that a result (for more on this, see article afford to ignore. The advice from ex- govern the business are being applied. Before this, you could get
content and information that is com- on page 7, ‘Virtualise to capitalise’). perts is to start small. “In this economy, one or the other, but not both. An internal cloud accelerates the
mercially sensitive stays private, but “Architecturally, an internal cloud few companies can afford to invest in a evolution of your virtual infrastructure to a true utility model and
commodity services can be purchased isn’t that different from a virtualised massive internal cloud. You will likely your IT department to an internal service provider. So embrace this
from a provider and delivered at a scale-out infrastructure in today’s en- limit your cloud to a small set of sys- trend and leverage it to transform your organisation.”
changeable rate, according to need.” terprise. Both are composed of a col- tems, since the cloud’s frequency of use James Staten, Forrester Research
Virtualisation lies at the heart of lection of servers, topped with either a and total capacity needed won’t entire-
any private cloud architecture. Many grid engine or a virtual infrastructure ly be known and every IT investment “I believe that enterprises will spend more money building pri-
based on hypervisors,” says Staten of needs a clear business case today,” says vate cloud computing services over the next three years than buying
Forrester Research. Staten of Forrester Research. services from cloud computing providers. But those investments will
But private clouds differ in two Many organisations will need help also make them better cloud computing customers in future. Build-
key respects. First, in a private cloud, with that, says Aad Dekkers, chief ing a private cloud computing environment is not just a technology
developers deploy new applications marketing officer at MTI Europe. thing - it also changes management processes, organisational cul-
es a data centre that offers more efficiency to the cloud via a self-service portal, “Virtualisation that covers servers, ture and relationships with business customers. And these changes
control and choice. That element of choice without needing the help of systems storage, networking and desktops will make it easier for an IT organisation and its customers to make
vital element in a private cloud environ- administrators to configure a server involves a range of skills that can good ‘cloudsourcing’ decisions and transitions in future.”
nt, because if organisations are going to for them. They simply configure a test the in-house resources at even Thomas Bittman, Gartner
n external services too, then the industry ‘virtual machine’ themselves. large organisations, which is where
to work together to ensure that public and Second, and arguably more impor- a trusted partner can help.” “The cloud is at its core nothing more than flexible hosting. It has
ate clouds can work together and are built tantly, private clouds offer a hefty dose of The most important thing, howev- three core attributes: cost, control and performance. If it doesn’t have
ndustry standards. Nobody wants to move automation, that frees systems adminis- er, is that organisations get that start cost advantages, there is no point in doing it. If control isn’t adequate,
cloud environment that locks them into a trators from manual administrative tasks, under their belts as soon as possible. it can’t be secured (creating an inexpensive way to get folks fired). And
icular infrastructure or provider, or forces “such as determining the best placement After all, says Staten, the economic if performance drops, the cost savings can’t be justified. Given that the
m to re-write applications. That defeats the of new workloads and optimising the value of an internal cloud “rises with cloud is based on dynamically shifting loads across wide distances and
ect and purpose of cloud computing. virtual pool to make room for more ap- its use, which normally means invit- locations, it would seem that the network is, in fact, the central critical
plications,” says Staten. (For more on data ing as many applications as possible.” path. You can’t forget the servers, anymore than you can forget the
BINAR centre automation, see article on page 13, At his company, analysts are in- structure in a new house, but you focus on optimising the network so
ate Cloud – The future shape ‘Just keeping the lights on’.) creasingly seeing a cross-over between that your cost, control and performance needs are met. Other parts
loud computing What this amounts to is an IT infra- data centre virtualisation and auto- comes to mind, and they are the virtualisation and storage layers. Infor-
nesday 29 July 2009 structure primed to manage and store mation, and the multi-tenant, scale- mation from all three - the virtualisation platform, the storage platform
burgeoning volumes of corporate out infrastructures of cloud comput- and the network - need to be optimised to assure that the resulting
VMware, Cisco and EMC for an overview information in a more efficient way, ing. “There’s a good reason there’s so cloud system performs to specification.”
discussion of the “Private Cloud” vision says Adrian McDonald, vice president much hype around cloud computing Rob Enderle, The Enderle Group
cluding a full review of how the technolo- of EMC’s UK & Ireland operations. right now – it’s the fulfilment of an
of today provide the building blocks for “What we’re ultimately talking about architecture we have all been seek- For insight from IDC market analyst Chris Ingle on how cloud comput-
cloud computing of tomorrow. is freeing up information from its ing for many years, a shared pool of ing and virtualisation can boost business continuity, please see article
tration and full agenda at www.emc.co.uk/vce-webinar physical infrastructure,” he says. infrastructure resources that flexibly on page 11, ‘A better way to defeat downtime’.
That has two important benefits. accommodate business services.”
WORKING WITH LEADING ORGANISATIONS TO
REDUCE THE IT COST BURDEN
Fujitsu is helping private and public Instead, our clients beneﬁt from lower cost,
sector organisations ﬁnd new ways to enterprise-class IT services, pre-built to
reduce their cost base and enable greater perform. It offers them a viable alternative
operational ﬂexibility. to owning IT infrastructure, reduces capital
expenditure and provides real ﬂexibility
With our standardised services, we’re moving forward.
substantially reducing the cost, complexity
and lead time commonly involved in Find out more
implementing and managing IT services. Tel: +44 (0) 870 242 7998
21982_FUJ_Ants_AD_264x338mm.indd 1 10/7/09 15:11:47
EFFICIENT IT 11
A better way to defeat downtime
Virtualisation is helping companies to achieve the goal of ‘business as usual’ more
efficiently than ever before. Chris Ingle of IDC explains
f one was asked to summarise Virtualisation holds out the promise of fails, and there is no failover provision,
the benefits of virtualisation in a big reductions in the cost of minimising the application will be taken out of op-
single sentence, the increasingly downtime, whilst simultaneously ena- eration until the server can be restored.
popular mantra of “do more with bling very significant increases in levels of In a virtual environment, virtual ma-
less” would get pretty close. In fact, application performance and availability. chines (VMs) can be created in pairs
in the early stages of virtualisation at (For more on virtualisation, see article on that run in lockstep, but on different
least, most organisations seem happy page 7, ‘Virtualise to capitalise’.) physical servers – a passive machine es-
to “do the same with less” – content Let’s have a look at four different sentially mirroring the active one.
with the apparent “immediate ben- business continuity situations which In the event of an unexpected hard-
efits” of significant cost reductions illustrate how virtualisation can de- ware failure that causes the active, pri-
through server consolidation. liver on this “breakthrough” prom- mary VM to fail, the secondary, formerly
However, as we look beyond server ise when compared with traditional, passive VM immediately picks up where
consolidation, and consider the im- physical IT environments. the primary left off, and continues to
pact of virtualisation on wider busi- run, uninterrupted, and without loss of
ness processes, the promise of “do- Server maintenance network connections or transactions.
ing more with less” starts to become Most downtime is planned. It’s a sim-
clearer. And as many organisations are ple requirement to shut down many Workload management
starting to realise, business continuity operating environments when main- Less visible than server failure, but of-
is a critical IT discipline where the op- tenance or implementation of new ten just as costly over time, is the is-
portunity to actually achieve the feat features are planned. sue of workload management. High
is particularly strong. In a virtualised environment, vir- levels of utilisation can significantly
Many observers still equate business tual machines with applications run- affect the ability of an application to
continuity with disaster recovery – en- ning can be moved in real time to oth- perform, with costly consequences
suring operations keep going when er servers, without any disruption to where the application is performing a
the entire data centre burns down. service levels. This means that server mission-critical function.
Essential as it is to plan for extremes, maintenance tasks can be performed In a physical environment, the applica-
almost all the cost and risk addressed
by business continuity is in fact the
at any time of day (or night), with
zero scheduled downtime.
tion is entirely dependant on the resourc-
es of the server on which it operates. Ei-
ABOUT THE AUTHOR
result of more common instances of ther server capacity has to be provisioned
application downtime. Mundane as it Server failure at all times to cope with peaks in activity Chris Ingle is consulting director, systems research with IDC, and co-
may sound, anyone who has experi- Unplanned downtime arising from – meaning wasted resources at all other author of the 2009 white paper, “virtualisation and Business continu-
enced a prolonged shutdown of their hardware or software failure can be times – or the business has to incur the ity”. A full copy of this document is available from www.emc.co.uk/bc
email system, or simply slow network very costly to a business. cost of degraded performance or even
performance, will immediately un- In a physical environment, an operat- failure at times of peak activity.
derstand the damage to productivity ing system and the applications it sup- In a virtual environment, VMs that re- framework for disaster recovery, the DR application files still demand time and
(and morale) that even short episodes ports are highly dependant on the server quire extra processing capacity can be re- process in a virtualised environment is processing bandwidth, which should
of downtime can cause. on which they are hosted. If the server deployed, with no interruption to other far faster and easier than in a physical not impede the performance of a
hosts, affording them greater perform- environment, where installation, recon- highly utilised server.
ance. This process is typically automat- figuration and testing of restored OS In a virtualised environment, there-
SEAmlESS TRAnSITIOn ed, with pre-set business rules dictating
at what point and to which destinations
and applications is a laborious process.
A virtual environment can poten-
fore, critical back-up processes will
need to be re-examined and changed
overloaded VMs will be redeployed. This tially recover a system in hours rather where necessary. Back-up to disk and
In the event of unexpected hardware failure that causes an active, reduces both the cost of outage itself and than days. A traditional system can take data deduplication are increasingly
primary virtual machine (VM) to fail, the secondary, formerly passive the cost of staff required to manage the many hours to rebuild the operating coming to be perceived as essential
VM picks up where the primary left off. rebalancing of workloads. system and application configuration. components of a high-performance in-
In the case of virtualised systems, the formation infrastructure environment.
large-Scale diSaSter back-up of the complete virtual ma- The processes themselves will be highly
Although large-scale disaster recov- chine can be directly restored without automated and intelligent; scheduling
ery is rarely invoked, the potential operating system or application rein- and provisioning will be driven dy-
impact of such a disaster may be stall, requiring much less testing. namically in real time by predefined
enough to risk putting the company business rules; and applications are
out of business. adequate preparationS restored in an order that reflects their
Virtualisation can dramatically From the discussion so far, it will hope- importance to the business.
reduce the costs associated with pro- fully be clear why virtualisation can bring The overall end-game of combin-
visioning for, and executing, disaster about significant improvements to busi- ing server virtualisation with a highly
recovery processes in two main areas. ness continuity performance, optimising efficient and automated data manage-
The first is cost of the recovery en- levels of assurance to the business within ment infrastructure, is to enable the
vironment. In a physical environment, the limits of acceptable spend. IT function to move beyond the in-
a full mirror site has to be maintained However, the benefits of server vir- effective and wasteful policy of equal
in the event of partial or total failure of tualisation with regard to high avail- business continuity provisioning for
the main production site. This can be ability and data protection may be all applications.
prohibitively expensive. In a virtual en- severely compromised if the wider In this scenario, provisioning can
vironment, the seamless portability of information infrastructure is not ad- fall short of real requirements for
VMs means that different (physically equately prepared. highly critical applications (creating
separate) parts of the production en- Data back-up offers a prime ex- risk) and conversely can be unneces-
vironment can be used to host mirror ample. A defining benefit of virtu- sarily high for non-critical applica-
copies of VMs from other parts. This alisation is the significant increase in tions (creating waste).
makes the provisioning of the recovery server utilisation rates. But this means It is this fundamental shift from
environment more cost-effective in that if, for example, a virtualised static, “highest common denomina-
terms of both cost of infrastructure and server increases its average utilisation tor” provisioning to dynamic, in-
cost of management. This is obviously from 20% to 70%, its spare process- telligent provisioning, that enables
best practised across multiple sites. ing capacity is no longer available as virtualisation to truly deliver on its
Source: VMware The second area is cost of recovery a performance buffer. Back-up and promise to help organisations “do
process: By providing a fully automated replication processes of databases and more with less”.
EFFICIENT IT 13
Just keeping the
If IT staff are too busy working on routine, day-to-day tasks, there’s
no time left for them to explore the strategic projects that could
deliver real business value. That’s where data centre automation
comes in, industry experts tell Sally Whittle
he typical IT department spends 70 per are important, of course, but it’s an even bigger LL: Repeatable, standardised processes are can manage both physical and virtual environ-
cent of each day ‘just keeping the lights problem if you can’t launch new projects. the most suitable candidates for automation. ments and integrate the two.
on’. That means that the vast majority of AW: Absolutely. If you want to improve busi- Tasks that involve subjective decisions make less RS: It’s critical to ask how an upgrade of an
staff time and resources are dedicated to ness performance, you’ve got to devote more sense, but you can still partially automate these automation tool is likely to impact the test and
routine, day-to-day administrative and trouble- time to strategic projects. Saving time in the data processes, if you can supply the decision-maker production environment. I know that […] this
shooting tasks, rather than investigating opportu- centre means you’ve got time to start looking at with the necessary information in a single, au- can easily become horribly complex.
nities to support overall company strategy. things like virtualisation and cloud computing,
That’s clearly not efficient, and in recent years, which can really drive value.
the situation has prompted widespread invest-
ment in data centre automation tools. These are
designed to take on tasks that would otherwise
need to be performed manually, automating the
What are the key technologies available for data
centre automation and which do you think are
most interesting? Why?
If you want to improve
processes of managing enterprise applications
based on policy and priority and maximising the
use of available hardware resources.
AW: There are four key stages in data centre automa-
tion. First, there’s discovery - finding out what you’ve
got and the relationships between those components.
business performance, you’ve
We asked four IT industry executives for their ad-
vice on using automation to relieve the data centre
management burden. Our commentators are:
Next, you have IT service management, which governs
how services are delivered. Then, there’s root cause
analysis, which is about understanding the causes of
got to devote more time to
Chris Ingle, a consulting director, systems re-
search with IDC; Andy Waterhouse, technology
services director for EMEA at information man-
problems. Finally, there’s change configuration and
management. Any or all of those processes can be im-
proved and monitored with automation tools.
agement company EMC; Robert Schwegler, chief CI: There are lots of automation tools available,
technical architect at Betwin, the world’s largest and what you need will vary depending on your tomated view. For example, we have an incident How will data centre automation help as organ-
listed gaming company; Luca Lazzaron, vice pres- company’s IT infrastructure. But automation will management tool that automatically collects isations increasingly look to adopt newer infra-
ident and general manager for EMEA at systems only be effective if you invest time in rationalising and presents the data needed for operators to structure models, such as utility computing,
and service management specialist BMC. and virtualising the data centre first – that’s essen- identify changes that may be causing a problem, service oriented architectures (SOAs), software
tial or you’re simply automating chaos. thereby reducing time to diagnose and repair. as a service (SaaS), virtualisation and so on?
In your experience, how much time do IT de- RS: At Betwin, there’s no single answer about what LL: These new architectures bring increased
partments spend ‘just keeping the lights on’? Can any data centre task be automated, or are we should and shouldn’t automate – it’s all about re- complexity. Automation will be critical in free-
CI: Our research suggests that large UK compa- some tasks more difficult than others? turn on investment. For high-volume ecommerce ing up resources to manage this increased com-
nies are only devoting 13 per cent of their infor- CI: Personally, I’d say you can automate pretty much applications, it makes sense to automate by building plexity. As we’ve seen within many organisations,
mation and communication technology (ICT) any administrative task by either having software up a blank server with the target application stack virtualisation offers dramatic savings on capital
resources to new developments, a figure that perform the task, or having the user request a serv- and configuring it automatically. This isn’t hard expenditure, at the cost of increased operational
hasn’t changed in five years. Operational issues ice, and automatically provisioning that service. to do and can be done with relatively simple tools. expense. With a data centre automation solution,
The less convincing arguments are around complex companies can achieve the best of both worlds,
work such as orchestration and analysing different with lowered capital and operational costs.
Automation will be sequences of changes to minimise downtime.
When selecting a data centre automation prod-
RS: The great thing about automation is you
don’t get the high-pressure, very risky changes rolled
out manually overnight. It’s often easier to produce
critical in freeing up uct or supplier, what are the most important
questions to ask?
AW: I think it’s key to ask [suppliers] not just
a version of a virtual computer, make a snapshot,
deploy a new version and if it doesn’t work, fall back
to the last known ‘good state’, instead of defining the
resources to manage this what they’re doing today, but also how will they
support future data centres. Whether you like it
delta change and saving only that before updating.
Without a doubt, the tools are getting there to help
or not, the world is going virtual, and from an to support complex tasks in a data centre, but one
increased complexity automation point of view, you want to be sure
you’re looking at tools which can automate that
kind of infrastructure. So you should ask if they
needs clear support contracts [from suppliers]. If
something doesn’t work, the experts are sitting out-
side of your company and may be hard to reach.
TOP CANDIDATES FOR AUTOMATION
Which data centre tasks are ripe for automa- formation by searching corporate networks, sion a new server – in the form of a ‘virtual manage physical (and sometimes virtual) in-
tion, taking them out of the hands of quali- so that systems are maintained at necessary machine’ – in less than one hour. frastructure in the data centre to reduce op-
fied IT professionals, who then get to spend operating levels. erational risk and improve efficiency, by mon-
their time on more strategic work? Change management itoring and controlling equipment, power,
Service provisioning Changes to systems and the applications cooling, network infrastructure and storage.
Discovery and dependency mapping If you need to get a new service up and run- that run on them need to be managed and
Many organisations have no real knowledge ning, it can easily take six weeks to provision monitored. With 80% of data centre down- Patch management
of how many servers and other hardware a new server to run a new application. Using time caused by changes to the infrastructure, Compared to a manual approach, auto-
resources they have in their environment, automation, data centre staff can automate why not remove human error from this proc- mated patch management can reduce the
what operating system version and patches processes such as configuring a new server ess by automating change management? annual cost of installing software patches
are running on them and how these assets and applying patches and security policies from £150 per computer to £25 per compu-
depend on each other to run smoothly. Data to it. Combined with virtualisation technolo- Resource management ter, according to researchers within Novell’s
centre automation tools can gather this in- gies, these tools can help IT staff to provi- Automation tools can be used to continually Zenworks business.
14 EFFICIENT IT
down. Can intruders get at that data?
Walls come tumbling down
Nobody can give a firm answer, until a
hacker tries and declares it impossible.
There’s also the ‘multiple vulnerabil-
ity’ problem. The virtual servers all run
on one real machine. That machine has
an Internet address. A denial of serv-
When data lives in a virtual environment, perimeter security is no longer enough ice attack launched at that IP address
could lock the machine up; instead of
to safeguard it. New approaches are needed, as Guy Kewney reports losing just one server, the corporation
loses all the virtual machines together.
Is it a real threat? Possibly not; but it’s
still a worry for many organisations.
The fact is that a lot of corporations
still rely heavily on what EMC and the
Jericho Forum might regard as obso-
lete security. The usual way of pro-
tecting data is to monitor traffic on
the local area network (LAN); if one
machine sends data to another, it has
to give it to a router, which handles se-
curity. If it spots a threat, the data isn’t
transmitted. End of story.
But on a virtual server, the virtual
machine receiving that data might be on
the same piece of hardware. The send-
ing virtual machine need not to talk to
the LAN at all; it just transfers the data
internally. Network security isn’t being
used. Yes, there are virtual routers, but
many CIOs doubt that they are as rig-
ourous as network-based intrusion de-
tection and prevention devices.
Those same CIOs, however, are un-
der pressure to adopt virtualisation, if
they haven’t already, and to explore its
potential further. Already, companies
ny management team con- Paul Simmonds Real oR not? Today’s virtual desktop technol- that provide wide area networking
cerned about the security of from the Virtual servers, however, don’t work ogy includes tools for hiding the data services are hoping to persuade them
their corporate data might Jericho Group like ‘real’ computers. They are imagi- from malware. Today’s virtual net- to move whole data centres. Why run
imagine that the best way to believes security nary machines. work can lock down data, even if the a huge battery of power-hungry serv-
stop strangers gaining access would be professionals The first time people hear this, hardware is somewhere outside the ers in the heart of the City of London
to keep it on a protected corporate serv- need to catch their reaction is incredulity; but company campus. where electricity supply is at its limits,
er, in a top-security building - preferably up with current in fact, the concept of virtualisa- Well, that’s the concept – but many they say, when you can move them to
one with bars on the windows. On that thinking on de- tion is pretty straightforward. You inside the data centre still see dangers cheaper premises in Slough? And be-
basis, they might be horrified by the perimeterisation take a standard personal computer in the approach and are unsure about yond that, CIOs must consider cloud
concept of ‘virtualising’ that server and or server, but you don’t load the the solution. For example, a virtual ma- computing, where server hardware
moving it out into the cloud. ation where security was not man- normal Word or Excel programme. chine that shares a physical system with might not even be in the same coun-
Some security specialists say they’d ageable. But with virtualisation, we Instead, you load a virtualisation four other virtual machines, could (at try. It might not even belong to them.
be wrong; the opposite, in fact, is true. are making security inherently part package, which sits in a corner of least in theory) still hold sensitive data Jericho Forum board member Paul
Their argument goes like this: it can be of the infrastructure.” the main computer, and runs vari- in its memory – data left over when Simmonds reckons security profes-
more secure to ‘de-perimiterise’ data se- The message that data should take ous operating partitions. One par- a different virtual machine is closed sionals just have to catch up. To assist
curity and move an IT infrastructure to precedence over infrastructure in se- tition might be Windows, another them in this, the think-tank has pub-
a truly virtual one. The way to do this curity terms is one that the Jericho might be Mac OS or Linux; and Eric Baize lished a set of principles that describe
successfully is to make the data itself se- Forum, an industry think-tank, has each copy of the operating system of EMC how best to handle security in a de-
cure, not the box on which it runs. been preaching for four years. But runs different programmes. says it’s time perimeterised world, its own version
As Eric Baize, senior director of many security professionals find this A virtual desktop is simpler still: that security of the Ten Commandments.
the product security office at EMC, new, “walls come tumbling down” you sit down at a computer, but the was made “My favourite tenet is Command-
puts it: “If you look at history, the theory alarming, and continue to in- software is running elsewhere. All you an inherent ment No 2: Unnecessary complexity is
whole security industry has been sist on robust firewalls and intrusion see is a copy of what would appear on part of a threat to security,” he says. “I’ve yet to
chasing infrastructure innovations; detection and prevention technology the remote machine’s display. None of the virtual find a security person who understands
as a consequence, we came to a situ- to guard their companies’ perimeters. the data is on the client device. infrastructure this in a virtualisation context.”
HOW SAFE IS THE CLOUD?
When an organisation operates a secure So how can managers at an organisation gues that not all problems are data-centric. compliance have been figured out by cloud
corporate network environment, its data that uses cloud computing convince their Some are hardware-centric, he points out: operators – some of them, at least.
stays on that network. When it migrates customers and partners that data is safe? “A credit card approval server should never European regulations state that certain types
corporate systems to the cloud, it’s a dif- The Jericho Forum approach is designed to be compromised.” of personal data may not be stored outside
ferent story. make cloud computing easier for data secu- Young claims that cloud providers like Trend the country in which the people concerned
In the cloud scenario, the organisation rity specialists. If data isn’t important, you don’t Micro, who understand virtualisation, are fur- live. “So the EMC cloud storage offering can
may not own the computers or the net- bother securing it at all. If it’s important that no- ther ahead in data security terms than end- decide where in the cloud data can go, based
work. All it is given are login details to ac- body changes it, then you impose digital rights user organisations, who focus on securing on security you define for the data,” he says.
cess the data, which resides on the cloud management (DRM), or simply store it as a read- perimeters. “We live in the virtual network, That means, for example, that companies
provider’s infrastructure. There’s almost no only document. And if it is really sensitive, you working from inside virtualisation, and moni- in countries where privacy rules forbid them
way of knowing where the data might be lock it down with new ‘de-perimeterisation tools’ tor traffic, deciding whether this is a rogue or to send employee data abroad can decide
– it may not even be in the same coun- that the Jericho Forum has been developing. a good guy. This is a new boundary.” which server is used in the private cloud (ei-
try. And if the data isn’t self-secured, then That doesn’t cover all security threats, of And according to Eric Baize, senior direc- ther external or internal), or they can directly
there’s no security other than that offered course. Simon Young, general manager for tor of the product security office at EMC, define policies so that data on employees is
by the cloud provider itself. server security in EMEA at Trend Micro, ar- even thorny issues such as data privacy only stored in its country of origin.
It creates understanding, where once
there were walls. It connects a kid to a
scientist to a CEO to save a glacier.
It brings ideas together.
And people together.
It’s the human network effect.
The effect that is changing the world.
When technology meets
humanity on the human network,
the way we work changes.
The way we live changes.
human network effect