1. ITinEurope
july 2012, volume 3
Gain a complete overview of European IT in today’s marketplace.
Innovations with BI
Advanced Analytics
Tools Slow to Catch Fire
Securing NoSQL
Applications: Best
Practises for Big
Data Security
+
SIPCOM Cut
Costs With
Storage and
Server
Upgrade
Unified
Storage Goes
Mainstream
Rural Broad-
band—Should
Residents Pay?
Additional
European
Resources
2. IT in Europe • july 2012 2
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
editor’s letter
managing a business is tough even
at the best of times, but even more
so in an economy as difficult and
unpredictable as most of Europe is
facing.
The Eurozone’s troubles bring
new challenges to senior execu-
tives every day. The situation puts
a premium, more than ever, on
timely, accurate, insightful deci-
sion-making. It’s no exaggeration
to say that one bad decision in a
down economy could make the
difference between survival and
failure.
Making good decisions means
having the right information to
hand, when you need it. For IT lead-
ers, that means delivering business
intelligence (BI) and analytics tools
into the hands of decision-makers
—and increasingly that extends
beyond the boardroom and onto
the desks and smartphones of
employees making important deci-
sions as part of their everyday work
and in every interaction with cus-
tomers.
For any organisation that deals
with consumers or a large cus-
tomer base, you need demographic
data about your clients; you need
behavioural data about what they
are doing, and why and where they
are doing it; and often you need
that information in real-time to
react faster than the competition.
Today we also have the emer-
gence of so-called “big data”—
tools to handle and analyse enor-
mous datasets, searching for the
needle in the haystack that gives
the extra piece of insight that leads
to a vital business decision.
In this month’s IT in Europe, we
examine the issues of turning data
into business decisions, the chang-
ing demands on IT managers to
deliver the latest tools, and the
technologies that are available.
Understanding the best practice
in BI and analytics might just turn
out to be the best decision your
business ever makes. n
BRYAN GLICK
Computer Weekly Editor in Chief
Innovations With
Business Intelligence
3. IT in Europe • july 2012 3
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Advanced Analytics Tools Slow to Catch Fire
urveys of chief infor-
mation officers
regularly place data
analytics among the
top priorities for IT
investment. For example, in Gart-
ner’s 2012 CIO Agenda Survey,
conducted worldwide late last year,
the 2,335 respondents ranked ana-
lytics and business intelligence (BI)
tools as their No. 1 technology pri-
ority for this year.
But things aren’t as simple as
they used to be for CIOs or their
BI and analytics teams. Over the
last few years, the spotlight has
shifted from “conventional” BI tools
to predictive analytics and other
advanced analytics technologies.
New demands from business
users—such as the need to process
ever-larger volumes of data more
rapidly, particularly in “big data”
environments—are also forcing
companies to look again at their
BI and analytics capabilities. Most
BI systems can be plausibly called
backward-looking, with a focus
on systems of record and histori-
cal data. The range of analytical
technologies has expanded rapidly,
offering organisations powerful
new tools for mining data and mod-
elling future business scenarios,
but also stretching their IT resourc-
es and analytics skills bases. And
so companies often face a consid-
erable challenge in updating their
capacity to support advanced ana-
lytics applications.
As a result, the take-up of ad-
Advanced Analytics
Tools Slow to Catch Fire
CIOs are avid for advanced analytics offered by in-memory and in-
database analytics and open source big data technologies like Hadoop.
But take-up of advanced analytics remains low. By Stephen Pritchard
S The spotlight has
shifted from “conven-
tional” BI tools to pre-
dictive analytics and
other advanced ana-
lytics technologies.
4. IT in Europe • july 2012 4
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Advanced Analytics Tools Slow to Catch Fire
vanced analytics software remains
relatively low, in part because of
its complexity and cost and in part
because of the immaturity of some
of the technologies on offer.
Gartner expects advanced ana-
lytics to eventually become widely
adopted. But a survey carried out
by the IT research and consult-
ing company in March 2011 found
only a quarter of the respondents
had made “significant progress”
towards deploying predictive ana-
lytics systems. Those that had
done so typically started with a
small project and built out their
capabilities from there; none of the
projects Gartner surveyed was IT-
led. Recent comments from Gart-
ner analysts indicate they haven’t
seen a significant increase in usage
levels since the survey was done.
Neil Miller, head of Accenture’s
analytics consulting practice in
the UK and Ireland, suggests the
advanced analytics field is even
less mature than Gartner’s data
makes it out to be. “Very few com-
panies really compete on analyt-
ics—there are maybe 15 to 20 glob-
ally that use sophisticated analytics
and have a strong BI and leadership
ethos,” Miller said.
However, he also thinks inter-
est in advanced analytics is likely
to pick up as businesses are able
to invest more in IT than they can
in the current economic climate,
and there are more examples of
companies driving higher revenue
and profits from analytics projects.
Miller said IT and BI teams “need to
deliver business value” on analyt-
ics investments to justify additional
spending.
Analytics Data
Blowing in the Wind
One company that has bought into
advanced analytics is Vestas Wind
Systems, a Denmark-based maker
of wind turbines. Vestas is using
IBM’s Hadoop-based InfoSphere
BigInsights software and other ana-
lytics tools to help prospective cus-
tomers evaluate possible locations
for its turbines based on wind and
other weather-related data from
today back to the year 2000.
“Our analysis tools make it pos-
sible to answer the question of
whether a location is a profitable
Jaspersoft: Where Big Data Grows
(Read the full story in Italian)
“Our analysis tools
make it possible to
answer the question
of whether a location
is a profitable site
to develop.”
—Lars Christian,
vice president, Vestas
5. IT in Europe • july 2012 5
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Advanced Analytics Tools Slow to Catch Fire
site to develop,” said Lars Christian
Christensen, a vice president at
Vestas. “The data we have is the
digital equivalent of a map of gold
mines.”
Predictive analytics and data
mining software are the most
prominent advanced analytics
technologies, but others are gaining
footholds in organisations as well.
In-memory analytics tools claim
to offer significant speed improve-
ments over conventional BI and
analytics systems that need to pull
information from data warehouses
for analysis. In-memory products
are available at various price points
and performance levels, but large
BI and enterprise applications ven-
dors such as SAP, Oracle and, most
recently, Microsoft, have all intro-
duced high-end offerings.
Analytical databases and in-
database analytics tools offer alter-
native approaches that can be less
costly than in-memory technology,
but faster than systems that rely on
migrating data to a separate analy-
sis system or data warehouse.
And then there are open source
technologies such as Hadoop,
MapReduce and NoSQL databases,
which target the challenges posed
by big data. They’re finding favour
in cloud-based IT environments
and at organisations that mine vast
amounts of unstructured data, such
as Web server logs or information
gleaned from social media. Also
available to aid in that process is
text analytics software that can
be used to look for keywords and
language patterns in social media
posts and other forms of text data.
Currently, advanced analytics
tools are available from three main
sources: large BI and data ware-
house vendors, niche players and
startups specialising in particular
fields, and companies set up to
commercialise open source tech-
nologies, especially Hadoop.
New Analytics Options
Add Complications
Because of all the different options,
businesses have to familiarise
themselves with new vendors as
well as new tools, said Steve Gal-
lagher, director of the BI practice
at PA Consulting Group. “There’s
a set of existing tools for day-to-
day BI and historical reporting, but
the companies that were the big
Predictive analytics
and data mining
software are the
most prominent
advanced analytics
technologies.
Bull Adopts Jaspersoft for its Business Intelligence
(Read the full story in French)
6. IT in Europe • july 2012 6
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Advanced Analytics Tools Slow to Catch Fire
hitters [on those products] are no
longer,” he said. “The growth is in
databases that are more attuned to
high-speed analytics. Hadoop, for
example, is being used more by big
organisations.”
Although enterprise BI vendors
are moving into most, if not all, of
the emerging advanced analytics
technology markets, by no means
are they all there yet. This situa-
tion, analysts warn, risks leading
to a jumble of point products in
organisations, resulting in rising
costs and a possible breakdown of
enterprise master data manage-
ment strategies.
“There is a quick bang for the
buck in data visibility from ad-
vanced analytics,” said Eddie Short,
a partner specialising in data ana-
lytics at consultancy KPMG. “But
the big challenge is a proliferation
of niche solutions.”
In some cases, though, advanced
analytics tools can be easier to
deploy than conventional BI and
data warehouse systems. One rea-
son, although it might seem illogi-
cal at first sight, is that data quality
is often less of an issue for analyt-
ics professionals than performance
is, according to Gallagher.
In real- or near-real-time applica-
tions, for example, “you don’t have
the luxury of extracting data to a
data warehouse overnight,” he said.
“In the traditional data warehouse,
you have time to cleanse the data,
but in real-time [systems] with
information fed from satellite feeds
or market data, you cannot.”
To assess the required levels of
data quality, “you have to ask what
type of questions you are trying to
answer,” Gallagher said. For some
organisations, he added, “the quali-
ty of the data is not as important as
the trends and patterns that come
out of it.” n
Stephen Pritchard is a journalist and broad-
caster based in London. He has covered the
technology and IT industries since the mid-
1990s and has contributed to publications
including Computer Weekly, The Independent,
The Financial Times and CNBC Business and
SearchDataManagement.co.UK.
Data Analysis and Business
Intelligence Tutorial
Everything you need to know
about business intelligence and
data management. This guide
includes information on business
data usage, data quality and
business analytics.
“There is a quick
bang for the buck
in data visibility
from advanced
analytics.”
—Eddie Short,
Partner, KPMG.
7. IT in Europe • july 2012 7
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Securing NoSQL Applications: Best Practices for Big Data Security
osql database sys-
tems are designed
to provide real-time
performance while
managing large vol-
umes of data. This performance,
coupled with the no-cost philoso-
phy behind many NoSQL products,
has led many companies to take a
look at NoSQL.
However, companies should not
rush to implement NoSQL without
first evaluating the security impli-
cations of switching from a rela-
tional database management sys-
tem (RDBMS) model to a NoSQL
model. So what are the secu-
rity implications for any company
thinking of deploying a NoSQL
database?
Who Uses NoSQL?
NoSQL can be an important tool
for any company that has big data.
Big data is simply any data set that
has grown too big to be efficiently
worked on in real-time with tradi-
tional database tools.
NoSQL is a broad class of data-
base management systems that are
not traditional relational database
management systems. They do
not use SQL as the primary query
language, nor do they typically
require fixed table schemas. Also,
NoSQL is not a single-vendor prod-
uct (many NoSQL implementations
are open source), but rather an
umbrella term that can be applied
to any of the non-RDBMS big data
alternative systems.
Currently, NoSQL databases are
in the evolutionary stage of their
lifecycle and, unlike their RDBMS
counterparts, such as DB2, MySQL,
Oracle and SQL Server, the attack
vectors for NoSQL databases are
not well mapped out. And it’s likely
new attack vectors will emerge that
will target NoSQL data stores in
new ways.
securing NoSQL
Applications: Best Practices
for Big Data Security
NoSQL is great for big data, but security is often lacking
in NoSQL applications. By Davey Winder
N
8. IT in Europe • july 2012 8
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Securing NoSQL Applications: Best Practises for Big Data Security
Data breaches caused by a
NoSQL injection are probably
not far away. With some NoSQL
implementations being, essentially,
authentication-free JavaScript pro-
cessing engines, this is inevitable.
Indeed, the basics of just such vul-
nerability were exposed at Black
Hat USA last year when Bryan Sul-
livan demonstrated a server-side
JavaScript injection attack against
one NoSQL implementation that
could discover database contents
and run basic commands.
Is NoSQL Secure?
NoSQL has not been designed with
security as a priority, so developers
or security teams must add a secu-
rity layer to their organisations’
NoSQL applications.
During the last couple years,
many small businesses have been
moving into big data territory,
struggling to manage ever-increas-
ing volumes of business data. So
it should come as no surprise to
learn that threat-tracking firms
have reported increased security
researcher and hacker activity tar-
geting the NoSQL database sector.
Some of this is driven by confusion
amongst small businesses about
how NoSQL databases can be
securely implemented. Too often,
these companies ignore NoSQL
security measures that would have
been implemented by default with
traditional RDBMS installations.
For example, many NoSQL prod-
ucts allow and even recommend
the use of a “trusted environment”
with no additional security or
authentication measures in place.
These modes assume that only
Business Intelligence and
Big Data Channel Coverage
n
BI Breaks Free of its Binary Chains
Traditional business intelligence (BI) tools are pretty good at what they’re
designed to do, but it’s an inescapable fact that they are entirely stupid, according
to this contributor.
n
Selling BI in the Context of Other Market Trends
Business intelligence vendors and their solutions need to be more agile and adap-
tive than ever, says Amro Gebreel.
n
You Don’t Have to be Big to Want to be Smart
BI isn’t the preserve of the corporate, and the emergence of analytics software
specifically designed for SMEs is helping the technology penetrate much smaller
businesses too, as Billy MacInnes reports. n
9. IT in Europe • july 2012 9
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Securing NoSQL Applications: Best Practises for Big Data Security
trusted machines can access the
database’s TCP ports. But relying
on the network to protect data in
an Internet-enabled world is a sure-
fire way of inviting a breach of any
sensitive information held there.
Kerberos authentication mod-
ules are now becoming available,
which should provide access con-
trol capabilities equivalent to the
current Kerberos or NTLM (Micro-
soft Windows NT LAN Manager)
approaches to user authentication.
Securing NoSQL Databases
Because most of the popular
NoSQL databases are open source,
IT staff would be wise to devote
some time to contributing stronger
authentication and encryption sys-
tems to their NoSQL implementa-
tions, rather than waiting for the
publisher of a proprietary database
to make changes.
NoSQL data stores are basically
vulnerable to the same security
risks as traditional RDBMS data
stores, so the usual best practises
for storing sensitive data should be
applied when developing a NoSQL-
based application. These include:
n
Encrypting sensitive
database fields;
n
Keeping unencrypted values
in a sandboxed environment;
n
Using sufficient input
validation;
n
Applying strong user
authentication policies.
Of course, it would be ideal if
there were an accepted standard
for authentication, authorisa-
tion and encryption in the yet-to-
mature NoSQL space. Until such
a standardised consensus can be
reached, the best approach is to
look at security in the middleware
layer, rather than on the cluster
level, as most middleware software
comes with ready-made support
for authentication, authorisation
and access control. For example,
if Java is being used, then the JAAS,
Oracle Corp. J2EE or SpringSource
(a division of VMware) Spring
Security frameworks are available
for the authentication, authorisa-
tion and access control for noSQL
database implementations.
In closing, the most important
tip to take from this brief explo-
ration of NoSQL security is this:
Beware of jumping on the NoSQL
bandwagon until you have made
sure the wheels won’t fall off. Rec-
ognize that NoSQL databases are
inherently insecure. If you decide
to proceed, apply your own encryp-
tion and authentication controls
to safeguard the big data in your
NoSQL databases. n
Davey Winder is a UK-based freelance writer
and former ‘Technology Journalist of the Year’
who has spent the best part of two decades
writing about IT security issues. A three time
winner of the ‘Information Security Journalist of
the Year’ title, in 2011 Davey was honoured to
receive the Enigma Award from BT in recogni-
tion of his lifetime contribution to information
security journalism. Winder is a contributor to
SearchSecurity.co.UK.
10. IT in Europe • july 2012 10
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
SIPCOM Cut Costs With Storage and Server Upgrade
DATA CENTRE
sipcom cut costs
With Storage and
Server Upgrades
Hosting provider SIPCOM cut its
licensing costs and improved SaaS
delivery for customers, thanks to
these infrastructure upgrades.
By Archana Venkatraman
hosting firms searching for ways to
cut costs should look no further
than the recent moves by SIPCOM.
By upgrading its storage and
server infrastructure, SIPCOM said
it improved software as a service
(SaaS) delivery for customers and
cut its own licensing costs in half.
SIPCOM hosts Microsoft Lync
environments for its customers,
including system integrators, tele-
com providers and resellers. But
its existing IT infrastructure used
to host the unified communica-
tions platform was not capable of
supporting its customers’ growing
needs. In addition, the distributive
IT systems meant high Microsoft
licensing costs.
“The architecture we were using
didn’t have the density or the cores
that can help our business scale,”
said Daniel Allen, SIPCOM’s chief
executive. “Neither was it flexible
and standardised.”
SIPCOM’s IT team chose Dell
PowerEdge R815 servers and Dell
Compellent virtualised storage
platforms to host the Microsoft
Lync environment. It virtualised
with Microsoft Hyper-V.
“We approached three of the top
vendors including HP, but only Dell
was offering the server combined
with storage capabilities we want-
ed,” Allen said.
Selecting Hyper-V was a
straightforward choice, Allen
said. SIPCOM host Microsoft
platforms and know their products
well, he added.
Upgrading Cuts
Licensing Costs in Half
In addition to helping customers
More European content
from the TechTarget
network
11. IT in Europe • july 2012 11
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
SIPCOM Cut Costs With Storage and Server Upgrade
with better SaaS capabilities, the
strategic project has brought ben-
efits to SIPCOM’s own bottom line.
The IT department has reduced
data centre rack space, saving on
power costs and minimising oper-
ating expenditure.
Most importantly, the new archi-
tecture has helped SIPCOM reduce
costs for Microsoft licensing by
50%. This is because Microsoft
software is typically licensed per
CPU, Allen said.
“The 12 cores—now 16 cores in
the latest hardware revisions—in
the R815s have enabled us to get
high compute power from a rela-
tively small number of CPUs,” Allen
said.
With a standardised and scalable
infrastructure in place, SIPCOM’s
IT team plans to include backup as
a service and desktop as a service
to its customers.
Seeking Thin Provisioning,
Automation and Multi-tiered
Platform
SIPCOM’s IT team wanted high-
density servers and a virtualisa-
tion platform that provides higher
average hardware utilisation,
repeatable deployment with more
automation and therefore higher
uniformity—all key aspects to
deliver better services to its cus-
tomers.
“Dell Compellent storage with its
thin-provisioning virtualised stor-
age, and its snapshotting capabili-
ties enabled us to deploy template
Hyper-V VMs in seconds,” Allen
said.
The IT team started the project
in November 2011 and now, five
months later, has an infrastructure
with the power and density to sup-
port more han 200,000 users with
Microsoft Lync services as well
as host other SaaS environments,
including Microsoft Exchange
2010, Microsoft SharePoint 2010,
and backup services—all on a sin-
gle platform.
SIPCOM’s technology infrastruc-
ture project has given the company
a versatile storage architecture
including automated tiering and
thin provisioning. These capa-
bilities helped SIPCOM meet the
needs of customers with various
service requirements, Allen said.
Some customers prefer to buy
storage in bulk, while others buy
storage per gigabyte. The multi-
tiered and scalable IT environment
allows SIPCOM’s customers to
benefit from eliminating wasted
capacity and using variable pay-
ment models. n
Archana Venkatraman is the Site Editor
of SearchVirtualDataCentre.co.UK.
Zurich East Data Centre Switches on Direct Current Power
(Read the full story)
12. IT in Europe • july 2012 12
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Unified Storage Goes Mainstream
STORAGE
unified storage
goes mainstream
Unified storage has gone main-
stream, with full protocol support
from the top five vendors. We
survey the options available,
including single platforms and
NAS gateways.
By Chris Evans
unified storage has come to mean
the delivery of block and file stor-
age from within a single platform.
The initial incarnation of these
devices consisted of a block stor-
age array with the addition of a file
gateway, enabling both file and
block protocols to be supported in
a single configuration.
Today, that’s still an approach
that’s common among storage ven-
dors, and the only major vendor to
offer a truly integrated unified plat-
form is NetApp. Its FAS series of
devices do not require the addition
of separate hardware to provide for
file or block. Most other vendors
have avoided this route, either by
design or because it has been eas-
ier to retain separate components
following acquisitions.
But having a single piece of hard-
ware from which any protocol can
be enabled through software does
have its benefits, in terms of cost
and flexibility. However, unless uni-
fied technology can manage the
different workload profiles of file
and block (and, crucially, at the
same time), performance problems
could be encountered.
Over the last 12 months or so, we
have seen the major storage ven-
dors delivering or enhancing uni-
fied storage products that offer the
range of typical file and block pro-
tocol support, including iSCSI, Fibre
Channel, CIFS/SMB and NFS. Fibre
Channel over Ethernet (FCoE) sup-
port is also available from NetApp
and EMC.
In some cases these unified stor-
age products are truly integrated,
both physically and at the software
level. Many solutions on offer are
hybrid devices that follow the path
of adding a file gateway to a block
storage product. Through acquisi-
tion, Dell and IBM have added file
protocol technologies to existing
platforms. Although they are hybrid
by nature of being a mixture of file
and block hardware, they are being
packaged as unified (and therefore
fully supported) solutions.
EMC replaced its separate Clari-
ion and Celerra products in early
2011 with a single product line
that combined both former prod-
ucts into one platform, the VNX.
Meanwhile, HP has partnered with
Microsoft to use the Windows
Storage Server (WSS) as the basis
Spanish Bank Adopts Google Enterprise Cloud Apps
(Read the full story)
13. IT in Europe • july 2012 13
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Unified Storage Goes Mainstream
for its gateway products.
Feature support among the ven-
dors [CE1] is extensive, with typical
options including thin provisioning,
policy-based file management and,
increasingly, data deduplication.
Solid-state drives (SSDs) are avail-
able from most vendors and can be
exploited fully using thin provision-
ing and automated storage tiering
technology.
Finally, we should mention that
there are a number of newcomers
to the unified storage marketplace,
including startups Starboard Stor-
age Systems and Tegile Systems.
Both companies have developed
technology that specifically targets
the I/O profile of a mixed file and
block workload. These devices are
easier to deliver today because of
the advances in processor speeds
and integration of I/O protocols
further into processor chipsets.
Over time, we will likely see more
of these solutions emerge that truly
bring multiprotocol together in a
single device. Now let’s examine
the major vendors’ updates to their
unified storage lines over the past
year.
n EMC. Early in 2011 EMC
announced its unified platform, the
VNX, as a replacement for the pre-
vious Clariion and Celerra product
lines. This converged product fam-
ily combines block and file proto-
cols in a single platform, managed
by the new Unisphere management
software.
VNX hardware combines file and
block protocols (delivered in sepa-
rate enclosures), providing file I/O
using X-Blades—hardware blades
that were known as data movers
in the previous Celerra solution.
The VNX models start at the entry-
level VNX 5300, which supports
a maximum of 125 drives; up to
the VNX7500, which supports
1,000 drives, including flash, SAS
and nearline SAS devices. Protocol
support includes NFS, CIFS, MPFS,
pNFS, FC, iSCSI and FCoE.
There is a range of additional
functionality available with the
VNX, including Fully Automated
Storage Tiering for Virtual Pools
(FAST VP) auto-tiering technol-
ogy, Fast Cache (enhanced cache
performance for random I/O work-
loads), data compression and thin
provisioning.
EMC also offers the VNXe,
Dutch University Wins Atos Olympic IT Challenge
(Read the full story)
Solid-state drives
can be exploited
fully using thin
provisioning and
automated storage
tiering technology.
14. IT in Europe • july 2012 14
Unified Storage Goes Mainstream
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
which is aimed at small businesses.
VNXe combines block and file sup-
port into a single unit, scaling to
a maximum of 120 drives on the
VNXe3300 model. Although the
VNXe is targeted at SMBs, the
device does still provide compres-
sion and thin provisioning tech-
nologies.
The VNX range has been upgrad-
ed over the last 12 months with
faster processors and additional
memory. SAS drives are now the
standard disk connection.
n NetApp. NetApp FAS devices
were one of the first unified stor-
age devices on the market, sup-
porting file and block protocols
in a single platform. As discussed
earlier, the company’s technology
was also unique in that it supported
the protocol range within a single
hardware device, rather than using
block storage with a NAS gate-
way. In September 2011, NetApp
upgraded its FAS2000 series
with the release of the FAS2240.
This provides support for up to 8
Gbps Fibre Channel and 10 Gbps
Ethernet IP connectivity, covering
NFS, CIFS and iSCSI protocols. The
higher-end FAS arrays now support
FCoE.
NetApp unified solutions are
only available with Data Ontap 8
7-mode, which retains backward
compatibility with previous plat-
forms. This limits the size of file
shares to 100 TB. Scalable NAS is
available using Data Ontap 8 Clus-
ter Mode, but this doesn’t provide
unified capabilities.
n Hewlett-Packard. HP has a
range of unified devices across its
portfolio. The new X3000 series
is based on Microsoft WSS 2008
R2 and runs on standard ProLiant
hardware. Microsoft’s WSS plat-
form has a number of additional
features, including single-instance
storage (SIS); file deduplication
technology; and file classifica-
tion, which delivers policy-based
management of files. The X3000
range provides only minimal inter-
nal storage and can be connected
to external HP arrays including
P4000, EVA and XP but is not as
truly integrated as solutions from
other vendors.
The HP P4000 G2 Unified NAS
Gateway offers similar functional-
ity to the X3000 series in that it
uses WSS 2008 R2 as a gateway
to the LeftHand P4000 G2 array.
The NAS gateway has no usable
storage of its own and so has to be
deployed in conjunction with the
P4000 platform.
n IBM. In 2010 IBM announced the
Storwize V7000, a new storage
platform based on its SVC stor-
age virtualisation technology. The
V7000 was enhanced in Octo-
The VNX range
has been upgraded
with faster proces-
sors and additional
memory.
15. IT in Europe • july 2012 15
Rural Broadband—Should Residents Pay?
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
ber 2011 to provide file and block
protocols through the addition of
a separate file module, built on
IBM’s existing Scale Out Network
Attached Storage (SONAS) tech-
nology. The V7000 offers a range
of additional functionality, includ-
ing support for SSD and thin provi-
sioning. Data tiering is implement-
ed with IBM’s Easy Tiering feature.
The IBM Active Cloud Engine tech-
nology implements policy-based
file management, automating
the movement of less frequently
accessed data to lower tiers of
storage and, where required, the
deletion of files.
The Storwize V7000 supports
Fibre Channel up to 8 Gbps, 1
Gbps and 10 Gbps Ethernet for
iSCSI and NAS and a maximum of
240 drives, expandable in 12- or
24-drive bay enclosures for a total
capacity of 720 TB in a clustered
configuration.
n Dell. In 2010 Dell acquired the
assets of Exanet, which provided
it with access to clustered NAS
technology. This technology was
released in June 2011 as the Dell
EqualLogic FS7500 Scale-out Uni-
fied Storage. This combines Equal-
Logic storage and the ExaStore file
system, rebranded as the Dell Fluid
File System. The FS7500 supports
iSCSI, CIFS and NFS protocols and
is scalable to a maximum system
size of 509 TB, with no restriction
on the size of a single file share. n
Chris Evans is a UK-based storage consultant.
He maintains The Storage Architect blog and is
a contributor to SearchStorage.co.UK.
Networking and communications
rural broadband—
should residents pay?
With government funding and com-
mercial plans only reaching 90% of
the population, is it the right move
for residents to take their broadband
connections into their own hands?
By Jennifer Scott
jeremy hunt’s promise in 2010 that
the UK would lead Europe with its
broadband infrastructure within
five years raised a few eyebrows.
No one was against the senti-
ment of rolling out broadband to
every home in the UK by 2015—
even if they weren’t so happy with
the minimum speed of 2Mbps—but
it seemed like a mammoth task for
the government to achieve alone.
Hunt expected the private sector
to get on board, and while the likes
of BT and Virgin Media were happy
to continue their deployments of
fibre infrastructure across the more
profitable areas of the UK, acres of
rural locations on this green and
pleasant land were left wanting
when it came to Internet connec-
tions.
The past two years have been
spent debating what the answer
should be to fill in these dead spots
across the British landscape and
trialling schemes to try to fix the
problem, but this week has been
about reporting back to the Houses
of Parliament on what has or hasn’t
been a success.
Rory Stewart, Conservative MP
16. IT in Europe • july 2012 16
Rural Broadband—Should Residents Pay?
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
for Penrith and the Borders, has
been one of the cheerleaders of
rural broadband, and as such has
worked with a number of com-
munities to find the best answer
for getting connections to remote
areas.
Current government schemes
have focused on a central pro-
curement process by local coun-
cils from large telecoms com-
panies and have cost millions of
pounds. However, Stewart has
been involved in trials which cost
significantly less by using more
innovative technologies and get-
ting residents to invest in their own
infrastructure.
“The assessments vary, but it
costs between £10bn and £40bn
for centralised procurement to
deliver fibre to every home,” he
said. “The government has decided
to spend £550m.
“Communities such as Great
Asby [a village in Cumbria which
has run its own broadband
scheme] have dropped this cost
dramatically to £60,000.”
“The very sad thing here is while
enormous progress has been
made…to get broadband [that
was] unimaginable to these com-
munities a few years ago…it is still
a problem getting this tiny sum of
money from government to fund
it.”
Funding Rural Broadband
Connections
It seems the only way these
schemes have got off of the ground
is by local residents digging their
hands in their own pockets to pay
for the infrastructure. But is it right
for them to pay their own way
when it comes to broadband, or
should both the public and private
sector be providing funding for
rural parts of the country?
Chris Conder, founding mem-
ber of B4RN (Broadband 4 Rural
North), admitted there was little
money left in government coffers
to increase its contribution and that
these areas were not commercially
viable for companies such as BT.
People in rural areas should have
the same rights of access to the
Internet as the rest of the popula-
tion. However, rather than resi-
dents paying up front to make the
broadband roll-out happen, she
believed in a suggestion raised by
Stewart of a “soft loan” from the
System Monitoring From a Single Source
(Read the full story in German)
People in rural
areas should have
the same rights of
access to the Inter-
net as the rest of
the population.
17. IT in Europe • july 2012 17
Rural Broadband—Should Residents Pay?
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
public sector to get the ball rolling.
The idea is government could
lend the money to communities to
get set up, and each household that
signs up to the scheme can pay it
back over a long period of time with
little interest. This removes the risk
from spending taxpayers’ money
on unproven technical innovations,
but enables rural areas to go ahead
with plans that could otherwise be
delayed indefinitely while money is
raised.
“There would have to be some
community participation to keep
the costs down to an affordable
level,” Conder said. “Communities
will do it if they believe in the proj-
ect; they don’t expect anything for
nothing and they will contribute,
but there is a limit.
“Soft loans that spread the
load over many years would be a
great incentive and very ‘big soci-
ety’. Some members of rural soci-
ety wouldn’t be able to contribute,
either health-wise or [due to] pov-
erty, but others will, and it gives a
great feeling of satisfaction to help
each other.”
Equal Rights to Internet Access
Fellow B4RN committee member
Martyn Dews said it should not
be down to citizens to pay for this,
saying that “people in rural areas
should have the same rights of
access to the Internet as the rest of
the population”.
However, he accepted that with
the lack of commercial viability and
the relatively small amount of gov-
ernment funding, it was a case of
“putting up with what is provided
or doing it themselves”.
“If it’s done right and there is
enough support, the community
will get the payback in the medium
to long term,” said Dews. “Imagine
a deeply rural community with a
1,000Mbps connection—one of
the best in the world. That is what
B4RN is delivering now. Think of
the possibilities that will open
up.”
BT is keen on communities
getting on board with their own
deployments but said residents
should be more active in seeking
out the set-up costs from govern-
ment.
“We’ve always been very clear
that we believe that local com-
Cisco Opens the Era of Programmable Networks
(Read the full story in French)
“There would have
to be some community
participation to keep
the costs down to an
affordable level.”
— Chris Conder,
founding member, B4RN
18. IT in Europe • july 2012 18
home
Editor’s
Letter
Advanced
Analytics
Tools Slow
to Catch Fire
Securing NoSQL
Applications:
Best Practises
for Big Data
Security
SIPCOM Cut
Costs With
Storage
and Server
Upgrade
Unified
Storage Goes
Mainstream
Rural
Broadband—
Should
Residents Pay?
Additional
European
Resources
Rural Broadband—Should Residents Pay?
munities can make a substantial
contribution to bring fibre to their
area,” a BT spokesman said. “We
have publicly encouraged them to
engage with their local authority
and central government in lobbying
for funds.”
He said, however, that the funds
on offer from both private and
public sector contributors would
only cover 90% of the popula-
tion, leaving the final 10% on their
own. The only way BT is going to
put up more of its own investment
though is if it is the chosen provider
for government schemes. “We
have indicated we would be willing
to invest further funds—of up to
£1bn—should we win many of the
public funds on offer,” added the
spokesman.
A number of residents from
Alston, another Cumbrian village
which has taken on its own broad-
band infrastructure, have pledged
their support for paying their own
way.
However it was made clear by
both residents and campaign-
ers that it would not be their first
choice. All believed, as Dews
stated, that they should be given
the same provisions as the rest
of the country, but rather than
waiting for more money from gov-
ernment and the private sector,
it was more beneficial to get on
with the task at hand and put in
their own money.
Neither government, com-
mercial operations or campaign-
ers will state it is the right thing
for residents to pay for their own
broadband, but all accept it may be
the only way to get everyone con-
nected while purse strings remain
tight. n
Jennifer Scott is the networking editor for
ComputerWeekly.com and a contributor to
SearchNetworking.co.UK.
Neither government,
commercial opera-
tions or campaigners
will state it is the
right thing for
residents to pay for
their own broadband.