Examination Case Study
Module Code: 2BBC601
Module Title: Global Business Computing Management
Year: Summer 2009
Instructions to candidates:
You may bring up to four (4) sides of pre-prepared notes to the examination.
You should hand in your pre-prepared notes at the end of the examination.
No other books, notes or material may be brought into the examination.
Dictionaries are NOT permitted.
You will be given a clean copy of the case study at the start of the examination.
You should acknowledge the sources of your information (including diagrams) and
include a list of references.
You should answer all questions which are based on the case study.
(You may bring a calculator if you wish but you are unlikely to need it).
The duration of the examination is 2 hours and 15 minutes.
Please note: you should read around the subject of the case study; do not assume the
case study alone will be enough.
Case Study Part I:
QuickTimeª and a
are needed to see this picture.
No Man Is an Island: The Promise of Cloud Computing
Published: April 01, 2009 in Knowledge@Wharton
"Cloud computing" promises myriad benefits -- including cost savings on technology
infrastructure and faster software upgrades -- for users ranging from small startups to
large corporations. That's an auspicious future considering that not everyone agrees
on exactly what cloud computing is or what it can do.
Despite the ethereal name, in its broadest terms, the concept of cloud computing is
fairly simple. Rather than running software on its own computers -- "on premises" as
the terminology goes -- a company buys access to software on computers operated by
a third party. Typically, the software is accessed over the Internet using only a web
browser. As long as the software performs properly, it doesn't matter where the
systems that run it are located. They are "out there somewhere" -- in "the cloud" of
the Internet. Since companies tend to purchase access to this remote software on a
subscription basis, cloud computing is also often termed "software as a service."
"Cloud computing refers to a number of trends related to pushing computing
resources -- hardware, software, data -- further into the network," said Kartik
Hosanagar, a Wharton professor of operations and information management who
moderated a panel discussion on cloud computing at the 2009 Wharton Business
These days, no computer user is an island. A recent study determined that 80% of the
data used by business comes from outside the company. Cloud computing "is the
technical response to this reality," said panel participant Anthony Arott of anti-virus
software company Trend Micro, based in Cupertino, Calif.
A somewhat broader definition of cloud computing comes from another expert on
that panel, Barry X. Lynn, CEO of "cloud platform" provider 3tera of Aliso Viejo,
Calif. "A lot of people define the cloud as having the computers be someplace else.
And that's not true," he said. "People have run IT in data centers they didn't own for
years. In the 1970s, we called that 'remote job entry.' In the 1990s, it was 'outsourced
data centers.' It's not a new concept."
Lynn suggested that true cloud computing isn't simply about adding physical distance
between the user and the computer that's doing the grunt work. What's new is "when
you abstract the computer from the physical resources." In other words, you no longer
have specific machines -- no matter where they are located -- dedicated to specific
functions or software applications. Instead, you have a piece of software running
across a pool of machines, making optimal use of all the available hardware
In between these explanations of cloud computing lies a variety of products and
services, all of which claim to offer a number of advantages -- lowered investment in
hardware, more efficient use of computing systems in existing data centers, easier
scale-up of the applications and services. These approaches are now possible due to
faster and more pervasive communications. As bandwidth has become cheap and
readily available, and transmission speed is no longer an impediment, it's possible to
store data and run software anywhere for users to access from wherever they want.
Backing (Up) Consumers
According to Prasanna Krishnan, an associate at Menlo Park, Calif.-based venture
capital firm Draper Fisher Jurvetson and also a panelist at the Wharton conference,
the easiest examples for most people to grasp may be consumer web applications
such as Microsoft's Hotmail, Google's Gmail and YouTube, and Yahoo's Flickr
photo-sharing service. Consumers run only their browsers on local computers. The
rest of the software -- along with users' email messages, photos or videos -- are on
remote machines the user can't see and doesn't have to know anything about -- as if
hidden in the clouds.
Another conference panelist, Vance Checketts, general manager of Decho, based in
Pleasant Grove, Utah, described his company's service, Mozy, as a "cloud" offering.
Mozy lets users back up their home computer data online. "We have 18 petabytes [18
million gigabytes] backed up now across about a million users. It's cloud technology,
but Mozy got started with just a bunch of cheap off-the-shelf disks."
Google extended its successful webmail model by introducing Google Docs -- online
versions of word processor and spreadsheet applications, software that traditionally
runs on users' PCs. It is joined in that market by others, including Zoho, of
Pleasanton, Calif., which offers a suite of online collaboration and business
applications. These convenient online tools have helped to fuel the market for
netbooks -- lightweight portable computers which contain minimal data storage and
computing capacity, and carry price tags usually under $400. By taking advantage of
online applications and storage, users have the option to spend less money on
Reducing -- or eliminating -- hardware and other operating costs naturally also
appeals to corporate users, many of whom are moving toward subscription-based
"software as a service" (abbreviated SaaS). Online business applications offered by
companies such as Salesforce.com (for customer relationship management) and
Workday (for human resources and financial software) can not only replace
expensive programs that would run on companies' premises, they can reduce the need
for corporate computer servers and the related costs of maintaining them. With SaaS,
companies pay subscription fees for usage rather than licensing costly enterprise
software. SaaS is a growth industry: A new study by Forrester Research concludes
that even in the current recession, software-as-a-service providers are seeing double-
digit growth in their subscription revenue. Ariba, a Sunnyvale, Calif.-based
procurement software-as-a-service company that had been left for dead after the
Internet bubble burst, saw a 73% jump in its subscription revenue, from $18.8 million
in the third quarter of 2007 to $32.6 million in the same period in 2008.
Other companies have expanded into the cloud by offering data-center resources as
more generic "computing as a service." Google, which maintains vast warehouses of
servers to run its own software applications, also offers a service called Google
AppEngine that allows businesses to develop and run their own programs on Google's
servers. Amazon has a similar offering called the Elastic Compute Cloud, or EC2.
These services offer companies a place to host applications and data under a pay-for-
usage model -- called "utility computing" because it is ready on demand, just like
turning on the lights or the water faucet. Customers pay by unit of consumption,
whether it's storage space or computing time, and can scale usage up or down
quickly. These computing services are particularly attractive when companies want to
develop and test new applications without interfering with existing systems, and they
can offer "hot," or ready-to-use, backups of the applications in use.
Back to the Future
The notion that a company has a "private cloud" on its premises might seem contrary
to the concept of cloud computing, but cloud-like features can also have advantages
in corporate data centers. Lynn from 3tera gave a historical analysis of how
computing architectures evolved. Decades ago, he said, "you had a giant mainframe,
and everything ran on it. If you ran out of capacity, you would either make it bigger
or get another giant mainframe." Then, client/server systems came along to distribute
processing between central computers or servers and the PCs at users' desks. Still,
however, every machine in the data center had to be dedicated to a specific software
function or application.
The newer technology of virtualization permits one piece of hardware to act as
multiple "virtual machines" and be dedicated to multiple functions. This makes more
efficient use of hardware, but each virtual machine still must be dedicated to a
specific software function. "What cloud computing really changes is [that] now you
don't have specific machines, or virtual machines, dedicated to specific functions.
You have a pool of machines. Anything can run anywhere" -- even in a company's
private data center, Lynn said.
Traditional corporate data centers can be inefficient. Businesses equipped for peak
workloads may have servers that are underutilized much of the time. In a private
cloud, a group of a company's existing computers can be brought together as a
computing pool -- and an application "can just grab any available hardware and then
give it back," said Lynn. "The term we use is 'disposable information technology
infrastructure.'" The software from 3tera acts as a conductor, parceling out
components of an application to different computers in a cloud like a taxi dispatcher.
"There is no architectural reason why you can't have 20 different machine types
involved”, Lynn noted, although performance is optimized if the machines are
For some corporate users, keeping the cloud in-house alleviates the security and
privacy concerns that can come with running key applications and data outside the
company. However, cloud providers insist that data is safer and less vulnerable with
them. Companies that provide storage and computing services maintain state-of-the-
art facilities and implement security updates immediately.
Lynn believes that eventually IT "will evolve to an almost completely external
cloud," and he sees it as a natural progression. "If you're in the health care business,
or financial services or manufacturing, why would you ultimately be spending
hundreds of millions or billions of dollars on IT infrastructure? And the answer is,
you've had no choice," he said. "If you woke up this morning and read in The Wall
Street Journal that, say, Overstock.com has stopped using UPS and FedEx and the
U.S. mail, and had bought fleets of trucks and started leasing airport hubs and
delivering products themselves, you would say they were out of their minds. Why is
that much more insane than a health care company spending $2 billion a year on
Panelists at the Wharton conference encouraged students in the audience to take
advantage of cloud computing as entrepreneurs. Those thinking of offering innovative
online services -- in the hopes of becoming the next Facebook or Twitter -- will need
a way to ramp up their capacity quickly if all goes well. With a cloud-based service,
expansion capacity is as close as you can get to unlimited, panelists noted.
Money is a factor, too, of course. A startup of any type can get the bulk of its
computing resources on a pay-as-you-go basis, said Wharton's Hosanagar. "You don't
have to worry about these big up-front fixed costs. I've had student startup companies
on minuscule budgets." Added panelist Jonathan Appavoo, a research scientist at
IBM: "Startups are the killer app for the cloud. As students with a good idea, this is a
playground. You can be the driver of all this stuff. Computation now is so accessible,
and we have an opportunity to dramatically change how things work."
Case Study Part II:
Cloud Thinking: Amazon, Microsoft, and Google
Posted By: Michael J. Miller
Wednesday November 12, 2008
A number of people have asked me how Microsoft's new Windows Azure platform
compares with Amazon's EC2 platform or what Google is doing with cloud services.
It seems to me, that at least for now, these so-called "cloud platforms" aren't really as
competitive with one another as it seems. Below is what the typical user needs to
know about these platforms, and what they may mean for the future of applications.
First, let's clarify some terms. In this piece, I want to talk about the major platforms
for cloud computing - in other words, the type of software a developer would use to
host an application. End-users won't use any of these tools directly - instead, they
might very well use applications deployed over the cloud, ranging from business
applications like Salesforce.com or NetSuite to consumer applications such as online
mail, Google Apps, or Picnic. I am a big fan of such "software as a service" solutions,
but I'll write about them some other time. Rather, these platforms would be used by
developers to build online applications ranging from stand-alone web businesses to
internal corporate applications.
Also, let me give a personal caveat. Other than a few Excel macros, I haven't done
any real programming in years. And it's been even longer since I've been a
professional programmer. So I haven't written an application on any of these
platforms - just sat in on a number of technical sessions, and talked to a few friends
who do program, create applications, or run web sites.
Clearly the cloud platform that has gotten the most attention is Amazon Web
Services, which is a collection of a variety of tools, mostly at a very low level. The
one that gets the most attention is called the Amazon Elastic Compute Cloud (EC2), a
web service that lets you assign your application to as many "compute units" as you
would like, whenever you need them. To give you an idea, a single default instance
includes 1 "virtual core," 1.7 GB of memory, and 160 GB of instance storage (which
is storage that is used only for that session) for 10 cents an hour. On top of this, you
might want to use the company's "simple storage solution" (S3), which costs 15 cents
per GB per month for the first 50 terabytes [50 thousand gigabytes] of data, and goes
down from there, plus charages for certain transactions. You might also want to use
the company's "Simple DB" database, or its queue for storing messages, which have
The Amazon platform's basic advantage is simple: you can just use the amount of
storage you want, when you want it. The platform itself seems to be very low-level
and very flexible. There are now lots of "machine images" with various operating
systems, databases, application development environments, etc. Most of the people I
know who have used it have done so with various open source tools, often Ruby on
Rails, but Amazon has recently rolled out a beta version running Windows Server and
Microsoft SQL Server. In the future, Amazon says it plans to add features like load
balancing, automatic scaling of compute capacity, and more monitoring and
Most of the people I know using it fall into two groups. They are starting web
businesses, and want to have access to the computing power they need, but don't want
to spend money or hire staff to buy and maintain equipment; or they have an existing
environment that has very inconsistent computing needs, such as a web site that
occasionally gets huge spikes in traffic, or a business that has a particular application
it only needs to run sporadically. Most of the companies I know that start out on EC2
eventually do end up with their own servers (typically in a hosted or co-location
environment) once the businesses start going, because that turns out to be more cost
effective, but many also continue to use EC2 for handling spikes in traffic.
Google's App Engine is newer. It is still in a free beta stage, and the tools so far are a
bit more restrictive. As I understand it, where Amazon gives you a virtual machine
you can install pretty much any software on, Google pretty much gives you a fixed
environment, based around the Python language, the Django development framework,
Google's BigTable database/storage system and Google File System (GFS). For now,
developers get 500 MB of storage and compute power for up to about 5 million
pageviews per month for free, and the company has announced pricing for more
active sites. For instance, the company says developers should expect to pay 10 to 12
cents per CPU core-hour.
[Note: Google has now introduced an additional cloud computing offering which
supports the Java language].
Because this has been built so closely around Google's own operating environment, it
should be relatively easy for developers who know those frameworks to get started.
But some developers have shied away from it because it seems more restrictive than
Amazon's solutions. Still, there are plenty of developers using it, and lately, I've heard
a few other companies talking about how you might port AppEngine applications to
Microsoft formally entered this space only at its Professional Developers Conference
last month [October 2008], with the announcement of its Windows Azure platform
for cloud service development. Azure is still in beta, and Microsoft hasn't given a
timetable for making it a released platform, but developers have just started to be able
to access what the company calls its Community Technology Preview.
Azure has gotten a lot of attention mainly because it looks like a natural extension of
Microsoft's development environment, which is widely used, especially within
corporate development shops [IT departments].
Like Amazon Web Services, Azure actually consists of a variety of different services
on top of a common platform. In fact, Microsoft used the phrase "Azure Services
Platform" to include everything from developer-focused tools such as .NET services
and SQL services, to corporate applications, such as SharePoint Services and
Dynamic CRM Services, to Live Services, a term that has included the company's
But the .NET services are the ones getting the most attention now, because that is
how developers would write for the platform initially. Indeed, from the sessions I
attended, it looks like applications written for the .NET framework and developed in
Visual Studio can be moved to "the cloud" relatively easily. Initially, Azure is
designed for such "managed applications" though Microsoft has talked about opening
the platform to "unmanaged applications" and other frameworks in the next year. The
basic version of Azure includes a simple database, but developers will also be able to
connect with a cloud-based version of SQL Server.
One big difference with Azure is that while Microsoft intends to offer its own hosted
Azure services, the platform is designed to be able to run on local computers or
corporate servers as well. This should make it easier to test the application, but also to
allow corporate applications to be run within a company's network as well as outside.
This could make it easier to build internal applications that can scale to the cloud to
meet peak demand; or to take existing .NET applications and modify them for the
cloud. The demonstrations of converting applications and adding server instances all
looked good, but the code shown at PDC (Professional Developers Conference) was
still unfinished, and how easy this will be in the real world remains an open question.
So looking at these three players, you see each of them playing to their strengths.
Amazon was early in the market, and has leveraged Internet standards and open
source platforms to create a very flexible platform. Google is leveraging the work it
has done with big databases and its internal development methods to create a
powerful but more restrictive environment. And Microsoft is leveraging its traditional
strength with developers and the breadth of its tools to create perhaps the largest array
of services. Over time, my guess is that we'll see all of them start to converge -
presaged perhaps by Amazon's introduction of Windows Server instances.
There are, of course, lots more companies in this field, ranging from various makers
of Software-as-a-Service (notably Salesforce.com) to many smaller vendors offering
hosted applications or co-location facilities, or tools for managing cloud applications.
But that's a topic for next time.
Posted By: Michael J. Miller
Sources: SAP Community Network and HighScalability.com
The Microsoft .NET Framework is a software component that can be added to or is
included with the Microsoft Windows operating system. It provides a large body of pre-
coded solutions to common program requirements, and manages the execution of
programs written specifically for the framework. The .NET Framework is a key Microsoft
offering, and is intended to be used by most new applications created for the Windows
Bigtable is a distributed storage system for managing structured data that is designed to
scale to a very large size: petabytes [millios of gigabytes] of data across thousands of
commodity servers. Many projects at Google store data in Bigtable, including web
indexing, Google Earth, and Google Finance. SimpleDB is Amazon’s equivalent of
A colocation centre (collocation center) ("colo") or carrier hotel is a type of data center
where multiple customers locate network, server and storage facilities.
A facility used to house mission critical computer systems and associated components.
Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides resizable
compute capacity in the cloud. It is designed to make web-scale computing easier for
Scaling up by incrementally adding more resources to handle more work. The new era of
cheap yet powerful computers has made horizontal scaling possible for virtually anyone.
This is a software deployment model that combine on-premise software with on-demand
Infrastructure as a Service (IaaS)
The Infrastructure Layer encapsulates and virtualizes datacenter facilities, including
physical servers and disks and the network fabric, and exposes these to the higher layers
Java is a programming language originally developed by Sun Microsystems and released
Load Balancing is a scalability solution where work is allocated amongst multiple servers.
Web servers can allocate requests amongst machines and processes.
The ability for a public cloud to support multiple tenants.
Platform as a Service (PaaS)
PaaS refers to the full lifecycle by which applications can be built, deployed and run on a
A Cloud that is only available for use for internal customers.
A Cloud that has multiple external customers.
Ruby on Rails (Django)
Ruby on Rails is a framework for developing database-backed web applications. Django is
a similar framework.
Amazon S3 is storage for the Internet. It is designed to make web-scale computing easier
for developers. http://aws.amazon.com/
Scalability is the ability to keep solving a problem as the size of the problem increases.
Scale is measured relative to your requirements. As long as you can scale enough to solve
your problem then you have scale. If you can handle the number of objects and events
required for your application then you can scale. It doesn't really matter what the numbers
Software as a Service (SaaS)
SaaS vendors provide applications to end users on demand. Instead of purchasing software
for on-premise use, customers license use of applications as a service over the web, and
may pay based on number of users accessing applications or by utilization. The term SaaS
has become the industry preferred term, generally replacing the earlier terms Application
Service Provider (ASP), On-Demand and "Utility computing".
Microsoft SQL Server is a relational database management system (RDBMS) produced by
Amazon Simple Queue Service (Amazon SQS) offers a reliable, highly scalable hosted
queue for storing messages as they travel between computers.
Different customers on a public cloud that share the same application
Virtualization is an abstraction layer that decouples the physical hardware from the
operating system to deliver greater IT resource utilization and flexibility. Virtualization
allows multiple virtual machines to run in isolation, side-by-side on the same physical
Web Framework (eg Ruby on Rails, Django)
A web application framework is a software framework that is designed to support the
development of dynamic websites, Web applications and Web services. The framework
aims to alleviate the overhead associated with common activities used in Web
development. Ruby on Rails and Django are examples of web frameworks.
The term Web server can mean one of two things: 1. A computer program that is
responsible for accepting requests from clients, which are known as Web browsers, and
serving them responses, which usually are Web pages. 2. A computer that runs a computer
program which provides the functionality described in the first sense of the term.