Amazon Elastic Compute Cloud (Amazon EC2) is one of the AWS (Amazon Web
Services) being offered to developers so that they can leverage Amazon’s data and robust
infrastructure, allowing for the creation of web applications that are scalable, reliable and
cost-effective. Before the innovation of Amazon EC2, this type of functionality had to be
completed through relational database and this requires a large amount of investment.
The manipulation process is very complex. There must be an administrator to look after
Amazon EC2 is a super server designed to make web-scale computing easier for
developers by providing resizable compute capacity in the cloud. Shifting the computing
from an individual application server to a “cloud” of computers enables the pooling of
computer resources and managing them through software. This process simplifies IT
management and allows for the efficient use of resources.
About “Cloud Computing” Method
“Cloud” is industry slang for data centers around the world. Cloud Computing is a new
type of method to share computing resources. The approach of the cloud computing
method is to divide a process into smaller and simpler processes and let multiple servers
to share the tasks. It allows for the automation of many management tasks, unlike the
individual data center, which requires human management to allocate the source. In terms
of data storage, “cloud storage” create virtual servers for the third parties and let them
manage their own data. This means the companies own large servers and clients can
purchase some server capacity from them.
The benefits of cloud computing are the following:
1. Lower cost of space and electricity
2. Faster processing speed from sharing tasks with trillions of virtual servers.
3. Easier maintenance on internet infrastructures.
This method has been applied to many internet applications, such as search engines,
webmail, and all kinds of web hostings.
Main Function and Advantage
Amazon EC2 allows one to access a virtual computing environment in which your
applications can run on a virtual CPU having the power equivalent to (at the very least)
1.7 GB RAM, 160 GB of instance storage, 32-bit platform, 250 Mb/s of network
bandwidth, and 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit). This is
a powerful machine that can easily be deployed. One can also remotely start and stop
instances of image using web service calls. In the event that a more powerful virtual CPU
is needed, one can avail of the Large Instance or Extra Large Instance, which provide 7.5
GB and 15 GB memory, respectively. Users are in full control of the instance. Many
instances types are available for users to choose the one that is most optimal to them.
Amazon EC2 works in terms of Amazon Machine Images (AMI). The AMI serves as the
operating system stored in Amazon Scalable Storage Infrastructure (S3). Web service
calls create images, and they are assigned to virtual CPUs to start running an application.
One can create distinct AMIs for each database tier and then spawn one or more instances
of the tier based on the load. One can then make requests to Amazon EC2 service to stop
and start the processes and monitor them at the same time. What the service basically
does is to reduce the time required to obtain and boot new server instance to just minutes.
This in turn, allows one to quickly scale capacity (whether up or down), as the computing
The service is particularly useful to start-ups who need massive compute resources to
boost traffic at launch. Prior to Amazon EC2, the cost of setting up is prohibitive given
that you will need that much horsepower for just a short period of time. Small start-ups
and developers do not have the capital to acquire all the resources needed to guarantee
that they have the capacity to handle unexpected spikes in load. Without Amazon EC2,
developers acquire and pay in advance for hardware, storage, and network bandwidth in
anticipation of high traffic. The main beauty of Amazon EC2 lies in the concept of
paying only for the capacity that is actually used. This means there is no upfront fee to be
collected. That way, the whole process is inexpensive whether the business succeeds or
fails. Another great feature is that Amazon EC2 can let user scale its capacity to either up
or down in a minute when the resource demand changed. There is no need to contact and
wait for the dedicated hosting provider.
Amazon EC2 introduces pay-per-use service for users with a low cost. It charges 10 cents
per clock hour, which allows one to get as many virtual CPUs as necessary and will only
be charged for the time the server is running and the bandwidth. Given the pricing of 10
cents per clock hour for the Small Instance; 40 cents per clock hour for Large Instance;
and 80 cents for Extra Large Instance, developers can plan ahead and determine the best
option for the business. They can also determine if it is time to switch to the virtual CPUs
or not. Businesses, especially small start-ups, will benefit most at the onset because they
can get access to powerful resources inexpensively.
The data transfer charge is 10 cents per GB for data transfer in and up to 18 cents per GB
for data transfer out. Data transfer between EC2 and EC2 account is free of charge. Thus
if there are organizations in partnership and they requires a lot of internal data
transaction, it is a great opportunity to use this server environment to reduce their cost.
The build-in flexible payment system gives Amazon.com a competitive advantage.
Amazon EC2 is the answer to businesses or sites that tend to experience a surge in traffic,
mainly those with user-submitted data (e.g. sites similar to Youtube.com, Digg.com, and
Slashdot.com). GigaVox Media has built the infrastructure for their podcast and
videocast technology on Amazon Web Services.
Although there is no official critique published these days since its inception (it is still
Beta version), there have been many users contributed significant feedback to this new
There are several issues and concerns that users have addressed out:
1. The cost is per hour based. If the server runs less than an hour, it will charge as an
hour base. For some new developers that simply take a try-out on this service by
running many small programs that only takes a few minutes (as they could have
done them in their own private server), they ended up paying significant money.
2. Institutions that make use of huge storage (and corresponding data transfer in and
out of Amazon EC2), may find the pricing structure a bit costly and they may be
better off with other options. This may be true for those in academia, high-end
commercial research and similar industry.
3. There is no persistent storage. This mean every process that is done during the
runtime will be lost. If the machine fails, the runtime process has to re-do by
reloading the original AMI.
4. Lack of SLA (Service Level Agreement): the related comments on AWS about its
terms and agreement have being quite a big dispute. This is the quote from AWS
“We further reserve the right to discontinue Amazon Web Services, any Services, or
any portion or feature thereof for any reason and at any time in our sole discretion.
Upon any termination or notice of any discontinuance, you must immediately stop
your use of the applicable Service(s), and delete all Amazon Properties in your
possession or control (including from your Application and your servers). Sections
3, 5, 8 - 12, any definitions that are necessary to give effect to the foregoing
provisions, and any payment obligations will survive any termination of this
Agreement and will continue to bind you and us in accordance with their terms.”
This suggests that there is a risk of losing all the data if customers fully depend on
AWS services. The large organizations would rather purchase their own
infrastructure instead of entrust data to AWS. There is no guarantee about AWS to
assure to safety of data. Customers who have read licensing agreement in other
software or web services may have seen the similar wording and they treat it as
typical terms to protect the service provider, however, many businesses refuse to use
Amazon EC2 because it is still a Beta version and they afraid that this service may
be terminated in the future. Especially for those businesses that have implemented
disaster recovery plan, AWS is probably not their first choice.
Although Amazon EC2 gives people an autonomous, scalable and cost-effective manner,
it is relatively new technology and its competitors are companies that offer dedicated
hosting services, Virtual Private Server (VPS) hosting, and a handful of online storage
companies. The main selling points of Amazon EC2 are the inexpensive pricing and the
robust infrastructure of Amazon. The little disadvantage that Amazon EC2 has is poor
data recovery system as compared to VPS. The former poses long recovery time for
businesses that have huge database. Google is providing similar service to Amazons’,
such as Google BigTable and GoogleBase are other current competitors in this market.
They super server that provides large storage. Some other potential competitors are Sun
Microsystems Inc, IBM Corp., which tend to sell pricier version aim at large scale
organizations. Among those competitors, what makes Amazon EC2 unique is that it
offers utility-style pricing. Unlike other competitors, contract or the minimum payment to
the service must be bounded.
The technology holds promise and currently stands as the solution to most start-up
problems. It will not be long for other companies to get into the game and get a piece of
Amazon EC2’s market share. Future competitors will surely work on better data storage
and recovery system, or perhaps offer cheaper pricing scheme that will benefit those that
use huge data space. But as it stands, Amazon EC2 had a head start and it has enough
time to improve the technology.
A managed-service provider (MSP) is a company that outsources information technology
via internet to other organizations such as providing software and technical supports.
Although MSPs did not compete directly with Amazon EC2, they are planning to expend
their service to provide basic level of data computing environment with more competitive
pricing. Microsoft aim at Value-add resellers (VARs) and this is to attract smaller
businesses with the price range from free to 39.95 per months.
Amazon seems to have made good progress. Since late 2007, there are more than 290,000
developers registered in the EC2 service since its inception. There are many competitors
in this field that threaten Amazon’s position in this market; therefore it is difficult to tell
if Amazon is able to turn web-service into business. Although there are several issues that
may effect the efficiency of Amazon EC2, it still has large potential as it is currently still
in BETA mode. The suggestions for this new service are the following:
1. Enhance the EC2 server with immunity storage: this is the major headache for the
current users. Although the system is fast, it is not persistent. The purpose of this
enhancement is to prevent data from being lost due to sudden incidence such as
sudden electricity termination. The more secure the service, the better attraction to the
customers and they will be willing to pay more bills on a strong stable server.
2. Make the use of Amazon EC2 in combination with Amazon S3: Amazon S3 provides
storage on internet. Since both services are from AWS and there is no data transaction
fee between both services, S3 can provide instant storage for EC2 while EC2 is
processing. The integration of both services will make AWS a more powerful web
3. Establish a stronger warrant to the services: it is necessary to regain some customer’s
confidence on Amazon EC2. Since many developers hesitated about the sustainability
of the service due to lack of SLA, there is the need to enhance the stability of Amazon
EC2 web application environment to establish a better impression to customers.
Amazon Web Service Computing Cloud has opened a new market segment to businesses
and individual developer with affordable price and a highly efficient computing
environment. There are still a lot of rooms to implement and enhance a superior service
to customers. Many mergers that has jumped or expended from other web service market
will speed up the development and competitiveness of this market.
1. Amazon.com (2008). Amazon Elastic Compute Cloud (Amazon EC2). Retrieved on
February 7, 2008 from http://www.amazon.com/gp/browse.html?node=201590011
2. Derrington, J. (2007). Offload your multimedia content and bandwidth to Amazon
using PHP. IBM.com Retrieved on February 8, 2008 from
3. Berlind, D. (2007). Time To Throw Away Your Servers? Retrieved on February 7, 2008
from ZDnet http://news.zdnet.com/2422-13569_22-155736.html
4. rpath.com (2007). Projects Containing Amazon Machine Image Builds. Retrieved
February 8, 2008 from http://www.rpath.com/rbuilder/search?search=+buildtype
5. Forgrty, K. (2006). Networking Channel News: Channel doubts Microsoft's offer of
Office Live as a source of resale revenue. Retrieved February 8, 2008 from