4. Whenever someone had thought to create
a cloud, he must have thought ….
“ How to earn from Computer Network? “
Computer network to earn money.
Cloud Computing
4
5. 5
Evolution of Sharing on Internet
Networking Networks
•Multiple regional networks linking computers.
•Initially at universities and national labs.
Network Sharing Internetworking & the Internet
•Internetworking of regional networks with TCP/IP
•Began to replace regional alternatives.
•World wide adoption – file transfer.
Information Sharing The World Wide Web
• HTML page format, HTTP protocol, and mosaic
browser for document exchange.
•Initially in universities; world wide adoption.
Resource Sharing Grid Computing
•Standards & software for sharing of remote
resources and collaboration.
•Mainly used for highly scalable HPC jobs.
Service Sharing Cloud Computing
• Everything as a service over the web.
•SaaS, Utility computing, IT services, …
•Ubiquitous, always available, scalable, ….
9. How to use/access
Amazon Web Services?
1. Internet Connection
2. Email ID (Create a new email ID )
3. Contact Number
4. Credit/Debit Card Details (INR 2)
AWS Account (for free)
9
16. Amazon S3 ( Simple Storage Service)
Storage for the Internet.
Provides a simple web interface to store, retrieve any amount of data any time,
from anywhere on the web.
Virtually any kind of data in any format, and unlimited capacity.
S3 objects can range from 0 B - 5 TB.
The largest object that can be uploaded in a single PUT is 5 GB.
Amazon store data and track its associated usage for billing purposes only.
(except when required to do so by law).
Amazon store its own data in Amazon S3.
Amazon S3 offer a Service Level Agreement (SLA)
You specify an AWS Region when you create your Amazon S3 bucket.
For S3 Standard, S3 Standard-IA, and Amazon Glacier storage classes, your
objects are automatically stored across multiple devices spanning a minimum of
three Availability Zones.
Multiple, physically separated and isolated Availability Zones in one AWS region
16
17. With Amazon S3, you pay only for what you use.
There is no minimum fee.
You can estimate your monthly bill using the AWS Simple
Monthly Calculator.
“ We charge less where our costs are less.”
Billing prices are based on the location of your bucket.
There is no Data Transfer charge for data transferred within an
Amazon S3 Region via a COPY request.
Data transferred via a COPY request between AWS Regions is
charged at rates specified in the pricing section of the Amazon S3
detail page.
There is no Data Transfer charge for data transferred between
Amazon EC2 and Amazon S3 within the same region.
Data transferred between Amazon EC2 and Amazon S3 across
all other regions is charged at rates specified on the Amazon S3
pricing page.
Amazon S3 Pricing
17
18. Amazon S3 Cont…
Sign up
for
Amazon
S3
Create
a
Bucket
Add an
Object to
a Bucket
View
an
Object
Move an
Object
Delete an
Object and
Bucket
Bucket names:
1. 3 – 63 character long.
2. Series of one or more label.
3. Use periods for separating.
4. Lowercase, numbers, hyphens
(underscore X)
5. Start & end with lowercase
letters/number.
6. Example:
nitttrspell2, nitttr.spell2, etc
Bucket Restrictions:
1. Max. 100 buckets in one AWS account.
2. Cannot transfer bucket owernership.
3. Unlimited no. of objects in one bucket
without degrading performance
Basic storage units of Amazon S3 are Objects
URL access to the objects in bucket
18
20. Amazon S3 Standard
Used for storing frequently accessed data.
Key Futures:
• Low latency and high throughput performance
• Designed for durability of 99.999999999% of objects across multiple Availability
Zones
• Data is resilient in the event of one entire Availability Zone destruction
• Designed for 99.99% availability over a given year
• Backed with the Amazon S3 Service Level Agreement for availability
• Supports SSL for data in transit and encryption of data at rest
• Lifecycle management for automatic migration of objects
Uses
1. Dynamic Websites
2. Cloud Applications
3. Mobile Applications
4. Upload photos & videos
5. Mobile & gaming applications
6. Data Analytics
20
21. Amazon S3 Standard Infrequent Access
Used for storing less frequently accessed data, but requires rapid access when
needed.
Stores data in minimum of three availability zones.
Key Futures:
• Same low latency and high throughput performance of S3 Standard
• Designed for durability of 99.999999999% of objects across multiple Availability Zones
• Data is resilient in the event of one entire Availability Zone destruction
• Designed for 99.9% availability over a given year
• Backed with the Amazon S3 Service Level Agreement for availability
• Supports SSL for data in transit and encryption of data at rest
• Lifecycle management for automatic migration of objects
Uses
Ideal for long term storage, backups and a data store for disaster recovery.
21
22. Amazon S3 One Zone Infrequent Access
Used for storing less frequently accessed data, but requires rapid access when
needed.
Stores data in a single availability zone
Key Futures:
• Same low latency and high throughput performance of S3 Standard and S3
Standard-IA.
• Designed for durability of 99.999999999% of objects in a single Availability
Zone, but data will be lost in the event of Availability Zone destruction.
• Designed for 99.5% availability over a given year.
• Backed with the Amazon S3 Service Level Agreement for availability.
• Supports SSL for data in transit and encryption of data at rest.
• Lifecycle management for automatic migration of objects
Uses
For example, for storing secondary backup copies of on-premises data or easily re-
creatable data, or for storage used as an S3 Cross-Region Replication target from
another AWS Region.
22
23. Amazon Glacier
• Used for data archiving.
•Three options for access to archive, from a few minutes to
several hours.
Key Futures:
• Designed for durability of 99.999999999% of objects across
multiple Availability Zones.
• Data is resilient in the event of one entire Availability Zone
destruction
• Supports SSL for data in transit and encryption of data at rest.
• Vault Lock feature enforces compliance via a lockable WORM
policy.
• Extremely low cost design is ideal for long-term archive.
• Lifecycle management for automatic migration of objects
Uses
Data Archiving
23
27. Amazon DynamoDB
• Fully managed, NoSQL database server.
• You never have to worry about the maintenance or administrative
burdens of operating & scaling a distributed database.
• You don’t have to worry about hardware provisioning setups and
configurations, data replications, s/w patching & cluster scaling.
• With DynamoDB you can create table that can store & retrieve any
amount of data and serve any level of request traffic.
• Amazon DynamoDB runs exclusively on solid-state drives (SSDs).
• Amazon DynamoDB supports GET/PUT operations using a user-
defined primary key.
• PutItem or BatchWriteItem APIs to insert items.
• GetItem, BatchGetItem API to retrieve the items.
• Support conditional operations
27
29. Conclusion
• Amazon EC2 is a reliable compute service from
Amazon and in order to optimize your costs
across AWS services, large objects or
infrequently accessed data sets should be
stored in Amazon S3, while smaller data
elements or file pointers (possibly to Amazon S3
objects) are best saved in Amazon DynamoDB.
29