• Save
Amazon Webservices for Java Developers - UCI Webinar
Upcoming SlideShare
Loading in...5
×
 

Amazon Webservices for Java Developers - UCI Webinar

on

  • 1,158 views

Amazon Web Services (AWS) offers IT infrastructure services to businesses in the form of web services - now commonly known as cloud computing. AWS is an ideal platform to develop on and host ...

Amazon Web Services (AWS) offers IT infrastructure services to businesses in the form of web services - now commonly known as cloud computing. AWS is an ideal platform to develop on and host enterprise Java applications, due to the zero up front costs and virtually infinite scalability of resources. Learn basic AWS concepts and work with many of the available services. Gain an understanding of how existing JavaEE applications can be migrated to the AWS environment and what the advantages are. Discover how to architect a new JavaEE application from the ground up to leverage the AWS environment for maximum benefit.

Statistics

Views

Total Views
1,158
Views on SlideShare
1,158
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NoDerivs LicenseCC Attribution-NoDerivs License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Hello and welcome.In this webinar we will take a look at Amazon Web Services, which is a large topic area, so we are going to focus on how Amazon Web Services can be used by Java Developers.At the end of the webinar I will introduce a new course that UC Irvine Extension is offering that is actually called Amazon Web Services for Java Developers.
  • My name is Craig Dickson and I will be the instructor for the Amazon Web Services for Java Developers class at UC Irvine Extension.I am an Advisory Board member for the JavaEE syllabus at UCI Extension.I spend most of my time however working as an independent software consultant with clients mostly in the US and Australia.I specialize in Enterprise Java Development and the systems that support enterprise Java development like cloud computing and Developer Operations. I also teach and provide corporate training on various topics related to enterprise Java development.I am an Oracle Certified Master for the JavaEE 5 platform, a certified ScrumMaster and an a Certified Adobe Trainer for their enterprise Java content management tool called Communique.My contact details will be available on the last slide in case anyone has any questions that they don’t get answered today.
  • So the agenda for today roughly tracks to the content of the course.Obviously in 45 minutes we are going to be hitting these topics at a very high 30,000 ft level.But hopefully you will get some useful information out of the webinar and also get a good understanding of what the course is going to cover.First we are going to take a quick look at some of the common issues that arise when developing a JavaEE application and how the cloud in general can help to resolve those issues.Next we will talk about what AWS actually is.Then we will take a closer look at the services that AWS provides that are of most interest to JavaEE developers. This is where we are going to spend most of our time today and it is where we will spend most of our time in the course as well - by taking a deep dive into multiple services and seeing how they fit into a JavaEE application.Finally I will go over some of the details of the course and then we will open it up for some questions.
  • From an infrastructure standpoint large scale enterprise Java applications can be very expensive to develop.Not only do you need to provision development environments for the developers, but then hopefully you are also using some Continuous Integration tool like Jenkins or CrusieControlwhich needs to have its own resources provisioned. You may also have multiple testing environments for Quality Assurance, User Acceptance Testing and Performance Testing. Maybe you even have a staging environment for customer sign off or last minute testing.And that is just the hardware and software that is needed. You also need to pay for the developers to write the application and then system administrators to manage the infrastructure.Then, even after all that cost you also need to deal with the special issues associated with a Production environment.So that means not only more hardware and more software for the Production environment, but also now you have non-functional requirements to deal with. What kind of uptime are you aiming for? How will you know when something is just starting to go wrong? How will you know when something catastrophic happens? How will you recover after a catastrophic failure? Will you lose your user’s data as a result? Who has access to your production environment in an emergency?Many enterprise Java applications that are functionally correct fail to meet their non-functional goals because the complexity and the cost involved (both in time and money) to make a highly scalable, reliable, secure application can be so prohibitive that organizations just stick their heads in the sand and hope for the best.So we have a combination of infrastructure issues and also application architecture issues to deal with as Java developers on a daily basis.
  • So how can the cloud help with these kinds of issues?Firstly all of that hardware and software and the cost of the system administrators to provision and maintain those systems even before the application goes live the first time can be reduced by making use of the cloud.Budget for these kinds of expenses, even in established organizations is rarely what we would like it to be. The problem is even more acute in small companies and startup environments. Many startups would simply not be considered financially viable if they had to purchase the necessary hardware up front many months or even years before there is any revenue being generated.With a cloud solution we get to rent the capacity we need, instead of buying servers.Not only that but if we need more hardware later we can buy small increments of server resources, or if we don’t need the resources for a period of time we simply can give them back and stop incurring costs.So the cloud gives us the possibility of Just In Time infrastructure which saves us upfront and ongoing costs for our project, but also gives us the security of knowing that there is redundant capacity available if our application happens to be a success. It also gives us a quick and clean exit strategy if our application is not a success.Also because the hardware resources are much more flexible in the cloud we have an opportunity to test various application architectures to see which is the most performant, which in a traditional environment might not be possible because of a lack of hardware or just the time involved in provisioning the different configurations.Finally the cloud also can help us achieve many of our non-functional goals by taking advantage of services in the cloud that can reduce our development time while also giving our application many of those desirable qualities like scalability and reliability and many of the other ‘ilities
  • So what is AWS?I expect many people watching this webinar probably have at least some idea or at least a notion of what AWS is. Which is not surprising since the very earliest parts of the AWS world have been around since 2006.These days the official buzz word for what AWS is, is something called “Infrastructure as a Service”. Most people today are familiar with the term Software as a Service, where applications are hosted somewhere remote to the end user instead of on their desktop - so things like Salesforce.com which is a completely hosted CRM solution.IaaSis the same model, but instead of the software being the artifact that is being hosted remotely in the cloud, it is now the infrastructure infrastructure itself that is hosted in the cloud, so things like hardware and servers and services to support applications like databases and middleware messaging.Many people have experienced hardware virtualization in their jobs with tools like VMWareand while AWS as an IaaS provider does provide many of the features that hardware virtualization does, it actually provides much more than just servers in the cloud.In addition, AWS delivers services that you can use to help build your applications on top of all of that fancy virtualized hardware.AWS comes with a sophisticated security backbone known as the Identity and Access Management service which controls infrastructure type issues like who can provision new resources but also low level issues like firewalls and opening ports between specific servers. However it goes even further and also handles user authentication and authorization for accessing AWS resources, so it can be integrated into your custom application to control certain security aspects of your application.All of the services offered by AWS have a pay-as-you-go model. So if you start a server and use it for 2 hours and then shut it down again, then you will only be charged for those 2 hours. If you store a 10MB file for a whole month and also store a 1GB file for 1 week in that month, you will be charged for only the 10MB for most of the month and then for the extra storage needed for that one week.And the story gets even better as some of the services are completely free and many of the other services offer what AWS calls a Free Tier, so if your application doesn’t consume many resources you might find it costs you nothing at all to host it on AWS.AWS has a pretty slick browser based management console which has actually gone through a facelift in the last couple of weeks where you can manage almost all of the AWS services directly from your browser.
  • As I said, almost all of the services can be managed and provisioned from the browser based management console.In addition to the Management Console there is a command line intefacefor each service that exposes the full API for each service. This is useful for scripted solutions for provisioning resources and other service management tasks.In addition to the command line, AWS has several SDKs available for various languages including Java. The Java SDK includes the API client as a JAR file and some code samples for common tasks.If your Java application is actually going to make use of AWS services at runtime, you will often bundle the JAR from the SDK into your application so that it is part of your application’s classpath. If you are smart and like to use Apache Maven to build your software, then the AWS Jar is available in the central repository so it is easy to include it in your builds.The AWS API serves two main purposes for custom Java applications.The first use is to manage and provision AWS services from within your application. For example you could request additional servers to be provisioned or a user to be created or a database server to be provisioned all from within your running application.The second use is to actually interact with already provisioned AWS resources, so reading files from the S3 service (which we will talk about in a moment) or to reading and writing to the Simple Queue Service (that we will also talk about later).Finally for Java developers there is an AWS plugin that will allow you to access and manage some AWS services and resources right within your Eclipse environment. In addition the plugin provides new Project types that will allow you to quickly create new Eclipse projects that can interact with AWS resources.
  • Now lets take a closer look at some of the services that enterprise Java developers can make use of.One of the first 2 services that were available through AWS was something called the Simple Storage Service or just more commonly S3.S3 gives you unlimited cloud based file storage.There is a simple API in the Java SDK that makes it straightforward to upload new files, read existing files, upload new versions of files and delete files.S3 is often used to store the file based assets of your application like large data files that you don’t want to deploy as part of your application artifacts like your Java WAR files.All of the files on S3 are URL accessible so they can be downloaded easily by clients and access to those files is controlled through IAM.A recent update to S3 now allows you to host an entire website on S3, so if your site is essentially static, then you can host the whole thing on S3 with no need for a web server.Another common use case for application developers is to store binary data on S3 that would have been traditionally placed in a database and instead the database simply contains the path of the file on S3.
  • The other original service that AWS offered was the Elastic Compute Cloud or just EC2.The service that EC2 provides is what many people think of when they think about hardware virtualization and cloud computing.EC2 allows you to provision servers within the AWS infrastructure of various types – there are small, medium and large servers and there are servers that have more CPU resources and there are others that have more memory resources. So there is a lot of flexibility to choose the exact right server for your custom application to run on.You can deploy most major operating systems, the most common being some flavor of Linux and then Windows. Amazon even publishes their own in-house flavor of Linux that is based on RHEL but is specially designed to run efficiently in the AWS environment.EC2 servers can have 2 different types of harddrive-like storage associated with them - the first is called instance store also known as ephemeral store and the second is known as Elastic Block Store or just EBS for short.The ephemeral storage only lasts while an EC2 instance is running, if the instance is rebooted then any data will be lost.An EBS volume is provisioned separately from any specific EC2 instance and the data survives a server reboot. The EBS volume can be detached from one server and then mounted on another. You can easily take a snapshot of an EBS volume for backup purposes and restore it later.The Java API for EC2 gives you full control to provision and manage running EC2 instances.One other interesting point to keep in mind is that you have full root access to an EC2 based server. This is actually an important distinction that sets AWS apart from some other cloud based solutions where multi-tenancy concerns mean you have limited access to control the low level details of the Operating System.
  • So if your provision an EC2 instance you can just about install any software you like on it, including web servers and application servers and database servers.But since provisioning databases in a “web scale” manner can be tricky, AWS has the Relational Database Service.With RDS you can provision a database server through the Web Management Console or through the SDK API, so no install disks or install wizards to deal with.RDS supports MySQL and also Oracle and SQL Server – obviously in the case of the last two you need to deal with licensing issues, and AWS supports a Bring Your Own Model for users that already have licenses and also a Provided model where AWS provides the license for you. Obviously there are different cost models involved with those two scenarios.The RDS service supports various sized servers, so you can select a size to match the load you are expecting.RDS also supports automated backups where backups are performed and stored for a specific amount of time. You can also perform manual backup and restore operations all through the web console.To improve your application’s reliability and disaster recovery capabilities you can also request to have live replication performed for your database, where the live copy is in a physically different location from your primary database.From a Java application perspective there is absolutely no difference when interacting with an RDS hosted database instance. All of your JDBC or Hibernate or JPA code will all still work exactly as before.In addition the Java SDK API gives your application the ability to provision additional database servers at runtime, or perform other database management tasks.
  • Recently the use of NOSQL based databases has been increasing and AWS actually has two services in the NOSQL space.The SimpleDB service was the first NOSQL offering from AWS and supports the storing of structured data, but has some performance limitations. The IO speed can be an issue for high performance applications and there is a 10GB limit to the size of any single table. But don’t be mistaken, there are many application’s whose use cases are easily covered by SimpleDB.If you do find that the requirements of your application are beyond what SimpleDB can offer, then you could used DynamoDBinstead which is a brand new offering from AWS and also provides a NOSQL style database, but without many of the limitations that SimpleDB has. DynamoDB instances are all provisioned on Solid State Disks so the IO is much better than SimpleDB instances and there is no limit to the size of data that can be stored in a DynamoDB table. Obviously DynamoDB costs a little more though.Keep in mind that the term NOSQL does not mean you never use SQL based databases, it actually stands for Not Only SQL meaning you might have a mix of NOSQL and traditional SQL based databases being used by the same application and AWS not only supports this model, it is actually quite a common architecture in large applications.
  • One of the issues in a production environment is keeping an eye on your application and your infrastructure and making sure that it is all working as expected now, and that there are no imminent problems that are about to bring your application to its knees.This is where CloudWatch comes in. CloudWath is a scalable system monitoring service that allows you to monitor various AWS services all through the same place in the web console, and also through the SDK API.In addition to monitoring the AWS services, you can actually monitor custom metrics from your own application as well.Based on the data that CloudWatch collects from the AWS services and your own application, you can create alarms that can be triggered when certain conditions are occur.Metrics like CPU utilization, network throughput, error rates and many others can be used.Once an alarm is tripped, various responses can be configured to automatically occur, mostly via the Simple Notification Service, which we will talk about in a few minutes.
  • One of the services that can be configured to listen for theCloudWatch alarms that we just talked about is the Auto Scaling service.The name Auto Scaling is pretty self evident about what it does.Based on the CloudWatch alarms that are triggered, for example - the average CPU utilization in your application for the last 5 minutes was above 75% - the Auto Scaling service can schedule new EC2 instances to be provisioned until the CPU utilization drops back below the threshold that triggered the alarm in the first place.Once the alarm is no longer active the Auto Scaling service can also shutdown EC2 instances. So by using Auto Scaling you can be confident that you always have enough capacity to meet the load, but without having to pay up front for excess capacity.Auto Scaling can also be used to pre-emptively scale your application, so for example if you have a big 24 hour promotion on your ecommerce site tomorrow, you can configure Auto Scaling to scale your standard 2 EC2 instances up to 10 EC2 instances for the day and then drop them back down to 2 instances again the next day.Another use for auto scaling is to monitor the health of your EC2 instances. If auto scaling detects that one of your EC2 instances has stopped responding for what ever reason, it can shut it down and provision a new EC2 instance to replace it.
  • The Elastic Load Balancing Service is also pretty self explanatory.It operates like a standard load balancer that many of us are used to, but with a few additional features.An ELB instance is inherently scalable and is also fault tolerant. So as the load increases, your load balancer does not become the bottleneck and in addition because of the way the ELB works it is highly fault tolerant.To achieve these two ‘ilitieswith a traditional load balancer it would require additional resources and some sophisticated configuration. With AWS you get all of it with a few mouse clicks in a browser.An ELB instance works hand-in-hand with the Cloud Watch and the Auto Scaling services to maintain a pool of healthy EC2 instances that the ELB instance can use to balance incoming requests to.If an EC2 instance stops responding it can be decommissioned and replaced, if CloudWatch alarms get triggered the pool of EC2 instances behind the load balancer can be scaled up or down as appropriate.
  • So we have looked at a set of services that can be combined to deploy a highly scalable and fault tolerant Java Web application on.However, even though all of them can be provisioned and configured relatively easily with a browser, there is still a lot to keep track of.So last year AWS released a new service called Elastic Beanstalk which makes things even easier to deploy certain types of applications, one of them being Java Web Applications.Elastic Beanstalk falls into a category called Platform as a Service, which is an abstraction layer above the infrastructure layer. Google App Engine is another example of Platform as a Service.With EBS we can now talk simply about applications and environments, we don’t need to talk about individual servers and load balancers and scaling which are all infrastructure concerns.With a few clicks in a browser, or a few API calls, you can deploy a Java Web Application into a highly scalable and fault tolerant environment and be live in a matter of minutes.In addition to abstracting away some of the lower level infrastructure services, Beanstalk provides features to support easy live upgrading of applications and provisioning identical QA and production environments which can be tricky and unreliable in traditional environments.And even though Beanstalk abstracts away several services to achieve the Platform as a Service model, the full AWS SDK is still available to the Java Web Applications that are deployed on it so you don’t lose any of the AWS power.
  • I already mentioned the Simple Notification Service in relation to the CloudWatch service.SNS provides a publish and subscribe model for messaging that is similar to the idea of a Topic in JMS, that Java developers should be familiar with. In fact, the SNS endpoint is actuallyalsocalled a TopicBecause SNS is hosted on the AWS infrastructure it is inherently scalable and fault tolerant with messages surviving outages and other interruptions.Clients can subscribe to the SNS Topic and be notified by various means protocols including HTTP, SMTP and SMS.So now that we know that SNS can send messages to clients via SMTP and SMS, we can see that when a CloudWatch alarm is triggered by a problem with out infrastructure or in our application, we can easily wake our system administrators up at 2am on a Sunday with a stream of email and text messages.
  • The Simple Queue Service is the other piece of the messaging solution available with AWS.Instead of a publish and subscribe model, SQS provides a queue model that allows anyone to publish a message and anyone to consume a message.This model is excellent for farming out long running tasks to a variety of worker clients. The example often used in the AWS documentation is video encoding. Video encoding messages or “jobs” are published to a Queue and then 1 of many workers consumes the job and performs the video encoding. SQS also supports timeouts, so if a worker crashes before finishing a job, or is just taking too long to finish, the job is returned to the queue and another worker client can take over the job.Making use of the SNS and SQS services in your Java applications can help to realize many ‘ilities with very little effort and also allow your application to perform at a “web scale” level
  • The Simple Workflow Service is one of the newest services offered by AWS.The workflow service essentially functions as you might expect if you have experience with other workflow engines like JBPM.It supports automated tasks and also tasks that require some kind of manual intervention before proceeding.Worfkflowscan be defined using the web console interface and also through the SDK API.The SWF service’s main purpose is to coordinate the tasks defined in a workflow and to distribute the actual automated tasks to be executed to available EC2 instances.The SDK API provides full access to the SWF service, particularly for triggering new instances of a workflow to start and providing the specific data input to be processed by the workflow.From a Java developers perspective, the combination of the Simple Notification Service, the Simple Queue service and the Simple Workflow Service provides an excellent platform for implementing a web scale application that is flexible, scalable and fault tolerant.
  • Big Data is another of the hot buzzwords floating around the IT industry in the last couple of years.In fact there are many predicting that jobs in the Big Data area will be some of the fastest growing and highest paid in the near future.The term Big Data loosely refers to large data sets that are so large that traditional databases and data vis-ualizationtools are not effectively able to process them within a reasonable timeframe. So we are talking about datasets that are often in the multi-terabyte or petabyte ranges.There are many emerging tools for dealing with these large data sets that allow the data sets to be analyzed and processed. One of the most popular frameworks for dealing with Big Data is called MapReduce and the most popular open source tool that implements the MapReduce framework is Apache Hadoop.In simple terms, the MapReduce framework takes the large data set and breaks it down into small chunks which are then processed in parallel (this is the Map part) and then the results of all of the parallel computations are recombined into one result (which is the Reduce part).So for the Elastic MapReduce Service, the data is read in from S3 or DynamoDB, the data is then mapped out to multiple EC2 instances for processing and then the results are recombined and written back out to S3 or DynamoDB.The number and type of EC2 instances that are available for processing the data can be configured through the web console interface. So as with most things in IT, it’s a tradeoff, the more EC2 instances that are available, the quicker the data will get processed, but the more it will cost you.The SDK API also provides full control of the EMR service, so your Java application can manage the EMR service and provision new job flows to process new data sets as they become available.
  • So those who were thinking at the start of this webinar that AWS was just S3 and EC2 are probably surprised to see the number of services I have already mentioned.This slide lists the rest of the current services that are also available that I have not yet mentioned.Some of these we will actually cover in the class, but we just didn’t have time for all of them during the webinar.The remaining services may or may not be of interest to Java Developers and we will only cover them briefly as part of the class so that you will at least know what is available even if you don’t have any need for those services immediately.
  • So the course is 10 weeks long. With the one additional orientation week.The course is 100% online, so all of the lesson content, any class participation and any assignment submissions will all be done via online methods. So all you are going to need is an internet connection.There will be a new lesson each week published on a Monday. The lessons will be a combination of written material, video based material and recorded presentations.Also each week there will be an activity to be completed. Normally this will involve some kind of assigned task or assigned reading and then participating in an online message board discussion related to the activity.The main deliverable for the class will be a project that will be completed gradually over roughly the second half of the class.
  • As I mentioned at the start of the webinar today, the course syllabus and webinar are very aligned.We will start the class by looking at some of the issues related to traditional vs cloud development.We will then explore AWS as a whole and get familiar with the terminology used.After that introductory work we will start the deep dive into the services I covered today and this is where we will spend the bulk of the class.Finally there will be a higher level review of the services that we do not cover in detail in the class.
  • This is an overview of the Grading scheme for the class.Notice the project is worth 50% of the grade so that represents a large part of the grade and as a result a large part of the work you will be doing.Then there are the weekly activities that make up most of the rest of your grade, so those will be things like assigned readings and participating in the message board discussions with other students.Finally there is 10% allocated to general online participation. This is an online course, so your ability and willingness to interact with other students by participating in discussions or answering each other questions is important for the success of the class.
  • Ok time for some questions.These are my contact details so feel free to reach out to me if you have a question that you don’t get an answer for today.Also I wanted to mention that I have secured a grant from the AWS in Education team that will give each student a $100 credit to use to pay for the AWS services you use during the course. If you combine that $100 with the fact that many of the services we will be using are either 100% free or have free tiers, it is possible that you will not pay anything additional during the course to use the AWS resources you need to use to complete the project and weekly tasks. You might even come away with some credit left over.

Amazon Webservices for Java Developers - UCI Webinar Amazon Webservices for Java Developers - UCI Webinar Presentation Transcript

  • Webinar: Amazon Web Services for Java Developers Craig S. Dickson extension.uci.edu
  • About Me• Advisory Board member for JavaEE at UCI• Independent software consultant – Enterprise Java – Cloud Computing – DevOps – Training• Oracle Certified Master, Certifed ScrumMaster, Certified Adobe Trainer
  • Webinar Agenda• Traditional JavaEE development issues• How the cloud can help• What is/are Amazon Web Services• What services should Java developers be familiar with and why• Course details• Questions
  • Traditional JavaEE Apps• Large amounts of resources just to develop – dev, CI, QA, UAT, performance testing, staging – web servers, app. servers, databases …• Not to mention Production – scalability, reliability, security, resilience to failures, disaster recovery – the ‘ilities• Many JavaEE apps. fail to meet non-functional goals
  • How can the cloud help?• Reduce upfront expenses• Reduce ongoing expenses• JIT infrastructure if things go well (or not)• Efficiently test multiple scenarios• Leverage repeatable “web scale” patterns – get some ‘ilities for free
  • What is AWS?• Infrastructure as a Service (IaaS)• Cloud based hardware and services• More than just hardware virtualization• Identity and Access Management (IAM)• Pay only for what you use – sometimes FREE!• Redundant resources are a browser click away
  • Java and AWS• Command Line• AWS Java SDK• Full API to manage AWS services• Also to use services from within your application• Eclipse integration
  • Simple Storage Service (S3)• Cloud based file storage• Simple API to CRUD files• Unlimited capacity• Excellent for static web content• Move BLOBS from RDBMS to S3
  • Elastic Compute Cloud (EC2)• Hardware virtualization in the cloud• Various sized servers• All major operating systems• Uses Elastic Block Store (EBS) service• Provision, configure, start and stop machines all from your browser or Java app• Full operating system access
  • Relational Database Service (RDS)• Specialized RDBMS hosting• Supports MySQL, Oracle, SQL Server• Automated backups• Live replication for fail-over support• No difference for JDBC based apps• DB provisioning from your browser or from within your Java app.
  • SimpleDB + DynamoDB• NOSQL databases• SimpleDB – excellent for small amounts of structured data – 10GB limit per table• DynamoDB – hosted on SSD – no size or request limitations
  • CloudWatch• Scalable unified resource monitoring• Monitor EC2, RDS and other AWS resources• Also monitor your own application metrics• Create your own alarm conditions• Flexible notification system via Simple Notification Service (SNS)
  • Auto Scaling• Works with CloudWatch• Automatically provision additional EC2 instances when load increases• Shutdown instances when load decreases• Pre-emptive scaling• Monitor health of EC2 instances
  • Elastic Load Balancing (ELB)• Scaled fault-tolerant load balancing• Works with Auto Scaling an CloudWatch• Load balances requests over a set of EC2 instances• Can monitor health of EC2 instances and shutdown non-performant instances• Can trigger scale-up and scale-down events
  • Elastic Beanstalk (EBS)• Platform as a Service (PaaS)• Bundles up services provided by EC2, S3, CloudWatch, Auto Scaling and ELB• Browser based provisioning of production ready Java Web Applications• Supports live hot-swapping for app upgrades• Can use other AWS services from apps
  • Simple Notification Service (SNS)• Scalable fault tolerant messaging• Publish-Subscribe model• Messages are persisted and can survive common outages• Comparable to JMS Topics• Notifications via HTTP, SMTP, SMS
  • Simple Queue Service (SQS)• Compliments SNS functionality• Queue model (FIFO)• Comparable to JMS Queues• Message publishers and consumers can be inside and outside of AWS• Using SNS and SQS as part of a Java application can realize many ‘illities
  • Simple Workflow Service (SWF)• Coordinates synchronous and asynchronous work in distributed applications• Comparable to tools like JBPM• Supports automated and manual (i.e. human) tasks• Like SNS and SQS, using SWF as part of a JavaEE application can help building web scale applications
  • Elastic MapReduce (EMR)• “Big Data”• Hosted Apache Hadoop environment• Uses EC2, S3 and DynamoDB services• Specify the number and type of EC2 instances used• Full control via Java API
  • Other AWS Services• Simple Email Service (SES) • DevPay• CloudFront • Import/Export• ElastiCache • Storage Gateway• CloudFormation • Mechanical Turk• CloudSearch • Alexa Integration• Route 53• Virtual Private Cloud (VPC)• Direct Connect• Flexible Payments Service (FPS)
  • Course Logistics• 10 week course• 100% online – lesson content, class participation, submissions• New lesson each week on Mondays – written, video, presentation• New activity each week• Multi-week project due at end of course
  • Syllabus Overview• Traditional vs. Cloud• AWS Overview, Web Console and Terminology• AWS Java SDK and Eclipse integration• Deep dive into multiple AWS services• Review of other AWS services
  • Grading• 50% - Project – develop and deploy a JavaEE application that makes use of multiple AWS services• 40% - Weekly Activities – assigned readings, discussion topics etc.• 10% - Online Participation
  • Questions?Craig S. Dickson Email - craig@craigsdickson.com Blog - http://bit.ly/csd-blog LinkedIn – http://bit.ly/csd-li