Masterclass Webinar: Application Services and Dynamic Dashboard

2,084 views

Published on

These slides from our Applications Services and Dyanmic Dashboard webinar introduce an example configuration that brings together topics from previous Masterclasses such as Auto Scaling and AWS S3, but adding event notification via Simple Notification Service, persistence of events in AWS Simple Queuing Service and API access from Python.

YouTube Demo: http://youtu.be/lb9qPhxIVNI




Published in: Technology, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,084
On SlideShare
0
From Embeds
0
Number of Embeds
202
Actions
Shares
0
Downloads
22
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Masterclass Webinar: Application Services and Dynamic Dashboard

  1. 1. MasterclassApplication Services and DynamicDashboardRyan Shuttleworth – Technical Evangelist@ryanAWS
  2. 2. A technical deep dive beyond the basicsHelp educate you on how to get the best from AWS technologiesShow you how things work and how to get things doneBroaden your knowledge in ~45 minsMasterclass
  3. 3. A grand title for a demonstration system that ties together servicesShow you how to use services like SNS and SQS to carry AWS eventsUse S3 as a web server to host dynamically generated contentWhy? Show you some tips and tricks you can use in your projectsApplication Services &Dynamic Dashboard
  4. 4. So what are we goingto run through?
  5. 5. Services &topicsEC2Instances to run ourapplication codeSNSTo publish events from ourinstancesAutoscalingTo generate events as ourapplication scales up & downSQSTo persist the event messagesfor processingS3To store and serve contentwe createCloudFormationTo build our system as amanaged stackDynamoDBTo store all events fromapplicationIAMTo control the creation andmanagement of resources
  6. 6. To dowhat? Mimic an application that implements auto-scalingTrap, transport and store the scaling eventsproducedUse a simple technique to produce pseudo-dynamic content from S3
  7. 7. To dowhat? Mimic an application that implements auto-scalingTrap, transport and store the scaling eventsproducedUse a simple technique to produce pseudo-dynamic content from S3An exercise beyond compute and storage!
  8. 8. There’s a movie you canview:http://youtu.be/lb9qPhxIVNII’ll show this link again at the end
  9. 9. This demo is just an illustration ofwhat you can do with these services
  10. 10. Built inthis way…
  11. 11. Built inthis way…Auto scaling GroupAn arbitraryapplication that wecan scale up anddown
  12. 12. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventBuilt inthis way…Messages producedwhen instances arestarted orterminated
  13. 13. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventDynamoDB tableholding instance detailsMonitoringinstanceAuto scaling GroupBuilt inthis way…
  14. 14. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventDynamoDB tableholding instance detailsS3 bucket holdingdashboard web contentAuto scaling GroupMonitoringinstanceBuilt inthis way…
  15. 15. EC2 Instance ContentsDynamoDB tableholding instance detailsMonitoringinstancePython scriptRead SQS queue andgenerate data for S3Static siteHTML, Javascript, cssreading data fileBuilt inthis way…
  16. 16. SNS & SQSEssential glue between applications
  17. 17. Simple Notification ServiceReliableRedundant storageScalableUnlimited number of messagesSimpleCreateTopic, Subscribe, PublishFlexibleHTTP, Email, SQSSecureTopic policiesIntegratedEC2, CloudWatch, Auto Scaling
  18. 18. A subscription by an SQSqueue for messagespublished on this topic
  19. 19. A CloudWatch alarmthat will publish toan SNS topic
  20. 20. Simple Queue ServiceReliableQueues store messages acrossavailability zonesScalableDesigned for unlimited servicesreading unlimited number ofmessagesSimpleCreateQueue, SendMessage,ReceiveMessage, DeleteMessageInexpensiveLow per request feesSecureAuthenticationPerformanceExcellent throughput
  21. 21. PythonApplication A
  22. 22. PythonApplication A>>> import boto>>> conn = boto.connect_sqs()>>> q = conn.create_queue(myqueue)
  23. 23. PythonApplication A>>> import boto>>> conn = boto.connect_sqs()>>> q = conn.create_queue(myqueue)>>> from boto.sqs.message import Message>>> m = Message()>>> m.set_body(This is my first message.)>>> status = q.write(m)
  24. 24. PythonApplication APythonApplication B>>> import boto>>> conn = boto.connect_sqs()>>> q = conn.create_queue(myqueue)>>> from boto.sqs.message import Message>>> m = Message()>>> m.set_body(This is my first message.)>>> status = q.write(m)>>> m = q.read(60)>>> m.get_body()Message not visibleto otherapplications for 60seconds
  25. 25. PythonApplication APythonApplication BX>>> import boto>>> conn = boto.connect_sqs()>>> q = conn.create_queue(myqueue)>>> from boto.sqs.message import Message>>> m = Message()>>> m.set_body(This is my first message.)>>> status = q.write(m)>>> m = q.read(60)>>> m.get_body()>>> q.delete_message(m)Message not visibleto otherapplications for 60seconds
  26. 26. The core trick…Auto-scaling SNS SQSEvent notificationsPersistence of event data
  27. 27. The core trick…Auto-scaling SNS SQSEvent notificationsPersistence of event dataYou can do this from anything to anything
  28. 28. as-create-launch-config--image-id <ami id>--instance-type t1.micro--group <security group>--launch-config my-launch-cfgCreate a Auto Scaling launch configuration:
  29. 29. as-create-auto-scaling-group my-as-group--availability-zones <az list>--launch-configuration my-launch-cfg--max-size 20--min-size 1Create a Auto Scaling group:
  30. 30. Amazon Resource Name
  31. 31. as-put-notification-configuration my-as-group--topic-arn <arn-from-SNS-topic>--notification-typesautoscaling:EC2_INSTANCE_LAUNCH,autoscaling:EC2_INSTANCE_TERMINATE
  32. 32. as-put-notification-configuration my-as-group--topic-arn <arn-from-SNS-topic>--notification-typesautoscaling:EC2_INSTANCE_LAUNCH,autoscaling:EC2_INSTANCE_TERMINATEautoscaling:EC2_INSTANCE_LAUNCHautoscaling:EC2_INSTANCE_LAUNCH_ERRORautoscaling:EC2_INSTANCE_TERMINATEautoscaling:EC2_INSTANCE_TERMINATE_ERROR
  33. 33. A subscription by an SQSqueue for messagespublished on this topic
  34. 34. {"Type" : "Notification","MessageId" : <message id>,"TopicArn" : <arn>,"Subject" : "Auto Scaling: termination for group "SNS-Dashboard-ASG"","Message" : ”…","Timestamp" : "2013-05-21T09:13:09.555Z","SignatureVersion" : "1","Signature" : ”…","SigningCertURL" : "https://sns.us-east-1.amazonaws.com/SimpleNotificationService-f3ecfb7224c7233fe7bb5f59f96de52f.pem","UnsubscribeURL" : "https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:241861486983:SNS-Dashboard-ASNotifications-7GU41DCQW8HC:ed30bf6e-582c-4fd2-8e07-28f7d1ac6278"}
  35. 35. {"Type" : "Notification","MessageId" : <message id>,"TopicArn" : <arn>,"Subject" : "Auto Scaling: termination for group "SNS-Dashboard-ASG"","Message" : ”…","Timestamp" : "2013-05-21T09:13:09.555Z","SignatureVersion" : "1","Signature" : ”…","SigningCertURL" : "https://sns.us-east-1.amazonaws.com/SimpleNotificationService-f3ecfb7224c7233fe7bb5f59f96de52f.pem","UnsubscribeURL" : "https://sns.us-east-1.amazonaws.com/?Action=Unsubscribe&SubscriptionArn=arn:aws:sns:us-east-1:241861486983:SNS-Dashboard-ASNotifications-7GU41DCQW8HC:ed30bf6e-582c-4fd2-8e07-28f7d1ac6278"}{"StatusCode":"InProgress","Service":"AWS Auto Scaling","AutoScalingGroupName":"SNS-Dashboard-ApplicationServerGroup-K61R5797WCMA","Description":"Terminating EC2 instance: i-8bb679eb","ActivityId":"dfc1181b-0df8-47dc-aa8d-79e13b8a33d1","Event":"autoscaling:EC2_INSTANCE_TERMINATE","Details":{},"AutoScalingGroupARN":"arn:aws:autoscaling:us-east-1:241861486983:autoScalingGroup:77ef2778-ded1-451a-a630-6a35c8e67916:autoScalingGroupName/SNS-Dashboard-ApplicationServerGroup-K61R5797WCMA","Progress":50,"Time":"2013-05-21T09:13:09.442Z","AccountId":"241861486983","RequestId":"dfc1181b-0df8-47dc-aa8d-79e13b8a33d1","StatusMessage":"","EndTime":"2013-05-21T09:13:09.442Z","EC2InstanceId":"i-8bb679eb","StartTime":"2013-05-21T09:12:20.323Z","Cause":"At 2013-05-21T09:12:02Z a user request explicitly set groupdesired capacity changing the desired capacity from 5 to 1. At 2013-05-21T09:12:19Z an instance was taken out of service inresponse to a difference between desired and actual capacity, shrinking the capacity from 5 to 1. At 2013-05-21T09:12:19Zinstance i-8fdbafed was selected for termination. At 2013-05-21T09:12:19Z instance i-8ddbafef was selected for termination. At2013-05-21T09:12:20Z instance i-8bb679eb was selected for termination. At 2013-05-21T09:12:20Z instance i-85b778e5 wasselected for termination."}
  36. 36. http://bootstrapping-assets.s3.amazonaws.com/as-register-instances.templateTemplate available:
  37. 37. SubscriptionNotificationTopicSQS QueueAuto ScalingGroup
  38. 38. SubscriptionNotificationTopicSQS QueueAuto ScalingGroup"NotificationConfiguration" : {"TopicARN" : { …"ASNotifications"},"NotificationTypes”[ "autoscaling:EC2_INSTANCE_LAUNCH”,"autoscaling:EC2_INSTANCE_TERMINATE" ]},Adding a notification to atopic for Auto Scaling groupevents
  39. 39. SubscriptionNotificationTopicSQS QueueAuto ScalingGroup"ASNotifications" : {"Type" : "AWS::SNS::Topic","Properties" : {"Subscription" : [ {"Endpoint" : { … ASNotificationsQueue …},"Protocol" : "sqs"} ]}},"NotificationConfiguration" : {"TopicARN" : { …"ASNotifications"},"NotificationTypes”[ "autoscaling:EC2_INSTANCE_LAUNCH”,"autoscaling:EC2_INSTANCE_TERMINATE" ]},Subscription to topic from anSQS queue
  40. 40. We now have events in SQSLet’s do something with them…
  41. 41. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventBuilt inthis way…Messages producedwhen instances arestarted orterminated
  42. 42. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventDynamoDB tableholding instance detailsMonitoringinstanceAuto scaling GroupBuilt inthis way…
  43. 43. Auto scaling GroupSNS notificationfrom auto scalinggroup SNS event body asJSONSQS queue topersist eventDynamoDB tableholding instance detailsS3 bucket holdingdashboard web contentAuto scaling GroupMonitoringinstanceBuilt inthis way…
  44. 44. EC2 Instance ContentsDynamoDB tableholding instance detailsMonitoringinstancePython scriptRead SQS queue andgenerate data for S3Static siteHTML, Javascript, cssreading data fileBuilt inthis way…
  45. 45. Read messagesfrom SQS queueWrite data toDynamoDB tableForm JSON filefrom updatedresultsWrite file to S3for javascript tointerpret
  46. 46. http://bootstrapping-assets.s3.amazonaws.com/as-node-manager.pyScript available
  47. 47. 1.Read SQS Queue
  48. 48. # Connect to SQS and open queuesqs = boto.connect_sqs()queue = sqs.get_queue(sqs_queue_name)queue.set_message_class(RawMessage)while True:rs = queue.get_messages(num_messages=10)for raw_message in rs:# Parse JSON messageenvelope = json.loads(raw_message.get_body())message = json.loads(envelope[Message])# Trap the EC2_INSTANCE_LAUNCH eventif message[Event] == autoscaling:EC2_INSTANCE_LAUNCH:save_instance(message[EC2InstanceId], ddb_table_name)# Trap the EC2_INSTANCE_TERMINATE eventelif message[Event] == autoscaling:EC2_INSTANCE_TERMINATE:delete_instance(message[EC2InstanceId], ddb_table_name)# Delete the message from the queue and continue pollingqueue.delete_message(raw_message)
  49. 49. # Connect to SQS and open queuesqs = boto.connect_sqs()queue = sqs.get_queue(sqs_queue_name)queue.set_message_class(RawMessage)while True:rs = queue.get_messages(num_messages=10)for raw_message in rs:# Parse JSON messageenvelope = json.loads(raw_message.get_body())message = json.loads(envelope[Message])# Trap the EC2_INSTANCE_LAUNCH eventif message[Event] == autoscaling:EC2_INSTANCE_LAUNCH:save_instance(message[EC2InstanceId], ddb_table_name)# Trap the EC2_INSTANCE_TERMINATE eventelif message[Event] == autoscaling:EC2_INSTANCE_TERMINATE:delete_instance(message[EC2InstanceId], ddb_table_name)# Delete the message from the queue and continue pollingqueue.delete_message(raw_message)
  50. 50. # Connect to SQS and open queuesqs = boto.connect_sqs()queue = sqs.get_queue(sqs_queue_name)queue.set_message_class(RawMessage)while True:rs = queue.get_messages(num_messages=10)for raw_message in rs:# Parse JSON messageenvelope = json.loads(raw_message.get_body())message = json.loads(envelope[Message])# Trap the EC2_INSTANCE_LAUNCH eventif message[Event] == autoscaling:EC2_INSTANCE_LAUNCH:save_instance(message[EC2InstanceId], ddb_table_name)# Trap the EC2_INSTANCE_TERMINATE eventelif message[Event] == autoscaling:EC2_INSTANCE_TERMINATE:delete_instance(message[EC2InstanceId], ddb_table_name)# Delete the message from the queue and continue pollingqueue.delete_message(raw_message)
  51. 51. # Connect to SQS and open queuesqs = boto.connect_sqs()queue = sqs.get_queue(sqs_queue_name)queue.set_message_class(RawMessage)while True:rs = queue.get_messages(num_messages=10)for raw_message in rs:# Parse JSON messageenvelope = json.loads(raw_message.get_body())message = json.loads(envelope[Message])# Trap the EC2_INSTANCE_LAUNCH eventif message[Event] == autoscaling:EC2_INSTANCE_LAUNCH:save_instance(message[EC2InstanceId], ddb_table_name)# Trap the EC2_INSTANCE_TERMINATE eventelif message[Event] == autoscaling:EC2_INSTANCE_TERMINATE:delete_instance(message[EC2InstanceId], ddb_table_name)# Delete the message from the queue and continue pollingqueue.delete_message(raw_message)
  52. 52. 2.Write to DynamoDB
  53. 53. def save_instance(instance_id, ddb_table_name):instance = get_instance(instance_id)# Connect to DynamodB (using key from env) and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Create a new record for this instanceitem = table.new_item(hash_key=instance.id,attrs = {pub_hostname: instance.public_dns_name,pub_ip: instance.ip_address,priv_hostname: instance.private_dns_name,priv_ip: instance.private_ip_address,ami_id: instance.image_id,region: instance.region.name,availability_zone : instance.placement,terminated: false})# Save the item to DynamoDBitem.put()
  54. 54. def save_instance(instance_id, ddb_table_name):instance = get_instance(instance_id)# Connect to DynamodB (using key from env) and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Create a new record for this instanceitem = table.new_item(hash_key=instance.id,attrs = {pub_hostname: instance.public_dns_name,pub_ip: instance.ip_address,priv_hostname: instance.private_dns_name,priv_ip: instance.private_ip_address,ami_id: instance.image_id,region: instance.region.name,availability_zone : instance.placement,terminated: false})# Save the item to DynamoDBitem.put()
  55. 55. def save_instance(instance_id, ddb_table_name):instance = get_instance(instance_id)# Connect to DynamodB (using key from env) and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Create a new record for this instanceitem = table.new_item(hash_key=instance.id,attrs = {pub_hostname: instance.public_dns_name,pub_ip: instance.ip_address,priv_hostname: instance.private_dns_name,priv_ip: instance.private_ip_address,ami_id: instance.image_id,region: instance.region.name,availability_zone : instance.placement,terminated: false})# Save the item to DynamoDBitem.put()Item ‘fields’Item key
  56. 56. def save_instance(instance_id, ddb_table_name):instance = get_instance(instance_id)# Connect to DynamodB (using key from env) and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Create a new record for this instanceitem = table.new_item(hash_key=instance.id,attrs = {pub_hostname: instance.public_dns_name,pub_ip: instance.ip_address,priv_hostname: instance.private_dns_name,priv_ip: instance.private_ip_address,ami_id: instance.image_id,region: instance.region.name,availability_zone : instance.placement,terminated: false})# Save the item to DynamoDBitem.put()
  57. 57. 63
  58. 58. def delete_instance(instance_id, ddb_table_name):# Connect to DynamodB and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Get the item to soft deleteitem = table.get_item(instance_id)# Update the terminated flagitem[terminated] = true# Save the item to DynamoDBitem.put()
  59. 59. def delete_instance(instance_id, ddb_table_name):# Connect to DynamodB and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Get the item to soft deleteitem = table.get_item(instance_id)# Update the terminated flagitem[terminated] = true# Save the item to DynamoDBitem.put()
  60. 60. def delete_instance(instance_id, ddb_table_name):# Connect to DynamodB and get tableddb = boto.connect_dynamodb()table = ddb.get_table(ddb_table_name)# Get the item to soft deleteitem = table.get_item(instance_id)# Update the terminated flagitem[terminated] = true# Save the item to DynamoDBitem.put()
  61. 61. 67
  62. 62. 3.Write to S3
  63. 63. def write_instances_to_s3(instances_json, s3_output_bucket, s3_output_key):# Connect to S3 and get the output buckets3 = boto.connect_s3()output_bucket = s3.get_bucket(s3_output_bucket)# Create a key to store the instances_json textk = Key(output_bucket)k.key = s3_output_keyk.set_metadata("Content-Type", "text/plain")k.set_contents_from_string(instances_json)
  64. 64. def write_instances_to_s3(instances_json, s3_output_bucket, s3_output_key):# Connect to S3 and get the output buckets3 = boto.connect_s3()output_bucket = s3.get_bucket(s3_output_bucket)# Create a key to store the instances_json textk = Key(output_bucket)k.key = s3_output_keyk.set_metadata("Content-Type", "text/plain")k.set_contents_from_string(instances_json)
  65. 65. {"instances":[{"id":"i-5525c932","terminated":"true","ami_id":"ami-7341831a","availability_zone":"us-east-1d","region":"RegionInfo:us-east-1","pub_ip":"107.21.167.7","pub_hostname":"ec2-107-21-167-7.compute-1.amazonaws.com","priv_ip":"10.201.2.233","priv_hostname":"domU-12-31-39-13-01-1B.compute-1.internal"},{"id":"i-bb6a86dc","terminated":"false","ami_id":"ami-7341831a","availability_zone":"us-east-1a","region":"RegionInfo:us-east-1","pub_ip":"174.129.82.128","pub_hostname":"ec2-174-129-82-128.compute-1.amazonaws.com","priv_ip":"10.242.211.185","priv_hostname":"ip-10-242-211-185.ec2.internal"}]}
  66. 66. …and now we have data in S3we can build a web view…
  67. 67. Instances.txtS3 bucketFrommonitoringappHTML/CSS/JS
  68. 68. Instances.txtS3 bucketFrommonitoringappHTML/CSS/JSWeb pageload fromS3Periodic refresh
  69. 69. Instances.txtS3 bucketFrommonitoringappHTML/CSS/JSWeb pageload fromS3jQuery getinstancedata fromS3Periodic refresh
  70. 70. S3 Bucketcssimgjsindex.htmlinstances.txt
  71. 71. S3 Bucketcssimgjsindex.htmlinstances.txt JSON data
  72. 72. jQuery.getJSON()
  73. 73. jQuery.getJSON()Javascript functions in indexpage hosted in S3Built around an AJAX ‘get’ ofinstances JSON in S3
  74. 74. some jQuerytableselectors/modifiers+
  75. 75. Want to try this yourself?View the video tutorial here:http://youtu.be/lb9qPhxIVNIAnd grab the CloudFormation template here:http://bootstrapping-assets.s3.amazonaws.com/as-register-instances.template
  76. 76. 1. Create security groups2. Create a notification of type SQS3. Create SQS queue4. Create auto-scaling launch configs & groups5. Add auto-scaling notifications to SQS SNS notification6. Create S3 bucket7. Create DynamoDB table8. Start instances9. Bootstrap monitoring application
  77. 77. Summary
  78. 78. Pub/Sub models with SNS S3 & pseudo dynamic contentMore than compute & storageReliable delivery with SQS DynamoDB for high performanceSummary
  79. 79. Pub/Sub models with SNS S3 & pseudo dynamic contentMore than compute & storageReliable delivery with SQSGiven you some ideas?Introduced you to some handy services?Helped you with some CloudFormation?DynamoDB for high performanceSummary
  80. 80. View the video:http://youtu.be/lb9qPhxIVNIFind out more:aws.amazon.com

×