Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

AWSを利用した開発者・データを扱う人向けの資料

1,167 views

Published on

Hands-on1: AWSを基盤として使いたい開発者向け
Hands-on2: AWSサービス間でデータを流す人向け

※EMRハンズオンはサンプルコード突っ込んだほうが良い。

Published in: Technology
  • Be the first to comment

AWSを利用した開発者・データを扱う人向けの資料

  1. 1. AWSを基盤として利用する 開発者、AWSを利用してデータ を扱う人向けの資料 2013/9/29 PBL講義 @mryoshio
  2. 2. 目次 • 前段  自己紹介  対象者  ゴール(ミニマム)  事前準備  注意点など • Introduction  AWS?  Your Experience? • Hands-on 1  EC2  Route53  RDS  VPC  Exercise • Hands-on 2  S3  DynamoDB  Elastic MapReduce  Exercise  Redshift (Bonus) 2
  3. 3. 前段 3
  4. 4. 自己紹介 4 開発 / インフラ / 雑用 最近凝ってること: 脱力
  5. 5. 対象者 5 AWS触ったこと無い人あ るいはAWSの複数サービ スを組み合わせたことが 無い人
  6. 6. ゴール(ミニマム) 6 AWSのサービスを使って 簡単なWEBアプリを構築 する
  7. 7. ゴール(マキシマム) 7 その辺の大企業でのデー タハンドリングの実態を コード・設定レベルで想 像できる
  8. 8. 事前準備 • AWSアカウント取得 • キーペアの作成 8
  9. 9. 注意点など • 基本的にハンズオンです。資料を作るのが面倒く さかった訳ではないです • 無事終えると一瞬でWEBアプリのインフラを構築 できるようになります • 休憩は適当に取りますが,勝手に席を離れても らっても構いません • ぶっつけ本番なので時間が余り過ぎたらごめんな さい • ハンズオン実施中の数時間についてもAWS使用料 かかります • 質問は随時受け付けますが講師は一人です 9
  10. 10. Introduction 10
  11. 11. AWS? 11
  12. 12. AWS? 12 ※2013/9/28現在
  13. 13. Your Experience? 13 ※2013/9/28現在
  14. 14. Hands-on 1 14
  15. 15. Theme Your Application Base 15
  16. 16. EC2 16
  17. 17. Vocabulary • Instance • AMI (Amazon Machine Image) • EBS (Elastic Block Store) • Security Group • EIP (Elastic IP) • Load Balancer 17
  18. 18. Launch First Instance 18
  19. 19. Point 19 • AMI • Instance Type • EC2 or VPC? • Key Pair • Security Group
  20. 20. SSH Login • Login server with key pair you selected in wizard. • Login as ec2-user if you use Amazon Linux AMI. 20
  21. 21. Build Web Server • Install Apache and run it. • Access http://<Public DNS>/ 21
  22. 22. Do anything and Destroy First Instance 22 • CAUTION: At first, Create Image (as backup) … • Change Security Group and change. … • Terminate.
  23. 23. Restore First Instance 23
  24. 24. Set up Load Balancer • [Before] Client -> EC2 Instance • [After] Client -> Load Balancer -> EC2 Instance 24 ※heavily strict with Ping Target
  25. 25. Route53 25
  26. 26. Vocabulary • Hosted Zone • Record Set • Domain • A Record: • NS Record 26
  27. 27. Set up Your Route • Create Hosted Zone with your domain. • Create Record Set (A Record) on your EC2 Instance. 27
  28. 28. RDS 28
  29. 29. Vocabulary • Instance • DB • Snapshot • Security Group • Parameter Group • Multi AZ • Subnet Group 29
  30. 30. Launch FIRST Instance 30
  31. 31. Login • mysql –u <adminname> -h <endpoint> -p 31
  32. 32. VPC 32
  33. 33. Vocabulary • VPC • Public Subnet • Private Subnet • Elastic IP • Route Table • Network ACL • Security Group 33
  34. 34. Create your VPC 34
  35. 35. Exercise for Hands-on1 35
  36. 36. 1. Create Web - DB Application 36 • EC2 Instance x 1 • DB Server running on EC2 Instance above. • User can access http://<some domain or ip>:80/
  37. 37. 2. Create Web - DB Application 37 • EC2 Instance x 1 • RDS Instance x 1 • User can access http://<some domain or ip>:80/
  38. 38. 3. Create Web - DB Application 38 • EC2 Instance x 2 • RDS Instance x 1 • Load Balancer x 1 • User can access http://<some domain or ip>:80/ • Top page shows host name or private IP on it. • May use any stuff you prefer.
  39. 39. 4. Create Web - DB Application 39 • EC2 Instance x 2 at public subnet in VPC • RDS Instance x 1 at private subnet in VPC, as Multi AZ • Load Balancer x 1 in VPC (internet-facing) • User can access http://<some domain or ip>:80/ • Top page shows host name or private IP on it. • May use any stuff you prefer.
  40. 40. 4-1. Create VPC 40 • Start VPC Wizard • Select “VPC with Public and Private Subnets”
  41. 41. 4-2. Launch EC2 Instance and ELB 41 • Launch EC2 Instance in your VPC  note: should set it in Public Subnet • Create Load Balancer in your VPC. • Add your EC2 Instance in ELB.
  42. 42. Hands-on 2 42
  43. 43. Theme Handle Your Data 43
  44. 44. S3 44
  45. 45. Vocabulary 45
  46. 46. Create Your Bucket and … 46 • Create your bucket with any name. • Upload one image file into it.
  47. 47. Light Exercise 47 • Insert img tag into your application’s view which points to image file in the bucket.
  48. 48. DynamoDB 48
  49. 49. Vocabulary • NoSQL • Hash Key • Range Key • Read Throughput • Write Throughput 49 ※http://www.allthingsdistributed.com/2007/10/amazons_dynamo.html
  50. 50. Create Your Table and … 50 • Create your table.  Hash Key => item_id (String)  Range Key => selling_date (String)  set Read/Write Throughput as you like • Insert test data by your way (manually, by program).  sample program: https://gist.github.com/mryoshio/6744061
  51. 51. Light Exercise 51 • Let’s write code to read data in DynamoDB. • And to change Read/Write throughput dynamically.
  52. 52. Elastic MapReduce 52
  53. 53. Vocabulary • Job Flow • Job • HDFS • MapReduce • Hive • Pig 53
  54. 54. Concept 54 • Treat S3, DynamoDB as HDFS. • Support Hive, Pig, Custom Program (e.g. Python). • Sample Program is prepared in AWS Console. • JobFlow (Job) can be handled in AWS Console but might be easier to do with command line client.
  55. 55. Exercise for Hands-on2 55
  56. 56. 1. Hive Program (DynamoDB -> S3) 56 • Export DynamoDB table you created before into S3.
  57. 57. 2. Hive Program (S3 -> DynamoDB) 57 • Import S3 files into DynamoDB table you created before. • May re-use S3 files exported in previous page.
  58. 58. 3. Streaming (S3 -> DynamoDB) 58 • Design S3 files and DynamoDB table. • Create new DynamoDB table. • Generate files on S3. • Import files on S3 into DynamoDB table. • On the way importing, your custom MapReduce script has to work this time.
  59. 59. Redshift (Bonus) 59
  60. 60. おわり 60

×