Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Migrate from SQL Server or Oracle into Amazon Aurora using AWS Database Migration Service

5,308 views

Published on

As organizations look to improve application performance and decrease costs, they are increasingly looking to migrate from commercial database engines into open source. Amazon Aurora is a MySQL-compatible relational database engine that combines the speed and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. In this webinar, we will cover how to use Database Migration Service (DMS) to go about the migration, and how to use the schema conversion tool to convert schemas into Amazon Aurora. We’ll then follow with a quick demo of the entire process, and close with tips and best practices.

Learning Objectives:
Understand how AWS Database migration can help you migrate from a commercial database into Amazon Aurora to improve application performance and decrease database costs.

Published in: Technology
  • ♣♣ 10 Easy Ways to Improve Your Performance in Bed... ●●● https://tinyurl.com/rockhardxxx
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website! https://vk.cc/818RFv
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Migrate from SQL Server or Oracle into Amazon Aurora using AWS Database Migration Service

  1. 1. © 2017, Amazon Web Services, Inc. or its Affiliates. All rights reserved. Paras Bhuva, Solutions Architect 2/22/2017 Amazon Web Services bhuparas@amazon.com Migrate from SQL Server or Oracle into Amazon Aurora using AWS Database Migration Service @parasbhuva
  2. 2. What to Expect from the Session Agenda • Migrating to AWS • AWS Schema Conversion Tool Overview • Migration Considerations • AWS Database Migration Service Overview • Amazon Aurora Overview • Demo! • Best practices – SCT and DMS • Q&A
  3. 3. Migrating to AWS
  4. 4. • Quickly provision databases • Multiple Availability Zones • Rapid scaling • Automated patching • Easy read replica creation • High durability • Point in time recovery • Detailed metrics • Single-click encryption at rest Amazon RDS Why AWS?
  5. 5. • How will my on-premises data migrate to the cloud? • How can I make it transparent to my users? • How will on-premises and cloud data interact? • How can I integrate my data assets within AWS? • How can I move off of commercial databases? How?
  6. 6. Migration Options • Lift and shift • Leverage Amazon EC2 and Amazon S3 • Keep existing DB engine but migrate to Amazon RDS • For example Oracle on-premises to RDS Oracle • Migrate database engine • Commercial engine to open source • Maintenance window • Maintenance window duration vs. CDC with 0 downtime
  7. 7. Database Migration Process
  8. 8. AWS Schema Conversion Tool Overview
  9. 9. AWS Schema Conversion Tool Features • Converts schema of one database engine to another • Database Migration Assessment report for choosing the best target engine • Code browser that highlights places where manual edits are required The AWS Schema Conversion Tool helps automate many database schema and code conversion tasks when migrating from Oracle and SQL Server to open source database engines.
  10. 10. Convert Tables, Views, and Code • Sequences • User Defined Types • Synonyms • Packages • Stored Procedures • Functions • Triggers • Schemas • Tables • Indexes • Views
  11. 11. Components of the Console 1. Source Schema 2. Action Items 3. Target Schema 4. Schema Element Details 5. Edit Window
  12. 12. Supported Conversions
  13. 13. $0 for software license Allowed Use  Use Schema Conversion Tool to migrate database schemas to Amazon RDS, Amazon Redshift, or Amazon EC2–based databases  To use Schema Conversion Tool to migrate schemas to other destinations, contact for special pricing Pricing  Free software license  For active AWS customers with accounts in good standing Pricing, Terms & Conditions
  14. 14. Prerequisites • Create Databases • Source • Target • Download AWS Schema Conversion Tool • http://amzn.to/2b2YE2a • Download Drivers • http://amzn.to/2axE0Hn • Update Global Settings
  15. 15. Global Settings – Logging
  16. 16. Global Settings – Drivers Download Drivers here http://amzn.to/2axE0Hn
  17. 17. Global Settings – Performance and Memory
  18. 18. Global Settings – Assessment Report
  19. 19. Few considerations before you start your DB migration project…
  20. 20. Time Considerations • Any Hard Dates? • Planning Time? • Typically 2-3 Weeks • Several Iterations
  21. 21. Database Considerations • Number of Schemas? • Number of Tables? • Engine Specific Types? • Users/Roles/Permissions?
  22. 22. Network Considerations • Access (Firewalls, Tunnels, VPNs)? • Which VPC? • Which Security Groups?
  23. 23. Requirements Considerations • Engine Selection Criteria? • Which Tables Need to Move? • Same Target For All Tables?
  24. 24. Phase Description Automation 1 Assessment SCT 2 Database schema conversion SCT/DMS 3 Application conversion/remediation SCT 4 Scripts conversion SCT 5 Integration with third-party applications 6 Data migration DMS 7 Functional testing of the entire system 8 Performance tuning SCT 9 Integration and deployment 10 Training and knowledge 11 Documentation and version control 12 Post production support Database Migration Phases
  25. 25. DMS Overview
  26. 26. • Start your first migration in 10 minutes or less • Keep your apps running during the migration • Replicate within, to, or from Amazon EC2 or RDS • Move data to the same or a different database engine AWS Database Migration Service (AWS DMS)
  27. 27. Customer premises Application users AWS Internet VPN • Start a replication instance • Connect to source and target databases • Select tables, schemas, or databases  Let AWS DMS create tables, load data, and keep them in sync  Switch applications over to the target at your convenience Keep your apps running during the migration AWS DMS
  28. 28. Multi-AZ option for high availability Customer premises or AWS AWS Internet VPN AWS DMS AWS DMS
  29. 29. AWS Database Migration service pricing T2 for developing and periodic data migration tasks C4 for large databases and minimizing time T2 pricing starts at $0.018 per hour for T2.micro C4 pricing starts at $0.154 per hour for C4.large 50 GB GP2 storage included with T2 instances 100 GB GP2 storage included with C4 instances Data transfer inbound and within AZ is free Data transfer across AZs starts at $0.01 per GB Complete pricing details here: https://aws.amazon.com/dms/pricing/
  30. 30. Migration Scenarios and Options
  31. 31. On-Premises Migration Scenarios • An on-premises database to a database on Amazon RDS DB instance • An on-premises database to a database on an Amazon EC2 instance • Migration from an on-premises database to another on-premises database is not supported.
  32. 32. RDS Migration Scenarios • A database on an Amazon RDS DB instance to an on-premises database • A database on an Amazon RDS DB instance to a database on an Amazon RDS DB instance • A database on an Amazon RDS DB instance to a database on an Amazon EC2 instance
  33. 33. EC2 Migration Scenarios • A database on an Amazon EC2 instance to an on-premises database • A database on an Amazon EC2 instance to a database on an Amazon EC2 instance • A database on an Amazon EC2 instance to a database on an Amazon RDS DB instance
  34. 34. DMS Components • Replication Instances • Endpoints • Tasks
  35. 35. Replication Instances • Performs the work of the migration • Tasks run on instances • Can support multiple tasks • AWS DMS currently supports T2 and C4 instance classes for replication instances
  36. 36. Public and Private Replication Instances • A replication instance should have a public IP address if the source or target database is located in a network that is not connected to the replication instance's VPC by using a virtual private network (VPN), AWS Direct Connect, or VPC peering. • A replication instance should have a private IP address when both the source and target databases are located in the same network that is connected to the replication instance's VPC by using a VPN, AWS Direct Connect, or VPC peer.
  37. 37. Sources for AWS Database Migration Service On-premises and Amazon EC2 instance databases: • Oracle versions 10.2 and later, 11g, and 12c, for the Enterprise, Standard, Standard One, and Standard Two editions • Microsoft SQL Server versions 2005, 2008, 2008R2, 2012, and 2014, for the Enterprise, Standard, Workgroup, and Developer editions. The Web and Express editions are not supported. • MySQL versions 5.5, 5.6, and 5.7 • MariaDB (supported as a MySQL-compatible data source) • PostgreSQL 9.3 and later • SAP Adaptive Server Enterprise (ASE) 15.7 and later Amazon RDS instance databases • Oracle versions 11g (versions 11.2.0.3.v1 and later), and 12c, for the Enterprise, Standard, Standard One, and Standard Two editions • Microsoft SQL Server versions 2008R2, 2012, and 2014, for the Enterprise and Standard editions. Note that change data capture (CDC) operations are not supported. The Web, Workgroup, Developer, and Express editions are not supported. • MySQL versions 5.5, 5.6, and 5.7 • PostgreSQL 9.4 • MariaDB (supported as a MySQL-compatible data source) • Amazon Aurora (supported as a MySQL-compatible data source)
  38. 38. Targets for AWS Database Migration Service On-premises and EC2 instance databases • Oracle versions 10g, 11g, 12c, for the Enterprise, Standard, Standard One, and Standard Two editions • Microsoft SQL Server versions 2005, 2008, 2008R2, 2012, 2014 for the Enterprise, Standard, Workgroup, and Developer editions. The Web and Express editions are not supported. • MySQL versions 5.5, 5.6, and 5.7 • MariaDB (supported as a MySQL-compatible data target) • PostgreSQL versions 9.3 and later • SAP Adaptive Server Enterprise (ASE) 15.7 and later Amazon RDS instance databases and Amazon Redshift • Oracle versions 11g (versions 11.2.0.3.v1 and later) and 12c, for the Enterprise, Standard, Standard One, and Standard Two editions • Microsoft SQL Server versions 2008R2, 2012, and 2014, for the Enterprise, Standard, Workgroup, and Developer editions. The Web and Express editions are not supported. • MySQL versions 5.5, 5.6, and 5.7 • MariaDB (supported as a MySQL-compatible data target) • PostgreSQL versions 9.3 and later • Amazon Aurora (MySQL and PostgreSQL) • Amazon Redshift
  39. 39. Tasks Overview • Run on a replication instance • Contain two and only two endpoints (source and target) • Different migration methods available • Specify selection and/or transformation rules • Can run multiple tasks
  40. 40. Migration Methods • Migrate existing data • Migrate existing data and replicate ongoing changes • Replicate data changes only
  41. 41. DMS – Change Data Capture (CDC) “No Touch” design • Reads recovery log of source database • Using the engine’s native change data capture API • No agent required on the source Some requirements • Oracle: Supplemental logging required • MySQL: Full image row level bin logging required • SQL Server: Recovery model bulk logged or full • Postgres: wal_level = logical; max_replication_slots >= 1; max_wal_Senders >=1; wal_sender_timeout = 0 Changes captured and applied as units of single committed transactions Activated when load starts No changes are applied until load completes, then applied as soon as possible in near real-time
  42. 42. Data copy: Existing data is copied from source tables to tables on the target. Change data capture and apply: Changes to data on source are captured while the tables are loaded. Once load is complete, buffered changes are applied to the target.  Additional changes captured on the source are applied to the target until the task stopped or terminatedAWS Database Migration Service AWS Schema Conversion Tool Oracle, SQL Server to Aurora migration Assessment report: SCT analyses the source database and provides a report with a recommended target engine and information on automatic and manual conversions Code Browser and recommendation engine: Highlights places that require manual edits and provides architectural and design guidelines.
  43. 43. Replication Instance Source Target Start Full Load
  44. 44. While Loading Data Also Capture Changes Source TargetReplication Instance Update
  45. 45. Load Complete - Apply Captured Changes Source TargetReplication Instance Update
  46. 46. Changes Reach Steady State Source TargetReplication Instance Update
  47. 47. Cutover - Shut Down Apps & Apply Remaining Changes Source TargetReplication Instance Update
  48. 48. Flip! Source TargetReplication Instance Update
  49. 49. Changes are Transactional and Come From the Logs Replication Instance Source Target Update t 1 t 2 t 1 t 2
  50. 50. Replication Instance Source Target Multiple Targets Target Target
  51. 51. Replication Instance Source Target Multiple Sources Source Source
  52. 52. Replication Instance Source Target Multiple Sources and Targets Source Source Target
  53. 53. You Don’t Have to Take Everything Source  Target Replication Instance
  54. 54. Homogenous or Heterogeneous Replication Instance SQL Server MySQL Replication Instance Oracle Oracle Replication Instance Oracle Aurora
  55. 55. Amazon Aurora
  56. 56. Enterprise customer wish list A database that …. Stays up, even when components fail …. Performs consistently at enterprise scale … Doesn’t need an army of experts to manage … Doesn’t cost a fortune; no licenses to handle …
  57. 57. Amazon Aurora: enterprise-class database for the cloud We started with enterprise requirements and walked backward to reimagine relational databases for the cloud ….  Enterprise-class availability, performance  Delivered as a fully managed service  No licenses; 1/10 the cost of commercial databases
  58. 58. Perfect fit for enterprise  6-way replication across 3 availability zones  Failover in less than 30 secs  Near instant crash recovery  Up to 500 K/sec read and 100 K/sec write  15 low latency (10 ms) Read Replicas  Up to 64 TB DB optimized storage volume  Instant provisioning and deployment  Automated patching and software upgrade  Backup and point-in-time recovery  Compute and storage scaling Performance and scale Enterprise class availability Fully managed service
  59. 59. Aurora is used by: 2/3 of top 100 AWS customers 8 of top 10 gaming customers Aurora customer adoption Fastest growing service in AWS history
  60. 60. A service-oriented architecture applied to the database Moved the logging and storage layer into a multitenant, scale-out database-optimized storage service Integrated with other AWS services like Amazon EC2, Amazon VPC, Amazon DynamoDB, Amazon SWF, and Amazon Route 53 for control plane operations Integrated with Amazon S3 for continuous backup with 99.999999999% durability Control PlaneData Plane Amazon DynamoDB Amazon SWF Amazon Route 53 Logging + Storage SQL Transactions Caching Amazon S3 1 2 3
  61. 61. Delivered as a managed service
  62. 62. Backup and recovery, data load and unload Performance tuning 5% 25% 20% 40% 5% 5% Scripting and coding Security planning Installing, upgrading, patching, and migrating Documentation, licensing, and training Databases are hard to manage
  63. 63. Hosting your databases on premises you Power, HVAC, net Rack & stack Server maintenance OS patches DB s/w patches Database backups Scaling High availability DB s/w installs OS installation App optimization
  64. 64. Hosting your databases in Amazon EC2 Power, HVAC, net Rack & stack Server maintenance OS installation OS patches DB s/w patches Database backups Scaling High availability DB s/w installs App optimization you
  65. 65. If you choose a managed DB service App optimization Power, HVAC, net Rack & stack Server maintenance OS patches DB s/w patches Database backups High availability DB s/w installs OS installation Scaling you
  66. 66. Learning Resources – Amazon Aurora Service page – https://aws.amazon.com/rds/aurora/ Deep dive video (from re:Invent 2016) here – https://youtu.be/duf5uUsW3TM Getting started with Aurora whitepaper – https://d0.awsstatic.com/whitepapers/getting-started-with-amazon-aurora.pdf Performance Benchmark Guide – https://d0.awsstatic.com/product-marketing/Aurora/RDS_Aurora_Performance_Assessment_Benchmarking_v1-2.pdf More resources found here – https://aws.amazon.com/rds/aurora/resources/
  67. 67. Before we get into the demo: Step 1: Database Migration Assessment 1. Connect Schema Conversion Tool to source and target databases. 2. Run Assessment Report. 3. Read Executive Summary. 4. Follow detailed instructions.
  68. 68. Demo time!
  69. 69. Best Practices – AWS Schema Conversion Tool General Memory Management and Performance Options Configure the AWS Schema Conversion Tool with different memory performance settings. Increasing memory speeds up the performance of your conversion but uses more memory resources on your desktop. Fast conversion, but large memory consumption – This option optimizes for speed of the conversion, but might require more memory for the object reference cache. Low memory consumption, but slower conversion – This option minimizes the amount of memory used, but results in a slower conversion. Use this option if your desktop has a limited amount of memory. Balance speed with memory consumption – This option optimizes provides a balance between memory use and conversion speed. If you are converting large database schemas, ex: a database with 3,500 stored procedures, you can configure the amount of memory available to the AWS Schema Conversion Tool. Details here: http://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_SchemaConversio nTool.BestPractices.html
  70. 70. Best Practices – AWS Database Migration Service • Load multiple tables in parallel • Remove bottlenecks on the target • Use multiple tasks • Optimizing change processing • Determine optimal Size for the replication instance based on: • Table Size • Data manipulation language (DML) activity • Transaction size • Total size of the migration • Number of tasks • Migrating Large Binary Objects (LOBs) Complete list of best practices can be found here: http://docs.aws.amazon.com/dms/latest/userguide/CHAP_BestPractices.html
  71. 71. Thank you! Paras Bhuva, Solutions Architect 2/22/2017 Amazon Web Services bhuparas@amazon.com @parasbhuva

×