Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

New AWS Services for Bioinformatics

1,516 views

Published on

New AWS Services for Bioinformatics - Athena, Batch, Step Functions, Glue and QuickSight

Published in: Technology
  • Be the first to comment

New AWS Services for Bioinformatics

  1. 1. Lynn Langit New AWS Services For bioinformatics pipelines Feb 2017
  2. 2. New AWS Services • Useful for scaling bioinformatics pipelines • Announced at re:Invent (Nov 2016) • Athena • Step Functions • Batch • Glue • QuickSight
  3. 3. Starting Point for CSIRO
  4. 4. Serverless AWS Lambda Application
  5. 5. Public Genomic Datasets
  6. 6. About AWS Athena Serverless SQL queries on S3 data
  7. 7. AWS Athena Information • Add table (structure) to database via DDL from input file(s) • Write and execute SQL query • Optionally save query • Optionally review query history • View results • Optionally download result set to .csv
  8. 8. Athena - Demo
  9. 9. Athena Genomics Query Example
  10. 10. About AWS Step Functions Serverless visual workflows for Lambdas
  11. 11. AWS Step Functions 1. Define steps and services (activities or lambdas) 2. Verify step execution(s) 3. Monitor and scale “Your application as a state machine.”
  12. 12. AWS Step Functions – 1. Define Steps/Services
  13. 13. AWS Step Functions – 2. Verify step execution
  14. 14. Step Functions - Demo
  15. 15. About AWS Batch Fully managed batch processing at scale
  16. 16. What is batch computing? Run jobs asynchronously and automatically across one or more computers. Jobs may dependencies, making the sequencing and scheduling of multiple jobs complex and challenging.
  17. 17. What is AWS Batch? Fully Managed No software to install or servers to manage. Integrated with AWS Batch jobs can easily and securely interact with services such as Amazon S3, DynamoDB, and Rekognition Cost-optimized Provisioning Auto provisions compute resources tailored to the job needs using EC2 & EC2 Spot
  18. 18. AWS Batch Concepts 1. Jobs 1. Job Definitions 2. Job Queues 3. Job States 2. Compute Environments 3. Scheduler Short Video -- here
  19. 19. Jobs Jobs are the unit of work executed by AWS Batch as containerized applications running on Amazon EC2. Containerized jobs can reference a container image, command, and parameters or users can simply provide a .zip containing their application and we will run it on a default Amazon Linux container. $ aws batch submit-job --job-name variant-calling --job-definition gatk --job-queue genomics
  20. 20. Massively parallel jobs • Now - users can submit a large number of independent “simple jobs.” • Soon – AWS will add support for “array jobs” that run many copies of an application against an array of elements. Array jobs are an efficient way to run: • Parametric sweeps • Monte Carlo simulations • Processing a large collection of objects NOTE: These use cases are possible today, simply submit more jobs.
  21. 21. Example Genomics Workflow
  22. 22. Workflows, Pipelines, and Job Dependencies Jobs can express a dependency on the successful completion of other jobs or specific elements of an array job. Use your preferred workflow engine and language to submit jobs. Flow-based systems simply submit jobs serially, while DAG-based systems submit many jobs at once, identifying inter-job dependencies. $ aws batch submit-job –depends-on 606b3ad1-aa31-48d8-92ec-f154bfc8215f ...
  23. 23. Job Definitions Batch Job Definitions specify how jobs are to be run. While each job must reference a job definition, many parameters can be overridden. Some of the attributes specified in a job definition: • IAM role associated with the job • vCPU and memory requirements • Mount points • Container properties • Environment variables $ aws batch register-job-definition --job-definition-name gatk --container-properties ...
  24. 24. Job Queues Jobs are submitted to a Job Queue, where they reside until they are able to be scheduled to a compute resource. Information related to completed jobs persists in the queue for 24 hours. $ aws batch create-job-queue --job-queue-name genomics --priority 500 --compute-environment-order ...
  25. 25. Compute Environments Mapped from job queues to run containerized batch jobs. • Managed CEs - you describe your requirements (instance types, min/max/desired vCPUs, and EC2 Spot bid as a % of On-Demand), AWS launches & scales resources for you. Pick specific instance types, instance families or simply choose “optimal” • Unmanaged CEs - you can launch and manage your own resources. Your instances need to include the ECS agent and run supported versions of Linux and Docker. AWS Batch will then create an Amazon ECS cluster which can accept the instances you launch. Jobs can be scheduled to your Compute Environment as soon as your instances are healthy and register with the ECS Agent. $ aws batch create-compute-environment --compute- environment-name unmanagedce --type UNMANAGED ...
  26. 26. AWS Batch Scheduler The Scheduler evaluates when, where, and how to run jobs that have been submitted to a job queue. Jobs run in approximately the order in which they are submitted as long as all dependencies on other jobs have been met.
  27. 27. Queued Job States • SUBMITTED: Accepted into the queue, but not yet evaluated for execution • PENDING: Your job has dependencies on other jobs which have not yet completed • RUNNABLE: Your job has been evaluated by the scheduler and is ready to run • STARTING: Your job is in the process of being scheduled to a compute resource • RUNNING: Your job is currently running • SUCCEEDED: Your job has finished with exit code 0 • FAILED: Your job finished with a non-zero exit code or was cancelled or terminated.
  28. 28. AWS Batch Actions • CancelJob: Marks jobs that are not yet STARTING as FAILED. • TerminateJob: Cancels jobs that are currently waiting in the queue. Stops jobs that are in a STARTING or RUNNING state and transitions them to FAILED. NOTE: Requires a “reason” which is viewable via DescribeJobs $ aws batch cancel-job --reason “Submitted to wrong queue” --jobId= 8a767ac8-e28a-4c97-875b-e5c0bcf49eb8
  29. 29. AWS Batch Data Types • ComputeEnvironmentDetail • ComputeEnvironmentOrder • ComputeResource • ContainerProperties • ContainerPropertiesResource • CounterProperties • Host • Job • JobDefinition • JobQueueDetail • MountPoint • Parameter • Ulimit • Volume
  30. 30. Batch - Demo
  31. 31. AWS Batch Pricing and Functionality There is no charge for AWS Batch; you only pay for the underlying resources that you consume! NOTE: Support for Array Jobs, retries, and jobs executed as AWS Lambda functions coming soon!
  32. 32. Use the Right Tool for the Job Not all batch workloads are the same… • ETL and Big Data processing/analytics? • Consider EMR, Data Pipeline, Redshift, and related services. • Lots of small Cron jobs? AWS Batch is a great way to execute these jobs, but you will likely want a workflow or job-scheduling system to orchestrate job submissions. • Efficiently run lots of big and small compute jobs on heterogeneous compute resources? Use AWS Batch
  33. 33. Example: DNA Sequencing
  34. 34. Example: Genomics on Unmanaged Compute Environments
  35. 35. Fully Managed Integrated with AWS Cost-optimized Resource Provisioning AWS Batch summarized
  36. 36. About AWS Glue Serverless managed, scalable ETL
  37. 37. AWS Glue 1. Build a data catalog 1. Discover and use your datasets via a Hive-compatible metastore 2. Store versions, connection and credential info 3. Use crawlers to auto-generate schema from S3 data & partitions 2. Generate and edit transforms using PySpark 3. Schedule and run your jobs 1. On schedule, event or lambda NOTE: Glue is announced, but no beta as of yet…video from re:Invent -- here
  38. 38. An aside… EC2 Elastic GPUs
  39. 39. About AWS QuickSight Quick and easy data dashboards
  40. 40. Resources for new AWS Services • Athena (SQL query on S3) – here • Batch (Optimized, chained EC2 batches) – here • Glue (Scaled ETL) -- here • Step Functions (Lambda workflows) – here • QuickSight (Data Dashboards) – here • Full list of AWS services announced at re:Invent 2016 -- here

×