2. “GeoApps”
Geo-Mapping and Visualization
Location Based Services (LBS)
Analyzing Geo-data
Location based Mobile Apps
Geospatial Applications
Geo-Social Apps
Emergency response Maps
Geographic Information Systems
Map-based Mashups and overlays
3. “GeoApps”
Geo-Mapping and Visualization
Location Based Services (LBS)
Analyzing Geo-data
Location based Mobile Apps
Geospatial Applications
Geo-Social Apps
Emergency response Maps
Geographic Information Systems
Map-based Mashups and overlays
Powered by the Cloud
4. The AWS Cloud
Tools to access
services
Cross Service
features
High-level
building blocks
Low-level
building blocks
5. Platform that provides foundation
to build innovation solutions on top
Platform that provides abstraction
to hide underlying layers (hardware and software)
Platform that is self-service
The Cloud
as a Platform
for Platforms
11. SwissTopo - The Swiss Federal Office of
Topography
GIS App: http://map.geo.admin.ch/
12. SwissTopo - The Swiss Federal Office of
Topography
30,000 unique visitors per day
8 TB per month
250,000,000 preproduced map tiles
1,300 delivered map tiles per second
40 GIS applications
40 Web map services
14. Topographical data + Hydrological data + Cloud
computing = good dams
What is the best location for the dam for optimum results?
What is the best height for the dam?
It would take 35
days of non-stop
computation
15. Digital Elevation Model
Whether it makes
Elevation Elevation
economic sense ?
Distance Distance
16. “Cloud HD”
High Performance Cloud computing with Amazon EC2
Cluster Compute Instance Family
2 * Xeon 5570 (“Intel Nehalem”)
23 GB RAM
10 gbps Ethernet
1690 TB local disk
HVM-based virtualization
$1.60/hour
17. Amazon EC2 Instances
Start End
AWS Master Node
Notify
From IDE
Command line Input output
Web Console dataset results
Input Data Input S3 Output Get Results
bucket S3 bucket
Amazon S3
18. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Massively
App Server & Service
Scalable
Database Orientation
Services
19. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping
Massively
App Server & Service
Scalable
Database Orientation
Services
20. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Massively
App Server & Service
Scalable
Database Orientation
Services
22. Recovery.gov
First government-wide system to move to the cloud
Savings of over $750,000 in current budget cycle
“Cloud computing strikes me as a perfect tool to help achieve
greater transparency and accountability. Moving to the cloud
allows us to provide better service at lower costs. I hope this
development will inspire other government entities to accelerate
their own efforts. The American taxpayers would be the
winners.’’ -
Earl E. Devaney, the Board’s Chairman.
28. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Massively
App Server & Service
Scalable
Database Orientation
Services
29. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Massively
App Server & Service
Scalable
Database Orientation
Services
30. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
App Server & Service
Scalable
Database Orientation
Services
31. World Wide Storage and Delivery Infrastructure
Amazon S3 Amazon
CloudFront
• Elastic • Low Cost No
Distributed Commitment
Highly Content
Redundant Delivery
Data Store
5 Regions 18 Edge Locations
US West (N. California) Ashburn, Dallas, Los Angeles, Miami,
US East (N. Virgina) Newark, Palo Alto, Seattle, St. Louis,
EU West (Ireland) Amsterdam, Dublin, Frankfurt,
APAC (Singapore) London, Hong Kong, Singapore,
APAC (Japan) Tokyo
36. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
App Server & Service
Scalable
Database Orientation
Services
37. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service
Scalable
Database Orientation
Services
38. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Services
39.
40. Time to provision a server in an enterprise
350,000 Minutes (7-8 Months)
$1000 To rack and stack on-premise
Time to provision a server in the cloud
<5 Minutes
$260 For 3 years (reserved 100% utilized)
44. www.Gulfofmexicoresponsemap.com was built
using ArcServer Standard version 9.3.1
leveraging the Flex API which utilizes ArcGIS
Online, Microsoft Bing, and Response Content
• Response content is refreshed twice a day
• Solution was deployed in Amazon
46. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Services
47. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Geo-Collaboration Services
48. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibility
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Geo-Collaboration EasyServices
52. “Customers can immediately access and integrate most
current geospatial data into business analyses,
location-based marketing programs and risk
management calculations”
Geosk Platform and Geosk Library will be
powered by WeoGeo, which is powered by
Amazon Web Services
53. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibilty
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Geo-Collaboration EasyServices
54. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibilty
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Geo-Collaboration EasyServices
Innovation Open
57. GeoApps Need Cloud
Geo-compute and Analysis Elasticity
Visualization and Geo-Mapping Flexibilty
Geo-Data Storage and Delivery Low Cost
Massively
AgilityServer &Crisis Mapping
App during Service Self Scalable
Service
Database Orientation
Geo-Collaboration EasyServices
Innovation Open
59. KEY BENEFITS TO RUNNING IN THE AWS CLOUD
Lowers Cost Increases Agility
Eliminates Capital
Reduce Time to Market
Investment
Removes constraints
Reduces Operational Costs
Removes the “Heavy Foundation for
Lifting” 21st Century
Leverages Scalability,
Reliability and Security
Architectures
Editor's Notes
Jinesh
Swisstopo currently serves with its Amazon EC2 instances up to 30,000 unique visitors per day. This equates to data transfer of 8 TB per month; up to 1,300 delivered map tiles per second; 250,000,000 preproduced map tiles in Amazon Simple Storage Service (Amazon S3); and 40 GIS applications with 40 Web map services for Swiss Federal offices, among other customers. Swisstopo estimates it is now possible for the FSDI team to launch a new server within hours, instead of the weeks or months it took before using AWS.
The first diagram depicts a setup to do high performance OGC WMS setup for a lot of source image files, usually uncompressed GeoTiff files. This setup puts source GeoTiffs on S3 allowing n-number of servers to be started without having to script loading of large EBS volumes with data for the whole area. Each zone group acts like a stripe of the overall coverage area. This happens automatically because the front-end index cluster only refers requests for certain geographic areas to certain zones, making them cache just those files to local EC2 volume.
This is an example what data can do today.An Analyst firm PSR took topographical data from NASA of area of 1o latitude x 1o longitude of an river in India (Sub river of Ganga) and analyzed the data with hydrological information (river flow) to determine the best site for constructions of hydro plants.Hydro inventory studies determine the best sites for the construction of hydro plants and their size in terms of installed capacity, dam height, reservoir area and storage capacity. These studies rely on topographical and hydrological information (hydropower just the conversion of potential energy into kinetic energy that moves the turbines and finally electric energy produced by the generator which is coupled to the turbine). Thus, the key components for hydropower are the artificially produced head by the damming of the river and the river streamflow (hydrology). NASA and the Japanese Government launched in 2009 a public database with an approximated topographical data for the entire Earth, based on satellite infrared measurements. One selects the area of interest (river basin plans where hydro projects are being planned) and downloads the data from NASA’s servers. Each file contains topographical data that covers an area of 1o latitude x 1o longitude (see below and notice “scarf” running above the red arrow, indicating the river pathway). The first task then becomes to massively process this data, which is also known as Digital Elevation Model (DEM) in order to: Infer the river path (Geographical Information Systems are used in this task)Determine the river elevation profile (from headwaters to mouth)Calculate cross sections profiles at various pointsUse all previous information to estimate the investment costs to build the hydro plant, deciding its best construction layout For instance, a river with an extension of 500km for which one investigates where to develop hydro plants based on a discretization of one kilometer and in each case how high the dam should be based on 20 elevations and choosing from 5 different engineering layouts would require 500 x 20 x 5 = 50,000 alternatives. Our model currently estimates construction costs in roughly one minute per alternative. Thus, if using a single machine, the total processing time to carry out the previous tasks would require 35 days of non-stop computation. As each computation may be carried out independently, the procedure may be processed in parallel using the Amazon Web Services - AWS. After pre-processing is made (steps (i)-(iv) above), the best construction alternative in each point is known. It remains to be determined where to build the plants (i.e. which of the those 500 sites are the “right ones”? and in each case “what is the best height for the dam?”). The problem becomes (mathematically) interesting because the combinations of possible ways to explore the hydro potential with the plants is exponential. The next figure shows two alternatives, one with three larger plants (left) and another with four smaller ones (right). To answer these questions, we formulate a Mixed Integer Programming model (MIP) that uses all pre-processed information regarding the construction costs and alternatives and determines the optimum solution, that is, that maximizes the economic benefit of developing the projects. This benefit is measured by a cash flow analysis that considers revenues related to the sales of electricity produced in the hydro plants on one side and the investment and operation costs on another. Socio-environmental factors are also considered in the analysis as economically interesting projects may not be feasible due to constraints.
they provide a visual representation of the US with the distribution of stimulus funding in the form of contracts, loans and grants. If you drill down into the top level map, there is a detailed map that allows you to see the funding and programs right down to an individual township. For each contract, grant or load you can see the dollars that were spent and the jobs that were created. This tracks the distribution of $$ that were made available as part of President Obama’s stimulus package – called the Recovery Act.
they provide a visual representation of the US with the distribution of stimulus funding in the form of contracts, loans and grants. If you drill down into the top level map, there is a detailed map that allows you to see the funding and programs right down to an individual township. For each contract, grant or load you can see the dollars that were spent and the jobs that were created. This tracks the distribution of $$ that were made available as part of President Obama’s stimulus package – called the Recovery Act.
We deploy GeoIQ through Amazon cloud services and are moving the entire GeoCommons to AWS. In about a month we will be launching several major NGO and US Federal Government geo-sites all on AWS.
The 2nd diagram, shows the first diagram as the wms service on the left and in the middle shows yet another Elastic Beanstalk app that is the Tile Making app. The Tile Making app is driven by messages that are recursively created in SQS to buildup certain parts of world into TMS type databases. From my earlier comments. This system takes advantage of the "auto"features of AWS, but more importantly to run something like this without incurring licensing fees, all of the component parts WANT to be FOSS. One US State at an aerial image resolution of 50cm/pixel can have on the order of 10 million tiles, just for the base layer. This system allows SpatialCloud to generate the full TMS "stack" for such an area in hours, not days. SpatialCloud has used diagram setups and other similiar tools, to process hundreds of millions of image tiles to S3. The SpatialCloud USA dataset alone is 200 million. The reason you want to pre-generate the TMS on S3 is because one server instance of WMS ( for an average state ) would require about 2TB of EBS volume to store uncompressed source gtifs and for it to be fast you would need to multiple instances and configure caching etc.Its less expensive to just run many servers once and pre-generate the files to S3. Once on S3 you can run other FOSS4G tools such as MapProxy to run WMS using the TMS'ed image content. Handling larger raster datasets for image data, for elevation data etc is a perfect use-case for on-demand nature of AWS services.
Jinesh
Haiti Earthquake occurred on Jan 12thFirst satellite images were available Jan 13thNeed to get viewer up to facilitate emergency response efforts
ESRI built application to allow for translation of up to the minute satellite images of the affected areaRelief workers were able to quickly identify evacuation routes and get aid to the hardest hit areas
BP Gulf Oil Spill ResponseIn response to the Deepwater Horizon incident, The Joint Incident Command needed to have a publicly facing interactive websiteNeed to rapidly deploy a site that would support the public’s need to have a view into the responseTimeframe was extremely tight, need to go in roughly a week from idea to global deploymentAmazon cloud was the most cost effective solution
Mapufacture was actually acquired by FortiusOne in August of 2008 - we just haven't shutdown the service. It has been entirely incorporated into the fuller featured GeoCommons (http://geocommons.com). GeoCommons, and Mapufacture before it, aims to provide collaboration and access to data, visualization and analytics. On the public web with GeoCommons we allow anyone to upload and share open data. Our customers then license the platform, GeoIQ, to combine their business data with open data. For example, we work with real estate companies to compare their appraisals and listings with demographic information, school ratings, and even social media. We also actively work in Crisis response - as a founding member of CrisisCommons, and original member of OpenStreetMap, to use GeoCommons as a way to gather data that is useful to responders and citizens in dealing with a disaster.