0
AngularJS & Cloud
CloudConf 2014 – Gabriele Mittica
www.corley.it
JavaScript & Cloud?
Cloud based database and storage services are very popular.
Why?
The Cloud is
• Cheap for startup projects
• Ready to scale for growing projects
• Rich of services for complex projects
Ho...
FrontEnd BackEnd
HTML/JS REST PHP
MySQL / Mongo
HTTP
HTML/JSS
AWS
CLOUD
• With AngularJS we can create
apps that work with RESTful
resources or directly with cloud
services.
Next steps
• #1 - signup to AWS website
• #2 - access to the AWS Web console
• #3 - create a IAM user with access to all s...
Introducing Amazon Web Services
• Over 25 cloud based services available
• Several regions across the world
• JavaScript S...
Signup to AWS on aws.amazon.com
IAM: Identity and Access Management
AWS Identity and Access Management (IAM) enables us to securely control access
to AWS ...
We create an user (or a group of
users) with Power User Access
level, in order to grant the access
to all services.
Then, ...
Now we can use the JS/Browser AWS SDK
Paste in your HTML:
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.0.0-rc6.min....
Upload a file to Amazon Simple Storage Service with classic JS:
<input type="file" id="file-chooser" />
<button id="upload...
That’s all!
Very Easy!
The
browser
AWS
Storage
File uploaded
Our keys (primarily the secret one) are exposed.
Bad guys (backend developers?) could use our
keys for malicious intents!
...
Solutions:
#1
Use read-only
IAM keys
#2
Ask user to write
own keys
#3
Work with
own backend
Read-only
app
Hard IAM
managem...
The #4 solution
We can use
AWS Security Token Service to grant temporary credentials
for non autheticaded users.
They are ...
HTML/JS APP
STS
S3
DB
IAM
LOGIN WITH
ASSUME ROLE
ACCESS TO SERVICES
OK
OK
OK
Create an app that helps users to store incomes/expenses and track cashflow.
So we need:
- a database service where store ...
Simple Storage Service
DynamoDB
IAM
STS
Step #1: set the storage service
Simple Storage Service (S3) is a
Cloud storage that lets us to PUT and GET
private (backu...
Step #2: set the database service
DynamoDB is a fully managed NoSQL
database stored in the cloud.
We pay for the throughpu...
We have to choose the indexes for the table.
We set a primary key(string type) called userID that will be useful later.
We...
We want to manage the authentication with certified external websites
such as Amazon, Google and Facebook.
Step #3: create...
Creating an app we get an ID and we can set
allowed source (the url of our test/production
web application). HTTPS is requ...
Go back to http://aws.amazon.com/console,
and add a new role in the IAM area linked to our Amazon App.
Step #4: create the...
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRoleWithWebIdentity",
"Principal": {
"...
We add policy to this role giving
full access to S3 and DynamoDB
thanks to the policy generator:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1291088462000",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource...
<div id="amazon-root"></div>
<script type="text/javascript">
window.onAmazonLoginReady = function() {
amazon.Login.setClie...
new AWS.STS().assumeRoleWithWebIdentity({
RoleArn: ‘the-arn-of-the-role’,
RoleSessionName: ‘the-name-of-the-role’,
WebIden...
angular.module('myApp.services', [])
.factory('loggerManager', function(configLogger, $location, $rootScope){
var baseFact...
var dynamo = AWS.DynamoDB({region: "eu-west-1"});
dynamo.putItem({
TableName: "finance",
Item: data
});
Now we can PUT dat...
var bucket = new AWS.S3({params: {Bucket: 'financeapptest'}});
var fileChooser = document.getElementById('file-chooser');
...
• How to do that with:
'use strict';
angular.module('myApp', [
'ngRoute',
'myApp.filters',
'myApp.services',
'myApp.directives',
'myApp.controlle...
Thanks to that you have a configuration available along your app.
Now we have to find a way to work with cloud services in...
'use strict';
angular.module('myApp.services', [])
//provide methods to manage credentials of federated user
.factory('log...
/**
* login method (based on provider)
* @param provider the name of provider
* @param data data used for the login
* @par...
/**
* return the access key provided by amazon, google, fb...
*/
baseFactory.getAccessKeyId = function() {
if(baseFactory....
Then, a service to work with S3. This is a tiny example:
// provides methods to put and get file on S3
.factory('s3Ng', fu...
Wotking with Dynamo is more complex. This is an example:
.factory('dynamoNg', function (configAWS, loggerManager) {
var ba...
/**
* parse the dynamo data
* @param the data
* @returns the data extracted
*/
baseFactory.reverseModel = function(respons...
// provides methods to put and get file on S3
.factory('s3Ng', function(configAWS, loggerManager){
var baseFactory = {
han...
In a controller, start the auth:
.controller('HomeCtrl', function($scope) {
// login button to auth with amazon.com app
do...
In a controller, how to work with services:
.controller("FinanceCtrl", function($scope, $routeParams, dynamoNg, dynamoFina...
You can find an example on GitHub: it’s a work-in-progress app,
don’t use in production. It’s under dev and test.
https://...
The files & data that we’re storing in AWS
are protected by unauthorized users,
but are fully visible by other authorized ...
Step #5: fix the role policy
In Simple Storage Service, we can limit the access of each user to a
specific subfolder calle...
{
"Effect":"Allow",
"Action":[
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource":[
"arn:aws:s3:::financeuplo...
In DynamoDB, thanks to fine-grained access
we can allow the access only to the rows
owned by the user (the rows with his u...
{
"Effect":"Allow",
"Action":[
"dynamodb:GetItem",
"dynamodb:BatchGetItem",
"dynamodb:Query",
"dynamodb:PutItem",
"dynamod...
The data are now protected in the right way.
Each user has access to his data.
The app is now completed.
Simple Storage Service
DynamoDB
IAM
STS
We can create other apps that works with
the data thanks...
The cloud is perfect for growing projects,
thanks to the scalability of services
and the cost saving
especially in the sta...
Thank you!
Any questions?
@gabrielemittica
gabriele.mittica@corley.it
JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources
JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources
JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources
JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources
Upcoming SlideShare
Loading in...5
×

JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources

2,908

Published on

Published in: Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,908
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
36
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide

Transcript of "JavaScript & Cloud: the AWS JS SDK and how to work with cloud resources"

  1. 1. AngularJS & Cloud CloudConf 2014 – Gabriele Mittica www.corley.it
  2. 2. JavaScript & Cloud? Cloud based database and storage services are very popular. Why?
  3. 3. The Cloud is • Cheap for startup projects • Ready to scale for growing projects • Rich of services for complex projects How we can use the cloud with JS?
  4. 4. FrontEnd BackEnd HTML/JS REST PHP MySQL / Mongo HTTP
  5. 5. HTML/JSS AWS CLOUD
  6. 6. • With AngularJS we can create apps that work with RESTful resources or directly with cloud services.
  7. 7. Next steps • #1 - signup to AWS website • #2 - access to the AWS Web console • #3 - create a IAM user with access to all services • #4 - download and use the JavaScript SDK
  8. 8. Introducing Amazon Web Services • Over 25 cloud based services available • Several regions across the world • JavaScript SDK available • http://aws.amazon.con
  9. 9. Signup to AWS on aws.amazon.com
  10. 10. IAM: Identity and Access Management AWS Identity and Access Management (IAM) enables us to securely control access to AWS services and resources for our users, setting users and groups and using permissions to allow and deny their access to AWS resources. Backup system AWS Storage AWS Email Service Marketing app PUT & GET FULL ACCESS YOUR APP AWS SERVICESAWS IAM
  11. 11. We create an user (or a group of users) with Power User Access level, in order to grant the access to all services. Then, we have to download the access and secret keys that we’ll use with the JS SDK.
  12. 12. Now we can use the JS/Browser AWS SDK Paste in your HTML: <script src="https://sdk.amazonaws.com/js/aws-sdk-2.0.0-rc6.min.js"></script> available on http://aws.amazon.com/javascript Configure with your IAM credentials: <script> AWS.config.update({accessKeyId: 'akid', secretAccessKey: 'secret'}); AWS.config.region = ‘eu-west-1'; //set your preferred region </script>
  13. 13. Upload a file to Amazon Simple Storage Service with classic JS: <input type="file" id="file-chooser" /> <button id="upload-button">Upload to S3</button> <div id="results"></div> <script type="text/javascript"> var bucket = new AWS.S3({params: {Bucket: 'myBucket'}}); var fileChooser = document.getElementById('file-chooser'); var button = document.getElementById('upload-button'); var results = document.getElementById('results'); button.addEventListener('click', function() { var file = fileChooser.files[0]; if (file) { results.innerHTML = ''; var params = {Key: file.name, ContentType: file.type, Body: file}; bucket.putObject(params, function (err, data) { results.innerHTML = err ? 'ERROR!' : 'UPLOADED.'; }); } else { results.innerHTML = 'Nothing to upload.'; } }, false); </script>
  14. 14. That’s all! Very Easy! The browser AWS Storage File uploaded
  15. 15. Our keys (primarily the secret one) are exposed. Bad guys (backend developers?) could use our keys for malicious intents! <?php use AwsS3S3Client; $client = S3Client::factory(array( 'key' => 'our key', 'secret' => 'our key' )); while(true) { $result = $client->putObject(array( 'Bucket' => ’myBucket’, 'Key' => 'data.txt', 'Body' => ‘Give me a million dollars!' )); }
  16. 16. Solutions: #1 Use read-only IAM keys #2 Ask user to write own keys #3 Work with own backend Read-only app Hard IAM management Target missed
  17. 17. The #4 solution We can use AWS Security Token Service to grant temporary credentials for non autheticaded users. They are called Federated Users.
  18. 18. HTML/JS APP STS S3 DB IAM LOGIN WITH ASSUME ROLE ACCESS TO SERVICES OK OK OK
  19. 19. Create an app that helps users to store incomes/expenses and track cashflow. So we need: - a database service where store private cashflow entries - a storage service where upload private files (receipts, invoices, bills…) - a authentication service that manages the access to database and storage - an AngularJS app that merge al previous Example
  20. 20. Simple Storage Service DynamoDB IAM STS
  21. 21. Step #1: set the storage service Simple Storage Service (S3) is a Cloud storage that lets us to PUT and GET private (backup, private images…) and public (js, css, public images) files. We just have to create a bucket (folder) in S3 where we’ll store the files uploaded.
  22. 22. Step #2: set the database service DynamoDB is a fully managed NoSQL database stored in the cloud. We pay for the throughput setted. For example: 10 reads / 5 writes per sec = free 100 reads / 25 writes per sec = $31.58/month We just have to create a new table where store the user’s incomes and expenses. We set a low throughput for the begeinning.
  23. 23. We have to choose the indexes for the table. We set a primary key(string type) called userID that will be useful later. We set also a range key (numeric type) called timestamp that lets us query quickly the entries ordering by «insert datetime».
  24. 24. We want to manage the authentication with certified external websites such as Amazon, Google and Facebook. Step #3: create federated apps Go to http://login.amazon.com websites and create a new app. There is possible get the code for the login, as following:
  25. 25. Creating an app we get an ID and we can set allowed source (the url of our test/production web application). HTTPS is required.
  26. 26. Go back to http://aws.amazon.com/console, and add a new role in the IAM area linked to our Amazon App. Step #4: create the IAM role
  27. 27. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "sts:AssumeRoleWithWebIdentity", "Principal": { "Federated": "www.amazon.com" }, "Condition": { "StringEquals": { "www.amazon.com:app_id": "XXYYZZ" } } } ] } IAM lets users from our Amazon Login app to assume role:
  28. 28. We add policy to this role giving full access to S3 and DynamoDB thanks to the policy generator:
  29. 29. { "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1291088462000", "Effect": "Allow", "Action": [ "s3:*" ], "Resource": [ "arn:aws:s3:::financeapptest" ] }, { "Sid": "Stmt1291088490000", "Effect": "Allow", "Action": [ "dynamodb:*" ], "Resource": [ "arn:aws:dynamodb:eu-west-1:728936874546:table/finance" ] } ] } This is an example of the policy generated (full access to S3 bucket and DynamoDB table):
  30. 30. <div id="amazon-root"></div> <script type="text/javascript"> window.onAmazonLoginReady = function() { amazon.Login.setClientId('YOUR-CLIENT-ID'); }; (function(d) { var a = d.createElement('script'); a.type = 'text/javascript'; a.async = true; a.id = 'amazon-login-sdk'; a.src = 'https://api-cdn.amazon.com/sdk/login1.js'; d.getElementById('amazon-root').appendChild(a); })(document); </script> Get your code and add after <body> <script type="text/javascript"> document.getElementById('LoginWithAmazon').onclick = function() { options = { scope : 'profile' }; amazon.Login.authorize(options, 'https://www.example.com/handle_login.php'); return false; }; </script> Create a button (#LoginWithamazon) and the click event:
  31. 31. new AWS.STS().assumeRoleWithWebIdentity({ RoleArn: ‘the-arn-of-the-role’, RoleSessionName: ‘the-name-of-the-role’, WebIdentityToken: ACCESS_TOKEN, ProviderId: "www.amazon.com" }, function(err, data){ if(data && data.Credentials) { console.log(data); //we get the Amazon User ID } }); }; User is redirected to https://www.example.com/handle_login.php?ACCESS_TOKEN=XYZ
  32. 32. angular.module('myApp.services', []) .factory('loggerManager', function(configLogger, $location, $rootScope){ var baseFactory = { handler: new AWS.STS(), provider: false, credentials: {}, id: false }; baseFactory.logout = function() { if(baseFacory.provider == "amazon") { amazon.Login.Logout(); } }; baseFactory.login = function(provider, data, redirect) { … } ……….
  33. 33. var dynamo = AWS.DynamoDB({region: "eu-west-1"}); dynamo.putItem({ TableName: "finance", Item: data }); Now we can PUT data to DynamoDB
  34. 34. var bucket = new AWS.S3({params: {Bucket: 'financeapptest'}}); var fileChooser = document.getElementById('file-chooser'); var file = fileChooser.files[0]; if (file) { var params = {Key: file.name, ContentType: file.type, Body: file}; bucket.putObject(params, function (err, data) { console.log(data); }); } …and upload files to S3
  35. 35. • How to do that with:
  36. 36. 'use strict'; angular.module('myApp', [ 'ngRoute', 'myApp.filters', 'myApp.services', 'myApp.directives', 'myApp.controllers' ]). .constant('configAWS', { tableName: "finance5", bucketName: "financeuploads", region: "eu-west-1" }) .constant('configLogger', { amazonAppId:‘your-amazon.com-app-id', amazonRoleArn: 'arn:aws:iam::xxxxxx:role/amazon-login', amazonRoleName: "amazon-login", }); When you start the APP you have to set the Amazon.com app id and AWS role credentials:
  37. 37. Thanks to that you have a configuration available along your app. Now we have to find a way to work with cloud services integrating the AWS SDK in our app. There are several ways to to that with AngularJS. In this case we create factory services to wrap each needed feature. Firstly, a service to manage the auth.
  38. 38. 'use strict'; angular.module('myApp.services', []) //provide methods to manage credentials of federated user .factory('loggerManager', function(configLogger, $location, $rootScope){ var baseFactory = { handler: new AWS.STS(), provider: false, credentials: {}, id: false }; /** * logout method (based on ID provider) */ baseFactory.logout = function() { if(baseFacory.provider == "amazon") { amazon.Login.Logout(); } };
  39. 39. /** * login method (based on provider) * @param provider the name of provider * @param data data used for the login * @param redirect the destination after login */ baseFactory.login = function(provider, data, redirect) { //get the access params from AWS with the amazon login if(provider == "amazon") { AWS.config.credentials = new AWS.WebIdentityCredentials({ RoleArn: configLogger.amazonRoleArn, ProviderId: 'www.amazon.com', // this is null for Google WebIdentityToken: data.access_token }); //assume role from AWS baseFactory.handler.assumeRoleWithWebIdentity({ RoleArn: configLogger.amazonRoleArn, RoleSessionName: configLogger.amazonRoleName, WebIdentityToken: data.access_token, ProviderId: "www.amazon.com" }, function(err, data){ //login ok if(data && data.Credentials) { baseFactory.provider = provider; baseFactory.credentials = data.Credentials; baseFactory.id = data.SubjectFromWebIdentityToken; if(redirect) { $location.path(redirect); $rootScope.$apply(); } } }); } };
  40. 40. /** * return the access key provided by amazon, google, fb... */ baseFactory.getAccessKeyId = function() { if(baseFactory.credentials.AccessKeyId) { return baseFactory.credentials.AccessKeyId; } else { return ""; } }; /** * return the id provided by amazon, google, fb... */ baseFactory.getSecretAccessKey = function() { if(baseFactory.credentials.SecretAccessKey) { return baseFactory.credentials.SecretAccessKey; } else { return ""; } }; /** * return the user id */ baseFactory.getUserId = function() { if(baseFactory.id) { return baseFactory.id; } else { return ""; } }; return baseFactory; })
  41. 41. Then, a service to work with S3. This is a tiny example: // provides methods to put and get file on S3 .factory('s3Ng', function(configAWS, loggerManager){ var baseFactory = { handler:false }; /** * start the service */ baseFactory.build = function() { baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}}); }; /** * put file on the cloud storage * @param fileName * @param fileBody */ baseFactory.put = function(fileName, fileBody) { var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId() + "/" + fileName, Body: fileBody}; baseFactory.handler.putObject(params, function (err, data) { console.log(data); }); }; return baseFactory; })
  42. 42. Wotking with Dynamo is more complex. This is an example: .factory('dynamoNg', function (configAWS, loggerManager) { var baseFactory = { handler:false }; //build the servic baseFactory.build = function() { baseFactory.handler = new AWS.DynamoDB({region: configAWS.region}); }; /** * put an element in to dynamo table. Data is a formatted json for dynamo * @param table name * @param data are the data in JSON formatted for DynamoDB * @return the result of the query */ baseFactory.put = function(table, data) { return baseFactory.handler.putItem({ TableName: table, Item: data }); }; /** * Get an element from a DynamoDB table * @param table name * @param data the key to fetch * @return elements by the table */ baseFactory.get = function(table, data) { console.log("getting"); return baseFactory.handler.getItem({ TableName: table, Key: data }); };
  43. 43. /** * parse the dynamo data * @param the data * @returns the data extracted */ baseFactory.reverseModel = function(response) { var result = []; if(response.data.Count) { for(var ii in response.data.Items) { var item = response.data.Items[ii]; result[ii] = {}; for(var kk in item) { if(item[kk].S) { result[ii][kk] = item[kk].S; } if(item[kk].N) { result[ii][kk] = item[kk].N; } //binary type is missing! } } } return result; }; return baseFactory; }) ;
  44. 44. // provides methods to put and get file on S3 .factory('s3Ng', function(configAWS, loggerManager){ var baseFactory = { handler:false }; /** * start the service */ baseFactory.build = function() { baseFactory.handler = new AWS.S3({params: {Bucket: configAWS.bucketName}}); }; /** * put file on the cloud storage * @param fileName * @param fileBody */ baseFactory.put = function(fileName, fileBody) { var params = {Key: loggerManager.provider + "/" + loggerManager.getUserId() + "/" + fileName, Body: fileBody}; baseFactory.handler.putObject(params, function (err, data) { console.log(data); }); }; return baseFactory; })
  45. 45. In a controller, start the auth: .controller('HomeCtrl', function($scope) { // login button to auth with amazon.com app document.getElementById('LoginWithAmazon').onclick = function() { var options = { scope : 'profile' }; amazon.Login.authorize(options, '/dynamofinance/app/#/logged/amazon'); return false; }; }) And a controller to manage login (after app auth) and logout: .controller('LoginCtrl', function($scope, $routeParams, loggerManager) { //user comes back from amazon.com app login success if($routeParams.access_token) { //do the login with the provider got by the url loggerManager.login($routeParams.provider, $routeParams, "/finance/list"); }; }) .controller('LogoutCtrl', function($scope, $routeParams, loggerManager) { loggerManager.logout(); })
  46. 46. In a controller, how to work with services: .controller("FinanceCtrl", function($scope, $routeParams, dynamoNg, dynamoFinanceTable, s3Ng, loggerManager, configLogger, configAWS){ //build services dynamoNg.build(); s3Ng.build(); //.... More code here //upload file to S3 $scope.uploadFile = function() { s3Ng.put("your filename", $scope.upload); $scope.entryId = false; }; //store movement $scope.add = function(el) { //prepare the data to store el.date = el.date.toString(); var movement = dynamoFinanceTable.modelAmount(el); //store the data $scope.putMovement(movement); $scope.formReset(false); }; $scope.putMovement = function(movement) { dynamoNg.put(configAWS.tableName, movement) .on('success', function(response) { $scope.entryId = response.request.params.Item.date.S; $scope.$apply(); }) .on('error', function(error, response) { console.log(error); }) .send(); }; //... More code here });
  47. 47. You can find an example on GitHub: it’s a work-in-progress app, don’t use in production. It’s under dev and test. https://github.com/gmittica/angularjs-aws-test-app But our work is not finished.
  48. 48. The files & data that we’re storing in AWS are protected by unauthorized users, but are fully visible by other authorized users. Each user has access to data of the other ones. Security problem We have to refine the policies adding fine-grained conditions.
  49. 49. Step #5: fix the role policy In Simple Storage Service, we can limit the access of each user to a specific subfolder called with his userId. { "Effect":"Allow", "Action":[ "s3:ListBucket" ], "Resource":[ "arn:aws:s3:::financeuploads" ], "Condition":{ "StringLike":{ "s3:prefix":[ "amazon/${www.amazon.com:user_id}/*" ] } } },
  50. 50. { "Effect":"Allow", "Action":[ "s3:GetObject", "s3:PutObject", "s3:DeleteObject" ], "Resource":[ "arn:aws:s3:::financeuploads/amazon/${www.amazon.com:user_id}", "arn:aws:s3:::financeuploads/amazon/${www.amazon.com:user_id}/*" ] },
  51. 51. In DynamoDB, thanks to fine-grained access we can allow the access only to the rows owned by the user (the rows with his userID) It is also possible restrict the access of the role to specific columns.
  52. 52. { "Effect":"Allow", "Action":[ "dynamodb:GetItem", "dynamodb:BatchGetItem", "dynamodb:Query", "dynamodb:PutItem", "dynamodb:UploadItem" ], "Resource":[ "arn:aws:dynamodb:eu-west-1:728936874646:table/finance5", ], "Condition":{ "ForAllValues:StringEquals":{ "dynamodb:LeadingKeys":[ "${www.amazon.com:user_id}" ] } } }
  53. 53. The data are now protected in the right way. Each user has access to his data.
  54. 54. The app is now completed. Simple Storage Service DynamoDB IAM STS We can create other apps that works with the data thanks to different policies:
  55. 55. The cloud is perfect for growing projects, thanks to the scalability of services and the cost saving especially in the startup stage.
  56. 56. Thank you! Any questions? @gabrielemittica gabriele.mittica@corley.it
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×