• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
node.js, javascript and the future

node.js, javascript and the future






Total Views
Views on SlideShare
Embed Views



3 Embeds 20

https://twitter.com 16
http://www.linkedin.com 3
http://nodeslide.herokuapp.com 1



Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.


12 of 2 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    node.js, javascript and the future node.js, javascript and the future Presentation Transcript

    • javascript & the future
    • Jeff MiccolisDevelopment Seed
    • Open AtriumFeaturesContextStrongarm
    • Jeff MiccolisMapBox
    • First, a brief apology.
    • Three “war” storiesabout things that arehard with php & drupal
    • ...and easier in node.js
    • Two lessons I learned
    • ...from Drupal andnode.js work
    • V8
    • 1. Sending lots of email
    • OpenAtrium
    • Single emailsDigests email
    • SMSXMPP (jabber)Twitteretc, etc, etc...
    • channel independent messagesdon’t send e-mails to users, send them messages delivered by mail, IM, SMS, etc...d.o/project/Messaging
    • Problems
    • Send an email to one user - 50 ms 28 users - 1.4 seconds 600 users - 30 secondsProblems
    • Run cron every 5 minutes and you can send a lot more emails but if one run doesn’t complete...Problems
    • <?php$addresses = array( jeff@mapbox.com, alex@mapbox.com, eric@mapbox.com);$message = I am writing to inform you of a...;$count = 0;foreach($addresses as $sucker) { if (mail($sucker, New opportunity, $message)) { $count++; };}echo "$count messages sent";
    • var mail = require(mail);var addresses = [ jeff@mapbox.com, alex@mapbox.com, eric@mapbox.com];var message = I am writing to inform you of a...;var count = 0;addresses.forEach(function(sucker, i) { mail(sucker, New opportunity, message, function(err) { if (!err) { count++; } if (i == addresses.length - 1) { console.log(count +" messages sent"); } });});
    • // Attempt to send email.if (mail($sucker, New opportunity, $message)) { $count++};// Once email is sent, get the next address.echo next;// Attempt to send email.mail(sucker, New opportunity, message, function(err) { if (!err) { count++; }});// Dont wait, get the next address NOW.console.log(next);
    • Send an email ~ 5 ms processing, 45 ms waiting 28 emails ~ 140 ms to process, done 45 ms later.600 users ~ 3 seconds to process, done 45 ms later.Stop waiting around
    • 600 emails w/Php: ~30 seconds 600 emails w/node.js: ~3 secondsStop waiting around
    • DEMOStop waiting around
    • 2. Handling feeds
    • Fetch RSS feeds Fetch original article Tag items (calais) Tag items (watchlist) Tag items (wikipedia) Geocode itemsThe task
    • Thousands of feeds. Millions of items. Gigs and Gigs of data.The scale
    • cron.phpProblems
    • maggie
    • multi-threaded python daemonmaggied
    • 4 “retriever” workers get batches of 50 items. they fetch/tag/geocode each item.maggied
    • retrieve original story: 300ms tag: 100ms geocode: 150ms TOTAL: 550msmaggied
    • Nearly all of that 550ms is spent idle.Stop waiting around
    • ...but we could run “packs” of retrievers!
    • Is that really the bestidea?
    • Replace the retrieverswith a singleHYPERACTIVE SQUID!
    • The squid runs up and down as fast as it can dealing with each item in turn.It fires off any long running I/O operations and then moves on to the next item.When the I/O operation reports progress,it does a little more work on behalf of the corresponding item.
    • Event loop100% async
    • Anything that leaves v8 takes a callback.100% async
    • filesystem, network, stdio, timers, child processes100% async
    • var fs = require(fs);fs.readFile(/etc/passwd, function (err, data) { if (err) throw err; console.log(data);});var request = require(request);request(http://example.com, function (err, resp, body) { if (!err && resp.statusCode == 200) { console.log(body); }});
    • Limiting factors change: how many open sockets are you allowed? how much bandwidth can you grab? how fast can you issue requests?100% async
    • 3. Big files, long sessions.
    • gigabyteWhat’s big?
    • hoursWhat’s long?
    • HTTPWhat’s a session?
    • A category of stories...
    • upload_max_filesize post_max_size max_input_time max_execution_timeBig uploads in php
    • this approach caps out ~500mbBig uploads in php
    • Opens the door to DOS. Tolerates application bloat. Problems in production can get really bad. Never gonna get gigabyte uploads.Problems
    • Php makes you look elsewhere.Problems
    • If only we could stream...
    • If only we could stream...
    • Deal with things bucket by bucket and you don’t need all that memory.Streaming
    • write that files to disk as it comes in ...or stream it off to s3!Streaming
    • var formidable = require(formidable);var http = require(http);http.createServer(function(req, res) { if (req.url == /upload && req.method == POST) { var form = new formidable.IncomingForm(); form.parse(req); form.onPart = function(part) { part.addListener(data, function(chunk) { // Do cool stuff, like streaming! console.log(chunk.toString(utf8)); }); }; }}).listen(80);
    • _changes
    • A persistent connection to your database, which feeds you new data when it has some._changes
    • it’s amazing_changes
    • 1. Map uploads to s3 2. Save a record to CouchDB 3. “Downloader” listens for new mapsMapBox uploads
    • node.js process very long-lived http connection to CouchDBMapBox uploads
    • non-blocking I/O & single event loopEverything is different
    • 4. Package Management
    • Package management for Drupaldrush make
    • d.o - project namespace d.o - inclusive project policydrush make
    • a way to stay sane.drush make
    • ...and is part of drush proper now!drush make
    • But I’d been using Drupal for YEARS by thendrush make
    • pear
    • PHP Extension and Application Repositorypear
    • high threshold for new projectspear
    • wildly inclusive awesomely useful awesomely successfulimagine if pear was...
    • wildly inclusive awesomely useful awesomely successfulnpm
    • node package managernpm
    • pear: 584 d.o: 15, 296 (~3,600 for D7) npm: 7,976 (2 years old!)packages
    • { "author": "Jeff Miccolis <jeff@miccolis.net>", "name": "portal", "description": "Data Catalog", "version": "0.0.0", "engines": { "node": "~v0.6.6" }, "dependencies": { "couchapp": "https://github.com/.../attachment_operators", "underscore": "= 1.2.0", "request": "= 2.1.1" }} npm - package.json
    • you’ll love it.npm
    • 5. Nice hammer
    • “Javascript, really?”Nice hammer...
    • “Clearly he’s overly excited about this async stuff”Nice hammer...
    • “...and thinks it’ll work for everything.”Nice hammer...
    • “Do it with Drupal”, eh?
    • computationally heavy tasks databasesnode.js is bad for...
    • interacting with other services.node.js is awesome for...
    • databases, mail servers, web services, web clients..services like...
    • http://substack.net/posts/b96642http://blog.nelhage.com/2012/03/why-node-js-is-cool/Other people’s words
    • “The primary focus of most node modules is onusing, not extending... A big part of what makesnode modules so great is how they tend to havereally obvious entry points as a consequence of focusing on usability and limited surface area”Limited surface area
    • “Instead of the http server being an external service that we configure to run our code, it becomes just another tool in our arsenal”Callback Austerity
    • "The upshot of this pressure is that, since essentially every node.js library works this way, you can pick and choose arbitrary node.js libraries and combine them in the sameprogram, without even having to think about the fact that you’re doing so."Async by default
    • If nothing else you’ll get better at javascript.Try it!
    • But I bet you’ll like it.Try it!
    • Thanks!
    • Photo credit due to these wonderful people who offertheir photos on flickr under a creative commonslicense:Spam - http://www.flickr.com/photos/olivabassa/Cows - http://www.flickr.com/photos/lynndombrowski/Dogs - http://www.flickr.com/photos/photosightfaces/Squid - http://www.flickr.com/photos/laughingsquid/Ants - http://www.flickr.com/photos/fuzzcat/Stream - http://www.flickr.com/photos/universalpops/Boxes - http://www.flickr.com/photos/sillydog/Pear - http://www.flickr.com/photos/reebob/Hammer - http://www.flickr.com/photos/kefraya
    • What did you think? Locate this session on the DrupalCon Denver websitehttp://denver2012.drupal.org/program Click the “Take the Survey” link. Thank You!