-
1.
PROMISES/A+
WAS BORN
BOOM!
-
2.
HI, I’M @DOMENIC
-
3.
The JavaScript community’s
greatest strength
is that we turn tiny primitives into
powerful patterns.
-
4.
But it is also
our greatest weakness.
-
5.
The Promises/A+ story is a story wherein
the community overcame this weakness.
-
6.
Let’s talk about async.
-
7.
CONTINUATION PASSING STYLE
var fileName = "data.txt";
fs.readFile(fileName, function (err, data) {
// ...
});
-
8.
CPS is easy.
But it is not simple.
-
9.
CPS traps you in the Turing Tarpit.
return and throw, where did you go!
-
10.
ENTER PROMISES
Joule and E’s promises
Java’s java.util.concurrent.Future
Python’s Twisted deferreds and PEP-3148 futures
F#’s Async<T>
.NET’s Task<T>
C++11’s std::future
Dart’s Future<T>
JavaScript’s Promises/A+
-
11.
The point of promises is simple:
to give you back async versions of
return and throw.
-
12.
function* someNumbers() {
console.log("A");
yield 1;
console.log("B");
yield 2;
console.log("C");
yield 3;
console.log("D");
}
var iterator = someNumbers();
console.log(iterator.next()); // "A", { value: 1, done: false }
console.log(iterator.next()); // "B", { value: 2, done: false }
console.log(iterator.next()); // "C", { value: 3, done: false }
console.log(iterator.next()); // "D", { value: undefined, done: true }
http://jsbin.com/eqamor/1/edit
-
13.
Q.async(function* () {
$("#loading").text("Loading...").fadeIn();
try {
var repoEvents = yield getRepoEvents("kriskowal", "q");
updateUI(repoEvents);
$("#loading").text("Loaded!");
} catch (e) {
$("#loading").text("Error loading data: " + e.message);
} finally {
yield Q.delay(5000);
$("#loading").fadeOut();
}
})();
http://jsbin.com/igimow/1/edit
-
14.
As a bonus, we get time travel:
promises are objects representing
objects from a different time.
expect(promise).to.eventually.deep.equal(["zomg", "jsconf!"]);
-
15.
// ES5
userPromise
.get("repos").get(0)
.get("commitHistory")
.invoke("toHTML")
.then(displayInUI)
.done();
// ES6
const html = yield userPromise.repos[0].commitHistory.toHTML();
displayInUI(html);
-
16.
// ES5
userPromise
.get("repos").get(0)
.get("commitHistory")
.invoke("toHTML")
.then(displayInUI)
.done();
// ES6
const html = yield userPromise.repos[0].commitHistory.toHTML();
displayInUI(html);
-
17.
var Connection = require("q-connection");
var remote = Connection(port);
// a promise for the remote user object!
var userPromise = remote.invoke("getUser", "domenic");
Web Socket
Web Worker
Message Port
https://github.com/kriskowal/q-connection/
-
18.
PROMISES IN JAVASCRIPT
The path to Promises/A+ has been
long and treacherous.
-
19.
It all started with CommonJS Promises/A
https://groups.google.com/d/msg/commonjs/6T9z75fohDk/U_0Gl9fPJxsJ
-
20.
Enter $.Deferred
http://bugs.jquery.com/ticket/11010
-
21.
Not Again!!
https://github.com/emberjs/ember.js/pull/1406
Horrible lies!
-
22.
https://gist.github.com/domenic/3889970
-
23.
https://github.com/domenic/promise-tests
-
24.
https://gist.github.com/briancavalier/eb5fc157825a170c9957
-
25.
http://promisesaplus.com
-
26.
ok, so what?
-
27.
https://github.com/promises-aplus/promises-spec/blob/master/implementations.md
-
28.
http://dom.spec.whatwg.org/#futures
-
29.
And if you don’t have an API you might
as well set up some Futures. Just in case.
-
30.
http://wiki.ecmascript.org/lib/exe/fetch.php?media=strawman:roadmap.pdf
-
31.
… what just happened?
-
32.
OPEN SPECIFICATION DEVELOPMENT
What made the Promises/A+ effort work so well?
The cause
The people
The code
The contract
The setting
-
33.
the cause:
bringing sane asynchronicity to JS
-
34.
the people:
a strong and cooperative community
-
35.
the code:
existing convergent and widely-loved solutions
-
36.
the contract:
a small core for interoperation via standardization
-
37.
the setting:
GitHub
-
38.
An open standard for sound, interoperable JavaScript
promises—by implementers, for implementers.
-
39.
With these ingredients, you can extend the web forward.
http://yehudakatz.com/2013/05/21/extend-the-web-forward/
-
40.
THANKS!
promisesaplus.com
@promisesaplus
And I’m happy to be here.
The most amazing example of this is ES5 module systems.But we see this over and over, from modules to classes to callbacks to streams to events to something as small as chainable APIs…We use function and object to cobble together amazing applications and propagate the wisdom we used to create them.Sure, it’s a a bunch of hacks, but they’re practical ones: and that’s what JavaScript’s always about.
We become prideful in our ability to shape worlds out of the primordial stardust of functionWe get stuck in our patterns, saying that because a certain thing is easiest to express in terms of function, it is therefore bestMore generally, we assume that the best ideas are those that express naturally in JS, and refuse to learn from other languages.
The async programming revolution has been brewing for a while.While JS made it popular, because JS is popular, it’s not the first place these ideas have been explored.Unfortunately, JS’s initial async APIs were designed as part of the DOM, and we all know how …. pleasant …. DOM APIs are.
Even Node.js was not able to escape the DOM’s legacy.It did decide on a common signature for its callbacks---mostly.But it never took a step back and surveyed the landscape of design options for asynchronous programming.It fell into the “moar functions” trap, and did literally the simplest thing.
CPS is the easiest thing to do in JavaScript, where function is your favorite tool. You reach out and it’s right there.You need to do an async operation? Just stuff the rest of your code into a function! Everyone can do that! Anyone who can’t is clearly a JavaScript noob (not kidding!)But it is not simple. What do I mean by that…
Why is CPS complex? Because it traps you into the Turing Tarpit. The idea that yes, indeed, JS is Turing complete, so we can build whatever system we want in it.This is usually applied to things like reinventing large parts of the web platform using JS, because HTML and CSS aren’t flexible enough. But when you end up using CPS, this is exactly what happens to async programming.It forces you to reinvent the basic features of the language, features like returning a value or throwing an error, in terms of function conventions.If you think about it, you have essentially reinvented the call stack!You end up using combinator functions tying together these callback functions just to express what was already in your language!And sure, it’s not “hard”---it is, in fact, easy. Everyone can do this, and everyone does. But it’s not simple.This kind of complexity has all kind of cognitive and maintenance burdens, as you try to understand the gymnastics your intertwined function combinators are forcing you through. Did someone accidentally call the callback twice? What’s the value of that counter I use for doing things in parallel? Did I remember to pass up any errors to my caller? I don’t know, and I don’t want to know! I just want an asynchronous function call; I don’t care how!
An asynchronous function call.When you see something like this, generally it means one of two things:An idea so good that everyone had to copy itConvergent evolution on a natural solutionPromises are a bit of both.I don’t really want to show you how to use promises in JavaScript. Read some blogs for that. Instead I want to make you understand why you should use promises in JavaScript.
Instead of thrashing wildly in the Turing Tarpit, only to sink further, we turn our attention toward an abstraction that can bring us back into the semantics of JavaScript.We create an async call stack, regaining all the semantics from our language and integrating well with synchronous return/throw.In this way promises are fundamentally simple, allowing composability in the same way our normal language constructs do. You can compose asynchronous functions without gymnastics, without having to entangle your concern of calling the function with the function’s concern of doing asynchronous work.There’s a rather drastic demonstration of this available to you behind some V8 flags. Let me introduce you to generators…
The other major thing about promises is…
Not only do we get back our language semantics of try/throw, we also get as a bonus these first-class objects representing asynchronous operations, i.e. representing objects from a different time.You can see an example of how powerful this is, where we do unit test assertions on our promise, about what its value will eventually be. This is implemented today in a library I’ve created.
Here’s some code the that uses promises.Note that this is beyond Promises/A+’s `then` interface. But the basic idea is that we have a promise for a user, then we call this method to get a promise for its “repos” property. (Etc.)Just pointing out that with ES6 proxies, we can make this a lot nicer.So that’s kind of cool, it’s some nice sugar over a bunch of traditional promise code. And of course with ES6 it’s delightful. But…
Everything here takes on a whole new meaning!Consider this in light of traditional “remote object” systems. They usually fall down in various ways, for example:They would translate this into a series of requests, e.g. first for the repos, then for the 0th property, etc.Or they would try to maintain a local copy of the remote object, which leads to synchronization problems and complex serialization and rehydration approaches, deciding when to do things locally and when remotely, and how to synchronize them.But with promises as the abstraction, just like we normally use them to co-locate our operations in time, we can also use them to co-locate in space. We can “pipeline” these messages from one side to the other, retrieving only the ultimately-desired result (in this case some server-rendered HTML).
How do you get a userPromise? Use Q-ConnectionQ-Connection has the whole promise pipelining thing. Other approaches to promises representing remote objects can be found in a framework like OasisJS.
That brings us to the end of our “promises are really cool” portion. Now I want to talk about that story I promised you earlier, where as a community we overcame our greatest weakness in order to push promises to the level they’re at today.
Actually, it all started with Dojo, as did pretty much everything apparently.Promises/A captured the core idea of promises from other languagesBut: it was underspecified, missing key features, and written in prose that was easy to misinterpretOne of the consequences of this was… jQuery $.Deferred.
… yeah. They missed the whole async/sync parallel thing. They failed Promises/A reading comprehension.
Whatever. I can do my own thing. I tried.But I swore a solemn vow on the grave of the dead callbacks I replaced … I’M NOT GOING TO LET THIS HAPPEN AGAIN!
At the bottom of the gist, I wanted to end on a positive note, so I said I would produce a Promises/A test suite, and a few hours later I did.At this point Yehuda got in touch, RSVP.js
And this time, we made a test suite!
It turns out that when you write a clear and thoughtful specification of something people have been implementing haphazardly for a while, and accompany it with a thorough test suite, people really like to implement that spec.
We’ve ended up with over thirty implementations, with new ones streaming in every day.Indeed, we’ve ended up with ones in ActionScript 3, Python, and Objective C! O_oWhat’s wonderful is that, because Promises/A+ only specifies the core unit of interoperability---each implementation’s `then` method---anyone can build libraries that consume promises from any implementation. This is key!
Even the DOM spec authors wanted to get in on this Promises/A+-implementing action! And now we have… “DOM Futures”. O_o.But seriously, promises are now in the DOM! And there’s been major work throughout the WHATWG and W3C to encourage the use of “futures” in upcoming APIs. In fact…
So that’s that…
Somehow, Promises/A+ has become the starting point for any conversation about promises in JavaScript.How did Promises/A+ end up supplanting Promises/A in mindshare so drastically?Why am I getting weekly queries about whether jQuery will fix their promises implementation to conform to us? (jQuery! Think of how many users that has!)How did we go from some nerd rage over a pull request, to a specification that ended up influencing the DOM and possibly even ES7?How did a bunch of implementers congregating on GitHub, just doing our own thing, end up influencing the WHATWG and TC39?
The answer to this question, of “What made the Promises/A+ effort work so well?” boils down to a few principles of what I like to call “open specification development.”
This is the stuff we talked about earlier.You can’t build a specification like this around things nobody cares about. You need to solve real problems, and you need to solve them with coherent solutions.
Led by Brian CavalierWe all cared deeply about these issues, but were not too far apart in our goalsWe were willing to compromise (Brian notably enforcing asynchronicity in When.js 2.0)
The biggest reason harks back to what I was saying at the beginning. We harnessed JS developers greatest strength, how we had already built promises from the fundamental primitives available to us. We’d been doing this for literally years before banding together. Implementations like Q and when were in widespread use, and Yehuda’s RSVP.js was starting to make ripples. We all had the experience and knowledge to know what worked and what didn’t. In short, code before prose.
All we specified was the `then` method---because that was enough!In contrast, we could have been fighting over which library’s API, with all its glorious surface area and helper methods, became “standard.”Heck, we didn’t even specify how to create a promise!This is why I like to say that DOMFuture is a Promises/A+ promise implementation, even though it itself is a spec: it, like Q or when, builds on the core interoperable `then` method in order to create a larger surface area for its consumers to adopt (in this case the browsers).
GitHub is where we work and play, as a communityIt encourages easy forking, pull requests, and discussionIt has great Markdown integration, which is notable since Markdown diffs are very easy to readRevision history and past discussions are easily viewable and searchable, allowing new participants to jump in to the standards process just like they would with a code projectExperimental spikes can take place in branchesEven the W3C and WHATWG are starting to see this, but they haven’t made the complete transition, with their high reliance on archaic mailing lists and their hard-to-read HTML diffs
At this point I want to stop and reflect on the Promises/A+ tagline. We spent some time on this, but in the end I think it’s perfect: we mean every word.We’re open: we do everything on GitHub, out in the open.We’re sound: that’s our cause. We’re interoperable: that’s our contract. And it’s by implementers and for implementers, reflecting our commitment to build on existing code and to leverage the strength of the promise community.So if you have all these “open specification development” ingredients…
With this phrase, I’m referring to a specific philosophy which has recently been championed by the W3C’s newly-reformed Technical Architecture Group, most prominently by the efforts of Yehuda Katz, Alex Russell, and Brian Kardell.The essential idea is to build the web platform, not on magic browser APIs, but on composable JavaScript primitives.This leads to exactly the kind of virtuous cycle that Promises/A+ has exhibited: the community creates an API, competes and refines it together, reaches a convergence, and eventually the common primitives get incorporated into the web platform itself.