PerformancePerformance analysis is amazingly complex There is no, single, silver-bulletDon’t want to compromise quality in favor ofperformanceAlso want to communicate the changes in a realisticway
Analyzing PerformanceWall-clock time Time in diﬀerent browsers CPU consumptionMemory consumption Memory leaksBandwidth consumption Parse timeBattery consumption (Mobile!)
Naïve SolutionPull in a raw list of wordsPush it into an object for fast property lookupsUses a lot of ﬁle sizeVery fast lookups
TrieA compact structure forstoring dictionariesOptimizes heavily for ﬁlesizeCan be rather expensiveto parseCan also use a lot of memory
Prove it.Any proposed performance optimization must beundisputedly proven.Show us the proposed changes and how it’ll aﬀectperformance across all platforms.How? JSPerf. http://jsperf.com/
JSPerfJSPerf is a great toolMakes it very easy to build a reproducible test: http://jsperf.com/valhooks-vs-val/2
See the Big Picture.Micro-optimizations are death.Doesn’t matter how much you unroll a loop if that loopis doing DOM manipulation.Most crippling web app performance is from DOMperformance issues.Pure JS performance is rarely an issue.
Prove the use case.If you’re proposing an optimization you must provewhat it’ll help.Show real world applications that’ll beneﬁt from thechange.This is especially important as it’ll help stop you fromwasting time on performance issues that don’t matter.
Clean Code.We won’t compromise our code quality in exchange forperformance.Almost all code quality compromises come fromneedless micro-optimizations.~~(1 * string) vs. parseInt( string )+new Date vs. (new Date).getTime()Don’t even get me started on loop unrolling.
Don’t Slow IE.Just because performance gets better in one browserdoesn’t mean it’ll get faster in all browsers.You shouldn’t compromise performance in otherbrowsers for the sake of one. (Unless that browser is IE, always improve IE performance.)
Communicating the ResultsCreating realistic testsCommunicating in an eﬀective manner
RealismIt’s incredibly hard to create realistic test casesIt’s important to look at actual applicationsWe frequently use Google Code Search to ﬁnd out howpeople are using our APIs (This gives us the knowledge that we need when we want to deprecate an API as well.)
BrowserscopeCollection of performance resultsOrganized by browserJSPerf plugs right in
Creating ResultsPull the results directly from BrowserScopeBest: Compare old versions to new versions Within the context of all browsers
.val() (get) (Number of test iterations, higher is better.)700000525000350000175000 0 Chrome 11 Safari 5 Firefox 4 Opera 11 IE 7 IE 8 IE 9 1.5.2 1.6
CompetitionYou might be inclined to compare performance againstother frameworks, libraries, applications, etc.This tends to create more problems than it’s worth And the comparison isn’t always one-to-oneIf competing, agree on some tests ﬁrstWork with your competition to create realistic tests
Compete Against YourselfIn the jQuery project we work to constantly improveagainst ourselvesEvery release we try to have some performanceimprovements Always compare against our past releasesRewriting API internals is a frequent way of gettinggood performance results
A particular slide catching your eye?
Clipping is a handy way to collect important slides you want to go back to later.