Full Disclosure: I'm a member of the AVA team
I should start by saying there are lots of reasons to choose AVA, and I don't think speed is (necessarily) the most import one. Other good reasons include:
babel.transform(`foo()`, { | |
plugins: [ | |
function () { | |
return { | |
visitor: { | |
CallExpression() { | |
console.log('plugin 1'); | |
} | |
} |
Full Disclaimer: I am a member of the AVA team
Between babel-plugin-espower#12 and ava#466 we are starting to see some pretty good performance compared to mocha:
I used emoji-aware
for the benchmarks, it's a real life test suite with 4,300+ tests. It's a good candidate for benchmarking the efficiency of AVA's test Runner because:
// code | |
require('time-require') | |
require('bluebird') | |
// output | |
Start time: (2016-01-02 00:25:23 UTC) [treshold=1%] | |
# module time % | |
1 pretty-ms (node_modules/pretty-ms/index.js) 1ms ▇ 2% | |
2 ansi-styles (node_modu...tyles/ansi-styles.js) 1ms ▇ 2% | |
3 strip-ansi (node_modul.../strip-ansi/index.js) 1ms ▇ 2% |
TAP version 13 | |
# Subtest: (unnamed test) | |
1..1 | |
ok 1 - should be equal | |
ok 1 - (unnamed test) # time=10.216ms | |
1..1 | |
# time=36.141ms |
@bcoe @novemberborn The self-coverage stuff is actually a pretty interesting profiling tool when used in conjunction with npm link
. Since we automatically use index.covered.js
if it exists, we can get a better idea how our code behaves IRL. All these screenshots were generated running the AVA test suite:
First run, no cache hits:
We can see that out of 50 forked processes, it only becomes necessary to create an instrumenter in 3 of those processes (meaning that 47 forked processes simply pulled from the cache - even on the first run). This makes sense with the speedups I am seeing. You still get the majority of the benefit from caching even on your first run. The second run may be faster, but imperceptibly so.
Doing require extensions correctly is essential, because:
nyc
need it to reliably supply coverage information that takes into account sourcemaps from upstream transforms.I hereby claim:
To claim this, I am signing this object:
https://www.firebase.com/blog/2015-10-07-how-to-keep-your-data-consistent.html
function fanoutPost({ uid, followersSnaphot, post, postId }) {
// Turn the hash of followers to an array of each id as the string
var followers = Object.keys(followersSnaphot.val());
var fanoutObj = {};
// write to each follower's timeline
// Correction: I think it needs something like this:
followers.forEach((key) => fanoutObj['/timeline/' + key + '/' + postId] = post);
Got Value: { DC: 'District of Columbia', DE: 'Delaware', FL: 'Florida', FM: 'Federated States of Micronesia', GA: 'Georgia', GU: 'Guam', HI: 'Hawaii', ID: 'Idaho', IL: 'Illinois', IN: 'Indiana'