Skip to content

Instantly share code, notes, and snippets.

@weyoss
Last active February 15, 2025 09:56
Show Gist options
  • Save weyoss/24f9ecbda175d943a48cb7ec38bde821 to your computer and use it in GitHub Desktop.
Save weyoss/24f9ecbda175d943a48cb7ec38bde821 to your computer and use it in GitHub Desktop.
Callback vs Promise vs Async/Await
Callback vs Promise vs Async/Await benchmarks
Benchmark Files
https://github.com/petkaantonov/bluebird/tree/master/benchmark
Platform Info
Linux 5.13.0-40-generic x64
Intel(R) Core(TM) i5-3210M CPU @ 2.50GHz × 4
Summary
In terms of performance and memory usage, there is no alternative to callbacks.
Promise and async/await are slower and use more resources.
ls ./doxbee-sequential/*.js | sed -e 's|\.js||' | xargs node ./performance.js --p 1 --t 1 --n 10000
results for 10000 parallel executions, 1 ms per I/O op
file time(ms) memory(MB)
callbacks-baseline 329 24.58
callbacks-caolan-async-waterfall 420 50.55
callbacks-suguru03-neo-async-waterfall 426 41.49
promises-bluebird-generator 499 42.61
promises-native-async-await 558 56.47
promises-bluebird 570 50.09
promises-ecmascript6-native 614 67.87
promises-lvivski-davy 622 91.29
promises-cujojs-when 705 67.07
promises-then-promise 782 75.79
generators-tj-co 804 59.77
promises-tildeio-rsvp 911 91.99
promises-calvinmetcalf-lie 1099 141.34
promises-dfilatov-vow 1445 141.14
promises-obvious-kew 1471 104.65
observables-pozadi-kefir 1499 146.91
streamline-generators 1540 77.86
promises-medikoo-deferred 1758 132.80
streamline-callbacks 2190 102.14
observables-Reactive-Extensions-RxJS 2732 218.70
promises-kriskowal-q 5838 359.06
observables-caolan-highland 6688 488.01
observables-baconjs-bacon.js 10233 761.92
Platform info:
Linux 5.13.0-40-generic x64
Node.JS 16.14.0
V8 9.4.146.24-node.20
Intel(R) Core(TM) i5-3210M CPU @ 2.50GHz × 4
ls ./madeup-parallel/*.js | sed -e 's|\.js||' | xargs node ./performance.js --p 25 --t 1 --n 10000
results for 10000 parallel executions, 1 ms per I/O op
file time(ms) memory(MB)
callbacks-baseline 624 82.00
callbacks-suguru03-neo-async-parallel 719 88.20
promises-bluebird 1037 105.51
promises-lvivski-davy 1099 156.89
callbacks-caolan-async-parallel 1141 116.48
promises-bluebird-generator 1198 106.91
promises-cujojs-when 1391 157.09
promises-ecmascript6-native 2280 212.14
generators-tj-co 2289 225.09
promises-native-async-await 2346 218.36
promises-then-promise 2358 235.82
promises-calvinmetcalf-lie 2927 330.71
promises-tildeio-rsvp 3006 315.84
promises-medikoo-deferred 3859 356.98
promises-dfilatov-vow 5261 476.34
promises-obvious-kew 5971 657.50
streamline-generators 14209 857.03
streamline-callbacks 20183 1066.83
Platform info:
Linux 5.13.0-40-generic x64
Node.JS 16.14.0
V8 9.4.146.24-node.20
Intel(R) Core(TM) i5-3210M CPU @ 2.50GHz × 4
@ruxxzebre
Copy link

Hi! Can you share exact scripts you've used for benchmarking?

@iambumblehead
Copy link

iambumblehead commented Nov 7, 2023

This benchmark can be used with node's official benchmark.js scripts The last number is the rate of operations measured in ops/sec (higher is better).

async-vs-cb.nested-benchmark.js
const common = require('./node/benchmark/common.js');

const bench = common.createBenchmark(main, {
  n: [ 1400 ],
  type: [ 'await-deep', 'await-shallow', 'cb-deep', 'cb-deep-promisified' ]
});

async function main(conf) {
  let res = Math.random()

  const string = conf.string
  const type = conf.type

  if (type === 'await-deep') {
    bench.start();
    res = await (async function nestedAsync (val, count) {
      if (!count--) return val

      return nestedAsync(Math.random() > Math.random() ? 1 : -1, count)
    })(Math.random(), conf.n)
    bench.end(conf.n);
  }

  if (type === 'await-shallow') {
    const arr = Array.from({length: conf.n}, (v, i) => i)
    const oneAsyncRes = async (val, count) => (
      count + Math.random() > Math.random() ? 1 : -1)
    
    bench.start();
    for (const n of arr)
      res = await oneAsyncRes(n, res)
    bench.end(conf.n);
  }

  if (type === 'cb-deep') {
    bench.start();
    (function nestedCb(val, count, cb) {
      if (!count--) return cb(null, val)

      return nestedCb(Math.random() > Math.random() ? 1 : -1, count, cb)
    })(Math.random(), conf.n, (err, res) => {
      bench.end(conf.n)
    })
  }

  if (type === 'cb-deep-promisified') {
    bench.start();
    const res = await new Promise(resolve => (
      (function nestedCb(val, count, cb) {
        if (!count--) return cb(null, val)

        return nestedCb(Math.random() > Math.random() ? 1 : -1, count, cb)
      })(Math.random(), conf.n, (err, res) => {
        resolve(res)
      })
    ))
    bench.end(conf.n)    
  }  

  return res
}
$ node async-vs-cb.nested-benchmark.js
.js type="await-deep"          n=1400: 1,516,848.9413477853
.js type="await-shallow"       n=1400: 1,137,102.9378678831
.js type="cb-deep"             n=1400: 2,214,625.7045278023
.js type="cb-deep-promisified" n=1400: 1,907,328.3642888186

Conclusion: performance-sensitive, deeply-stacked callback code should not be converted to async-await.


also updating benchmarks to send and receive destructured values shows significant performance drop, so that async/await with array destructuring is very slow only getting about ~200,000 ops/sec on this local machine.

To elaborate, callback functions may receive multiple values, for example,

callback(function (a, b, c, d) {
  // ...
})

async function callers must receive one value and separate values are found with lookups or slower destructuring,

[a, b, c, b] = await asyncfn();

sending and receiving destructured values drops performance significantly, so that async/await with array destructuring is very slow only getting about ~200,000-300,000 ops/sec on the local machine here.

@weyoss
Copy link
Author

weyoss commented Dec 2, 2023

@iambumblehead

Conclusion: performance-sensitive, deeply-stacked callback code should not be converted to async-await.

I completely agree.

@iambumblehead
Copy link

github doesn't give me a place to "thumbs up" your reply, however, "thumbs up" :)

@iambumblehead
Copy link

iambumblehead commented Jan 29, 2025

Hello, I wanted to report back an interesting finding that occurred for me. I was preparing to replace async-await with callbacks in a js-defined runtime/state-machine and decided to first write benchmarks. That runtime is constructed around async-await (or callbacks) and is a very worthy and "real-world" performance-test of both callback and async-await approaches.

I wrote benchmarks using the same style benchmark.js method above. The first benchmark I wrote resolved "deep" values with the existing async-runtime. The second benchmark resolved the same values, using a separately modified copy of the runtime with callbacks rather than async-await.

Surprisingly to me, there was no performance difference between the two. If the benchmark was configured to run both tests many many times (40,000 times) then callbacks were seen to have a small performance benefit, but otherwise no practical difference between them.

I personally believe this more-recent benchmark over the older one shared here about a year ago and have basically decided to stop using callbacks. Possibly the callback test shared here about a year ago was too trivial and fell into some trivial-condition nodejs is optimized for.

Anyway, anyone facing this question should be encouraged to use benchmarks and verify things themselves before moving ahead.

An additional factor for me, code written around async-await will be more easily ported to different languages. When javascript has been fully vandalised and undermined, it will be a little easier to move over to whatever comes next.

@ruxxzebre
Copy link

If the benchmark was configured to run both tests many many times (40,000 times) then callbacks were seen to have a small performance benefit, but otherwise no practical difference between them.

Thanks for an update! It seems that v8 and nodejs team done a great job ensuring we won't use outdated callback style anymore.

@iambumblehead
Copy link

thanks for the thanks :)

@shd101wyy
Copy link

Nice 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment