Node.js v22, AI-Warp, A deep dive on setImmediate, and many other Adventures in Nodeland
Debunking performance 'hack', Fastify v5 prep, asyncforge module intro, Node.js v22 - all in one!
Hi Folks,
I have been silent for long, and I've lost the habit of writing my thoughts in this newsletter. The weekly frequency that once was my target has been slipping for the last six months, leading to two months and a half of silence. I need to get back my writing mojo, and keep you all up to date.
This edition includes:
An in-depth explanation of why deferring work with
setImmediate()
worsen your application application performance long term.Fastify v5 release preparation starts!
asyncforge
, a new module to implement improssive Developer Experiences.
...and the usual summary of OSS releases and interesting articles.
What's new in Node.js v22?
A performance trick that would backfire: setImmediate
In 2017, Tomas della Vedova added a change to Fastify that seemed to drastically improve benchmarks: deferring all responses by an event loop with setImmediate()
. This technique can give a significant boost because it allows more than one low-event be processed before going back to C++. However, it uses significantly more memory, leading to more GC work, effectively slowing everything down in case there was no "spare" CPU available to run the GC. This is a very nice way to "cheat" benchmarks. We reverted it in [#545](https://github.com/fastify/fastify/pull/545) after Eran Hammer found the problem.
This walk through the memory lane was caused by the question "Why is H3 faster than Fastify?". H3 uses that technique: https://github.com/unjs/h3/blob/8345c1f493abc284e6272621929ecb6385f3e4f2/src/utils/response.ts#L12, and it exhibits once under significant stress.
Under significant stress, h3 has a latency P99 of 1000ms.
➜ autocannon -c 1000 -d 40 -p 10 http://127.0.0.1:3000
Running 40s test @ http://127.0.0.1:3000
1000 connections with 10 pipelining factor
┌─────────┬───────┬────────┬────────┬─────────┬──────────┬───────────┬──────────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼───────┼────────┼────────┼─────────┼──────────┼───────────┼──────────┤
│ Latency │ 58 ms │ 200 ms │ 806 ms │ 1004 ms │ 276.4 ms │ 600.84 ms │ 21301 ms │
└─────────┴───────┴────────┴────────┴─────────┴──────────┴───────────┴──────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec │ 75,455 │ 75,455 │ 84,735 │ 88,255 │ 84,049.6 │ 3,085.43 │ 75,439 │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 14.1 MB │ 14.1 MB │ 15.8 MB │ 16.5 MB │ 15.7 MB │ 576 kB │ 14.1 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 40
3391k requests in 40.08s, 629 MB read
4k errors (2k timeouts)
Fastify instead has a latency at P99 of 820ms under the same situation:
➜ autocannon -c 1000 -d 40 -p 10 http://127.0.0.1:3000
Running 40s test @ http://127.0.0.1:3000
1000 connections with 10 pipelining factor
┌─────────┬───────┬────────┬────────┬────────┬───────────┬───────────┬──────────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼───────┼────────┼────────┼────────┼───────────┼───────────┼──────────┤
│ Latency │ 49 ms │ 156 ms │ 598 ms │ 816 ms │ 192.95 ms │ 345.04 ms │ 13477 ms │
└─────────┴───────┴────────┴────────┴────────┴───────────┴───────────┴──────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬─────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Req/Sec │ 103,615 │ 103,615 │ 113,087 │ 115,711 │ 112,758.4 │ 2,060.4 │ 103,554 │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼─────────┼─────────┤
│ Bytes/Sec │ 19.5 MB │ 19.5 MB │ 21.3 MB │ 21.8 MB │ 21.2 MB │ 388 kB │ 19.5 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴─────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 40
4536k requests in 40.06s, 848 MB read
3k errors (1k timeouts)
As you can see, the throughput of h3 is also affected by this additional GC activity (during the benchmark, the CPU activity of h3 spiked to 150%). These were taken on my M1 max, with plenty of CPU and memory available. On a constrained system, the difference would likely be wider.
Even if H3 uses this technique, it's not faster than Fastify. When serving JSON, it does not add the mandatory charset=utf-8
, which is required to support Chinese characters in some odd environment. I've also amended the benchmark to take that into consideration.
Open Source Updates
The Road to Fastify v5
The target release date for Fastify v5 is July 19th, 2024. You can follow progress at issue [#5453](https://github.com/fastify/fastify/issues/5453) and [#5116] (https://github.com/fastify/fastify/issues/5116). The date is mostly indicative, as things might shift a few days or even a week, depending on how the updates of all the official plugins are going.
New module: asyncforge
Global values or singletons makes the code base tightly coupled, reducing the ability to refactor and extract part of the project later on. Moreover, it requires the implementation of module-mocking to write unit tests.
Recently, I have been exploring the use of AsyncLocalStorage
to solve some long-term developer experiences issues. Specifically, scoping global values to the context of an HTTP request and/or an HTTP server. Check out asyncforge
and fastify-asyncforge
The key idea is to store the db connection pool (and all other needed part of the app) attached to the AsyncLocalStorage.
import { memo } from 'asyncforge'
export const db = memo('db')
export function buildDB () { ... }
Then,
import { db, buildDB } from './db.js'
db.set(await buildDB())
However in tests one could do:
import { test } from 'node:test'
import { buildDB } from './db.js'
test('my db', async () => {
const db = await buildDB()
})
To use (in React and others):
import { db } from './db.js'
export function myFn () {
return db().query(...)
}
In Fastify:
import fastify from 'fastify'
import { start } from 'fastify-asyncforge'
import doWork from './do-work.mjs'
const app = fastify({
logger: true
})
await start(app)
app.decorate('foo', 'bar')
app.decorateRequest('a')
app.decorateReply('b')
app.addHook('onRequest', async function (req, reply) {
req.a = 'a'
reply.b = 'b'
})
app.get('/', async function (request, reply) {
doWork()
return { hello: 'world' }
})
app.listen({ port: 3000 })
and
import { logger, app, request, reply } from 'fastify-asyncforge'
export default function doWork () {
const log = logger()
log.info({
foo: app().foo,
a: request().a,
b: reply().b
}, 'doing work')
}
This is essentially identical to Next.js headers()
helpers etc.
Quite a lot of undici updates
v6.7.0 ships with a new
RetryAgent
class, performance improvements and a smaller package. v6.7.1 reverts the package size improvements.v6.8.0 improve
util.inspect
output for web specifications.v6.9.0 adds
compose()
to dispatchers, making the composition of dispatchers incredibly more flexible.v6.10.2 fixes a regression when using WebSocket.
v6.12.0 block ports 4190 & 6679 for fetch() as well as many fixes.
v6.13.0 10% more throughput, automated releases,
v6.14.0 add
EnvHttpProxyAgent
.v6.15.0 remove a memory leak of
AbortSignal
listeners; it also adds supportif-match
headers on retry handle.v6.16.0 uses
FinalizationRegistry
to cancel the body stream when theResponse
is garbage collected. v6.16.1fetch()
bugfixes.v6.17.0 fixes
fetch()
stack traces, adds thedump
interceptor and it improves theWebSocket
implementation.
Releases
@fastify/auth v4.6.0 improve typing to allow custom request and reply.
fastify v4.26.1 fixes a regression introduced by avvio v8.3.0.
fastify-cli v6.1.1 adds missing
c8
dependency.@fastify/secure-session v7.2.0 set missing key expect error.
@fastify/swagger-ui v3.0.0 reduces the package size by removing source maps.
@matteo.collina/snap v0.2.0 adds types.
async-cache-dedupe v2.1.0 only use
setImmediate
if it is defined and it has support for invalidation of redis invalid references.close-with-grace v1.3.0 adds support for a custom logger.
pino v8.19.0 many improvements to our browser version.
@fastify/websocket v9.0.0 adds
injectWS()
to easily tests websocket apps.@fastify/view v9.0.0 async flow control.
light-my-request v5.11.1 fixes a tiny bug on types.
fastify v4.26.2 ships with types updates and perf improvements.
light-my-request v5.12.0 updates
process-warning
to v5.borp v0.10.0 adds monorepo support via --build flag.
fast-json-stringify v5.13.0 adds a "raw" format for string - it would bypass escaping, thus enabling faster throughput.
@fastify/websocket v10.0.0 exposes the ws.WebSocket object instead of the ws.WebSocketStream. v10.0.1 adds a re-export of the WebSocket type.
mercurius v13.4.1 lcok graphql-jit dependency due to bad esm migration. v14.0.0 updates to latest @fastify/websocket
pino-roll v1.0.0 first major release of pino roll.
@fastify/one-line-logger v1.3.0 adds colors.
@fastify/bearer-auth v9.4.0 uses @fastify/error to create errors.
@fastify/http-proxy v9.5.0 keep websocket path on proxy.
pino-pretty v11.0.0 provides additional functionality to
customPrettifiers
andmessageFormat
.@fastify/restartable v2.3.0 adds generic to the types.
@fastify/multipart v8.2.0 makes sure the handler resolves in all cases.
@fastify/response-validation v2.6.0 Fix issue with passing in non-draft 7 AJV instance.
@fastify/compress v7.0.2 fixes incorrect vary header.
pino-roll v1.1.0 roll next file only once
pino v8.20.0 documentation improvements.
fast-json-stringify v5.14.0 improves performance, v5.14.1 Fix allOf with $ref property. 5.15.0 adds many performance improvements; v5.15.1 escapes single quote when building error message for required property
light-my-request adds
response.stream()
and support forFormData
.@fastify/compress v7.0.1 apply defaults to createBrotliCompress and createBrotliDecompress. v7.0.2 fixes incorrect vary header. v7.0.3 replace into-stream with Readable.from().
@fastify/static v7.0.2 retains the path when using fallback precompressed path. v7.0.3 fixes wildcard head return the connect
'content-length'
header.v7.5.1 adds the types.
@fastify/session v10.7.1. ensure maxAge type has milliseconds. v10.7.2 rejects invalid signer object.
avvio v8.3.1 fixes a crash and the error message in case of a plugin time with promises.
@fastify/view v9.1.0 adds
viewAsync
reply decorator.[pino-abstract-transport v1.2.0] allows to load the pino config at startup.
thread-stream v2.6.0 posts a message to the worker when the message event is emitted.
borp v0.12.0 adds support for checking c8 coverage.
mercurius v14.1.0 fixes multiple bugs.
@fastify/etag v5.2.0 supports matching weak etags.
@fastify/reply-from v9.8.0 passes
request
to thequeryString
function and adds an option for undici proxy agent.@fastify/compress v7.0.1 fix: default brotliOptions does not applied to createBrotliCompress and createBrotliDecompress.
@mercurius/gateway v3.0.1 implement multiple bugfixes.
borp v0.13.0 adds
--no-timeout
.fast-json-stringify v5.15.0 performance improvements
@fastify/sensible v5.6.0 exposes
HttpError
type.async-cache-dedupe v2.2.0 fixes/align ts types with code
@fastify/flash v5.2.0 always applies declaration merging
@fastify/oauth2 v7.8.1 fixes a typeerror when omitting credentials.auth.
mqemitter-redis v5.1.0 allows to connect to sharded redis cluster.
find-my-way v8.2.0 adds support for optional parameters on the root endpoint.
middie v8.3.1 only sets req.raw.body if body defined, aligning behavior with node:http.
@fastify/autoload v5.8.1 adds better support for TS. v5.8.2 fixes autohooks.
@fastify/jwt v8.0.1 minor fixes
mqemitter v6.0.0 drop old Node.js versions
pino-http v10.0.0 updates for pino v9.
fastify v4.27.0 handles synchronous error thrown in error handlers and adds more Webdav HTTP methods.
@fastify/session v10.8.0 fixes type argument inference of session.get(key) and session.set(key, value).
@fastify/aws-lambda v4.1.0 allow users to opt out of literal comma "," query parameters parsing.
Articles
Migrating from Node Redis to Ioredis: a slightly bumpy but faster road
Tobie Langel on LinkedIn: Tobie Langel - 1 Billion for Open Source Maintainers
- https://medium.com/the-node-js-collection/node-js-core-values-ab5387c4fd49
ESLint v9.0.0 released - ESLint - Pluggable JavaScript Linter
Secfault Security - Deno: Digging Tunnels out of a JS Sandbox
Should Node.js be built with ClangCL under Windows? – Daniel Lemire's blog
What we published at Platformatic
HTTP Fundamentals: Understanding Undici and its Working Mechanism
HTTP Fundamentals: Routing fetch() to an in-memory server instance
HTTP Fundamentals: How to Easily Make HTTP Calls with Platformatic
Videos
Upcoming Events
Building AI apps with Platformatic (remote) - May 23rd
Node.js Philly (remote) - June 5th
CityJS Athens - June 6-8
This Next Thing - June 16–19 - I lead the Milan chapter.
The Geek Conf (remote) - July 18th