Introduction: Why Benchmark Express vs Fastify
Express has been the de facto web framework for Node.js since its launch in 2010, known for its simplicity and huge ecosystem of middleware and plugins (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). Fastify, on the other hand, is a newer Node.js framework (first released in 2017) that was designed from the ground up with performance in mind. Many developers hear claims that Fastify can handle far more requests per second than Express, but how much faster is it in practice? In this post, we’ll benchmark a simple “Hello World” HTTP server in Express and Fastify to quantify the performance difference. This live benchmark will illustrate where Fastify’s speed comes from and help you understand whether switching to Fastify is worth it for your use case.
Before we start, you can get the full code for this example on GitHub here.
Setting up the Benchmark
To compare Express and Fastify, we will create a small Node.js project that runs both frameworks and measures their throughput using Autocannon (a Node.js HTTP benchmarking tool). Follow these steps to set up the benchmark environment:
Create Project
I’ll use pnpm as my packager manager of choice. Lets create a new project for this example by running:
pnpm init
This will provide us with a starting package.json that we can use to add our dependencies.
Install Dependencies
pnpm install express fastify autocannon
Benchmark Script (benchmark.js)
Here’s our benchmarking script, which sequentially benchmarks both Express and Fastify:
const express = require("express");
const fastify = require("fastify");
const autocannon = require("autocannon");
async function startExpress() {
const app = express();
app.get("/", (req, res) => {
res.send("Hello from Express");
});
return new Promise((resolve) => {
const server = app.listen(3000, () => resolve(server));
});
}
async function startFastify() {
const app = fastify();
app.get("/", async (request, reply) => {
return "Hello from Fastify";
});
await app.listen({ port: 3001 });
return app;
}
function runAutocannon(url) {
return new Promise((resolve, reject) => {
const instance = autocannon(
{
url,
connections: 100,
duration: 10,
},
(err, result) => {
if (err) {
reject(err);
} else {
resolve(result);
}
}
);
autocannon.track(instance, { renderProgressBar: true });
});
}
async function runBenchmarks() {
console.log("Starting servers...");
const expressServer = await startExpress();
const fastifyApp = await startFastify();
console.log("\nBenchmarking Express...");
const expressResult = await runAutocannon("http://localhost:3000");
console.log(`\nExpress req/sec: ${expressResult.requests.average}`);
console.log("\nBenchmarking Fastify...");
const fastifyResult = await runAutocannon("http://localhost:3001");
console.log(`\nFastify req/sec: ${fastifyResult.requests.average}`);
await expressServer.close();
await fastifyApp.close();
}
runBenchmarks();
What Does the Code Do?
- Server Initialization: Both servers are initialized and set up with a single route (
"/"
) responding with plain text. - Autocannon Benchmarking:
autocannon
runs benchmarks on each server, simulating 100 concurrent connections for 10 seconds. - Sequential Benchmarking: Express is benchmarked first, followed by Fastify, with average requests per second displayed after each benchmark completes.
Running the Benchmark
To run the benchmark, execute the script with Node:
➜ express-fastify-benchmark node benchmark.js
Starting servers...
Benchmarking Express...
Running 10s test @ http://localhost:3000
100 connections
┌─────────┬───────┬───────┬───────┬───────┬──────────┬──────────┬────────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼───────┼───────┼───────┼───────┼──────────┼──────────┼────────┤
│ Latency │ 14 ms │ 15 ms │ 20 ms │ 23 ms │ 15.84 ms │ 13.29 ms │ 640 ms │
└─────────┴───────┴───────┴───────┴───────┴──────────┴──────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬────────┬─────────┬────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼────────┼─────────┼────────┼─────────┤
│ Req/Sec │ 4,803 │ 4,803 │ 6,303 │ 6,503 │ 6,150.8 │ 467.85 │ 4,800 │
├───────────┼─────────┼─────────┼─────────┼────────┼─────────┼────────┼─────────┤
│ Bytes/Sec │ 1.18 MB │ 1.18 MB │ 1.55 MB │ 1.6 MB │ 1.51 MB │ 115 kB │ 1.18 MB │
└───────────┴─────────┴─────────┴─────────┴────────┴─────────┴────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 10
62k requests in 10.09s, 15.1 MB read
Express req/sec: 6150.8
Benchmarking Fastify...
Running 10s test @ http://localhost:3001
100 connections
┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬────────┐
│ Stat │ 2.5% │ 50% │ 97.5% │ 99% │ Avg │ Stdev │ Max │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼────────┤
│ Latency │ 5 ms │ 6 ms │ 9 ms │ 10 ms │ 6.48 ms │ 3.61 ms │ 258 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬────────┬─────────┬──────────┬─────────┐
│ Stat │ 1% │ 2.5% │ 50% │ 97.5% │ Avg │ Stdev │ Min │
├───────────┼─────────┼─────────┼─────────┼────────┼─────────┼──────────┼─────────┤
│ Req/Sec │ 12,655 │ 12,655 │ 14,247 │ 15,855 │ 14,460 │ 1,209.41 │ 12,648 │
├───────────┼─────────┼─────────┼─────────┼────────┼─────────┼──────────┼─────────┤
│ Bytes/Sec │ 2.32 MB │ 2.32 MB │ 2.61 MB │ 2.9 MB │ 2.65 MB │ 221 kB │ 2.31 MB │
└───────────┴─────────┴─────────┴─────────┴────────┴─────────┴──────────┴─────────┘
Req/Bytes counts sampled once per second.
# of samples: 10
145k requests in 10.04s, 26.5 MB read
Fastify req/sec: 14460
When you run the test, Autocannon will simulate 100 parallel connections hitting the server(s) for the specified duration. The script prints the average number of requests per second handled by each framework. In the example output above, Express handled around 6150 requests per second while Fastify handled about 14460 requests per second under the same conditions. (Your exact numbers may vary depending on your machine, Node.js version, and configuration, but you should consistently see Fastify outperforming Express by a significant margin.)
Note: We used a relatively simple “Hello World” JSON response with no additional middleware. This scenario highlights the baseline overhead of each framework. We also chose 100 concurrent connections for a short duration to stress the frameworks a bit. Feel free to adjust the connections
or duration
in the autocannon
options to see how results change. On more powerful hardware or longer test durations, the gap might be even larger.
Analyzing the Results
In our benchmark, Fastify achieved roughly 2 to 3 times the throughput of Express for a simple JSON response. This aligns with other findings — for example, in one test Fastify handled ~45k req/sec versus Express’s ~10k req/sec on the same hardware (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium) (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). The significantly lower latency in Fastify (and higher throughput in MB/s) indicates its efficiency in handling requests with minimal overhead.
We also see the average response latency to be much lower in Fastify. The average response latency for Express was 15.84ms while Fastify came in at 6.48ms. This means that our overall throughput and average response times are better with Fastify.
Why is Fastify so much faster? There are several technical reasons for Fastify’s performance advantage:
- Efficient Routing: Fastify uses an extremely fast routing engine under the hood (
find-my-way
) which is a highly optimized trie-based router. This allows route lookups to be very quick. In fact, Fastify’s router can outperform Express’s router by about 3x in routing throughput (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). For a simple single route this difference isn’t visible to the user, but in an application with many routes, Fastify’s routing incurs less overhead per request (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). - Faster JSON Serialization: When sending JSON responses, Fastify leverages schema-based serialization (using libraries like AJV and fast-json-stringify) which can be much faster than Express’s default
JSON.stringify
approach. By compiling JSON schemas, Fastify can transform response objects to JSON up to 2x faster than Express (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). In our benchmark both servers are just echoing a small JSON, but Fastify’s optimized serialization still gives it an edge. - Minimal Overhead and Native Code: Fastify is built on top of Node.js core HTTP modules without monkey-patching them (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). It avoids a lot of the magic and patching that some frameworks do, which means each request goes through less overhead. The framework is optimized for asynchronous performance – for example, it was designed with async/await in mind, whereas Express’s core was originally callback-based. Fastify’s lifecycle and plugin system have been engineered to reduce overhead for each request.
- Plugin Architecture vs Middleware: Both frameworks extend functionality through plugins/middleware, but Fastify’s plugin system is designed for encapsulation and performance. Fastify plugins can be registered in a way that scopes their effect and avoids global overhead. In contrast, Express middleware runs through a centralized pipeline for each request, which can add overhead especially as the number of middleware grows. Fastify’s approach tends to scale better with many plugins (it even supports zero-cost async hooks).
- Logging and Other Internals: Out of the box, Fastify includes the extremely fast Pino logger (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium). If you enable logging, Fastify will handle it more efficiently than Express combined with a typical logging middleware like Morgan. All these internal choices (like using Pino, and other micro-optimizations) contribute to Fastify’s speed.
It’s important to note that our “Hello World” benchmark is a best-case scenario for Fastify. In a real-world application, the difference may be less stark: if your endpoints do significant work (e.g. database queries or heavy computation), the framework’s overhead becomes a smaller portion of the total response time. Nonetheless, Fastify’s lean design means it will use less CPU per request, leaving more headroom for your actual application logic. Even with additional middleware or plugins, Fastify tends to maintain a performance lead over Express, though the gap might shrink as the work per request increases.
Conclusion
In this benchmark comparison, Fastify clearly outperformed Express in raw throughput, handling several times more requests per second for a simple scenario. Fastify’s advantage comes from its modern, performance-first architecture – from faster routing and serialization to an efficient plugin system. However, choosing a web framework involves more than just raw speed. Both Express and Fastify have their own strengths, and the “best” choice depends on your project’s needs:
- When should you still use Express? If you have an existing codebase built on Express or your team is already highly familiar with Express, sticking with it can be wise. Express has a vast ecosystem of middleware and plugins, and a huge community; practically any feature or integration you need has an Express middleware available (sessions, auth, etc.). This means less reinventing the wheel. Additionally, for many projects with moderate traffic, the performance difference might not be noticeable (Express.js vs Fastify – In-Depth Comparison of the Frameworks) – Express can already handle thousands of requests per second, which is sufficient for a large class of applications. Its stability and decades of community support make it a safe, well-documented choice. In short, use Express if you value its maturity, rich middleware ecosystem, and if maximum performance is not the top priority or bottleneck for your app.
- When is Fastify a better choice? If you are building a new service where performance is critical – for example, an API expected to handle a very high load or a microservice where every millisecond of latency matters – Fastify is often the better choice. Fastify shines in scenarios with lots of concurrent requests or when you want to optimize server resource usage. It’s also a great choice if you want to leverage its modern features like built-in schema validation (to automatically validate and serialize request/response data) or the integrated fast logger. Fastify has first-class TypeScript support and a growing ecosystem of plugins, making it increasingly feasible to replace Express in new projects. Keep in mind that Fastify’s ecosystem, while rapidly growing, is still smaller than Express’s, so you may occasionally need to implement a custom plugin for very niche functionality (Express.js vs Fastify – In-Depth Comparison of the Frameworks). But for most common needs (routing, validation, CORS, JWT auth, etc.), Fastify’s plugins have you covered. Choose Fastify when you need top-tier performance and are ready to adopt a newer framework that might require learning new patterns (like its plugin system), but will pay off in efficiency and speed.
In summary, Express vs Fastify is a trade-off between extreme familiarity and ecosystem (Express) versus raw performance and modern design (Fastify). This benchmark showed Fastify’s impressive throughput advantage (Express vs Fastify: A Performace Comparison | by Chetan Jain | Medium), but remember to consider the requirements of your own project. You might even use both in different contexts: Express for quick prototypes or apps where developer productivity matters more than performance, and Fastify for high-performance backend services. By understanding the strengths of each framework, you can make an informed decision and get the best of the Node.js ecosystem.