Express is 579KB. For edge computing, that’s a disaster. When you deploy to Cloudflare Workers or Deno Deploy, bundle size directly impacts cold start time—every kilobyte adds milliseconds to your users’ first request. Hono solves this: a 14KB web framework built on Web Standards that runs on Cloudflare Workers, Deno, Bun, AWS Lambda, and Node.js with the same code. Cloudflare’s D1 and Workers Logs teams already switched to it internally. Here’s why you should consider it for your next edge API.
Why Bundle Size Destroys Edge Performance
Edge computing platforms like Cloudflare Workers and Deno Deploy have fundamentally different constraints than traditional servers. Functions spin up on-demand across a global network, which means cold starts are real—and bundle size determines how fast your code initializes. Express at 579KB might be fine on a persistent Node.js server, but on the edge, it’s 41 times heavier than Hono’s 14KB footprint.
This isn’t just theoretical optimization. Cloudflare Workers runtime favors lightweight code because ephemeral functions distributed globally need to start fast. Every unnecessary dependency extends latency. Hono eliminates this by using only Web Standard APIs—no Node-specific dependencies, no bloat. The framework was built edge-first from day one, not adapted from server-side origins.
Multi-Runtime Portability as Strategic Advantage
Hono’s architecture delivers something rare: true multi-runtime portability. The same codebase runs on Cloudflare Workers, Deno, Bun, AWS Lambda, Fastly Compute, and Node.js without modification. Deploy to Cloudflare today, migrate to AWS Lambda tomorrow—your code doesn’t change.
This works because Hono depends exclusively on Web Standard APIs. No platform-specific libraries, no runtime assumptions. Portkey AI demonstrates this in production: they run Hono on both Cloudflare Workers and Node.js for their AI gateway, using the same framework logic across entirely different deployment targets.
But there’s a trade-off. Hono is slower on Node.js than Fastify because it uses an adapter to convert Web Standard APIs to Node’s input/output model. Fastify handles 40,000+ requests per second on Node; Hono can’t match that in the same environment. This is intentional—Hono optimizes for edge environments where it hits 402,820 operations per second on Cloudflare Workers. If you’re deploying exclusively to traditional Node.js servers, Fastify is the better choice.
Getting Started: Setup to Deploy in Minutes
Creating a Hono application for Cloudflare Workers takes three commands:
npm create hono@latest my-app -- --template cloudflare-workers
cd my-app
npm run dev
Your development server runs on localhost:8787. The basic application structure is minimal:
import { Hono } from 'hono'
const app = new Hono()
app.get('/', (c) => c.text('Hello Cloudflare Workers!'))
export default app
TypeScript support is first-class. Integrating with Cloudflare services like D1 databases requires type definitions passed as generics:
type Bindings = {
DB: D1Database
}
const app = new Hono<{ Bindings: Bindings }>()
app.get('/users', async (c) => {
const result = await c.env.DB.prepare('SELECT * FROM users').all()
return c.json(result)
})
Deployment is equally straightforward: npm run deploy publishes your API to a *.workers.dev subdomain or custom domain. The entire workflow—from scaffold to production—takes minutes, not hours.
Cloudflare’s Internal Adoption Validates Production Readiness
Hono isn’t a community experiment. Cloudflare uses it internally for core infrastructure. D1’s internal Web API runs on Hono. Workers Logs migrated from Baselime infrastructure to Hono-powered APIs on Workers—both internal and customer-facing endpoints. KV and Queues also integrate Hono into their internals.
This matters because Cloudflare operates at scale. Their D1 database service handles production traffic for thousands of developers. Workers Logs processes massive volumes of telemetry data. If Hono couldn’t handle real-world load, Cloudflare wouldn’t bet their products on it.
The framework was created by Yusuke Wada, a Cloudflare employee, to solve a practical problem: building applications for Cloudflare Workers without verbose boilerplate. What started as “yak shaving”—wanting to build apps but ending up creating a framework—became the foundation for Cloudflare’s internal tooling. With 200+ GitHub contributors and adoption by companies like Portkey AI, IntelliQ, Toddle, and Skill Struck, Hono has proven itself beyond Cloudflare’s walls.
When to Choose Hono, Fastify, or Express
No framework fits every use case. Here’s the decision matrix:
Choose Hono if:
- You’re deploying to Cloudflare Workers, Deno Deploy, or other edge platforms where bundle size impacts cold starts
- You want multi-runtime flexibility to avoid vendor lock-in (deploy anywhere without code changes)
- TypeScript-first development with full type safety matters to your team
Choose Fastify if:
- You’re building exclusively for Node.js servers where Fastify’s 40,000+ req/sec throughput outperforms alternatives
- You need a mature ecosystem with extensive middleware options
- Traditional server deployment is your only target
Choose Express if:
- You require the largest possible ecosystem with maximum middleware and package availability
- Your team is deeply familiar with Express patterns and migrating isn’t justified
- Performance isn’t a critical differentiator for your workload
Performance context matters: Hono delivers 402,820 ops/sec on Cloudflare Workers but is slower than Fastify on Node.js. Different environments produce different winners. Hono optimizes for edge constraints—minimal bundle size, Web Standard APIs, multi-runtime support. If those aren’t your constraints, other frameworks may serve you better.
The Edge Demands Edge-Optimized Tools
Edge computing is replacing traditional server architectures, but most frameworks weren’t designed for edge constraints. Express dominated the Node.js era because it solved server-side routing elegantly. That same 579KB bundle size becomes a liability when functions are ephemeral, distributed globally, and billed by compute time.
Hono exists because edge platforms need edge-native frameworks. Cloudflare’s internal teams recognized this and built their infrastructure on it. If you’re deploying to Cloudflare Workers, Deno Deploy, or similar platforms, Hono eliminates the bundle size penalty while maintaining clean APIs and TypeScript safety. The trade-off—slower performance on traditional Node.js servers—is intentional. Edge computing requires edge-optimized frameworks, and Hono delivers precisely that.

