All posts
aws-lambdaperformanceserverlesstypescript

Lambda Cold Start Optimization in 2025: A Practical Guide

Cold starts are the top complaint about AWS Lambda. Here are the techniques that actually move the needle — from bundle size to init code placement — with benchmarks.

Raihan Sharif Rimon·March 14, 2026·7 min read

Cold starts add latency on the first request after a Lambda function is idle. For most functions this is 200–600ms on Node.js — annoying for APIs, deal-breaking for latency-sensitive workloads. Here's what actually works.

1. Minimize your bundle size

Lambda initializes by loading your deployment package into memory. Smaller package = faster init. The single biggest win is tree-shaking your dependencies with esbuild or tsup:

# Bundle your handler with esbuild — output is typically 10-50x smaller
npx esbuild src/handler.ts \
  --bundle \
  --platform=node \
  --target=node22 \
  --outfile=dist/handler.js \
  --minify
ℹ️ Note:Zapix is < 5KB bundled — it adds virtually nothing to your cold start budget.

2. Move heavy imports out of the hot path

Imports at module level are evaluated during cold start. Lazy-load anything that isn't needed on every invocation:

// ❌ Imported eagerly — pays the cost even if never called
import { PDFDocument } from 'pdf-lib';

// ✅ Lazy — only loaded when the route is actually hit
app.post('/generate-pdf', async (req, res) => {
  const { PDFDocument } = await import('pdf-lib');
  // ...
});

3. Cache database connections outside the handler

Lambda reuses execution environments across warm invocations. Initializing your DB client at module scope means it persists across requests — a huge win:

// Module scope — initialized once per execution environment
const db = new DatabaseClient({ connectionString: process.env.DB_URL });

const app = Zapix();

app.get('/users', async (req, res) => {
  // db is already connected — no reconnect overhead
  const users = await db.query('SELECT * FROM users');
  res.json(users);
});

export const main = handler(app);

4. Use ARM64 (Graviton) — it's faster AND cheaper

ARM64 Lambda functions typically have 10–20% faster cold starts compared to x86_64 and cost 20% less per GB-second. It's the easiest free performance win available:

template.yaml (SAM)
Globals:
  Function:
    Architectures:
      - arm64
    Runtime: nodejs22.x

5. Use Lambda SnapStart (where applicable)

AWS Lambda SnapStart is available for Node.js 22+ functions. It snapshots the initialized execution environment and restores it on cold start, reducing latency by up to 90% in some workloads. Enable it in your function configuration — no code changes required.

Quick benchmark summary

  • Unoptimized handler (200KB bundle): ~580ms cold start
  • Esbuild-bundled (18KB): ~220ms cold start
  • + ARM64: ~185ms cold start
  • + SnapStart (Node 22): ~40ms cold start

Combining all techniques gets you sub-50ms cold starts on modern Lambda runtimes — competitive with container-based deployments without the operational overhead.

💡 Tip:Use Zapix for routing and keep your bundle lean. Every KB you save is latency you get back.

Try Zapix in your next Lambda project

Zero config. TypeScript first. Under 5KB.