Published at Jan 23, 2025
If you’ve worked with serverless functions, you’ve probably encountered cold starts - that initial delay when a function hasn’t been used recently and needs to spin up from scratch. While investigating slow cold start times in our Firebase Cloud Functions, I discovered something that might seem obvious in hindsight but caught me off guard.
Our serverless functions were experiencing 11-second cold starts, which became problematic when we needed to scale up quickly. Initially, I suspected either bloated dependencies or slow infrastructure provisioning. After adding timing logs throughout the codebase, I found that the container itself started almost immediately - the delay was happening during code evaluation.
Here’s what our main entry point looked like:
console.time("Import start");
import { abc } from "./api/abcHandler";
import { def } from "./api/defHandler";
import { ghi } from "./api/ghiHandler";
// ... many more imports
import { xyz } from "./api/xyzHandler";
console.timeEnd("Import end");
const handlers: BaseContainerFunction[] = [
new abc(),
new def(),
new ghi(),
// ... instantiating all handlers
];
const server = new ContainerServer(handlers);
server.serve();
This seemingly innocent code was taking 10 seconds to evaluate - nearly our entire cold start time.
Module imports in JavaScript aren’t just copy-paste operations. When you import a module, Node.js evaluates all the code in that module immediately. If those modules have their own imports, it creates a cascade of evaluations that can significantly impact startup time.
For example:
// greet.js
console.log("Module loaded");
// index.js
import "./greet.js";
Running node index.js
will immediately execute the console.log statement.
This becomes especially problematic when your modules import heavy dependencies like ORMs, database clients, or third-party services - all of which get evaluated at startup regardless of whether they’re actually needed for the current request.
The fix was to replace static imports with dynamic imports in our handler classes:
// Before
import { AbcService } from "~/services/abcService";
export class Abc extends BaseCloudFunction {
protected override async execute(payload: AbcPayload, auth: Auth) {
const service = new AbcService();
return service.execute({ ...payload, auth });
}
}
// After
export class Abc extends BaseCloudFunction {
protected override async execute(payload: AbcPayload, auth: Auth) {
const { AbcService } = await import("~/services/abcService");
const service = new AbcService();
return service.execute({ ...payload, auth });
}
}
With dynamic imports, modules are only loaded when they’re actually needed. Node.js caches dynamically imported modules after their first evaluation, so subsequent calls to the same endpoint don’t incur the import cost again.
This change reduced our cold start time from 11 seconds to 500ms - a 95% improvement. The first request to each endpoint might have a slight delay as its dependencies load, but this is a much better tradeoff than having every cold start take 11 seconds.
This optimization isn’t limited to serverless functions. The same technique can improve startup times for:
While it’s easy to blame performance issues on frameworks or infrastructure, the root cause is often in how we structure our own code. JavaScript makes it surprisingly easy to accidentally load more than you need at startup, but with the right patterns, you can avoid these pitfalls.