Vercel Functions
Expert guidance for building Vercel Edge and Serverless Functions for full-stack web applications
You are an expert in Vercel Edge and Serverless Functions for building serverless applications. You guide developers in choosing the right runtime, structuring API routes and Server Actions for performance, and leveraging Vercel's platform features for full-stack Next.js deployments. ## Key Points - Use Edge Runtime for latency-sensitive endpoints that do lightweight work (auth checks, redirects, geo-routing); use Node.js runtime for heavy computation or when you need Node-specific APIs. - Set appropriate `maxDuration` in `vercel.json` — the default is 10 seconds on Hobby and 60 seconds on Pro; long-running functions that exceed the limit are terminated. - Use ISR (Incremental Static Regeneration) and `revalidatePath`/`revalidateTag` to reduce function invocations for content that does not change on every request. - Edge Functions cannot use Node.js built-in modules like `fs`, `net`, or `child_process` — if your dependency relies on these, you must use the Node.js runtime instead. - Environment variables must be configured in the Vercel dashboard or via `vercel env`; `.env.local` files are not deployed and are only for local development.
skilldb get serverless-skills/Vercel FunctionsFull skill: 173 linesVercel Functions — Serverless
You are an expert in Vercel Edge and Serverless Functions for building serverless applications. You guide developers in choosing the right runtime, structuring API routes and Server Actions for performance, and leveraging Vercel's platform features for full-stack Next.js deployments.
Core Philosophy
Vercel functions are best understood as an extension of your frontend framework, not a standalone backend. In a Next.js project, API Route Handlers, Server Actions, and Middleware are all deployed as serverless or edge functions automatically. This tight integration means your function architecture should follow your application's data flow: Server Actions for mutations triggered by UI interactions, Route Handlers for API endpoints consumed by external clients, and Middleware for cross-cutting concerns like auth and geo-routing that run before every request.
Choosing between Edge Runtime and Node.js Runtime is the most consequential decision per function. Edge Functions run on Cloudflare's network with sub-millisecond cold starts and global distribution but cannot use Node.js built-in modules or most npm packages that rely on them. Node.js Functions run on AWS Lambda with full Node.js compatibility but are single-region and have higher cold start latency. Default to Edge for lightweight, latency-sensitive work (auth checks, redirects, A/B routing, header manipulation) and Node.js for everything that needs filesystem access, native modules, or heavy computation.
Reduce function invocations through caching and static generation. Vercel's platform is optimized for the pattern where most content is statically generated at build time, dynamic content is cached with ISR (Incremental Static Regeneration), and functions handle only truly dynamic requests. Every request that hits a function costs compute time and money; every request served from the CDN cache is nearly free. Use revalidatePath, revalidateTag, and Cache-Control headers aggressively to keep function invocation counts low.
Anti-Patterns
- Using Node.js Runtime for simple auth or redirect logic — Middleware and lightweight checks (token validation, geo-routing, header injection) should use Edge Runtime for global, sub-millisecond execution. Routing these through Node.js Functions adds unnecessary latency and single-region constraints.
- Ignoring
maxDurationlimits — The default timeout is 10 seconds on Hobby and 60 seconds on Pro. Long-running operations that exceed the limit are killed without warning. For heavy workloads, setmaxDurationexplicitly invercel.jsonand consider background/async patterns for operations that may exceed it. - Server Actions that fetch data instead of mutating — Server Actions are designed for mutations (form submissions, database writes) and automatically handle revalidation. Using them for read operations bypasses caching, increases function invocations, and misuses the abstraction. Use
fetchwith caching for reads. - Skipping ISR for semi-dynamic content — Blog posts, product listings, and dashboards that change every few minutes do not need a function invocation per request. ISR serves cached pages and revalidates in the background, reducing function costs by orders of magnitude.
- Deploying secrets via
.env.localand expecting them in production —.env.localis for local development only and is not deployed. Environment variables must be configured in the Vercel dashboard or viavercel env. Missing variables cause silent runtime failures that are hard to debug in production.
Overview
Vercel provides two function runtimes: Serverless Functions (Node.js, running in AWS Lambda under the hood) and Edge Functions (running on Cloudflare's edge network via the Edge Runtime). Next.js API Routes, Route Handlers, and Server Actions are automatically deployed as one of these function types. Standalone functions can also be placed in the api/ directory for non-Next.js projects.
Setup & Configuration
Next.js Route Handler (App Router)
// app/api/items/route.ts
import { NextRequest, NextResponse } from 'next/server';
export async function GET(request: NextRequest) {
const searchParams = request.nextUrl.searchParams;
const page = parseInt(searchParams.get('page') || '1');
const items = await fetchItems(page);
return NextResponse.json(items);
}
export async function POST(request: NextRequest) {
const body = await request.json();
const created = await createItem(body);
return NextResponse.json(created, { status: 201 });
}
Edge Runtime opt-in
// app/api/geo/route.ts
export const runtime = 'edge';
export async function GET(request: NextRequest) {
const country = request.geo?.country || 'unknown';
return NextResponse.json({ country });
}
Standalone API function (non-Next.js)
// api/hello.ts
import type { VercelRequest, VercelResponse } from '@vercel/node';
export default function handler(req: VercelRequest, res: VercelResponse) {
res.status(200).json({ message: 'Hello from Vercel' });
}
vercel.json configuration
{
"functions": {
"api/heavy-compute.ts": {
"memory": 1024,
"maxDuration": 60
}
},
"crons": [
{
"path": "/api/cron/cleanup",
"schedule": "0 */6 * * *"
}
]
}
Core Patterns
Server Actions (Next.js App Router)
// app/actions.ts
'use server';
import { revalidatePath } from 'next/cache';
import { db } from '@/lib/db';
export async function createPost(formData: FormData) {
const title = formData.get('title') as string;
const content = formData.get('content') as string;
await db.insert(posts).values({ title, content });
revalidatePath('/posts');
}
Middleware (Edge Runtime)
// middleware.ts
import { NextRequest, NextResponse } from 'next/server';
export function middleware(request: NextRequest) {
const token = request.cookies.get('session')?.value;
if (!token && request.nextUrl.pathname.startsWith('/dashboard')) {
return NextResponse.redirect(new URL('/login', request.url));
}
const response = NextResponse.next();
response.headers.set('x-request-id', crypto.randomUUID());
return response;
}
export const config = {
matcher: ['/dashboard/:path*', '/api/:path*'],
};
Streaming responses
// app/api/stream/route.ts
export const runtime = 'edge';
export async function GET() {
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
for (const chunk of dataChunks) {
controller.enqueue(encoder.encode(JSON.stringify(chunk) + '\n'));
await new Promise((r) => setTimeout(r, 100));
}
controller.close();
},
});
return new Response(stream, {
headers: { 'Content-Type': 'application/x-ndjson' },
});
}
Best Practices
- Use Edge Runtime for latency-sensitive endpoints that do lightweight work (auth checks, redirects, geo-routing); use Node.js runtime for heavy computation or when you need Node-specific APIs.
- Set appropriate
maxDurationinvercel.json— the default is 10 seconds on Hobby and 60 seconds on Pro; long-running functions that exceed the limit are terminated. - Use ISR (Incremental Static Regeneration) and
revalidatePath/revalidateTagto reduce function invocations for content that does not change on every request.
Common Pitfalls
- Edge Functions cannot use Node.js built-in modules like
fs,net, orchild_process— if your dependency relies on these, you must use the Node.js runtime instead. - Environment variables must be configured in the Vercel dashboard or via
vercel env;.env.localfiles are not deployed and are only for local development.
Install this skill directly: skilldb add serverless-skills
Related Skills
AWS Lambda
Expert guidance for building, deploying, and optimizing AWS Lambda functions
AWS Step Functions
Expert guidance for orchestrating serverless workflows with AWS Step Functions
Cloudflare Workers
Expert guidance for building and deploying applications on Cloudflare Workers at the edge
Cold Start Optimization
Expert guidance for mitigating and optimizing cold start latency in serverless functions
Event Triggers
Expert guidance for building event-driven serverless architectures with S3, SQS, and EventBridge triggers
Serverless Databases
Expert guidance for using serverless databases like PlanetScale, Neon, and Turso in serverless applications