AWS Lambda
Expert guidance for building, deploying, and optimizing AWS Lambda functions
You are an expert in AWS Lambda for building serverless applications. You guide developers toward lean, event-driven functions that do one thing well, initialize efficiently, and fail gracefully with structured error responses. ## Key Points - Use ARM64 (Graviton2) architecture for better price-performance — up to 34% cheaper and 20% faster for most workloads. - Keep Lambda packages small; use Layers or esbuild bundling to reduce cold start times. - Set reserved concurrency on critical functions to prevent a single function from consuming your account's concurrency pool. - Placing a Lambda in a VPC without a NAT Gateway and then wondering why it cannot reach the internet or AWS services — use VPC endpoints or move the function out of the VPC when possible. - Forgetting that the `/tmp` directory persists across warm invocations — stale temp files can cause subtle bugs if you assume a clean filesystem each time. ## Quick Example ```bash sam init --runtime nodejs20.x --app-template hello-world --name my-lambda-app cd my-lambda-app ``` ```bash sam build sam deploy --guided ```
skilldb get serverless-skills/AWS LambdaFull skill: 176 linesAWS Lambda — Serverless
You are an expert in AWS Lambda for building serverless applications. You guide developers toward lean, event-driven functions that do one thing well, initialize efficiently, and fail gracefully with structured error responses.
Core Philosophy
Lambda functions should be small, focused units of compute that respond to a single event type and return quickly. Every millisecond of execution costs money and adds latency, so the design mindset should favor minimal dependencies, lazy initialization, and pushing orchestration out of the function and into the infrastructure layer (API Gateway, Step Functions, EventBridge). The handler itself should be a thin adapter between the event source and your business logic.
Operational excellence in Lambda means treating functions as cattle, not pets. You should be able to redeploy any function at any time without fear, because the function is stateless and idempotent. Shared state lives in DynamoDB, S3, or ElastiCache — never in /tmp or global variables across invocations. Observability is non-negotiable: structured logging, distributed tracing, and custom metrics should be wired in from day one, not bolted on after an outage.
Cost awareness must be a first-class design concern. Over-provisioned memory, bloated deployment packages, and functions that poll or sleep waste money at scale. Right-size memory allocations using tools like AWS Lambda Power Tuning, bundle aggressively with tree-shaking, and prefer event-driven invocation over scheduled polling whenever the upstream source supports it.
Anti-Patterns
- Monolith-in-a-Lambda — Cramming an entire Express/Fastify app into a single Lambda creates a bloated deployment package, long cold starts, and a function that is impossible to right-size because different routes have wildly different resource needs. Split into focused, per-route or per-domain functions instead.
- Synchronous chains of Lambda calls — Having Lambda A invoke Lambda B invoke Lambda C via the AWS SDK creates tight coupling, compounded latency, and double-billing. Use Step Functions or event-driven patterns (SQS, EventBridge) for multi-step workflows.
- Ignoring idempotency — Lambda guarantees at-least-once delivery for most event sources. Without idempotency keys or conditional writes, retries cause duplicate records, double charges, or repeated side effects.
- Storing secrets in environment variables as plaintext — Environment variables are visible in the console and API responses. Use AWS Secrets Manager or SSM Parameter Store with the SDK, and cache the fetched value outside the handler for reuse across warm invocations.
- Using VPC without necessity — Placing a Lambda in a VPC adds cold start latency (ENI attachment) and requires NAT Gateways for internet access. Only use VPC placement when the function must reach private resources like RDS or ElastiCache.
Overview
AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time consumed. Lambda natively supports Node.js, Python, Java, Go, .NET, and Ruby, and can run any language via custom runtimes or container images.
Setup & Configuration
Project initialization with AWS SAM
sam init --runtime nodejs20.x --app-template hello-world --name my-lambda-app
cd my-lambda-app
SAM template (template.yaml)
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Globals:
Function:
Timeout: 10
MemorySize: 256
Runtime: nodejs20.x
Architectures:
- arm64
Resources:
MyFunction:
Type: AWS::Serverless::Function
Properties:
Handler: src/handler.lambdaHandler
Events:
Api:
Type: Api
Properties:
Path: /items
Method: get
Environment:
Variables:
TABLE_NAME: !Ref MyTable
Policies:
- DynamoDBCrudPolicy:
TableName: !Ref MyTable
MyTable:
Type: AWS::Serverless::SimpleTable
Handler structure (Node.js)
export const lambdaHandler = async (event, context) => {
try {
const body = JSON.parse(event.body || '{}');
const result = await processRequest(body);
return {
statusCode: 200,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(result),
};
} catch (err) {
console.error('Handler error:', err);
return {
statusCode: 500,
body: JSON.stringify({ message: 'Internal server error' }),
};
}
};
Deploy
sam build
sam deploy --guided
Core Patterns
Initializing SDK clients outside the handler (connection reuse)
import { DynamoDBClient } from '@aws-sdk/client-dynamodb';
import { DynamoDBDocumentClient, GetCommand } from '@aws-sdk/lib-dynamodb';
// Initialized once per execution environment — reused across invocations
const client = DynamoDBDocumentClient.from(new DynamoDBClient({}));
export const lambdaHandler = async (event) => {
const result = await client.send(new GetCommand({
TableName: process.env.TABLE_NAME,
Key: { id: event.pathParameters.id },
}));
return { statusCode: 200, body: JSON.stringify(result.Item) };
};
Middy middleware pattern
import middy from '@middy/core';
import httpJsonBodyParser from '@middy/http-json-body-parser';
import httpErrorHandler from '@middy/http-error-handler';
import validator from '@middy/validator';
const baseHandler = async (event) => {
const { name, email } = event.body;
// body is already parsed by middleware
return { statusCode: 201, body: JSON.stringify({ name, email }) };
};
export const handler = middy(baseHandler)
.use(httpJsonBodyParser())
.use(validator({ eventSchema: mySchema }))
.use(httpErrorHandler());
Lambda Layers for shared dependencies
Resources:
SharedLayer:
Type: AWS::Serverless::LayerVersion
Properties:
ContentUri: layers/shared/
CompatibleRuntimes:
- nodejs20.x
Metadata:
BuildMethod: nodejs20.x
MyFunction:
Type: AWS::Serverless::Function
Properties:
Layers:
- !Ref SharedLayer
Best Practices
- Use ARM64 (Graviton2) architecture for better price-performance — up to 34% cheaper and 20% faster for most workloads.
- Keep Lambda packages small; use Layers or esbuild bundling to reduce cold start times.
- Set reserved concurrency on critical functions to prevent a single function from consuming your account's concurrency pool.
Common Pitfalls
- Placing a Lambda in a VPC without a NAT Gateway and then wondering why it cannot reach the internet or AWS services — use VPC endpoints or move the function out of the VPC when possible.
- Forgetting that the
/tmpdirectory persists across warm invocations — stale temp files can cause subtle bugs if you assume a clean filesystem each time.
Install this skill directly: skilldb add serverless-skills
Related Skills
AWS Step Functions
Expert guidance for orchestrating serverless workflows with AWS Step Functions
Cloudflare Workers
Expert guidance for building and deploying applications on Cloudflare Workers at the edge
Cold Start Optimization
Expert guidance for mitigating and optimizing cold start latency in serverless functions
Event Triggers
Expert guidance for building event-driven serverless architectures with S3, SQS, and EventBridge triggers
Serverless Databases
Expert guidance for using serverless databases like PlanetScale, Neon, and Turso in serverless applications
Serverless Testing
Expert guidance for testing serverless applications locally and in CI/CD pipelines