Skip to main content
Technology & EngineeringLogging Services273 lines

Pino Logger

Pino: fast JSON logger for Node.js — child loggers, serializers, transports (pino-pretty, pino-http), redaction, Next.js integration, and log levels

Quick Summary18 lines
Pino is the fastest JSON logger for Node.js, designed around the principle that logging should never become a bottleneck. It achieves this by deferring serialization to a worker thread via transports, producing newline-delimited JSON (NDJSON) by default, and keeping the hot path minimal. Pino logs are structured data first — human readability is handled downstream by tools like `pino-pretty`, not in the application process.

## Key Points

- **Speed over convenience** — Pino avoids synchronous string formatting in the main thread.
- **Structured by default** — Every log line is a valid JSON object, ready for ingestion by ELK, Datadog, or any log aggregator.
- **Child loggers for context** — Rather than passing metadata through every call, bind context once via child loggers.
- **Transport pipeline** — Heavy formatting, file rotation, and network shipping happen in worker threads.
1. **Set `level` from an environment variable** — allows runtime adjustment without redeployment.
2. **Use child loggers** to propagate request-scoped context (request ID, user ID, tenant) rather than repeating fields.
3. **Offload formatting to transports** — never use `pino-pretty` in production; it defeats the performance advantage.
4. **Lean on `pino.stdSerializers.err`** for error objects to capture stack traces consistently.
5. **Redact sensitive fields declaratively** — use the `redact` option with glob paths rather than manually scrubbing.
6. **Log objects first, message second** — `logger.info({ orderId }, "order placed")` not `logger.info("order placed " + orderId)`.
7. **Keep log messages static strings** — dynamic data goes into the object argument so log aggregators can group by message.
8. **Set `timestamp: pino.stdTimeFunctions.isoTime`** for ISO-8601 timestamps that every log platform understands.
skilldb get logging-services-skills/Pino LoggerFull skill: 273 lines
Paste into your CLAUDE.md or agent config

Pino Logger

Core Philosophy

Pino is the fastest JSON logger for Node.js, designed around the principle that logging should never become a bottleneck. It achieves this by deferring serialization to a worker thread via transports, producing newline-delimited JSON (NDJSON) by default, and keeping the hot path minimal. Pino logs are structured data first — human readability is handled downstream by tools like pino-pretty, not in the application process.

Key tenets:

  • Speed over convenience — Pino avoids synchronous string formatting in the main thread.
  • Structured by default — Every log line is a valid JSON object, ready for ingestion by ELK, Datadog, or any log aggregator.
  • Child loggers for context — Rather than passing metadata through every call, bind context once via child loggers.
  • Transport pipeline — Heavy formatting, file rotation, and network shipping happen in worker threads.

Setup

Installation

// Core
// npm install pino

// Development formatting
// npm install -D pino-pretty

// HTTP request logging
// npm install pino-http

// For Next.js / Edge
// npm install pino pino-pretty

Basic Configuration

import pino, { type Logger, type LoggerOptions } from "pino";

const config: LoggerOptions = {
  level: process.env.LOG_LEVEL ?? "info",
  timestamp: pino.stdTimeFunctions.isoTime,
  formatters: {
    level(label: string) {
      return { level: label };
    },
    bindings(bindings: pino.Bindings) {
      return { pid: bindings.pid, host: bindings.hostname };
    },
  },
  redact: {
    paths: ["req.headers.authorization", "req.headers.cookie", "*.password", "*.ssn"],
    censor: "[REDACTED]",
  },
};

// Development: pipe through pino-pretty
// Production: raw NDJSON to stdout for log collectors
const transport =
  process.env.NODE_ENV !== "production"
    ? pino.transport({ target: "pino-pretty", options: { colorize: true, translateTime: "SYS:HH:MM:ss" } })
    : undefined;

export const logger: Logger = pino(config, transport);

Child Loggers

// Bind context that appears on every subsequent log line
const requestLogger = logger.child({ requestId: "abc-123", userId: "user-42" });

requestLogger.info("processing payment");
// {"level":"info","time":"...","requestId":"abc-123","userId":"user-42","msg":"processing payment"}

// Nest further — child of a child
const paymentLogger = requestLogger.child({ service: "stripe" });
paymentLogger.info({ amount: 4999 }, "charge created");

Key Techniques

Custom Serializers

import pino from "pino";
import type { IncomingMessage, ServerResponse } from "http";

const logger = pino({
  serializers: {
    req(req: IncomingMessage) {
      return {
        method: req.method,
        url: req.url,
        remoteAddress: req.socket?.remoteAddress,
      };
    },
    res(res: ServerResponse) {
      return { statusCode: res.statusCode };
    },
    err: pino.stdSerializers.err, // stack, message, type
  },
});

logger.info({ req: incomingRequest }, "request received");

HTTP Request Logging with pino-http

import pinoHttp from "pino-http";
import { randomUUID } from "crypto";
import express from "express";

const app = express();

app.use(
  pinoHttp({
    logger,
    genReqId: (req) => req.headers["x-request-id"]?.toString() ?? randomUUID(),
    customLogLevel(_req, res, error) {
      if (error || res.statusCode >= 500) return "error";
      if (res.statusCode >= 400) return "warn";
      return "info";
    },
    customSuccessMessage(_req, res) {
      return `${res.statusCode} ${res.req.method} ${res.req.url}`;
    },
    customErrorMessage(_req, _res, error) {
      return `request failed: ${error.message}`;
    },
    customProps(req) {
      return { correlationId: req.id };
    },
    // Silence health-check noise
    autoLogging: { ignore: (req) => req.url === "/health" },
  }),
);

app.get("/api/users", (req, res) => {
  // req.log is a child logger with request context already bound
  req.log.info("fetching user list");
  res.json([]);
});

Multi-Transport Pipeline

import pino from "pino";

const transport = pino.transport({
  targets: [
    // Pretty-print to stdout in dev
    {
      target: "pino-pretty",
      options: { colorize: true, destination: 1 },
      level: "debug",
    },
    // JSON to a rotating file
    {
      target: "pino/file",
      options: { destination: "/var/log/app/app.log", mkdir: true },
      level: "info",
    },
    // Ship errors to an HTTP endpoint
    {
      target: "pino/file",
      options: { destination: "/var/log/app/errors.log", mkdir: true },
      level: "error",
    },
  ],
});

const logger = pino({ level: "debug" }, transport);

Next.js Integration

// lib/logger.ts — works in both Node.js and Edge runtimes
import pino from "pino";

function buildLogger() {
  if (typeof window !== "undefined") {
    // Client-side: use browser console binding
    return pino({ browser: { asObject: true }, level: "info" });
  }

  const isEdge = typeof EdgeRuntime === "string";

  if (isEdge) {
    // Edge runtime: no worker threads, no transports
    return pino({ level: "info" });
  }

  // Node.js server runtime: full features
  return pino({
    level: process.env.LOG_LEVEL ?? "info",
    transport:
      process.env.NODE_ENV !== "production"
        ? { target: "pino-pretty", options: { colorize: true } }
        : undefined,
  });
}

export const logger = buildLogger();
// middleware.ts
import { NextResponse, type NextRequest } from "next/server";
import { logger } from "@/lib/logger";

export function middleware(request: NextRequest) {
  const start = Date.now();
  const requestId = request.headers.get("x-request-id") ?? crypto.randomUUID();

  const log = logger.child({ requestId, path: request.nextUrl.pathname });
  log.info("incoming request");

  const response = NextResponse.next();
  response.headers.set("x-request-id", requestId);

  log.info({ durationMs: Date.now() - start }, "request completed");
  return response;
}

Redaction

const logger = pino({
  redact: {
    paths: [
      "password",
      "creditCard",
      "user.email",
      "req.headers.authorization",
      "*.secret",
      "records[*].ssn",
    ],
    censor: "***",
    remove: false, // set true to omit the key entirely
  },
});

logger.info({ user: { name: "Alice", email: "alice@example.com" } }, "user login");
// email field appears as "***"

Best Practices

  1. Set level from an environment variable — allows runtime adjustment without redeployment.
  2. Use child loggers to propagate request-scoped context (request ID, user ID, tenant) rather than repeating fields.
  3. Offload formatting to transports — never use pino-pretty in production; it defeats the performance advantage.
  4. Lean on pino.stdSerializers.err for error objects to capture stack traces consistently.
  5. Redact sensitive fields declaratively — use the redact option with glob paths rather than manually scrubbing.
  6. Log objects first, message secondlogger.info({ orderId }, "order placed") not logger.info("order placed " + orderId).
  7. Keep log messages static strings — dynamic data goes into the object argument so log aggregators can group by message.
  8. Set timestamp: pino.stdTimeFunctions.isoTime for ISO-8601 timestamps that every log platform understands.

Anti-Patterns

  1. String interpolation in messageslogger.info(\User ${id} logged in`)prevents log grouping. Uselogger.info({ userId: id }, "user logged in")`.
  2. Using pino-pretty as a production transport — it serializes synchronously and negates Pino's speed advantage.
  3. Creating a new logger per request — use logger.child() instead; it shares the parent's configuration and transport pipeline.
  4. Logging at trace/debug level without gating — even though Pino is fast, producing millions of debug lines saturates transport workers. Guard with if (logger.isLevelEnabled("debug")) for expensive computations.
  5. Ignoring the err serializer — passing raw Error objects without the standard serializer loses the stack trace in JSON output.
  6. Synchronous file writes — never use pino.destination({ sync: true }) in production; it blocks the event loop on every log line.
  7. Mixing console.log and Pino — breaks structured logging; all output should go through the same logger instance so transports and redaction apply uniformly.

Install this skill directly: skilldb add logging-services-skills

Get CLI access →