Skip to main content
Technology & EngineeringAzure Services365 lines

Azure Functions

Azure Functions serverless compute for event-driven applications

Quick Summary31 lines
You are an expert in Azure Functions for building serverless, event-driven applications on Microsoft Azure.

## Key Points

- **Building multi-route HTTP APIs inside a single function app** -- Each function should handle one route. For complex APIs, use a dedicated web framework on App Service or Container Apps instead.
- **Ignoring poison queue messages** -- Queue-triggered functions that repeatedly fail move messages to a poison queue. If nobody monitors the poison queue, failed work is silently lost.
- **Function App**: The hosting container for one or more individual functions
- **Trigger**: The event that causes a function to run (HTTP request, timer, queue message, blob change, etc.)
- **Binding**: Declarative connections to input/output data sources
- **Hosting plans**: Consumption (pay-per-execution), Premium (pre-warmed instances), Dedicated (App Service plan)
1. **Keep functions focused and small**: Each function should do one thing. Use Durable Functions for complex workflows instead of chaining HTTP calls.
2. **Choose the right hosting plan**: Use Consumption for sporadic workloads, Premium for latency-sensitive or VNET-connected apps, and Dedicated for predictable high-volume workloads.
3. **Manage cold starts**: On the Consumption plan, use Premium plan or pre-warming strategies for latency-sensitive functions. Keep dependencies minimal to reduce startup time.
4. **Use Application Insights**: Always enable Application Insights for monitoring, distributed tracing, and alerting.
6. **Use managed identity**: Avoid storing connection strings when possible. Use system-assigned managed identity with RBAC.
7. **Configure retry policies**: Set retry counts and strategies for triggers to handle transient failures.

## Quick Example

```json
// In application settings, reference Key Vault:
   "@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/MySecret/)"
```

```bash
az functionapp identity assign \
     --name myFunctionApp \
     --resource-group myResourceGroup
```
skilldb get azure-services-skills/Azure FunctionsFull skill: 365 lines
Paste into your CLAUDE.md or agent config

Azure Functions — Cloud Services

You are an expert in Azure Functions for building serverless, event-driven applications on Microsoft Azure.

Core Philosophy

Azure Functions exist to react to events, not to run general-purpose workloads. Each function should map to a single trigger -- an HTTP request, a queue message, a timer tick, a blob upload -- and do one thing in response. When you find yourself chaining HTTP calls between functions or building complex branching logic, switch to Durable Functions for orchestration rather than stitching functions together manually. The platform handles scale, retries, and execution guarantees; your job is to keep each function small, focused, and idempotent.

Bindings are the declarative superpower of Azure Functions. Instead of writing boilerplate code to read from a queue, write to a blob, and connect to Cosmos DB, declare input and output bindings and let the runtime handle connection management. This reduces code, eliminates resource leak bugs, and keeps functions focused on business logic. But for large payloads, bypass bindings entirely and use the SDK with streaming -- bindings load data into memory, which can exhaust the function's memory allocation.

Cold starts are the fundamental tradeoff of the Consumption plan. Scale-to-zero means you pay nothing when idle, but the first invocation after idle takes seconds while the runtime provisions an instance. For latency-sensitive workloads, use the Premium plan with pre-warmed instances. For predictable, high-volume workloads, use a Dedicated plan. Match the hosting plan to the workload profile, not to a budget line item.

Anti-Patterns

  • Building multi-route HTTP APIs inside a single function app -- Each function should handle one route. For complex APIs, use a dedicated web framework on App Service or Container Apps instead.
  • Storing state in static variables or local files -- Functions are stateless and ephemeral. Static variables are shared across invocations within the same instance but lost on scale events. Use Durable Functions entities or external storage for state.
  • Creating new SDK clients per invocation -- Instantiating HTTP clients, database connections, or SDK clients inside the handler causes socket exhaustion. Declare clients at module scope so they are reused across invocations.
  • Using Durable Functions orchestrators with non-deterministic code -- Orchestrators replay from history. Calling Date.now(), Math.random(), or doing direct I/O inside an orchestrator produces different results on replay, corrupting the execution. Use context.df.currentUtcDateTime and activity functions.
  • Ignoring poison queue messages -- Queue-triggered functions that repeatedly fail move messages to a poison queue. If nobody monitors the poison queue, failed work is silently lost.

Overview

Azure Functions is a serverless compute service that lets you run event-triggered code without provisioning or managing infrastructure. Functions scale automatically based on demand and you pay only for the compute time consumed. They support multiple languages including C#, JavaScript/TypeScript, Python, Java, and PowerShell.

Key concepts:

  • Function App: The hosting container for one or more individual functions
  • Trigger: The event that causes a function to run (HTTP request, timer, queue message, blob change, etc.)
  • Binding: Declarative connections to input/output data sources
  • Hosting plans: Consumption (pay-per-execution), Premium (pre-warmed instances), Dedicated (App Service plan)

Setup & Configuration

Create a Function App via Azure CLI

# Create a resource group
az group create --name myResourceGroup --location eastus

# Create a storage account (required by Functions)
az storage account create \
  --name mystorageaccount \
  --location eastus \
  --resource-group myResourceGroup \
  --sku Standard_LRS

# Create a Function App on the Consumption plan (Node.js)
az functionapp create \
  --resource-group myResourceGroup \
  --consumption-plan-location eastus \
  --runtime node \
  --runtime-version 18 \
  --functions-version 4 \
  --name myFunctionApp \
  --storage-account mystorageaccount

# Create a Function App on the Premium plan
az functionapp plan create \
  --name myPremiumPlan \
  --resource-group myResourceGroup \
  --location eastus \
  --sku EP1

az functionapp create \
  --resource-group myResourceGroup \
  --plan myPremiumPlan \
  --runtime node \
  --runtime-version 18 \
  --functions-version 4 \
  --name myPremiumFunctionApp \
  --storage-account mystorageaccount

Local development with Azure Functions Core Tools

# Install Azure Functions Core Tools
npm install -g azure-functions-core-tools@4 --unsafe-perm true

# Create a new Functions project
func init MyFunctionProject --worker-runtime node --language typescript

# Create a new function from a template
cd MyFunctionProject
func new --name HttpTrigger --template "HTTP trigger" --authlevel anonymous

# Run locally
func start

Project structure (Node.js v4 programming model)

MyFunctionProject/
  src/
    functions/
      httpTrigger.ts
      timerTrigger.ts
  host.json
  local.settings.json
  package.json
  tsconfig.json

host.json configuration

{
  "version": "2.0",
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "excludedTypes": "Request"
      }
    },
    "logLevel": {
      "default": "Information",
      "Host.Results": "Error",
      "Function": "Information"
    }
  },
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[4.*, 5.0.0)"
  },
  "concurrency": {
    "dynamicConcurrencyEnabled": true
  }
}

Core Patterns

HTTP Trigger (Node.js v4 model)

import { app, HttpRequest, HttpResponseInit, InvocationContext } from "@azure/functions";

app.http("httpTrigger", {
  methods: ["GET", "POST"],
  authLevel: "anonymous",
  route: "products/{id?}",
  handler: async (request: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> => {
    context.log(`HTTP function processed request for url "${request.url}"`);

    const id = request.params.id;
    if (id) {
      return { jsonBody: { id, name: `Product ${id}` } };
    }

    const body = await request.json() as { name?: string };
    const name = body?.name || request.query.get("name") || "World";

    return { jsonBody: { message: `Hello, ${name}!` } };
  },
});

Timer Trigger (scheduled tasks)

import { app, InvocationContext, Timer } from "@azure/functions";

app.timer("timerTrigger", {
  // Run every 5 minutes
  schedule: "0 */5 * * * *",
  handler: async (myTimer: Timer, context: InvocationContext): Promise<void> => {
    context.log("Timer function ran at:", new Date().toISOString());

    if (myTimer.isPastDue) {
      context.log("Timer is running late!");
    }

    // Perform scheduled work: cleanup, sync, report generation, etc.
    await performScheduledTask(context);
  },
});

Queue Trigger with Blob Output Binding

import { app, InvocationContext, output } from "@azure/functions";

const blobOutput = output.storageBlob({
  path: "output-container/{queueTrigger}.json",
  connection: "AzureWebJobsStorage",
});

app.storageQueue("processQueue", {
  queueName: "work-items",
  connection: "AzureWebJobsStorage",
  return: blobOutput,
  handler: async (queueItem: unknown, context: InvocationContext): Promise<string> => {
    context.log("Processing queue item:", queueItem);

    const result = {
      processed: true,
      timestamp: new Date().toISOString(),
      data: queueItem,
    };

    // Return value is written to the blob output binding
    return JSON.stringify(result);
  },
});

Durable Functions (orchestration)

import * as df from "durable-functions";
import { app, HttpRequest, HttpResponseInit, InvocationContext } from "@azure/functions";

// Orchestrator function
df.app.orchestration("orderProcessingOrchestrator", function* (context) {
  const orderId = context.df.getInput() as string;

  const validationResult = yield context.df.callActivity("validateOrder", orderId);
  if (!validationResult) {
    return { status: "rejected", orderId };
  }

  const paymentResult = yield context.df.callActivity("processPayment", orderId);
  const shippingResult = yield context.df.callActivity("arrangeShipping", orderId);

  yield context.df.callActivity("sendConfirmation", {
    orderId,
    payment: paymentResult,
    shipping: shippingResult,
  });

  return { status: "completed", orderId };
});

// Activity functions
df.app.activity("validateOrder", {
  handler: async (input: string, context: InvocationContext): Promise<boolean> => {
    context.log(`Validating order: ${input}`);
    return true;
  },
});

// HTTP starter
app.http("startOrderProcessing", {
  route: "orders/{orderId}/process",
  methods: ["POST"],
  extraInputs: [df.input.durableClient()],
  handler: async (request: HttpRequest, context: InvocationContext): Promise<HttpResponseInit> => {
    const client = df.getClient(context);
    const orderId = request.params.orderId;
    const instanceId = await client.startNew("orderProcessingOrchestrator", {
      input: orderId,
    });

    return client.createCheckStatusResponse(request, instanceId);
  },
});

Cosmos DB Trigger (change feed processing)

import { app, InvocationContext } from "@azure/functions";

app.cosmosDB("cosmosDBTrigger", {
  connectionStringSetting: "CosmosDBConnection",
  databaseName: "myDatabase",
  containerName: "myContainer",
  createLeaseContainerIfNotExists: true,
  leaseContainerName: "leases",
  handler: async (documents: unknown[], context: InvocationContext): Promise<void> => {
    context.log(`Processing ${documents.length} changed documents`);

    for (const doc of documents) {
      context.log("Changed document:", JSON.stringify(doc));
      // Process each changed document
    }
  },
});

Best Practices

  1. Keep functions focused and small: Each function should do one thing. Use Durable Functions for complex workflows instead of chaining HTTP calls.

  2. Choose the right hosting plan: Use Consumption for sporadic workloads, Premium for latency-sensitive or VNET-connected apps, and Dedicated for predictable high-volume workloads.

  3. Manage cold starts: On the Consumption plan, use Premium plan or pre-warming strategies for latency-sensitive functions. Keep dependencies minimal to reduce startup time.

  4. Use Application Insights: Always enable Application Insights for monitoring, distributed tracing, and alerting.

    az monitor app-insights component create \
      --app myAppInsights \
      --location eastus \
      --resource-group myResourceGroup
    
    az functionapp config appsettings set \
      --name myFunctionApp \
      --resource-group myResourceGroup \
      --settings APPLICATIONINSIGHTS_CONNECTION_STRING="<connection-string>"
    
  5. Secure your functions: Use function keys, Azure AD authentication, or API Management in front of HTTP-triggered functions. Never embed secrets in code; use App Configuration or Key Vault references.

    // In application settings, reference Key Vault:
    "@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/MySecret/)"
    
  6. Use managed identity: Avoid storing connection strings when possible. Use system-assigned managed identity with RBAC.

    az functionapp identity assign \
      --name myFunctionApp \
      --resource-group myResourceGroup
    
  7. Configure retry policies: Set retry counts and strategies for triggers to handle transient failures.

  8. Use slots for zero-downtime deployments:

    az functionapp deployment slot create \
      --name myFunctionApp \
      --resource-group myResourceGroup \
      --slot staging
    
    # Deploy to staging, test, then swap
    az functionapp deployment slot swap \
      --name myFunctionApp \
      --resource-group myResourceGroup \
      --slot staging
    

Common Pitfalls

  • Stateful functions on Consumption plan: Functions are stateless by default. Do not store state in static variables or local files — use external storage or Durable Functions entity pattern.

  • Exceeding timeout limits: Consumption plan has a 5-minute default (max 10 minutes). Premium and Dedicated allow up to 60 minutes (unbounded on Dedicated). Use Durable Functions for long-running work.

  • Ignoring connection limits: Reuse HTTP clients and database connections. Use static/singleton clients to avoid socket exhaustion.

    // GOOD: Reuse client across invocations
    import { CosmosClient } from "@azure/cosmos";
    const cosmosClient = new CosmosClient(process.env.COSMOS_CONNECTION!);
    
    // BAD: Creating new client per invocation
    
  • Not handling poison messages: Queue-triggered functions that repeatedly fail will move messages to a poison queue. Monitor and handle poison queues.

  • Large payloads through bindings: Bindings load data into memory. For large blobs, use the SDK directly with streaming instead of input bindings.

  • Missing CORS configuration: When calling HTTP functions from a browser, configure CORS in host.json or via the portal. Do not rely on function-level CORS headers alone.

  • Blocking async in orchestrators: Durable Functions orchestrators must be deterministic. Never use Date.now(), Math.random(), or direct I/O in orchestrators — use context.df.currentUtcDateTime and activity functions instead.

Install this skill directly: skilldb add azure-services-skills

Get CLI access →