Cloud Functions
Build and deploy event-driven serverless functions on Google Cloud Functions
You are an expert in Google Cloud Functions for building lightweight, event-driven serverless compute. ## Key Points - **Building monolithic HTTP routers inside a single function** -- Routing multiple endpoints through one function defeats the single-responsibility model. Use Cloud Run for multi-route services. - **Processing event-triggered functions without idempotency** -- Retries and duplicate delivery will cause duplicate side effects (double charges, duplicate emails) unless the handler is idempotent. - **Loading heavy dependencies at module scope** -- Importing large libraries (ML models, big SDKs) at the top level increases cold start time for every invocation. Lazy-load expensive dependencies. - HTTP-triggered and event-triggered functions - 2nd gen functions built on Cloud Run and Eventarc - Native integration with Pub/Sub, Cloud Storage, Firestore, and other GCP event sources - Automatic scaling including scale-to-zero - Support for Node.js, Python, Go, Java, .NET, Ruby, and PHP runtimes - **Use 2nd gen functions.** They are built on Cloud Run and Eventarc, offering longer timeouts (up to 60 minutes), larger instances, concurrency, and traffic splitting. - **Keep functions focused.** Each function should do one thing. Compose complex workflows using Cloud Workflows or Pub/Sub. - **Set appropriate timeouts.** Default is 60 seconds for 1st gen, 30 seconds for 2nd gen HTTP. Adjust based on actual workload needs. - **Use the Functions Framework for local testing.** It replicates the production execution environment locally. ## Quick Example ```bash gcloud services enable cloudfunctions.googleapis.com \ cloudbuild.googleapis.com \ eventarc.googleapis.com \ run.googleapis.com ``` ```bash gcloud functions deploy my-func \ --gen2 \ --set-env-vars "API_URL=https://api.example.com" \ --set-secrets "API_KEY=my-secret:latest" ```
skilldb get gcp-services-skills/Cloud FunctionsFull skill: 195 linesGCP Service — Cloud Functions
You are an expert in Google Cloud Functions for building lightweight, event-driven serverless compute.
Core Philosophy
A function should do exactly one thing. Cloud Functions is not a general-purpose compute platform -- it is designed for small, focused units of work triggered by events. If you find yourself routing multiple endpoints inside a single function or managing complex state, you have outgrown Functions and should use Cloud Run instead. The beauty of Functions is that each one maps cleanly to a single trigger and a single responsibility.
Event-driven functions must be idempotent. Cloud events can be delivered more than once, and functions can be retried on failure. If your function charges a credit card, sends an email, or writes a database record, duplicate execution must not produce duplicate side effects. Use a deduplication key (the event ID, a Firestore document ID, or a custom idempotency token) to detect and skip reprocessing.
Cold starts are the cost of scale-to-zero. Every serverless function trades startup latency for the ability to scale down to zero instances and pay nothing when idle. Minimize cold start impact by keeping dependencies lean, using lazy initialization for heavy clients, and setting min-instances on latency-sensitive endpoints. For functions that serve user-facing HTTP requests, the cold start tax is real and must be managed.
Anti-Patterns
- Building monolithic HTTP routers inside a single function -- Routing multiple endpoints through one function defeats the single-responsibility model. Use Cloud Run for multi-route services.
- Processing event-triggered functions without idempotency -- Retries and duplicate delivery will cause duplicate side effects (double charges, duplicate emails) unless the handler is idempotent.
- Loading heavy dependencies at module scope -- Importing large libraries (ML models, big SDKs) at the top level increases cold start time for every invocation. Lazy-load expensive dependencies.
- Ignoring the background work freeze -- In HTTP functions, the runtime may freeze after the response is sent. Any background work (analytics pings, async writes) not completed before the response is at risk of being lost.
- Using 1st gen functions for new projects -- 2nd gen functions are built on Cloud Run, offer longer timeouts, concurrency, traffic splitting, and Eventarc integration. There is no reason to choose 1st gen for new work.
Overview
Cloud Functions is a Functions-as-a-Service (FaaS) platform that executes single-purpose code in response to HTTP requests, Cloud events, or Pub/Sub messages. It handles provisioning, scaling, and patching automatically.
Key capabilities:
- HTTP-triggered and event-triggered functions
- 2nd gen functions built on Cloud Run and Eventarc
- Native integration with Pub/Sub, Cloud Storage, Firestore, and other GCP event sources
- Automatic scaling including scale-to-zero
- Support for Node.js, Python, Go, Java, .NET, Ruby, and PHP runtimes
Setup & Configuration
Enable APIs
gcloud services enable cloudfunctions.googleapis.com \
cloudbuild.googleapis.com \
eventarc.googleapis.com \
run.googleapis.com
Deploy an HTTP function (2nd gen)
gcloud functions deploy hello-http \
--gen2 \
--runtime nodejs20 \
--region us-central1 \
--source . \
--entry-point helloHttp \
--trigger-http \
--allow-unauthenticated \
--memory 256Mi \
--timeout 60s
Deploy an event-triggered function
gcloud functions deploy process-upload \
--gen2 \
--runtime python312 \
--region us-central1 \
--source . \
--entry-point process_upload \
--trigger-event-filters="type=google.cloud.storage.object.v1.finalized" \
--trigger-event-filters="bucket=my-bucket"
Set environment variables and secrets
gcloud functions deploy my-func \
--gen2 \
--set-env-vars "API_URL=https://api.example.com" \
--set-secrets "API_KEY=my-secret:latest"
Core Patterns
HTTP function (Node.js)
const functions = require('@google-cloud/functions-framework');
functions.http('helloHttp', (req, res) => {
const name = req.query.name || req.body.name || 'World';
res.json({ message: `Hello, ${name}!` });
});
Cloud Storage event function (Python)
import functions_framework
from google.cloud import storage
@functions_framework.cloud_event
def process_upload(cloud_event):
data = cloud_event.data
bucket_name = data["bucket"]
file_name = data["name"]
print(f"Processing file: gs://{bucket_name}/{file_name}")
client = storage.Client()
bucket = client.bucket(bucket_name)
blob = bucket.blob(file_name)
content = blob.download_as_text()
# Process the content
Pub/Sub triggered function (Go)
package myfunc
import (
"context"
"encoding/base64"
"fmt"
"github.com/GoogleCloudPlatform/functions-framework-go/functions"
"github.com/cloudevents/sdk-go/v2/event"
)
func init() {
functions.CloudEvent("ProcessMessage", processMessage)
}
type MessagePublishedData struct {
Message struct {
Data string `json:"data"`
} `json:"message"`
}
func processMessage(ctx context.Context, e event.Event) error {
var msg MessagePublishedData
if err := e.DataAs(&msg); err != nil {
return fmt.Errorf("event.DataAs: %w", err)
}
data, _ := base64.StdEncoding.DecodeString(msg.Message.Data)
fmt.Printf("Received message: %s\n", string(data))
return nil
}
Firestore trigger function (Node.js)
const functions = require('@google-cloud/functions-framework');
functions.cloudEvent('onUserCreate', (cloudEvent) => {
const data = cloudEvent.data;
const doc = data.value.fields;
console.log(`New user created: ${doc.email.stringValue}`);
});
Local development and testing
# Install the Functions Framework
npm install @google-cloud/functions-framework
# Run locally
npx functions-framework --target=helloHttp --port=8080
# Test
curl http://localhost:8080?name=Test
Best Practices
- Use 2nd gen functions. They are built on Cloud Run and Eventarc, offering longer timeouts (up to 60 minutes), larger instances, concurrency, and traffic splitting.
- Keep functions focused. Each function should do one thing. Compose complex workflows using Cloud Workflows or Pub/Sub.
- Set appropriate timeouts. Default is 60 seconds for 1st gen, 30 seconds for 2nd gen HTTP. Adjust based on actual workload needs.
- Use the Functions Framework for local testing. It replicates the production execution environment locally.
- Minimize cold start impact. Keep dependencies lean, use lazy initialization, and consider min-instances for latency-critical functions.
- Return proper HTTP status codes. Cloud Functions retries event-driven invocations that return errors; be deliberate about retry behavior.
- Pin runtime versions. Specify exact runtime versions (e.g.,
nodejs20,python312) to avoid unexpected upgrades.
Common Pitfalls
- Not handling idempotency for event functions. Cloud events can be delivered more than once. Design functions to handle duplicate invocations safely.
- Exceeding memory limits. Functions that process large files in memory will crash. Stream data or increase the memory allocation.
- Background work after response. In HTTP functions, the runtime may freeze after the response is sent. Complete all work before responding.
- Ignoring cold start latency. First invocation after idle can take seconds. Use min-instances for user-facing endpoints.
- Hardcoding project IDs. Use environment variables or metadata server queries instead of hardcoding GCP project identifiers.
- Missing retry configuration. Event-driven functions retry on failure by default. If your function is not idempotent, disable retries or add deduplication logic.
Install this skill directly: skilldb add gcp-services-skills
Related Skills
Bigquery
Analyze large datasets with Google BigQuery serverless data warehouse and SQL engine
Cloud Run
Deploy and manage containerized applications on Google Cloud Run serverless platform
Cloud Storage
Store, retrieve, and manage objects in Google Cloud Storage buckets
Firestore
Model, query, and manage data with Google Cloud Firestore NoSQL document database
Adversarial Code Review
Adversarial implementation review methodology that validates code completeness against requirements with fresh objectivity. Uses a coach-player dialectical loop to catch real gaps in security, logic, and data flow.
API Design Testing
Design, document, and test APIs following RESTful principles, consistent