Vercel Blob
Build with Vercel Blob for file storage in Next.js. Use this skill when the
You are a storage specialist who integrates Vercel Blob into projects. Vercel Blob is a serverless file storage service integrated with Vercel, providing simple APIs for uploading, serving, and managing files with automatic CDN delivery. ## Key Points - Use client uploads for large files — don't proxy through your server - Use `onBeforeGenerateToken` to validate auth and restrict file types - Store the blob URL in your database — it's the permanent reference - Set `access: 'public'` for user-facing assets - Use `onUploadCompleted` to run post-upload logic (database saves, notifications) - Use `list` with prefix for organized file management - Proxying large file uploads through API routes — use client uploads - Not validating file types in `onBeforeGenerateToken` — allows any file - Not setting size limits — allows arbitrarily large uploads - Storing files without database references — no way to track or clean up - Using server uploads for user-facing features — client uploads scale better ## Quick Example ```bash npm install @vercel/blob ``` ```env BLOB_READ_WRITE_TOKEN=vercel_blob_... ```
skilldb get storage-services-skills/Vercel BlobFull skill: 158 linesVercel Blob Integration
You are a storage specialist who integrates Vercel Blob into projects. Vercel Blob is a serverless file storage service integrated with Vercel, providing simple APIs for uploading, serving, and managing files with automatic CDN delivery.
Core Philosophy
Zero configuration on Vercel
Vercel Blob works out of the box on Vercel deployments. Add the package, get a token from the dashboard, and start uploading. No bucket configuration, no CORS setup, no CDN provisioning.
Client uploads with server validation
Vercel Blob supports client-side uploads via a two-step flow: the client requests an upload token from your API, then uploads directly to Blob storage. Your server controls who can upload without proxying the file.
Setup
Install
npm install @vercel/blob
Environment variable
BLOB_READ_WRITE_TOKEN=vercel_blob_...
Key Techniques
Server upload
import { put, del, list } from '@vercel/blob';
// Upload from server
export async function POST(req: Request) {
const formData = await req.formData();
const file = formData.get('file') as File;
const blob = await put(file.name, file, {
access: 'public',
contentType: file.type,
});
return Response.json({ url: blob.url });
}
Client upload (recommended for large files)
// Server: handle upload token
// app/api/upload/route.ts
import { handleUpload, type HandleUploadBody } from '@vercel/blob/client';
export async function POST(req: Request) {
const body = (await req.json()) as HandleUploadBody;
const response = await handleUpload({
body,
request: req,
onBeforeGenerateToken: async (pathname) => {
// Auth check
// Return allowed content types and max size
return {
allowedContentTypes: ['image/jpeg', 'image/png', 'image/webp'],
maximumSizeInBytes: 10 * 1024 * 1024, // 10MB
};
},
onUploadCompleted: async ({ blob }) => {
// Save to database
await db.insert(uploads).values({
url: blob.url,
pathname: blob.pathname,
});
},
});
return Response.json(response);
}
// Client component
'use client';
import { upload } from '@vercel/blob/client';
function FileUploader() {
const [url, setUrl] = useState<string>();
async function handleUpload(e: React.ChangeEvent<HTMLInputElement>) {
const file = e.target.files?.[0];
if (!file) return;
const blob = await upload(file.name, file, {
access: 'public',
handleUploadUrl: '/api/upload',
});
setUrl(blob.url);
}
return (
<div>
<input type="file" onChange={handleUpload} />
{url && <img src={url} alt="Uploaded" />}
</div>
);
}
List and delete
import { list, del } from '@vercel/blob';
// List blobs
const { blobs } = await list({ prefix: 'avatars/' });
// Delete
await del(blobUrl);
// Delete multiple
await del([url1, url2, url3]);
Copy
import { copy } from '@vercel/blob';
const newBlob = await copy(sourceUrl, 'new-path/file.jpg', { access: 'public' });
Best Practices
- Use client uploads for large files — don't proxy through your server
- Use
onBeforeGenerateTokento validate auth and restrict file types - Store the blob URL in your database — it's the permanent reference
- Set
access: 'public'for user-facing assets - Use
onUploadCompletedto run post-upload logic (database saves, notifications) - Use
listwith prefix for organized file management
Anti-Patterns
- Proxying large file uploads through API routes — use client uploads
- Not validating file types in
onBeforeGenerateToken— allows any file - Not setting size limits — allows arbitrarily large uploads
- Storing files without database references — no way to track or clean up
- Using server uploads for user-facing features — client uploads scale better
Install this skill directly: skilldb add storage-services-skills
Related Skills
AWS S3
Build with AWS S3 for object storage. Use this skill when the project needs to
Backblaze B2
Build with Backblaze B2 for low-cost S3-compatible object storage.
Cloudflare R2
Build with Cloudflare R2 for S3-compatible object storage with zero egress fees.
Cloudinary
Build with Cloudinary for image and video management. Use this skill when the
Imagekit
Build with ImageKit for real-time image optimization and delivery. Use this skill
Tigris
Build with Tigris for globally distributed S3-compatible object storage.