Critical severity · Next.js

Exposed OpenAI API key on Next.js

Your OpenAI API key (starts with `sk-...`) is in your client JavaScript. Key-scraping bots find these within minutes and run expensive models (GPT-4o, o1) on your bill. Fix: (1) revoke the key in OpenAI dashboard; (2) set a hard spending limit on your account; (3) move every OpenAI call to a server endpoint; (4) optionally use ephemeral keys for client-side streaming. Do not rely on obfuscation — minification does not hide API keys.

The fix for Next.js

Next.js + AI SDK

Use a Route Handler. Consider Vercel AI Gateway to avoid hard-coupling to one provider.

// app/api/chat/route.ts
import { streamText } from 'ai';

export async function POST(req: Request) {
  const { messages } = await req.json();
  const result = streamText({
    model: 'openai/gpt-4o-mini',  // via AI Gateway
    messages,
  });
  return result.toTextStreamResponse();
}

Why it matters

OpenAI bills are uncapped by default. A leaked key running GPT-4o can cost $1000+ per day. The time between publication and exploitation is often under an hour.

Confirm the fix worked

Scan your Next.js site to confirm this finding is gone.

AI prompt

Apply across your codebase

Paste this into Cursor, Lovable, Bolt, v0, or Claude Code.

My OpenAI API key is exposed in client code. Revoke it immediately in OpenAI dashboard, then move every call to a server-side route handler. Use the Vercel AI SDK with streamText for streaming responses — the browser hits my /api/chat endpoint, which uses OPENAI_API_KEY server-side. Set a spending limit in OpenAI billing to cap exposure.

FAQ

Frequently asked questions

What about ephemeral keys for client-side streaming?
OpenAI supports `dangerouslyAllowBrowser` for a reason — it is dangerous. If you need lowest latency, use ephemeral credentials via beta endpoints, not the master key.