How to fix an exposed OpenAI API key in your frontend
Your OpenAI API key (starts with `sk-...`) is in your client JavaScript. Key-scraping bots find these within minutes and run expensive models (GPT-4o, o1) on your bill. Fix: (1) revoke the key in OpenAI dashboard; (2) set a hard spending limit on your account; (3) move every OpenAI call to a server endpoint; (4) optionally use ephemeral keys for client-side streaming. Do not rely on obfuscation — minification does not hide API keys.
Why it matters
OpenAI bills are uncapped by default. A leaked key running GPT-4o can cost $1000+ per day. The time between publication and exploitation is often under an hour.
How to check
- 01Search your bundle for `sk-` followed by 40+ characters.
- 02In OpenAI dashboard → Usage, look for usage spikes you cannot explain.
- 03API keys page shows 'Last used' timestamp and IP.
Or let SafeToShip check it for you in 60 seconds:
How to fix it
OpenAI dashboard
Settings → API keys → Revoke. Then Settings → Billing → Limits → set a monthly hard limit.
Next.js + AI SDK
Use a Route Handler. Consider Vercel AI Gateway to avoid hard-coupling to one provider.
// app/api/chat/route.ts
import { streamText } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: 'openai/gpt-4o-mini', // via AI Gateway
messages,
});
return result.toTextStreamResponse();
}AI prompt
Copy-paste into your AI tool
Paste this prompt into Cursor, Lovable, Bolt, v0, or Claude Code and it will walk through the fix for your specific codebase.
My OpenAI API key is exposed in client code. Revoke it immediately in OpenAI dashboard, then move every call to a server-side route handler. Use the Vercel AI SDK with streamText for streaming responses — the browser hits my /api/chat endpoint, which uses OPENAI_API_KEY server-side. Set a spending limit in OpenAI billing to cap exposure.FAQ
Frequently asked questions
- What about ephemeral keys for client-side streaming?
- OpenAI supports `dangerouslyAllowBrowser` for a reason — it is dangerous. If you need lowest latency, use ephemeral credentials via beta endpoints, not the master key.
Related fix guides
Fix these too
Hardcoded API key in JS
Any secret in your client bundle is public. Here is how to find them, rotate them, and move the calls server-side.
Read moreExposed Anthropic key
Claude API keys (sk-ant-...) leaked in client code get drained like any other LLM key. Here is the fix.
Read moreExposed .env file
An exposed .env file is a critical leak — it contains API keys, database URLs, and secrets. Here is why it happens in vibe-coded apps and how to lock it down.
Read moreFree tools
Check this yourself
Platform guides
Building on these platforms?
Next.js security
Next.js is the most popular React framework, but even experienced developers miss security headers and accidentally expose server files in production.
Read moreLovable security
Lovable makes it easy to ship fast, but AI-generated backends often ship with open Supabase tables and leaked API keys. Scan your Lovable app before your users find out.
Read moreBolt security
Bolt generates full-stack apps in seconds, but speed can leave security gaps. Exposed environment files and missing CORS configuration are common in Bolt projects.
Read moreCursor security
Cursor helps you write code faster with AI, but AI-assisted code can introduce subtle security issues. Missing headers, exposed files, and insecure cookies slip through easily.
Read moreScan your site for this and 50+ other issues
Free scan. Results in 60 seconds. No account required.