API Key Exposed in Frontend? Secure It with Workers Proxy in 5 Minutes, 100K Free Requests Daily

Introduction
Last month, I built a small tool that called the ChatGPT API. For convenience, I hardcoded the API Key directly in the frontend code. The next morning, I woke up to find my account had been charged over $40 in unauthorized usage.
Checking the dashboard, I saw thousands of API calls made overnight. I panicked and immediately deleted the key and regenerated a new one. But here’s the problem: that key was already exposed in the frontend code. Even after changing it, the same issue would happen again.
You might say, “Just use environment variables!” To be honest, I thought the same thing at first. But I later discovered that Vite or Webpack environment variables still get bundled into your JS files. Users can open their browser’s Network panel and see the complete requests, including your API Key.
So what’s the solution? The traditional approach is to set up a backend server as a proxy to hide the API Key server-side. But this brings new problems: servers cost money (at least a few dollars a month for the cheapest options), you need to configure the environment, handle SSL certificates, deal with CORS… It’s way too heavy for personal projects.
That’s when I discovered Cloudflare Workers. I deployed an API proxy in 5 minutes. The API Key stays in server-side environment variables where the frontend can never access it. Best part? It’s completely free with 100,000 daily requests, which is more than enough for personal projects. And it solves CORS issues too.
In this article, I’ll walk you through the setup step by step. All the code can be copied and used directly. If you’re worried about API Key security, this solution will definitely help.
Why You Can’t Put API Keys in Frontend
Frontend Code is Completely Transparent
Many people think using .env files or Vite’s import.meta.env is secure. But that’s just a development convenience tool. After build, all environment variables get hardcoded into your JS files.
Don’t believe me? Try this: open any frontend project in production, press F12 to open developer tools, switch to the Network tab, and refresh. You’ll see all API requests, including headers, request bodies, and URL parameters—everything exposed.
Even if you obfuscate or minify your code, it just makes it harder to read. But API calls must send the real Key, which can’t be obfuscated. Some people think about encryption. The problem? Both encryption and decryption code are in the frontend, so users can still see everything.
Put simply, frontend code runs in users’ browsers. Anything you can do, users can do. There’s no way around this.
The Cost of Key Leaks
I used to think, “Who would bother scraping my code to steal an API Key?” Turns out, there are people who specialize in this on the internet.
There are automated tools on GitHub that scan projects specifically looking for exposed API Keys. Once found, people either use them to abuse APIs or sell them to others. OpenAI’s API is billed per token, so getting charged a few hundred dollars overnight is normal—serious cases can reach thousands.
I saw a discussion in a developer forum where someone made an AI chat page with the key in the frontend. After someone scraped it and made massive calls, the monthly bill shot up to over $2,000. While they successfully appealed it later, the process was truly frustrating.
And it’s not just OpenAI. Google Maps API, weather APIs, translation APIs—anything with usage-based billing faces the risk of unauthorized charges.
Problems with Traditional Solutions
After learning about the risks, I started researching solutions. Online suggestions basically boil down to: “Set up a backend server as a proxy.”
Sounds simple, but in reality:
- Cost issue: The cheapest cloud servers (like Alibaba Cloud or Tencent Cloud lightweight servers) still cost $7-15/month. While not expensive, it’s painful for small personal projects.
- Complex configuration: You need to install Node.js or other runtimes, configure Nginx as a reverse proxy, apply for SSL certificates for HTTPS, and handle CORS configuration… Just understanding all this takes half a day.
- Maintenance burden: Servers need regular updates, monitoring, and restarts if they go down. Small projects really can’t handle this hassle.
Domestic Serverless functions (like Alibaba Cloud Function Compute or Tencent Cloud Functions) can save money, but the configuration is even more complex, cold starts are slow, and documentation isn’t user-friendly. I tried several times without success.
API gateways sound professional, but those are enterprise-grade products. The barrier is too high for individual developers.
Advantages of Cloudflare Workers
Completely Free with Powerful Performance
Cloudflare Workers’ free tier offers 100,000 requests per day. For personal projects, this is really more than enough. My little tool only makes a few hundred calls a day—the free tier is plenty.
Plus, Cloudflare has over 200 data centers worldwide. Your code automatically deploys to these nodes. When users access it, they’re automatically routed to the nearest node with very fast response times. Unlike traditional servers where if you buy in one region, other regions will be slow.
Another important point: Workers don’t have cold start issues. Serverless functions (like AWS Lambda) can have several seconds of startup delay if they haven’t been called in a while. Workers respond in milliseconds, similar to traditional servers.
Deployment is Incredibly Simple
The first time I used Workers, from registration to deployment took 5 minutes. I’m not exaggerating.
No need to configure server environments, install Node.js or Nginx, or even apply for SSL certificates (Workers give you HTTPS automatically). All you need to do is:
- Write code (one JS file, done in a few dozen lines)
- Run one command:
wrangler publish - Done
After deployment, Cloudflare gives you a domain like your-worker.your-subdomain.workers.dev that you can use directly. If you want to use your own domain, just bind it in the console—no extra configuration needed.
Compare that to traditional approaches: buy server → configure environment → write code → configure Nginx → apply for certificate → deploy → test… Just looking at the workflow is overwhelming.
Naturally Solves CORS Issues
Frontend calls to third-party APIs often encounter CORS errors, like: Access to fetch at 'xxx' from origin 'yyy' has been blocked by CORS policy.
This is due to browser security restrictions blocking requests between different domains. The traditional solution is to have the API provider add CORS configuration in response headers, but you can’t change third-party API configurations.
Workers as a proxy solves this perfectly:
- Frontend calls your own Workers address (like
https://api.yourdomain.com) - Workers calls the third-party API
- Workers adds CORS headers when returning the response
To the browser, you’re calling a same-origin interface—no CORS issue. To the third-party API, the caller is the Workers server—also no CORS issue.
When I was calling the Amap API, I kept getting CORS errors. After using a Workers proxy, I fixed it with two lines of code.
Hands-On: Building Your First API Proxy
Alright, enough about the advantages. Let me walk you through the setup step by step. I’ll use proxying the OpenAI API as an example—other APIs work similarly.
Step 1: Environment Preparation
1. Register a Cloudflare Account
Go to cloudflare.com and register an account—the free tier works. Registration is simple, just verify your email.
2. Install Wrangler CLI
Wrangler is Cloudflare’s official command-line tool for creating and deploying Workers.
npm install -g wranglerIf you don’t have Node.js installed, download and install it from nodejs.org first.
3. Login Authorization
wrangler loginRunning this command will open your browser for authorization. Just click agree, and the CLI can then manage your Workers.
Step 2: Create a Worker Project
wrangler init openai-proxyRunning this command will ask you several questions:
- “Would you like to use TypeScript?” → Choose No (unless you’re familiar with TS)
- “Would you like to create a new Worker?” → Choose Yes
- “Would you like to install dependencies?” → Choose Yes
This will generate a project folder with a structure like:
openai-proxy/
├── src/
│ └── index.js # Your code goes here
├── wrangler.toml # Configuration file
└── package.jsonStep 3: Write Proxy Code
Open src/index.js, delete the default code, and replace it with this:
export default {
async fetch(request, env) {
// Only allow POST requests
if (request.method !== 'POST') {
return new Response('Method not allowed', { status: 405 });
}
// Read OpenAI API Key from environment variables
const apiKey = env.OPENAI_API_KEY;
if (!apiKey) {
return new Response('API Key not configured', { status: 500 });
}
try {
// Get request body from frontend
const body = await request.json();
// Call the real OpenAI API
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`, // Using server-side Key
},
body: JSON.stringify(body),
});
// Get response data
const data = await response.json();
// Return to frontend with CORS headers
return new Response(JSON.stringify(data), {
status: response.status,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*', // Allow all domains
'Access-Control-Allow-Methods': 'POST',
'Access-Control-Allow-Headers': 'Content-Type',
},
});
} catch (error) {
return new Response(JSON.stringify({ error: error.message }), {
status: 500,
headers: { 'Content-Type': 'application/json' },
});
}
},
};The code is straightforward:
- Receive POST request from frontend
- Read the real API Key from environment variables (frontend can never see this)
- Use this Key to call OpenAI API
- Return results to frontend while adding CORS headers to solve cross-origin issues
Step 4: Configure Secrets (Store API Key)
This is the most critical step. API Keys can’t be written in code—use Cloudflare’s Secrets feature for encrypted storage.
Run this command:
wrangler secret put OPENAI_API_KEYAfter hitting enter, it will prompt you to input the Key value. Paste your OpenAI API Key and hit enter.
This Key will be encrypted and saved to Cloudflare’s servers. You can’t see the plaintext even in the console. In code, you can read it via env.OPENAI_API_KEY.
What about local development?
Create a .dev.vars file in your project root:
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxThis file is only for local development. Never commit it to Git. Add a line to .gitignore:
.dev.varsStep 5: Local Testing
Run this in your project directory:
wrangler devThis starts a local server at http://localhost:8787 by default. You can test it with Postman or frontend code:
fetch('http://localhost:8787', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Hello!' }],
}),
})
.then(res => res.json())
.then(data => console.log(data));If it returns a proper OpenAI response, your proxy is working correctly.
Step 6: Deploy to Production
Once testing is good, deploy with one command:
wrangler publishTakes just a few seconds. After deployment, Cloudflare gives you an address like:
https://openai-proxy.your-subdomain.workers.devChange your frontend API address to this, and you’re done! API Key won’t be exposed to the frontend at all.
Want to use your own domain? In the Cloudflare console Workers page, click your Worker → Settings → Triggers → Add Custom Domain, enter your domain (like api.yourdomain.com), and follow the prompts to configure DNS.
Advanced Techniques and Best Practices
The above code works, but there are some optimizations worth exploring. Let me share a few advanced tips.
Prevent Proxy Abuse
Right now your Worker is public—anyone who knows the address can call it. If someone maliciously spams requests, your free tier might run out quickly, or worse, incur charges.
Simple Token Validation
You can add a simple validation mechanism:
export default {
async fetch(request, env) {
// Validate token in request headers
const token = request.headers.get('X-API-Token');
if (token !== env.MY_SECRET_TOKEN) {
return new Response('Unauthorized', { status: 401 });
}
// ... rest of proxy logic
},
};Then use wrangler secret put MY_SECRET_TOKEN to set a secret. When calling from frontend, include this token:
fetch('https://your-worker.workers.dev', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Token': 'your-secret-token', // Can also be stored in frontend env variables
},
body: JSON.stringify(data),
});While this token is still visible in frontend, it at least raises the barrier. You can periodically rotate tokens or assign different tokens to different users.
IP Whitelist
If your app only runs under specific domains, you can restrict origins:
const allowedOrigins = ['https://yourdomain.com', 'http://localhost:3000'];
const origin = request.headers.get('Origin');
if (!allowedOrigins.includes(origin)) {
return new Response('Forbidden', { status: 403 });
}Support Multiple APIs
If you need to proxy multiple different APIs, you can differentiate by path:
export default {
async fetch(request, env) {
const url = new URL(request.url);
// Forward to different APIs based on path
if (url.pathname.startsWith('/openai')) {
return proxyOpenAI(request, env);
} else if (url.pathname.startsWith('/maps')) {
return proxyMaps(request, env);
} else {
return new Response('Not found', { status: 404 });
}
},
};
async function proxyOpenAI(request, env) {
// OpenAI proxy logic
}
async function proxyMaps(request, env) {
// Maps API proxy logic
}This way, one Worker can handle multiple APIs conveniently.
Handle OPTIONS Requests (Complete CORS Support)
The previous code only handles POST requests. But browsers send an OPTIONS preflight request before cross-origin requests. Complete CORS handling should look like this:
export default {
async fetch(request, env) {
// Handle CORS preflight requests
if (request.method === 'OPTIONS') {
return new Response(null, {
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, X-API-Token',
'Access-Control-Max-Age': '86400',
},
});
}
// ... normal request handling
},
};Monitoring and Debugging
The Cloudflare console lets you view Worker performance:
- Request count
- Error rate
- Response time
For real-time logs, use the wrangler tail command:
wrangler tailThis outputs logs for all requests, useful for debugging. Any console.log() output in your code will also appear here.
FAQ
What if I run out of free tier?
100,000 requests per day is really enough for personal projects. I’ve used it for several months and never exceeded the free tier.
If it’s truly not enough, the paid tier is also inexpensive—$5/month gets you 10 million requests. Compared to buying a server, this price is already very reasonable.
Is Workers stable? Will it suddenly go down?
Cloudflare is one of the world’s largest CDN providers with very reliable infrastructure. I’ve used it for over half a year without encountering any service unavailability.
They have a 99.99% SLA guarantee. The probability of issues is much lower than running your own server.
Can I use my own domain?
Yes. Just bind a custom domain in the Cloudflare console and configure DNS. The whole process takes 5 minutes, no need for extra certificate configuration (automatic HTTPS).
How’s the access speed in China?
Cloudflare has nodes in China, so speed is decent. I’ve tested it—response times are generally 100-300ms, much faster than directly calling foreign APIs.
But it’s definitely not as good as services specifically optimized for China. If speed is critical, consider using domestic Serverless platforms (like Alibaba Cloud Function Compute). Though the configuration is more complex.
Does it support languages other than JavaScript?
Workers mainly support JavaScript and TypeScript. If you prefer other languages, you can consider compiling to WebAssembly, but that has a higher barrier to entry.
For simple scenarios like API proxying, JavaScript is perfectly adequate.
Is this really secure?
As long as you don’t return the API Key in responses to the frontend, it’s secure. Secrets are stored encrypted—you can’t see plaintext even in the console.
Of course, you still need proper access controls to prevent proxy abuse. The token validation and IP whitelisting mentioned above can both help.
Conclusion
API Key leakage troubled me for quite a while. At first, I thought there was no good solution—either accept the risk or pay for a server.
Not until I discovered Cloudflare Workers did I realize it could be this simple. Deploy in 5 minutes, completely free, secure API Key storage, CORS issues solved—all these advantages combined make Workers nearly the perfect solution for personal projects.
If you’re also building projects that need to call third-party APIs, I strongly recommend trying this method. It’s really not difficult. Follow the steps above and you’ll definitely get it done in half an hour.
I’ve written the code in detail—just copy and modify as needed. If you have questions, check the Cloudflare official documentation or leave comments. I’ll reply when I see them.
One last reminder: After deployment, remember to implement access controls so your proxy doesn’t get abused. Adding token validation or restricting origin domains are both simple yet effective measures.
Go register a Cloudflare account and give it a try now!
Published on: Dec 1, 2025 · Modified on: Dec 4, 2025
Related Posts

Complete Guide to Deploying Astro on Cloudflare: SSR Configuration + 3x Speed Boost for China

Building an Astro Blog from Scratch: Complete Guide from Homepage to Deployment in 1 Hour
