1 in 9 Apps Ship Supabase Keys: Here’s How It Happens

BBetter Stack
Computing/SoftwareSmall Business/StartupsInternet Technology

Transcript

00:00:00A report came out this month scanned about 20,000 indie apps and found that one in nine
00:00:04were exposing Supa-based credentials in front-end code.
00:00:08This is not in a server log or a private repo.
00:00:11This is just in the JavaScript every visitor downloads.
00:00:14And the thing is, these apps weren't hacked, they just shipped the secret by accident.
00:00:18To be clear, some keys are meant to be public.
00:00:21The problem is the pipeline, because the same mistake can ship a key that should never be
00:00:25in the browser in the first place.
00:00:27We have videos coming out all the time.
00:00:28Be sure to subscribe.
00:00:35Here's the simple truth that ties some people up.
00:00:38We all kind of know it, but it's easy to miss when you're moving fast.
00:00:41When you mark an environment variable as public, your build tool treats it like it belongs in
00:00:46the browser and it gets added to the client bundle.
00:00:50Next.js does this with Next.public, Vite does this with Vite, and SvelteKit does this using
00:00:56public.
00:00:57Next is not a safety label of any kind, that's literally the shipping label.
00:01:01Now we know that, but here's the Supa-based part.
00:01:04Some keys are meant to be public, like anon or publishable key, and some are private, like
00:01:10service role or secret keys.
00:01:12If a private key ends up in the browser, well, I think you can guess what happens.
00:01:17Role level security won't save you in that case.
00:01:20Supa-based has clear docs that say service keys can bypass RLS.
00:01:24So if that key is in your front end, all your policy work doesn't really matter anymore.
00:01:29Now let me show you exactly how this happened using my own sandbox.
00:01:33First, I built out a simple CRUD app using Next.js and I linked my Supa-based into it.
00:01:38Here's my EMV file, and here's the mistake.
00:01:41I put a key behind a public prefixed variable.
00:01:45This key is publishable, so it's expected to be visible.
00:01:48The real problem is that the exact same Next.public path can accidentally ship a private key too.
00:01:53When I ran npm run build to build it out, then I started my app.
00:01:59Here I am in Chrome.
00:02:00Let me just add some real quick info to our database, our CRUD app.
00:02:05Okay, now I can open the compile JavaScript bundle and I'm going to search for it.
00:02:10There it is.
00:02:12The URL and the key are literally sitting inside the file your users just pulled into their
00:02:18browser.
00:02:19And notice what did not happen.
00:02:21Nobody broke into this.
00:02:22I found this, right?
00:02:24Nobody exploited anything.
00:02:25This is just reading what the app already shipped to the public internet.
00:02:29If you can see it, anyone can see it.
00:02:32Open dev tools, look at the JavaScript files and search for it.
00:02:35That's all you got to do.
00:02:36So if your plan is nobody's going to look, well, the internet is full of people and bots
00:02:41whose job is literally to look for these kinds of things.
00:02:44So here's the fix that actually works.
00:02:47The browser should only call your API.
00:02:50Your API runs server side.
00:02:52That's where private keys live.
00:02:54Move the private operation into an API route or server function.
00:02:58The client calls your endpoint, your endpoint calls super base, then rebuild and recheck
00:03:03in the bundle.
00:03:04If the key is gone from the bundle, you actually fixed it, but don't just stop there and call
00:03:09it quits because there's other things you could do.
00:03:11Make sure row level security is enabled for user facing tables and make sure your policies
00:03:16do what you think they do.
00:03:18Spend some time testing too.
00:03:19I think this gets brushed over some.
00:03:21Now the part that keeps this from coming back, most people fix this once, then reintroduce
00:03:26it later during a rush.
00:03:28So add some guard rails, start with a secret scanning NCI so builds fail if a key shows
00:03:34up where it shouldn't.
00:03:36Then a PR role that anything with next public or Vite is treated as public by default because
00:03:41it is.
00:03:42Finally, add some rotation.
00:03:43If you even have a slight hint that keys were exposed, just rotate them.
00:03:47That's better than seeing how it plays out over a few days.
00:03:50Here's what you can try right now.
00:03:52Build your app the way you ship it.
00:03:55Search the output for super base JWT service secret and anything that looks like a token.
00:04:01If you find anything private, assume it's compromised because you found it.
00:04:05Rotate it and then change up your logic server side.
00:04:08If you remember one line from this video, make it this.
00:04:11If it's in the bundle, it's public.
00:04:13We'll see you in another video.

Key Takeaway

Any environment variable prefixed for client-side use is publicly accessible, making it critical to keep private keys on the server and use automated scanning to prevent accidental exposure.

Highlights

Approximately 1 in 9 indie apps scanned were found to be exposing private Supabase credentials in their front-end code.

Environment variable prefixes like "NEXT_PUBLIC" or "VITE_" are not security labels but shipping labels that bundle secrets into the browser.

Private keys

Timeline

The Scope of the Exposure Crisis

A recent report revealed that nearly 11% of indie apps are inadvertently leaking their Supabase credentials directly within their front-end JavaScript. This vulnerability is not the result of a database hack or a private repository leak, but rather a mistake in how code is bundled for the browser. The speaker emphasizes that while some keys are intended to be public, the current development pipeline makes it far too easy to ship private secrets by accident. This section establishes the gravity of the situation by highlighting that these secrets are visible to any visitor who downloads the site's assets. It sets the stage for a deeper look into the technical triggers of these leaks.

Understanding Public Prefixes and Build Tools

The root cause of these leaks often lies in how modern frameworks like Next.js, Vite, and SvelteKit handle environment variables. When a developer prefixes a variable with names like "NEXT_PUBLIC", the build tool is instructed to include that value in the client-side bundle. The speaker warns that these prefixes are "shipping labels" rather than safety labels, meaning they guarantee exposure to the browser. This becomes catastrophic when private keys, such as the Supabase service role key, are labeled this way because they are designed to bypass Row Level Security (RLS). Consequently, all security policies meant to protect user data become irrelevant if a private key is leaked to the front end.

Demonstrating the Leak in a Live Sandbox

To illustrate the danger, the speaker demonstrates a simple CRUD application where a private key was accidentally placed behind a public prefix in the .env file. After running a standard production build, the speaker opens the Chrome developer tools and searches the compiled JavaScript bundle for the secret key. The demonstration proves that no hacking skills are required; the key is sitting in plain text for anyone or any bot to find. This section highlights the reality that the internet is constantly being crawled by automated systems looking for exactly these types of vulnerabilities. If a developer can see the key in their own browser's dev tools, it must be considered public information.

Architectural Fixes and Security Guardrails

The definitive solution for protecting private keys is to move sensitive logic into server-side API routes or server functions. By ensuring the browser only communicates with your own API rather than directly with the database using a secret key, the credentials never leave the server. Once the logic is moved, developers should rebuild their apps and verify that the keys no longer appear in the bundle. Additionally, the speaker recommends implementing long-term guardrails such as CI/CD secret scanning and mandatory peer reviews for any changes to public environment variables. If there is even a suspicion that a key has been exposed, the immediate best practice is to rotate the key rather than waiting to see if it gets exploited.

Final Testing and the Golden Rule of Bundling

In the concluding segment, the speaker provides a concrete checklist for developers to try on their own applications immediately. Users are encouraged to build their apps in production mode and perform a manual search for keywords like "JWT" or "service_secret" within the output files. If any private tokens are found, the speaker advises assuming they are already compromised and moving to rotate them immediately. The core philosophy of the video is summarized in one memorable line: "If it's in the bundle, it's public." This final reminder serves as a warning to always treat client-side code as a completely transparent environment.

Community Posts

No posts yet. Be the first to write about this video!

Write about this video