00:00:00Millions of developers have built on Vercel's front-end cloud for over a decade.
00:00:04We've all been using Vercel to deploy everything from personal projects to enterprise software.
00:00:10And when you're building agents and AI features to reduce friction for your users or your team,
00:00:15the last thing you want is something complicated to maintain in production.
00:00:21Vercel's self-driving infrastructure makes DevOps as easy as deployment.
00:00:25You generate code and build agents, then Vercel autonomously optimizes for global-scale performance,
00:00:31provisions compute, orchestrates AI workflows, and even investigates anomalies.
00:00:38All with no configuration required.
00:00:40Now let me show you what I mean with a speed run.
00:00:43Let's take on the challenge of migrating an existing web app.
00:00:46Now this is a demo app of a hiring platform for a midsize coffee company that has a roastery and cafes.
00:00:53It's a simple app with a front-end, a database to store job applications and resumes,
00:00:58and a back-end service that summarizes cover letters for faster application review.
00:01:03Now it's on an ancient hosting provider, I'm not going to tell you which one,
00:01:07but we definitely want to modernize it, so let's do that.
00:01:10We'll migrate to Vercel, which is going to speed up feature development, deployment, and collaboration.
00:01:17Now let's import this app into Vercel.
00:01:19I'll start with the back-end, which is a basic FastAPI Python service.
00:01:24I'll start by clicking Add New.
00:01:26I've already connected my Git account and I can see that my repositories are here.
00:01:31I'll click Import on the API repo.
00:01:33The import is fully configurable, including build and output configs,
00:01:38but Vercel supports FastAPI and other back-end frameworks out of the box.
00:01:44So I can just click Deploy and it's going to work.
00:01:46And within a few seconds, my Python back-end service is deployed and running on Vercel.
00:01:52Now Vercel automatically generated a URL for this project.
00:01:55I'm going to copy it so that I can add it as an environment variable for the next app.
00:02:01I'll add another new project, click Import on the next app,
00:02:05and I'm going to import my environment variables for this app so that it can talk to Subabase.
00:02:11Great, now my app can talk to Subabase, and for the FastAPI variable,
00:02:15I'll add the URL I just copied, and I'll deploy the app.
00:02:20Vercel is building and deploying the project,
00:02:23and this will run on the same infrastructure that powers millions of websites and apps.
00:02:27Now Vercel's 126 points of global presence automatically locates content close to the user,
00:02:33which means it will always be super fast.
00:02:37And here we can see that the app has deployed successfully.
00:02:40Let's take a look at it.
00:02:42I can see the jobs, and I can see in the dashboard the back-end with our Python summary feature.
00:02:49Now once this is in production, I'll be able to monitor the entire app.
00:02:53But for now, I'll jump into another project where we can see some live data.
00:02:57I can get the full picture with web analytics, speed insights, logs, and a queryable observability dashboard.
00:03:04For example, here are the analytics for our website, skills.sh, where people go to find agent skills.
00:03:11I can find detailed traffic and referral information, including referral sites.
00:03:15If I click over here on speed insights, it shows me Core Web Vitals.
00:03:19This way I can diagnose and fix any speed or load issues.
00:03:23And if I click on observability, I get the dashboard, which enables me to explore and query any diagnostic from my app,
00:03:29from logs to functions to edge requests to data transfer.
00:03:34All right, now I want to make the summarization feature more useful to hiring managers.
00:03:38Currently, FastAPI just summarizes the cover letter with a Python library.
00:03:43But we can build a simple agent to analyze both cover letter and the PDF resume,
00:03:48compare them to their job description, make an initial evaluation of the candidate,
00:03:52and generate an email for the hiring manager to send.
00:03:56I already have the repo cloned locally, so I'll jump into the directory and install the tools that I need.
00:04:01Now I'm going to use the Vercel CLI to connect my local project to Vercel.
00:04:06The CLI makes it easy for me to control the Vercel platform right from my command line.
00:04:11I'll start with Vercel link.
00:04:14Yeah, I'm going to link this local project.
00:04:16It'll be the Vercel demo org and the coffee shop jobs project.
00:04:21I already found it.
00:04:22And we'll pull the environment variables.
00:04:24There we go.
00:04:25I can also run the app locally with Vercel dev, which replicates the Vercel deployment environment locally.
00:04:31Try that, Vercel dev.
00:04:34Start up the server, let's just go ahead and test it.
00:04:37And there we go.
00:04:38Pretty cool.
00:04:40Now let's install the AI tools we'll need to build the agent.
00:04:43Vercel gives me all of the AI tools I need to build features and agents.
00:04:47First, I need to integrate an actual AI model for summarization.
00:04:51I can use hundreds of different models through the Vercel AI gateway,
00:04:55and running Vercel dev gives me automatic access with OIDC tokens, which is pretty cool.
00:05:00Next, I'll install the AI SDK, which gives me a full set of AI primitives for things like summarizing text.
00:05:07Let's do PMPM install AI.
00:05:12Great.
00:05:13After that, I'm going to install the workflow dev kit, which will durably orchestrate each step the agent takes.
00:05:21If my app was generating code, I could also use the Vercel sandbox to run it safely.
00:05:26But this is a simple summarization agent, so I don't need sandbox for it.
00:05:31All right, last but not least, the front end is Next.js.
00:05:34So I'll add the React best practices skill to my project to ensure the front end is clean and fast.
00:05:40I'm going to use skills.sh.
00:05:43Head on over here, search for React best practices.
00:05:47I'm going to copy the command to install it.
00:05:50And there we go.
00:05:53Let's make sure to install it for clog code and a few other things.
00:05:56Globally, I like the SimLink version, and it's installed.
00:06:01Now let's build out the agent.
00:06:02It's pretty simple, so I'm just going to prompt clog code.
00:06:06This is the prompt that I started with.
00:06:08It tells clog code to build the agent, to summarize the cover letter and PDF resume,
00:06:12combine the summaries and compare the job description, make an initial recommendation, and generate a follow-up email.
00:06:20But of course, I spent some time with clog refining this into a one-shot prompt,
00:06:25which I'm going to paste now into clog code and just let it rip.
00:06:29Through the magic of video demos, let's assume I YOLO'd my way through the clog prompts,
00:06:35which of course I did, and I'll just bring up the other directory with that output.
00:06:39All right, let's deploy.
00:06:41I'm going to commit the changes and push the branch that I'm working on.
00:06:47Now Vercel's self-driving infrastructure is going to automatically recognize the AI workloads
00:06:51and provision infrastructure for those compute jobs.
00:06:55And active compute pricing is really great here.
00:06:57I'm only going to be charged for the actual compute, not the round-trip time while we're waiting on the model API to respond.
00:07:03All right, now I can see the app, and let's go into the dashboard and check out that new feature and view an application.
00:07:10And in fact, there we have the rejection text for this application and the generated reject email.
00:07:17Now the best part about preview environments, I think, is that anyone on my team can make any comment on any part of the app.
00:07:24I'm going to do that right now, and I'm just going to ask Eric to add a button.
00:07:31Cool, and that's just one feature of the Vercel toolbar.
00:07:34You can test flags, run accessibility audits, run traces, and more.
00:07:38Now the preview environment runs on the same infrastructure as production,
00:07:42so what I test here is exactly what my end users will see in prod.
00:07:46Okay, we've added an agent to the app, but I want other people on the team to be able to iterate on the UI as well.
00:07:52V0 makes it easy for other people to work on the project in a web-based dev environment,
00:07:57but keeps everything version-controlled and safe through Git workflows.
00:08:01This is really cool.
00:08:03I'm going to import the GitHub project, choose main as the base branch, and V0 will set up the project.
00:08:13And as you can see, my environment variables were imported as well.
00:08:19And now, in less than a minute, my app is running in V0.
00:08:23Because this is an existing project, V0 spins up a sandbox to run the code.
00:08:28It's the exact same sandbox primitive that we can use on Vercel to spin up isolated environments.
00:08:33And if I click on Git, you can see that V0 has automatically created a new branch for me to work on.
00:08:39I want to adjust the Java's page design just a bit.
00:08:42Might be crazy, but let's make the job cards full-width on the page.
00:08:50V0 cooked on that, and, huh, I don't love that choice that I've made, but let's see what the design team thinks.
00:08:57I can send this V0 chat to my team so they can continue to iterate,
00:09:00or I can open a PR and share the preview environment for collaboration.
00:09:06Once I deploy my app and it's being served a global scale, Vercel ensures that it's secure by default.
00:09:13The same self-driving infrastructure that runs my app also protects it at the edge.
00:09:18This is the firewall view for our Next.js site.
00:09:20It gets a lot of traffic, and not all of it is good.
00:09:24Vercel's web application firewall automatically inspects and filters malicious requests at the edge before they ever reach your app.
00:09:33Bot ID quietly distinguishes real users from automated traffic, blocking abusive bots without adding friction like CAPTCHAs.
00:09:40Vercel's global edge network automatically detects and mitigates large-scale attacks, keeping your app responsive under load.
00:09:49Everything you just saw—migration, AI workflows, collaboration, security—ran on one platform with zero infrastructure configuration.
00:09:59That's what self-driving means.
00:10:01The platform gets out of the way so you can focus on shipping.
00:10:05We're at 11 million projects and counting.
00:10:08Yours is next.