▲ Community Session: Ship and validate faster with PostHog + v0

VVercel
컴퓨터/소프트웨어창업/스타트업AI/미래기술

Transcript

00:00:00[BLANK_AUDIO]
00:00:30[BLANK_AUDIO]
00:01:00[BLANK_AUDIO]
00:01:29>> Hello, everyone.
00:01:32Welcome to the Versailles community session for this week.
00:01:36We're so excited to have you here.
00:01:41It's really nice to do these live sessions because I feel like I haven't
00:01:44done them in a while.
00:01:46So hi, everyone.
00:01:47[LAUGH] If this is your first time in one of our community sessions,
00:01:53I'm Pauline Navas from the community team here at Versailles.
00:01:57You may have seen me around the community spaces on our own community platform
00:02:03www.versailles.com, or maybe on X or LinkedIn, answering questions and
00:02:08engaging with the community to really understand what you guys are all building
00:02:12on Versailles.
00:02:13So this is always a community opportunity to connect with our community,
00:02:18our customers, and our users live.
00:02:22Cool, it's awesome to see so many of you already here.
00:02:25I see a good evening from Adam in the chat.
00:02:29Hello, hello, hello.
00:02:31So drop a message in the chat and let us know where you're tuning in from today.
00:02:36You're watching this on X or on YouTube or LinkedIn and
00:02:39you want to join the chat live, head over to community.versailles.com/live.
00:02:46You should find the session right at the top and drop in a comment in the chat.
00:02:53So today's session, I'm really excited whenever we have these Versailles
00:02:58marketplace integration sessions, because it's really cool to see what you
00:03:03can build on top of Versailles using our integrations.
00:03:07So today we're talking with Posthug.
00:03:11And if you haven't come across them before, the quick version is Posthug
00:03:16is an open source product analytics platform.
00:03:19You could think of it as a feature flags, A/B testing, event tracking, and
00:03:25basically all of the tools you need to help you understand what's actually happening
00:03:29when people use your product.
00:03:31What makes today's session super exciting is that Posthug now
00:03:36integrates directly with vZero.
00:03:38That means that you can go from an idea to a live experiment without ever having to
00:03:43leave your builder.
00:03:45So it's no more of the ship it and then figure out analytics later.
00:03:50You can set up the feature flags, run tests, and really track those custom events
00:03:55as part of the building itself.
00:03:58So I will stop yapping, but hey, everyone, I can see loads of you joining the chat here.
00:04:06Nice.
00:04:06Tell us where you're tuning in from, folks.
00:04:09Cool.
00:04:10So without further ado, I would love to welcome our guest, Brooker, from Posthug.
00:04:16Hey, Brooker.
00:04:17- Hi.
00:04:17Happy to be here.
00:04:19Yep, so like Pauline said, my name is Brooker.
00:04:21I'm a product engineer on the growth team.
00:04:25And to get started, I'll tell you a little bit about what Posthug offers,
00:04:30why you would want to use something like Posthug in vZero.
00:04:34And then we'll go through a few use cases in vZero, in Vercel and vZero.
00:04:41I'll do a live demo.
00:04:42So we'll actually implement some of the use cases.
00:04:46So the two that I want to cover in particular, well, I guess I'll get started
00:04:49on what Posthug is, why you would be interested in it as a user of vZero
00:04:54and Vercel.
00:04:56So, you know, you use vZero, you built the coolest app ever, you deployed it,
00:05:02you want to see how it's working, how do you figure that out?
00:05:06How do you understand how people are using your app, what they like about it,
00:05:11what they don't like about it?
00:05:12How do you understand when something goes wrong?
00:05:15So a lot of times, you deploy an app and, you know, it works great
00:05:19in the preview mode.
00:05:20Maybe you ran some tests on it yourself, but when it's running in production,
00:05:26it runs into some issues that you didn't foresee.
00:05:29That's when an exception might be thrown, and it would be helpful for you
00:05:33to have visibility into that, and especially bring that visibility
00:05:39and context into vZero, into the agent, so the agent can understand everything
00:05:45that's going on in production and fix the issue for you without you needing
00:05:50to run around gathering the context in different, you know, whether it's your
00:05:55logging system, your error system, or, you know, reproducing it yourself
00:05:59and copy-pasting out of Chrome DevTools.
00:06:02Not that I've ever had to do that before.
00:06:04- We've all done that, I think.
00:06:07- Yeah.
00:06:09So that's one use case I want to show off is error handling.
00:06:13And then the other is feature flags and A/B testing.
00:06:17So, you know, I've actually really enjoyed this method of developing.
00:06:23So, you know, you're working with your team.
00:06:26You're talking about a feature that you want to add.
00:06:29And maybe you have a different opinion than your teammate about how you want
00:06:33to implement this feature.
00:06:34So, you know, I want to do it this way.
00:06:36They want to do it that way.
00:06:39One great way to resolve that is just do both and then test them
00:06:42in production and see what is giving you better results.
00:06:46So we're going to also go through that.
00:06:48So without further ado, let me share my screen.
00:06:51And I'm going to take you through the entire process end-to-end.
00:06:55So a little bit of it is pre-done on this account.
00:06:58So I'm just going to switch to an account where I don't have Posthog installed yet.
00:07:06And if you could let me know when my screen is.
00:07:09- The screen is visible, Brooker.
00:07:11Let's go.
00:07:11- Great.
00:07:12Okay.
00:07:12So we're in Vercel.
00:07:14We're going to go into the integrations and we're going to search for Posthog.
00:07:18And I don't know why it says...oh, Browse Marketplace.
00:07:21So we're going to Browse Marketplace.
00:07:22We're going to search for Posthog and select Posthog here and click Install.
00:07:29This is going to ask me if I want to create a new account or link to an existing.
00:07:32So if you already have a Posthog account, you can link them.
00:07:35And what this is going to do for me is...and we'll just install it here.
00:07:39So it's going to...let me just...sorry, I can't click and talk at the same time.
00:07:46Why do I have to enter a billing address?
00:07:48Okay.
00:07:49So you'll fill out this form.
00:07:52Hit Continue.
00:07:53This will actually sync your...let me show you what it will do.
00:07:59It will sync your environment variables so that if I go to Settings...
00:08:06Oh, well, I'll go to the Overview.
00:08:09Oh, I thought I had this installed already.
00:08:14Let's do this real quick.
00:08:15So we're going to install it for a project.
00:08:17So I'm picking which project I want to install it.
00:08:20Hit Continue.
00:08:22Great.
00:08:23This is going...so I picked the project.
00:08:25Let's see.
00:08:30Sorry, I thought I had one on here.
00:08:32Oh, I know what it is.
00:08:33I just have the wrong account up here.
00:08:35I think I'm on this one.
00:08:37- This is the best, by the way, like debugging and going through the steps.
00:08:41This is great.
00:08:42- Yeah.
00:08:44Okay, yeah, so I have it connected here.
00:08:47So what I wanted to show off, the really cool thing about this integration is you
00:08:51don't have to manage the environment variables yourself.
00:08:55So if you have experience with that, it's not very fun, especially when you have
00:08:59to rotate your keys, which is a really good idea.
00:09:03So Vercel, actually, the Vercel integration with PostHog will manage all that for you.
00:09:08So all you have to do is kind of click through the installation process,
00:09:12which I just showed.
00:09:13It also handles billing, which we have a really generous free tier.
00:09:17So hopefully you won't have to deal with that at first until your product
00:09:21takes off, and then it'll be the least of your worries.
00:09:24But yeah, so this will sync your environment variables.
00:09:28The main things we need here, the PostHog host is...I'm not going to show these off
00:09:33because I don't want to get flak for it, but the host is just a URL.
00:09:39So that's just going to be like us.i.posthog.com.
00:09:43This PostHog key is the important part, and that's going to tell PostHog,
00:09:48that's going to identify your project with PostHog.
00:09:50So when your application is running in production and, you know, errors and usage
00:09:58and feature flags are being reported back to PostHog, this key is what's going
00:10:02to identify your product with a PostHog project.
00:10:07So that's really useful to have that kind of automatically set up for you in here.
00:10:12And any questions on, like, the Vercel side?
00:10:16It's pretty simple.
00:10:17I kind of hand-waved, and I also ran into some issues, so sorry about that.
00:10:21But basically, you install it, link it to the project, the Vercel project that you want
00:10:25to use it in, and then you're off to the races.
00:10:28There's also, you know, we have some getting started guides here.
00:10:32You can see your feature flags in here if you have them set up.
00:10:35I don't have any in this one yet, but we will in a minute.
00:10:40- Yeah, this is great.
00:10:41I was going to say, there's no questions right now, which I think shows that it's quite seamless.
00:10:47I did ask in the chat, has anyone tried this integration before?
00:10:51So if you have, definitely let us know and ask questions as we go along.
00:10:54- Yeah, and feel free to, yeah, interrupt.
00:10:58So let's get to the demo now.
00:10:59So I have an app running.
00:11:01I built a little game.
00:11:02It's called Hog Hop.
00:11:04And so I have a little PostHog here.
00:11:06I got some bugs running around, and I'm trying to collect data points.
00:11:12So Max is jumping around collecting data points.
00:11:15I have a bug.
00:11:16So when I collect certain data points, it's just frozen.
00:11:21So I can't do anything now.
00:11:23I ran into this in production, and I need help figuring out what's going on.
00:11:28So I could, like I said before, I could pop open DevTools.
00:11:32I could, you know, look into like the Vercel logs to try to find errors.
00:11:41But I'm going to show you how you don't have to do any of that.
00:11:44And in v0, you can gather all the context you need about the error and fix it right
00:11:50in there.
00:11:51So first, I'll show you how we add the MCP.
00:11:54So in the bottom left of your chat in v0, there's this little plus button,
00:12:01and it has a spot for MCPs.
00:12:04And you can just click Add MCP.
00:12:06And then PostHog is set up as kind of a preset MCP.
00:12:10So I already have it connected for this, so I'm not going to disconnect it.
00:12:13But you're just going to click the plus button, and then it'll ask you to log
00:12:18in with your PostHog account.
00:12:19One thing I didn't show off was when you create this integration with PostHog
00:12:25and Vercel, you get this Open in PostHog button, and that's going to log me
00:12:30into PostHog so I don't have to manage, like, credentials or anything like that.
00:12:36So I'm logged in here, and then over in v0, when I want to connect the MCP,
00:12:42I'll just click to authenticate, and it will open up a PostHog tab to authenticate
00:12:48the MCP.
00:12:50I just realized I didn't define MCP, and I'm so sorry.
00:12:52So MCP, Model Context Protocol, it's basically a way to gather information
00:12:58to give the agent, the v0 agent, the ability to gather information or kind
00:13:03of call functions that are connected to this account.
00:13:07So in this case, we're saying, "Hey, v0, if you need to ask any questions or manage
00:13:14things in PostHog," and this is...it's a very fully featured product,
00:13:19the MCP of PostHog, so you can do quite a lot in there.
00:13:24I'll actually show...we have a Doc...PostHog Docs, Model Context Protocol.
00:13:29This shows you kind of all the things you can do.
00:13:33Don't get too overwhelmed with this list.
00:13:35This is all loaded into the agent.
00:13:36So you could actually ask, like, "What can I do with the PostHog MCP?"
00:13:42And it'll tell you.
00:13:43And just make sure I have this little toggle toggled on here.
00:13:50And, yeah, so what I wanted to show off here is, again,
00:13:55when I try to collect this third data point, I'm hitting a bug.
00:14:00It just freezes.
00:14:01So I'm going to describe that in here.
00:14:04So it told me all these things I can do with PostHog MCP.
00:14:08So I'm going to describe that to the agent.
00:14:10So I'm going to say, "When I collect data points with max, sometimes it freezes.
00:14:23Can you find the bug using the PostHog MCP errors and fix it for me?"
00:14:35And while it's doing that, I'm just going to kind of show off the PostHog dashboard.
00:14:39So when I'm in PostHog here, I can go to apps and then error tracking.
00:14:47And like I said, PostHog does a lot of things.
00:14:50So there are quite a lot of features you can explore here.
00:14:55We're just kind of focusing in on a couple just to give you an idea of, like, tangibly what you can do.
00:15:01So if you are more technically minded and want to see what's actually going
00:15:04on with these errors, you can open this and see, you know, you could, like,
00:15:11click into the error and see, like, a stack trace.
00:15:14You can see how many times you're seeing it.
00:15:17And what else can you see?
00:15:20Similar issues.
00:15:21One, actually, let's see if this is still working.
00:15:25Oh, it fixed it.
00:15:26Okay. I'll show you another thing next time we have to wait for the agent.
00:15:30So this says that it fixed it.
00:15:32Let me see if I can deploy or we can test it in the preview.
00:15:37Let's just do that.
00:15:39All right.
00:15:39So I'm going to jump around, collect something.
00:15:41Oh, it's working.
00:15:42Sweet. So it's that easy.
00:15:46So I had a bug.
00:15:48I told it kind of what I saw happening.
00:15:51You could even, if you don't have that much information about how to reproduce it,
00:15:55you could give it less information, just tell it to look for errors.
00:15:59So I could say, you know, are there other errors in the post hog MCP we should fix?
00:16:09I'll say from today, just so we're not getting too historical here.
00:16:12The other thing I wanted to show off is if I go to session replay is a pretty cool feature.
00:16:21So I can actually see -- oh, and it doesn't look that great here.
00:16:27Oh, that's interesting.
00:16:28Okay. We're going to skip that, but I'm going to look into this.
00:16:33Typically, you'd be able to actually see what people are doing.
00:16:36I think probably because of the technology we're using on this game,
00:16:42the recording is getting messed up.
00:16:44But probably in different types of apps or definitely in other types of apps.
00:16:50I've never seen this for any other app.
00:16:51So you can see kind of how people are using your app and what they might be running into.
00:16:58And just seeing that live video of what they're doing can really help contextualize.
00:17:04And it just gives you so much more information than trying to look
00:17:08at like activity or events or things like that, or logs.
00:17:13So all right, so we have a couple other errors.
00:17:16And this is something I also want to stress.
00:17:18Like you as the builder might not know all the things people are running into in production.
00:17:25They might not be reporting it to you.
00:17:26You might not have run into it yourself.
00:17:28And so having visibility into all the errors can be super helpful for you.
00:17:34So in this case, we have a couple other errors that we're being run -- or that we're running into.
00:17:40Oh, it looks like they were both fixed.
00:17:43So cool. Any questions on that?
00:17:47And I want to move on to feature flags here.
00:17:51>> There's no questions in the chat for now, but there are comments.
00:17:54So someone in the chat has said they have a lot of tools in one place, an extremely generous free tier
00:18:01and an amazing DX in their platform, which is always great to hear.
00:18:05I guess a question I had actually, like I'm sure you'll probably touch on it in this next section.
00:18:11But beyond the errors, what are a couple of like other high value things the posthogs MCP can do
00:18:19for v0 users that you think people overlook?
00:18:23>> Yeah. So in my mind, feature flags and experiments is one of the most beneficial things.
00:18:30I think there are a lot of teams, a lot of applications are not using them that should be.
00:18:38So I'll show that off real quick.
00:18:40>> Yes, absolutely.
00:18:41Perfect segue.
00:18:42>> Yeah. So let's add -- so in this case, I have this game.
00:18:46I don't have a way right now to kill the bugs.
00:18:48So this one in particular, there's this bug like going back and forth on this little platform.
00:18:52And I'm having a really hard time getting that data point.
00:18:55And my users are too.
00:18:57So I want to build some extra functionality.
00:19:01So let's build like give my hedgehog the ability to shoot lasers out of its eyeballs.
00:19:16But I want to test this against a different piece of functionality.
00:19:22So I'm trying to think lasers.
00:19:24I guess we could do kind of the classic like Mario.
00:19:27Also -- well, let's start out with build a multivariant feature flag with lasers as one variant
00:19:43and the ability to jump on the bugs to kill the bugs as another.
00:19:52And then -- so this will -- so what I want to show off here is a couple things.
00:19:59So the MCP is very powerful.
00:20:02So you can use the MCP for most things that you would use the dashboard for.
00:20:07So you can do this in the dashboard.
00:20:10I'll show you if we go to apps.
00:20:17And then feature flag.
00:20:21You could also -- most things you can do in the MCP you can do in the chat here as well.
00:20:25So if you use -- I'll just open a new tab here.
00:20:29Oh, there's no way to chat in the tab.
00:20:33Oops.
00:20:34I lost it.
00:20:37So open a new chat, here we go.
00:20:39I could kind of give this a similar prompt.
00:20:43The only difference is this chat is not going to have the ability to update my V0 code, obviously.
00:20:48I have to accept.
00:20:52But I could still use it to manage the feature flags.
00:20:55Or in here I can create a feature flag, you know, using -- and it should be creating it
00:21:01right now.
00:21:02Here we go.
00:21:03Experiment, create.
00:21:04I could create it through the UI.
00:21:07I'm showing off how to do that via the V0 agent using the MCP because I find that the more
00:21:16I can do in my agent in like a V0, the better.
00:21:20I don't want to have to spend my time kind of clicking around, learning a new UI.
00:21:24Even though the dashboard is super cool and I love the interface here, I'd rather spend
00:21:31more time just in V0 building.
00:21:35So cool.
00:21:36So it set up this experiment.
00:21:39And we set up some metrics.
00:21:40So this is kind of one of the keys for experiment -- or the key for the experiment is like what
00:21:47is your hypothesis you're trying to test?
00:21:49V0 went ahead and kind of made up a hypothesis for us.
00:21:53So that's kind of cool.
00:21:55I might want to read this and kind of update that.
00:21:58So this is saying testing which one leads to better game completion rates.
00:22:02That's cool.
00:22:03Maybe my goal is game completion rates.
00:22:05But maybe my goal is just like time spent playing the game.
00:22:09I think game completion rate kind of makes sense.
00:22:13So it set that as the primary metric.
00:22:16So you're going to first come up with a hypothesis.
00:22:18What am I testing?
00:22:19In this case, I'm testing is stomp more engaging or is laser more engaging?
00:22:26And then we're going to set up primary metrics.
00:22:29And then there's also this concept of secondary metrics which I think are super important.
00:22:33So primary metrics are going to be the main goals that you want to achieve.
00:22:38But with any feature that you add or change that you make, there might be kind of secondary
00:22:43effects to that.
00:22:44So in this case, like maybe they're killing more or less bugs per session.
00:22:49Like maybe they're completing the game at a higher rate but killing less bugs.
00:22:52Probably unlikely.
00:22:53Or, you know, the deaths per session is another thing to look at.
00:22:57So anytime you're setting up an experiment, it's important to think about like, what's
00:23:02my goal?
00:23:03And then what are the things that are that this might be impacting that maybe aren't kind
00:23:09of the primary goal, but I want to be aware of as I'm making that decision.
00:23:14Quick question here, Brooker.
00:23:17So if someone like accidentally sets the wrong goal metric at first, how easy is it in post
00:23:23hog to adjust that like experiment without losing everything on V0, I guess?
00:23:30Yeah, so you could do it either via the MCP again or the chat or in the UI here.
00:23:36So there's this little pencil icon next to the metric.
00:23:39You just click that.
00:23:41And then, you know, let's say you want to change it from game completed to like time spent in
00:23:46the app or something like that.
00:23:47You would click in here and and find an event or maybe, you know, maybe it's like we decided
00:23:53we actually want to check like if they're leaving at a higher rate.
00:23:58So I could change it to page leave.
00:24:00And then in this case, I'm going to have the goal of that be to decrease.
00:24:03So I want people to leave the page at a lower rate.
00:24:08That's so it's as easy as that.
00:24:10You can also use the like I said, the MCP to do that, which I find much, much easier personally.
00:24:16But nice, it's good to see both both ways.
00:24:19Yeah.
00:24:20Nice.
00:24:21Yeah.
00:24:22And it'll it'll recalculate that.
00:24:23Sometimes what I run into is like, I might run an experiment.
00:24:27I'm seeing the results and I and that kind of helps me realize I have another question
00:24:32that I want to ask.
00:24:33Like you said, I want to update the metrics that I'm tracking.
00:24:36There are times where maybe I don't have an event or a way of tracking that yet.
00:24:43So that's another place where the MCP can be really helpful.
00:24:46You can say something like add an event.
00:24:48I'm trying to think of a good example like I don't know, maybe jumps.
00:24:53I could add like add an event for every time the hedgehog jumps.
00:25:02In that case, you would only have those events probably like depending on what it is that
00:25:08you're tracking for most things like that.
00:25:10You might only have that tracked from when from the time that you add it to your app.
00:25:15And then in that case, you can kind of change the duration of your experiment to start from
00:25:21a certain time.
00:25:22You could also target Oh, sorry.
00:25:24I'm so sorry.
00:25:25Um, but and if you if you do change that mid experiment, do you is there like a recommended
00:25:31best practice on should you restart the test or is it safe to just keep collecting data
00:25:38in the kind of same experiment, if that makes sense?
00:25:40Yeah, that's a good question.
00:25:42As long as you are kind of aware of the different changes that might be impacting what you're
00:25:48doing.
00:25:49I don't see a problem with just keeping the same experiment going.
00:25:55There might be a scenario where like, there are other changes happening at the same time
00:25:59that could be impacting your experiments.
00:26:02So you want to be really aware of like, all the different tests that you're running, and
00:26:06kind of how they could be impacting each other.
00:26:09That's why I feel like generally, if anything, like in this case, we did a multi variant,
00:26:15where I have, you know, stomp and laser, I guess it just set the laser to the control.
00:26:20But I might want a control that's like no, you know, no, no weapon.
00:26:30It can be helpful to group all of those into one experiment so that you're not running into
00:26:35issues where like different experiments are competing with each other.
00:26:38If that makes sense?
00:26:39Yeah, that makes sense.
00:26:41Yeah.
00:26:42Um, and let's see where we're, oh, I didn't add that to the queue.
00:26:45Okay, so it's done.
00:26:47So we got laser eyes.
00:26:48Let's see what happens if we use the app now.
00:26:51Okay, cool.
00:26:52I have a laser, but it's not killing the guys.
00:26:57That's lame.
00:26:58So anyway, so and, you know, you might run into this as well.
00:27:02If I didn't, I didn't effectively prompt v zero.
00:27:05So we're not going to blame v zero for that.
00:27:07But I just said shoot lasers out of its eyeballs.
00:27:10But I didn't say kill the bugs with the laser.
00:27:12So you could also target different users.
00:27:16So I could say like, target my user for the stomp feature, for example.
00:27:23And then we'll see if it can find who my user is.
00:27:26So this is kind of pushing the agent and MCP to the limit a little bit, but the MCP does
00:27:31have the capability of like finding a user.
00:27:35And then setting the variant of the experiment for that user.
00:27:41And you can target different cohorts for different different variants.
00:27:46So if you want to say like, you know, everybody in Australia gets laser eyes, you could also
00:27:51do that.
00:27:53And yeah, I think we're running, are we running close to the end of time?
00:27:56I forget how long.
00:27:57Yes, we are.
00:27:58I actually just checked that.
00:27:59But yeah.
00:28:00Okay, yeah.
00:28:02I can kind of keep this running.
00:28:03But if you have any other like questions or anything that I didn't cover, that would be
00:28:08interesting.
00:28:09Yeah, another question that's come in is, you know, for the v zero users that are new to
00:28:15the experimentation, do you have like a checklist or something that people can follow to avoid
00:28:21setting up like a, in quotes, bad or maybe misleading experiment?
00:28:27Yeah, that's, um, I'm not 100% sure.
00:28:31I know we have guides.
00:28:32I don't want to try to find it here live.
00:28:34But I could definitely follow up with a guide.
00:28:38We could.
00:28:39Yeah, we can definitely attach it to our resources section in this in this chat.
00:28:43So yeah, 100%.
00:28:44Yeah.
00:28:45And then and you could kind of go through the AI, I would encourage people to, you know,
00:28:50ask the chat, go through the UI, there's a lot of helpful, like, like I mentioned the
00:28:55hypothesis, you know, you didn't see that in the v zero.
00:28:58So it could help to pop open the UI and kind of see what else is here that you could play
00:29:02around with.
00:29:05Or even just asking v zero, you know, like, what, what kinds of things am I potentially
00:29:10missing out on here?
00:29:11What What else can we do with this?
00:29:13So I'd encourage you just kind of like, leverage v zero as much as you can for something like
00:29:18that, or the post hog AI is another option to chat and post hog.
00:29:23Wow, I didn't even know you guys had that.
00:29:26That's so cool.
00:29:27I guess I asked this in every one of our integration sessions, but longer term, what do you have
00:29:35cooking?
00:29:36Anything you can share with us?
00:29:38Yeah, so we view the future of software development and product development to be more autonomous.
00:29:45And I think it's already happening.
00:29:47But I think there are still there's still ways to go.
00:29:53So things like what I showed off with the error resolution, I don't think it's very long before
00:29:59that's kind of a standard in an application is like, I have an application running.
00:30:04And it there's something running in the background that's just fixing errors without me having
00:30:09to tell it to and maybe I get some kind of report about that, about what happened.
00:30:15But ultimately, in the future, like I don't want to have to prompt an AI to check for errors,
00:30:20I don't want to have to, you know, get a an alert in the middle of the night get woken
00:30:24up with an error.
00:30:26I think, in the future, like some form of some kind of system, it could be post hog could
00:30:35be versatile could be kind of them working together.
00:30:38They're going to be basically gathering information errors is just one thing.
00:30:43There's session replay, which I showed the product analytics experiments is another one
00:30:49where like, do I really need to manually create different metrics?
00:30:53And I it was interesting, actually, here, we got to see v zero actually picked some interesting
00:30:57metrics for us already.
00:31:00But I'm probably going to have an agent monitoring those things in the future and and making code
00:31:05updates without me really needing to, to prompt it at that level of specificity, like I'm definitely
00:31:14going to be guiding strategically, like what kind of what experiment experience am I going
00:31:19for my game?
00:31:20What what are the metrics I'm targeting?
00:31:22But in terms of like the lower level stuff, I think a lot of that's going to be picked
00:31:26up by autonomous.
00:31:28Yeah, development.
00:31:30Yeah.
00:31:31Yeah, I love that that vision because that basically aligns with everything we're talking
00:31:37about a versatile and we'd be zero as well.
00:31:40So it's all about like closing that loop from deploy.
00:31:43So then like observe in post hog and to fix and iterate automatically as well.
00:31:49That's awesome.
00:31:50I guess one last question before we close off because I know we're at time here.
00:31:56How can people get involved in the post hoc community?
00:32:00I did say at the start that post hoc is open source.
00:32:03Do you accept contributions?
00:32:05I don't know if I got that correct.
00:32:06By the way.
00:32:07I read that.
00:32:08Yes.
00:32:09100%.
00:32:10There's some PRS that come in from from all over the place that we review.
00:32:16There's also like a forum post hoc forum you could ask questions in.
00:32:21We have live events.
00:32:22Again, I wish I had the URL handy, but I'll hand it to you after.
00:32:26Yes, absolutely.
00:32:27Yeah, there's live events.
00:32:29And just sign up.
00:32:30Use the use the app.
00:32:31Tell us what you think.
00:32:32You know, reach out on X or whatever you use LinkedIn or whatever.
00:32:37And let us know what you think we're very active across those platforms.
00:32:40Amazing.
00:32:41Thank you so much, Brooker.
00:32:43This was incredible.
00:32:44And for everyone who's watching, if you do have follow up questions, drop them in the
00:32:49chat and we will make sure that the post hoc team, you know, follow up with them.
00:32:54But yeah, thank you so much, Brooker.
00:32:56I appreciate your time here.
00:32:58Thank you.
00:32:59Yeah.
00:33:00Thanks for having me.
00:33:02Amazing.
00:33:03Thank you so much, everyone, for joining this chat and our live session today.
00:33:08If you don't know, we host a community live session pretty much every week.
00:33:14So if you head over to community.versa.com/events, you'll see a very nice calendar where we post
00:33:21all of our in-person and online events.
00:33:24So let us know what's the what next integration would you like to see in our community session?
00:33:31And we will definitely get them on.
00:33:33Incredible.
00:33:34Well, thank you so much, everyone, for joining us today.
00:33:39And I hope to see you in our next live session.

Key Takeaway

The integration of PostHog with Vercel and v0 enables developers to transition from an idea to a live, monitored experiment with AI-driven error resolution and feature management.

Highlights

PostHog is an open-source product analytics platform offering feature flags, A/B testing, and event tracking.

The Vercel integration automatically manages environment variables and API keys, simplifying the deployment process.

The Model Context Protocol (MCP) allows AI agents like v0 to directly access PostHog data to identify and fix production errors.

Feature flags can be created and managed directly within the v0 chat interface using natural language commands.

Session replays and error tracking provide visual context that helps developers understand user behavior and bug reproduction.

The future of development involves autonomous agents that monitor, diagnose, and resolve application issues without manual intervention.

Timeline

Introduction and Community Welcome

Pauline Navas from the Vercel community team opens the session by welcoming attendees and explaining the value of Vercel marketplace integrations. She introduces PostHog as an all-in-one product analytics suite that includes tools for A/B testing and event tracking. The session highlights how developers can now integrate these tools directly into the v0 builder to shorten the feedback loop. This introduction sets the stage for a live demonstration of building and validating products faster. Pauline encourages viewers to participate in the live chat via the Vercel community platform.

PostHog Overview and Integration Setup

Brooker, a product engineer from PostHog, joins the session to explain the core benefits of using analytics during the development phase. He demonstrates the step-by-step process of installing the PostHog integration within the Vercel dashboard, emphasizing that it handles environment variables automatically. This automation is crucial because it removes the manual burden of rotating keys and managing host URLs like 'us.i.posthog.com'. Brooker explains that this seamless connection ensures that production errors and usage data are immediately reported back to the correct project. The section concludes with a look at the generous free tier available to new users.

AI-Driven Error Handling with MCP

The demonstration shifts to a live game called 'Hog Hop' which contains a production bug that causes the application to freeze. Brooker introduces the Model Context Protocol (MCP), which allows the v0 AI agent to interact with PostHog's error tracking data. By prompting v0 to 'find the bug using the PostHog MCP', the AI retrieves the stack trace and context needed to suggest a code fix. This process eliminates the need for manual debugging using Chrome DevTools or searching through raw server logs. Brooker also briefly mentions session replays as a powerful way to see exactly what a user was doing before a crash occurred.

Multivariant Feature Flags and Experiments

In this segment, the focus turns to feature experimentation where Brooker uses v0 to create a multivariant feature flag for new game mechanics. He compares a 'laser eyes' ability against a 'stomp' mechanic to see which drives better game completion rates. The AI agent automatically suggests primary and secondary metrics, such as session duration and death rates, to measure the success of the experiment. Brooker demonstrates how to adjust these hypotheses and metrics either through the PostHog UI or directly via MCP commands. This highlights the flexibility of the tool in testing different user experiences without deploying entirely separate codebases.

Future of Autonomous Development and Closing

The session concludes with a discussion on the future of software development, which Brooker describes as becoming increasingly autonomous. He envisions a world where AI agents monitor applications and fix bugs in the middle of the night without human intervention. Pauline and Brooker discuss the importance of open-source contributions and how users can get involved with the PostHog community forum and live events. The integration between Vercel and PostHog is framed as a key step toward closing the loop between deployment and observation. Pauline ends the stream by inviting users to suggest future integrations for upcoming community sessions.

Community Posts

View all posts