00:00:00- I have two very important things to say.
00:00:01First one, look at this beard.
00:00:03Okay, I'm somehow accidentally gaining a beard.
00:00:06Second off, I read something on LinkedIn, okay?
00:00:09I'm sorry, but here we are.
00:00:11Look at this.
00:00:12A reflection on the year ahead of our industry.
00:00:15This is of course from CEO of Microsoft.
00:00:17And if you read the letter, of course, it goes on
00:00:20and it pretty much is just a chat GPT inspo,
00:00:23kind of like experience
00:00:25where you just wanna feel really great, right?
00:00:27A new concept that evolves bicycles for the mind.
00:00:29Such that we always think of AI
00:00:32as a scaffolding for human potential versus substitute.
00:00:35Wow, man, hey, yo Satya, that's so inspirational.
00:00:38Like, dude, how many chat GPT prompts?
00:00:40What's your prompt, bro?
00:00:41Bro, what's your prompt?
00:00:42That's some inspiration right there.
00:00:44But something that kind of caught everybody's eye
00:00:46from all of this that generated tons of memes,
00:00:49including this one, my personal favorite one right here,
00:00:51Micro Slop, is this line right here.
00:00:53We need to get beyond the arguments
00:00:55of slop versus sophistication
00:00:57and develop a new equilibrium in the terms of theory of mind
00:01:00that accounts for human beings being equipped
00:01:02with these new cognitive amplifier tools
00:01:03as we relate to each other.
00:01:05First off, I mean, AI is not helping us relate to anybody,
00:01:08but we can set that side.
00:01:09Okay, we'll set that part aside.
00:01:11But this idea that we just need to get
00:01:13beyond the arguments of slop.
00:01:14We don't need to use the term slop anymore.
00:01:15Hey, slop, a slop is just not needed.
00:01:17We don't need to say that, right?
00:01:19Right, bros?
00:01:20All right, so this is the part where I'm gonna try
00:01:21to break down this problem.
00:01:22And I'm gonna actually kind of give,
00:01:24I guess a different take than I normally do.
00:01:25The reason why I think we're seeing a lot of people,
00:01:28even like normies, right, not like tech people,
00:01:30just the average person going, "Oh, hey, that's slop."
00:01:32I'm gonna try my best to kind of give
00:01:34why I think people feel that and what's going on here.
00:01:38'Cause the primary problem I have with this whole statement
00:01:40is if something is really good,
00:01:43you've never had to ask somebody not to insult it.
00:01:46Like no one calls products that are good
00:01:48some sort of like slur, right?
00:01:50We don't refer to them.
00:01:52There's not like some contingent of people like,
00:01:53"Oh yeah, that's the shit stuff."
00:01:55Like, no, that's not what happens.
00:01:57The reason being is because the person's average experience
00:02:00with it is really, really good.
00:02:02And so when Satya is trying to tell us,
00:02:04"Hey, we need to stop saying these things."
00:02:06It's kind of like, why don't you just provide a product
00:02:09that's really, really good.
00:02:11And then people won't say it.
00:02:12They'll be like, "Oh yeah, yeah, yeah,
00:02:14it's actually pretty useful.
00:02:15Yeah, that's called useful.
00:02:16That's useful right there."
00:02:17It kind of blows my mind to try to be like,
00:02:19"Hey, don't do that.
00:02:20Don't you do that."
00:02:21Because from Microsoft's point of view,
00:02:23it's obvious where they're going, right?
00:02:25They've just renamed their like Microsoft's 360 office.
00:02:29Remember that, their productivity suite?
00:02:31It's been named from 360 office to co-pilot.
00:02:34That's right.
00:02:35That means co-pilot is the office software suite.
00:02:39Co-pilot is the thing inside the text documents
00:02:42making suggestions.
00:02:43Co-pilot is the thing on GitHub
00:02:44able to make PRs and interact with you.
00:02:46Co-pilot is the auto-complete
00:02:48inside of your VS code editor.
00:02:50Why do you think we don't want people insulting the AI?
00:02:55Okay, we want them to perceive it as something good.
00:02:57But we as programmers,
00:02:58we should have like kind of special insights on this, right?
00:03:01Okay, hey, for all my pro AI people out there,
00:03:03I'm not even gonna say something negative here.
00:03:05You can't just simply ask the AI to do something, right?
00:03:08Like there's a lot of planning that you have to do.
00:03:11You have, you know, you guys call context engineering,
00:03:13prompt engineering,
00:03:14whatever the hell engineering going on out there, right?
00:03:17It's not something as simple as just like,
00:03:19"Hey, do this thing for me."
00:03:20And then, because that sometimes works,
00:03:22sometimes it doesn't work at all, right?
00:03:23That's part of the problem.
00:03:25We, the people who've experimented with it,
00:03:27understand that problem.
00:03:29The average person gets this experience right here.
00:03:32So this is Microsoft like little finder thing
00:03:34so you can kind of search through your system.
00:03:36Look what it says.
00:03:37"Try my mouse pointer is too small."
00:03:40Okay, so you're a grandma.
00:03:42You don't know nothing about AI.
00:03:43You're gonna go in here and you're gonna type in,
00:03:45"My mouse pointer is too small."
00:03:48What happened?
00:03:49Well, you sit there
00:03:50and then you continue to sit there
00:03:52and nothing happens.
00:03:54Nothing happens.
00:03:55It just sits there.
00:03:56And so this is the experience
00:03:57that the average person is having.
00:03:58They have this kind of shoved in AI experience
00:04:01into all over throughout the average experience of Windows.
00:04:04And it's just like halfheartedly working.
00:04:06Sometimes it works.
00:04:08Sometimes it doesn't.
00:04:09People are like, "Yeah, that's sloppy."
00:04:11That's what sloppy means.
00:04:12We're calling it slop because that's what it is.
00:04:16But I think there's like kind of like an inverse
00:04:18to this argument,
00:04:18which is not just like, "Hey,
00:04:20you shouldn't call it slop anymore."
00:04:21I think the average kind of experience of people,
00:04:24especially people that are more online,
00:04:26is that they're having all these thought leaders, right?
00:04:29They're constantly telling you like,
00:04:30"Hey, brother, you got it wrong."
00:04:31Like, "Hey, I know you're sitting over there
00:04:34and you've generated some code."
00:04:35And you're like, "Oh, this was really good.
00:04:36Okay, this was really bad.
00:04:37Okay, I don't really like it for these reasons,
00:04:39but I like it for these reasons," right?
00:04:40Like you have a nuanced, normal opinion
00:04:42because you're nuanced and normal.
00:04:44But then you see all this like stuff on Twitter
00:04:46constantly telling you like, "Dude, bro,
00:04:48it's the end.
00:04:49Oh my gosh.
00:04:50Look at this one year of work done in an hour."
00:04:53So people's expectations are just so out of line.
00:04:56And I think there's been no worse example
00:04:59in my entire lifetime than this tweet right here.
00:05:03I will read it for you.
00:05:04"I'm not joking.
00:05:05This isn't funny."
00:05:06Okay, serious tweet people.
00:05:08We're not doing...
00:05:09Hey, this is not some soft soap, okay?
00:05:10This is the big stuff.
00:05:11"We've been trying to build distributed agent orchestrators
00:05:14at Google since last year.
00:05:15There are various options and not everyone is aligned.
00:05:18I gave Claude code a description of the problem.
00:05:20It generated whatever we built last year in an hour."
00:05:23So obviously, what do you read there?
00:05:26You read two things.
00:05:27One, this principal engineer and her group
00:05:31must be A, incompetent because several people spent a year
00:05:36and it was reproduced in Claude in an hour.
00:05:38Okay, so I have to be like,
00:05:40bro, that's kind of crazy.
00:05:42That kind of sounds like...
00:05:43What kind of incompetence is going on over at Google?
00:05:45I know Google is not the bastion of engineering
00:05:48that it once was, but this can't be right, right?
00:05:50Like this can't be the way to read it.
00:05:51Of course, the other way to read into this, of course,
00:05:54is Claude code is that amazing.
00:05:57Literally gave it a little description of the problem
00:06:00and bada-bing, bada-boom.
00:06:02Your entire year of effort solved in an hour
00:06:06and maybe like 50 bucks of tokens.
00:06:09Now this tweet just by itself,
00:06:11I cannot imagine the amount of panic and problems
00:06:14I've had so, I have actually had several people reach out
00:06:18due to this one thing and ask me questions about,
00:06:20is it still safe actually to learn?
00:06:21Like I should probably just stop learning, right?
00:06:23I should just vibe code.
00:06:24I should quit doing this
00:06:24because obviously it's the best thing ever.
00:06:26Like this tweet ruins careers in people's lives.
00:06:30The worst part about this tweet,
00:06:3228 hours later, this is what we get.
00:06:34To cut through the noise on the topic,
00:06:36it's helpful to provide some more context.
00:06:38Yeah, oh, okay, okay, okay.
00:06:40Maybe the tweet from before wasn't quite as accurate
00:06:42as we were led to believe.
00:06:44We have built several versions of the system last year.
00:06:46There are trade-offs and there hasn't been a clear winner.
00:06:49When prompted with the best ideas that survived,
00:06:51coding agents are able to go very far
00:06:53and generate a good, decent toy version in an hour or so.
00:06:57Okay, so now it's not one hour.
00:07:00It could be an hour and a half.
00:07:01It could be 45 minutes.
00:07:02It's just some period of time.
00:07:03Okay, timeframe changes a little bit,
00:07:05but more so decent toy version.
00:07:08When I hear toy version,
00:07:09I don't hear what I've built last year.
00:07:11I hear a shadow of what was built,
00:07:15something that kind of looks like it, but is nothing like it.
00:07:18Also, on top of it,
00:07:20when prompted with the best ideas that survived,
00:07:22so let me get this straight,
00:07:23not only just the description of the problem,
00:07:25so you're not even, the whole thing was a lie, right?
00:07:28Because you said the description of the problem.
00:07:29I just gave them the description.
00:07:30No, you gave it a bunch of ideas
00:07:33that you spent a year researching.
00:07:34You came up probably with a really fantastic technical doc
00:07:37about all of this stuff and then fed it to Cloud Code
00:07:40and then we're like, oh my gosh, it created something
00:07:42of what we just stated it should create
00:07:44and it did like a toy version of it.
00:07:46It just like this, this type of stuff,
00:07:48it just makes me hate AI, right?
00:07:49And it's not because AI isn't a neat or whatever,
00:07:52text generation, all super cool.
00:07:53I mean, seriously, look at that Vim logo, okay?
00:07:56Now that burned down a forest for,
00:08:00but it's this overselling.
00:08:02And so it just means every time I use AI,
00:08:05my expectation is so high,
00:08:08but my reality is so much different.
00:08:11I've done about nine hours straight of vibe coding
00:08:13just recently trying to rebuild like this,
00:08:15just a CloudFlare beginning application.
00:08:17One, a worker and a container.
00:08:19I've spent about 75 to $100 doing this.
00:08:22Out of nine hours, I got something that could be generated
00:08:25in maybe 15 minutes if you're familiar.
00:08:27And granted, hey, prompt issues, I was using planning,
00:08:30I was using Opus 4.5 Max.
00:08:32I was really asking a lot of questions.
00:08:34It went off the rails a lot.
00:08:36You know, maybe I was doing the wrong thing, whatever.
00:08:39It doesn't matter.
00:08:40I did something and I had an okay result.
00:08:42There are some parts of it I really liked.
00:08:44There are some parts of it I didn't like.
00:08:45There's parts of CloudFlare that I'm not familiar with
00:08:47in which it saved me like an hour of research.
00:08:50There's parts that I was familiar with
00:08:52and it just did a horrible job.
00:08:53And so it's like, hey, that was a cool experience.
00:08:55I learned a lot.
00:08:56I know more things about it, but this wasn't, oh my gosh,
00:09:00it did everything I did in a year
00:09:01and it completely wiped everything out.
00:09:03It's more like, hey, there's a bunch of sharp edges.
00:09:05There's some cool parts.
00:09:06There's not some cool parts.
00:09:07Like I just don't understand why people keep trying
00:09:10to sell it as something fantastic.
00:09:12It would literally make so many people
00:09:15stop calling things slop and just go, oh yeah, yeah,
00:09:18you gotta be careful about that, right?
00:09:19Like yeah, we would just talk about it normally
00:09:21if we didn't get just hyped up constantly.
00:09:24I am so sick of my expectations being here
00:09:27and my reality being here.
00:09:29Anyways, I just wanted to yap about this.
00:09:30I just, I'm sorry.
00:09:31I feel just frustrated by the whole thing
00:09:33'cause I just feel like this last two weeks on Twitter
00:09:35have been just nothing but just the craziest amount
00:09:38of Claude code glazing in my entire lifetime.
00:09:41I used the TUI app.
00:09:43There was just like, I mean,
00:09:44I encountered so many little bugs with it,
00:09:46which is also shocking.
00:09:47Flickering, ultra think not actually even coloring
00:09:50ultra think properly with the rainbow.
00:09:52Like it's just, man.
00:09:54I mean, those aren't even, I mean, what is going on here?
00:09:57If AGI's landed, I would like a working TUI app, okay?
00:10:01The scroll, the scroll doesn't even work right.
00:10:03The control O doesn't work right.
00:10:05You cannot control O open or expand.
00:10:07I mean, there was just so many issues, but again,
00:10:10it could be skill issues on my behalf.
00:10:12The name is the skill issue, Wijin.
00:10:15Also, have you noticed one thing?
00:10:17Why do communists and people who are super enthusiastic
00:10:21about AI always talk the same?
00:10:23They're always like, oh, well, you didn't actually do it right.
00:10:25See the problem was you actually, see you didn't do it right.
00:10:29It's just like, okay, we're all no true Scotsman here.
00:10:32I didn't realize that.
00:10:32Every bad experience is a no true Scotsman.
00:10:35Every good experience is, yeah,
00:10:36that's why all the losers are being left behind.
00:10:39I don't get it.
00:10:40I don't know.
00:10:40Sorry for the extra, bye bye.
00:10:42Hey, do you wanna learn how to code?
00:10:43Do you wanna become a better backend engineer?
00:10:45Well, you gotta check out boot.dev.
00:10:47Now, I personally have made a couple courses from them.
00:10:49I have live walkthroughs, free available on YouTube
00:10:52of the whole course, everything on boot.dev.
00:10:54You can go through for free,
00:10:56but if you want the gamified experience,
00:10:58the tracking of your learning and all that,
00:11:00then you gotta pay up the money, but hey, go check them out.
00:11:02It's awesome.
00:11:03For any content creators you know and you like,
00:11:05make courses there, boot.dev/prime for 25% off.