19 Uncomfortable Truths About Human Nature - Gurwinder Bhogal

English
CChris Williamson
정신 건강자격증/평생교육컴퓨터/소프트웨어AI/미래기술

Transcript

00:00:00- It's been too long, man.
00:00:01You write these awesome things on the internet.
00:00:03This is the eighth time we've done this now
00:00:05for the people that haven't seen you before.
00:00:06You come up with some of my favorite aphorisms
00:00:08and insights and stuff.
00:00:10And we just do that.
00:00:11We're kind of like the Bonnie Blue
00:00:13of interesting insights about the internet.
00:00:15We're just taking whatever we get.
00:00:17It's high velocity stuff.
00:00:19The first one that I want to get into,
00:00:21the oxytocin paradox.
00:00:22This is one of yours.
00:00:23Oxytocin, the love hormone, can also make people spiteful.
00:00:27Cruelty is not simply the opposite of compassion.
00:00:30It's often adjacent to it.
00:00:32For instance, the platform most dominated
00:00:34by social justice activists, Blue Sky,
00:00:37is also the one with the highest support for assassinations.
00:00:41Beware of those quick to show empathy,
00:00:43for they are often just as quick to show barbarity.
00:00:46- Yeah, so this is a finding
00:00:50that I sort of came across quite recently,
00:00:52but it confirms something I've long known,
00:00:55which is that people who outwardly express a lot of empathy
00:00:59tend to also be equally capable of cruelty
00:01:02to that same extent.
00:01:03And I first learned about this
00:01:07from a book called "Against Empathy"
00:01:10by Paul Bloom, who's a psychologist.
00:01:14And in this book, I think you've had him on the show,
00:01:17in this book, he basically talks about
00:01:21how people tend to assume that empathy
00:01:25is just a good thing overall,
00:01:27that it's not, that we need more empathy,
00:01:29that empathy is in short supply.
00:01:32But really, empathy is in-group loyalty.
00:01:36That's what it is, because we're tribal animals.
00:01:39And what empathy is is when you empathize with someone,
00:01:43the way he describes it is you don't empathize
00:01:46with everybody at the same time.
00:01:47You empathize with select people.
00:01:49And the way he describes it is that
00:01:50empathy is like a spotlight.
00:01:52So you shine it on people,
00:01:55a small group of people at a time,
00:01:56or just an individual at a time.
00:01:58But while you have empathy shined on that person,
00:02:01everybody else is in darkness,
00:02:02which basically means that you don't have
00:02:06any real feelings for that person
00:02:07that's outside of that spotlight.
00:02:09So what this can mean is that if you empathize,
00:02:13so let's take a real-world example.
00:02:15Let's say you're somebody who empathizes
00:02:18with the plight of the Palestinians.
00:02:20So you'll have a lot of love for those people,
00:02:23and you'll be very, very concerned about them.
00:02:25But there's a kind of yin-yang effect
00:02:28where because you have so much concern for them,
00:02:31you have negative concern for Israelis.
00:02:33So it's not like you just have love for one group of people
00:02:38and then everybody else you're sort of neutral to.
00:02:41It can actually have almost like a zero-sum effect.
00:02:44The more empathy you have for one group of people,
00:02:46the less empathy you have for other people.
00:02:49And this is, I think, a major driver
00:02:51of sort of cruelty and spite in the world.
00:02:54When you consider the people that go out there
00:02:56and commit political violence,
00:02:58what you often see is that these people empathize
00:03:00very strongly with one group of people.
00:03:03So again, if we go with the Palestinian analogy,
00:03:06a group like Hamas, for instance,
00:03:08now Hamas have a lot of empathy for Palestinians,
00:03:11at least they do claim to.
00:03:13But then that equates also to hostility,
00:03:17corresponding hostility proportionate to Israelis.
00:03:21You see it also with, again, with the example
00:03:24that I gave in that piece, which is about blue sky.
00:03:27So blue sky obviously is where
00:03:29all the social justice people hang out.
00:03:31It's basically all the refugees from Musk's ex.
00:03:34So these are all people that you would think
00:03:38would be extremely compassionate, extremely sort of empathic.
00:03:42And they are, they are, but only to a small group of people.
00:03:45For example, the left, when they call for empathy,
00:03:48they don't call for empathy for right-wingers.
00:03:51They call for empathy towards immigrants
00:03:54or towards trans people.
00:03:57So their empathy is very selective.
00:04:00And this is why when you look at recent research,
00:04:03you find that the amount of support for assassinations
00:04:07is strongest amongst the people that you would expect
00:04:09to be the most compassionate, basically.
00:04:11- Well, you saw that with Luigi Mangione, right?
00:04:15That he had a manifesto.
00:04:17He was very empathetic toward people
00:04:19who'd been screwed over by healthcare services.
00:04:21People who'd had their healthcare denied
00:04:23and their claims that had been rebuked
00:04:25due to, you know, squirrelly manipulation behind the scenes.
00:04:30And that resulted in him shooting a guy in the head.
00:04:33- Yeah, yeah, so I mean, yeah, so I sort of met Luigi
00:04:41in 2024 and he seemed like a really nice guy, you know?
00:04:46I can't say a single bad thing about him
00:04:47from our conversation.
00:04:48He really did seem like a genuinely nice person.
00:04:50- You spoke to him for a couple of hours, right?
00:04:52- Yeah, he had a two-hour conversation with him
00:04:55because he was, you know, he was a big fan of my writing.
00:04:57And so he became a founding member
00:04:59and then we ended up having a two-hour video call.
00:05:01And yeah, he seemed like,
00:05:03he genuinely seemed like a really nice guy.
00:05:05And I just, you know, did not have any idea
00:05:07that he was planning this.
00:05:09I don't know if he was planning this
00:05:11at the time that I spoke to him, but I wasn't.
00:05:14I mean, although I was shocked, obviously,
00:05:15because when somebody you know is in the news
00:05:17for something like that, of course it's gonna be shocking.
00:05:20But at the same time, it didn't surprise me.
00:05:21From an intellectual point of view, it didn't surprise me.
00:05:24Because I've interacted with some extremely dangerous people.
00:05:28Early in my sort of writing career,
00:05:30I hung around with Al Mahajirun,
00:05:33just to try and find out who they are.
00:05:34And Al Mahajirun is the UK's deadliest jihadist organization.
00:05:39They've been responsible for quite a lot of
00:05:41terrorist attacks on UK soil.
00:05:43And I was sort of hanging out with these people for a while
00:05:47just to find out how their minds work.
00:05:50And they were really, really friendly people.
00:05:52'Cause they thought I was Muslim.
00:05:54Because I speak the same language as them.
00:05:55And so I was able to pretend I was one of them.
00:05:59And they were really, really nice to me.
00:06:01They would, you know, they would like,
00:06:02if they were going to the shop,
00:06:03they would ask me if I wanted anything.
00:06:04They were just kind of like really always concerned,
00:06:07you know, like, and stuff, and they barely knew me.
00:06:10And so, you know, I was kind of like,
00:06:12well, this was a bit strange,
00:06:13but then I'd later learn that one of them
00:06:16went to Syria to become a bomb maker for ISIS.
00:06:21He blew his arm off.
00:06:22Another thing he did was actually, before he did that,
00:06:26he stabbed a guy in the eye
00:06:27for apparently insulting the Prophet Muhammad.
00:06:30And then when he was on bail, he was able to skip,
00:06:33he was able to skip bail.
00:06:35He fled, he went to Syria, became a bomb maker,
00:06:39blew his arm off, and then he got killed in a strike.
00:06:42So, you know, this guy's name was Abu Rahim Aziz.
00:06:45He also used to go by the name Abu al-Brattani.
00:06:49And I think he was actually allowed to leave by MI6
00:06:54so that they could track him and then blow him up.
00:06:56But that's a whole other story.
00:06:57But basically, he was somebody who was really nice.
00:07:00He was a really kind guy. - But also the sort of guy
00:07:02that would say, hey, do you want a grenade bar?
00:07:04I'm going to the corner shop.
00:07:07Would you, do you want some crisps
00:07:09or some chocolate or something?
00:07:11- Yeah, I mean, they were just, you know,
00:07:12they were always looking out for each other.
00:07:14And they had a lot of empathy for each other, you know,
00:07:16and for their fellow Muslims.
00:07:18They had a lot of empathy for them.
00:07:20But then they had no empathy for, for example, Jewish people.
00:07:23I witnessed a lot of antisemitism
00:07:24when I was in Bury Park in Luton,
00:07:27which is just a Muslim sort of enclave.
00:07:31They were very, you know, they were very antisemitic.
00:07:35They dehumanized Jews and especially Israelis.
00:07:40But they had all the empathy in the world for Muslims.
00:07:44And, you know, you see this on the other side as well.
00:07:46You know, you see Israelis who have all the empathy
00:07:48in the world for Jews,
00:07:49but don't have any for the Palestinians.
00:07:51So it's, you know, this is not like just one side.
00:07:53This is a common human trait.
00:07:54You see this everywhere.
00:07:55You see it amongst the left.
00:07:57You see it amongst the right.
00:07:58You even see it amongst centrists.
00:08:00So, you know, when people say,
00:08:02"Oh, we need more empathy," I think, "Mm, do we?"
00:08:04You know, I think maybe the problem is,
00:08:05is that we have sort of selective empathy.
00:08:08And we maybe need to sort of understand
00:08:10that everybody's a human being,
00:08:12not just the people that we empathize with.
00:08:13- Yeah, less tribalism, not more empathy.
00:08:16- Exactly. - Yeah, interesting.
00:08:19Next one, Rumpelstiltskin effect.
00:08:21To name a problem is to tame it.
00:08:23Diagnosing one's suffering makes it feel more meaningful
00:08:26and thus manageable.
00:08:28Even if the diagnosis is wrong,
00:08:30major depressive disorder is easier to live with
00:08:32than anonymous sadness.
00:08:35This is one reason for the recent surge in diagnoses
00:08:37of disorders like depression, autism, and ADHD.
00:08:40And I pulled some data.
00:08:42Anxiety is now the most common mental health condition
00:08:45in the world, so global burden of disease study.
00:08:48359 million people, that's 4.4%, have an anxiety disorder.
00:08:53332 million, that's 4%, have depressive disorders.
00:08:5737 million, bipolar. 23 million, schizophrenia.
00:09:0116 million have eating disorders.
00:09:03That's, I would guess, bringing the Rumpelstiltskin effect
00:09:06into real life.
00:09:07- Yeah, so the Rumpelstiltskin effect takes its name
00:09:13from the fairy tale of Rumpelstiltskin,
00:09:15which most people should be familiar with
00:09:16from their childhood.
00:09:17I vaguely remember it, but what I remember is
00:09:20that basically Rumpelstiltskin is an imp
00:09:22who steals a woman's baby.
00:09:26And in order to get it back, she has to find out his name.
00:09:29And then one day she hears him dancing around a fire,
00:09:32singing about how nobody knows his name
00:09:35because his name is Rumpelstiltskin.
00:09:37So he's not very bright, but after she finds out,
00:09:40she has power over him.
00:09:42So it's the idea of when you name something,
00:09:43you have power over it.
00:09:45And there's a lot of kind of evidence of this
00:09:48because I've actually written about this in detail, actually.
00:09:52When you look at, for example,
00:09:54the ways in which people come to understand themselves,
00:09:59how they come to sort of understand their own identities,
00:10:02often through their ailments.
00:10:04And this can kind of bring them a kind of a sense
00:10:07that they're not in control.
00:10:09Well, at least it gives them more of a sense
00:10:11of control over their problems.
00:10:13So, for example, if you are shy, right,
00:10:18then you might consider that your shyness
00:10:21is a personality defect.
00:10:23And this can be quite hard on people, right?
00:10:25So it can compound the anxiety that you already feel
00:10:30from your shyness by making you believe that you're worthless
00:10:33or that you're defected in some way
00:10:36because it's a personality trait that you can't really grasp.
00:10:39You don't know why you're shy.
00:10:41You don't know why.
00:10:41And so you're just kind of stuck with it.
00:10:44But if somebody says to you,
00:10:46"Oh, no, no, no, you're not shy.
00:10:47"You have social anxiety disorder."
00:10:50Then suddenly you have something that you can direct
00:10:54all of your sort of your frustrations towards.
00:10:57You know, you have something concrete now.
00:10:59You're like, "Oh, okay."
00:11:00So now you can come to understand
00:11:01a little bit more about yourself
00:11:02by learning more about social anxiety disorder.
00:11:05So it helps you to sort of come to terms with your problems.
00:11:09There are many incentives
00:11:10why you would want to labor yourself in such a way.
00:11:12I mean, one of the other incentives
00:11:14is that it kind of takes responsibility from yourself
00:11:18to something that you can't really do much about.
00:11:20So something like your neurochemistry
00:11:23or your genetics or something like that.
00:11:26So you're like, "Okay, well, I can't do anything about it
00:11:29"because this is social anxiety disorder."
00:11:32But then at the same time,
00:11:33this can also prevent you from getting that thing treated
00:11:38because then you can kind of become quite resigned
00:11:40in a sense.
00:11:41I think with kind of labels like this,
00:11:43I think that they can be useful, right?
00:11:45But I think naming only helps
00:11:49if it leads to a tractable next step,
00:11:51a real tangible next step.
00:11:54Because if the label replaces action,
00:11:57then it's just an excuse, right?
00:12:00And I think that's the problem
00:12:01that a lot of people are facing at the moment
00:12:03where they're using the label as an excuse
00:12:05rather than as a motivation for more action.
00:12:07Because if you have,
00:12:08let's go back to the example of social anxiety disorder,
00:12:11but there's two ways to cope with labeling your problem
00:12:14as social anxiety disorder.
00:12:16You can either resign yourself and say,
00:12:18"Well, this is something biological or psychological
00:12:22"that I can't really do anything about.
00:12:23"So let's just not bother trying to fix it."
00:12:25That's one path.
00:12:27The other path is to say,
00:12:28"Okay, so what are the causes of social anxiety disorder?
00:12:31"What are the treatments for social anxiety disorder
00:12:33"and what's gonna work best for me?"
00:12:35Obviously, the latter is a much more healthier attitude.
00:12:38But I think that too many people,
00:12:39what they're doing is they're using the label as an excuse
00:12:43to prevent action.
00:12:44So it actually has the opposite effect.
00:12:46They never fix it.
00:12:48So while I'm not against labeling one's problems
00:12:51in such a way,
00:12:52I think that it should always serve to further action.
00:12:55You should actually,
00:12:55it should be a step towards further action.
00:12:57If it's a step towards inaction,
00:12:59then it's just an excuse basically.
00:13:01- I saw a clip of someone on some women's show
00:13:06talking about, maybe it was Oprah,
00:13:11talking about how obesity is a disease
00:13:14and is the medicine to the disease.
00:13:17And you wouldn't tell somebody that has diabetes
00:13:21that they shouldn't take their insulin
00:13:22because they have a disease
00:13:24and this is medicine for the disease.
00:13:26And it reminds me a little bit of concept creep,
00:13:29that idea that you taught me about probably four years ago,
00:13:32where over time, as racism goes down,
00:13:36numbers of racism, objective numbers of racism,
00:13:38which I know that you've done tons of research into this
00:13:40in your previous life, objective racism goes down,
00:13:44but subjective racism goes up
00:13:45because the demand for racism outstrips its supply.
00:13:49And the only way that you can keep the volume of racism going
00:13:52so that people who comment on it
00:13:53have got something to talk about and campaign against
00:13:56is to broaden the definition of racism
00:13:58until it becomes so large
00:14:00that basically anything could be racism
00:14:02or anything could be transphobia
00:14:04or anything could be xenophobia or anything could be.
00:14:06And the same thing goes for diseases, right?
00:14:09- If you're diagnosing some issue,
00:14:11you had a passage from a book,
00:14:17maybe he was a clinical psychologist or something
00:14:19and he was saying how many patients he'd ever seen
00:14:21in his entire career who'd come in
00:14:23and labeled themselves as just being sad with sadness.
00:14:27And it was three, three patients across thousands
00:14:31had ever come in and said sadness.
00:14:33Everybody else was depression or anxiety or schizophrenia
00:14:37or imposter syndrome or whatever.
00:14:39Even imposter syndrome, right?
00:14:41The fear that other people expect a standard of you
00:14:45which you can no longer meet.
00:14:48That, I mean, there are a million different terms for it.
00:14:52It could be uncertainty.
00:14:54It could be humility and humbleness.
00:14:57It could be low confidence.
00:14:59It could be low self-esteem.
00:15:00It could be low self-belief.
00:15:01But imposter syndrome,
00:15:03to put the word syndrome after something,
00:15:05and it's a cool term and I think it's a useful term
00:15:08to name something, but the danger is of pathologization.
00:15:12And yeah, you're right.
00:15:13If you being able to put a name to imposter syndrome
00:15:18and because of that, you go,
00:15:21I'm gonna learn a little bit about what the research says
00:15:24to do with imposter syndrome.
00:15:25Oh, well, actually, if I do some positive self-appraisal
00:15:28and I journal a little bit and I have a gratitude practice,
00:15:31it seems that I can overcome my imposter syndrome.
00:15:34How wonderful.
00:15:35But if it is, oh, let's say we're in a different world
00:15:39that didn't have Ozempic.
00:15:40I have obesity.
00:15:41It's a disease.
00:15:42I can't lose weight.
00:15:44You have outsourced all of your agency now.
00:15:46So yeah, you have used the naming of it
00:15:51as a roadblock to action as opposed to a GPS
00:15:56that can help you find how you should act.
00:16:00- Exactly, yeah.
00:16:01And yeah, so the passage that you're talking about
00:16:03is from Theodore Dalrymple who's a sort of like clinician,
00:16:08come sort of writer.
00:16:10And yeah, he's sort of talked about this quite a bit.
00:16:14But I mean, medicalization is a real problem.
00:16:16It's been a major problem since the 1970s.
00:16:19I think it's kind of like it's something that's sweeping
00:16:24across pretty much all sort of fields.
00:16:27And the reason is because it's kind of like the alignment
00:16:31of perverse incentives.
00:16:32So you have patients, right?
00:16:33Patients who want easy answers to their problems.
00:16:36So they are incentivized to pathologize.
00:16:39Then you have the medical industry,
00:16:41which is both financially and ideologically incentivized
00:16:45to sort of treat more and more things as medical problems
00:16:48for obvious reasons.
00:16:49Firstly, they make money if they are treating more things.
00:16:55So they have a sort of incentive
00:16:57to sort of just creep their definitions outwards.
00:17:02And then they have ideological issues as well.
00:17:04And this is because obviously they are not looking
00:17:08for signs of health.
00:17:11They're looking for signs of disease.
00:17:15That's essentially what physicians do, right?
00:17:17They don't look for signs of health.
00:17:18They look for signs of disease.
00:17:19And because of that, there's a certain sense
00:17:20of confirmation bias where if you're looking for something,
00:17:23you will tend to find it.
00:17:25And so it's very easy if you have that kind of mindset,
00:17:27the mindset of a clinician or a doctor
00:17:30where you're looking for disease to see it,
00:17:31even if it's not there.
00:17:33And this is, again, this has been shown throughout history.
00:17:38In the sort of, you know, in the 1980s,
00:17:40there was the whole thing about multiple personality disorder.
00:17:43It's now known as dissociative identity disorder.
00:17:46And this was basically from like, you know,
00:17:49you can actually trace the development of this,
00:17:52what is essentially like a kind of moral panic.
00:17:54I don't believe that multiple personality disorder
00:17:57or dissociative identity disorder are real things.
00:18:01I think that they're actually fictions
00:18:03because I've actually looked at the history
00:18:04and it really began in sort of like the late 1970s
00:18:08where I think there was one case of somebody claiming
00:18:11to have multiple personalities.
00:18:13And this case went kind of viral
00:18:15as things might go viral in those days,
00:18:16which was through newspapers.
00:18:18And after this, suddenly loads of people
00:18:20started coming forward saying that they also had this issue.
00:18:23And what's interesting is that the number
00:18:26of alternate identities that people claim to have
00:18:30increased over time.
00:18:31So I think like initially people average
00:18:34one alternative personality.
00:18:36And then apparently by like the 1990s,
00:18:39there was an average of about 17.
00:18:40It was absolutely ridiculous.
00:18:42Like just more people, just more and more,
00:18:44they're having more and more alternate identities, you know,
00:18:47and there's no real sort of like neurology behind it.
00:18:50It's just complete sort of nonsense.
00:18:52So, you know, this to me is a very good example
00:18:56of this whole pathologization sort of pandemic as I call it,
00:19:01where even it's not just that definitions increase,
00:19:04but whole diseases can be invented out of nothing
00:19:07simply because people want to put a name on their discomfort.
00:19:12- Trust really is everything when it comes to supplements.
00:19:15A lot of brands may say that they are top quality,
00:19:17but very few can actually prove it,
00:19:18which is why I partnered with Momentous.
00:19:21They make the highest quality supplements on the planet
00:19:23and their whey protein is literally
00:19:25the cleanest on the market.
00:19:26It's fast absorbing isolates
00:19:28sourced from grass fed European cows,
00:19:30which means no hormones, no antibiotics, no GMOs,
00:19:33plus it's NSF certified,
00:19:34meaning that even Olympians can use it.
00:19:36And unlike most proteins, it's designed for gut health.
00:19:39No fillers, no junk, low in lactose, and it mixes amazingly.
00:19:43This is fantastic.
00:19:44Clean protein usually tastes awful and this is unbelievable.
00:19:47Best of all, there's a 30 day money back guarantee
00:19:49so you can buy it completely risk-free,
00:19:50plus they ship internationally.
00:19:51Right now you can get 35% off your first subscription
00:19:54and that 30 day money back guarantee
00:19:56by going to the link in the description below
00:19:58or heading to livemomentous.com/modernwisdom
00:20:00and using the code modernwisdom at checkout.
00:20:02That's L-I-V-E-M-O-M-E-N-T-O-U-S.com/modernwisdom
00:20:07and modernwisdom at checkout.
00:20:08Malingering between 20% and 40% of undergraduates
00:20:13at many elite American universities
00:20:15are now registered as disabled.
00:20:17In the UK, one quarter of the entire population
00:20:20now identifies as disabled.
00:20:22The rewards for claiming a disability
00:20:24now outweigh the stigma and those hurt most
00:20:27by all the pretenders are ultimately those
00:20:29with genuine disabilities.
00:20:30- Yeah, so we're kind of living in a world now
00:20:35where I don't really see much stigma towards disabled people,
00:20:40at least not institutional stigma.
00:20:42I see a lot more benefits being given to people
00:20:46who claim to have disabilities.
00:20:47So in the example that I give,
00:20:49if you look at universities like elite universities
00:20:52such as Stanford, Harvard, Yale,
00:20:55they have really high percentages of disabled students
00:20:58or at least students who claim to be disabled.
00:21:00And when you look at why this is,
00:21:02you see it's pretty obvious.
00:21:04If you are registered as disabled
00:21:06with one of these universities,
00:21:07then you get extra time on exams.
00:21:09That's just one of the benefits you get,
00:21:10but you also get other benefits as well.
00:21:12But that's the main benefit,
00:21:14probably the most lucrative benefit.
00:21:16And so you get a lot of rich kids.
00:21:19It's weird because when you look at the people
00:21:21who are primarily claiming disability,
00:21:24it tends to be the rich kids,
00:21:26which is quite an odd sort of correlation, right?
00:21:30And it seems to be because they are the ones
00:21:33who can pay doctors to essentially
00:21:35fabricate their disabilities.
00:21:37- Oh, no. (laughs)
00:21:40- So yeah, so basically these kids now
00:21:43are getting extra time in exams
00:21:45because they're basically saying,
00:21:46oh, I have ADHD, I'm on the spectrum.
00:21:49I have some problem,
00:21:52like I have constant pain in my left leg.
00:21:54It could be anything, right?
00:21:56They'll just basically say something,
00:21:58you know, I'm dyslexic or whatever.
00:21:59And so what happens is that these kids
00:22:02basically get the extra time in exams.
00:22:05And why this is bad,
00:22:06I mean, it's obviously bad for being dishonest,
00:22:09but it's extra bad because it essentially makes it harder
00:22:13for people with genuine disabilities to be believed,
00:22:16like when they have a disability.
00:22:18Because it is true that there are some people
00:22:19who have disabilities that are not obvious
00:22:21that require a physician to actually do a check on them
00:22:24to find out.
00:22:25You know, I have an aunt, for example,
00:22:26who has osteoporosis,
00:22:28and it's not obvious watching her even walk
00:22:31that she has osteoporosis,
00:22:33but she actually does have it because it's all,
00:22:36you know, from x-rays,
00:22:37you can see that her bones are basically crumbling.
00:22:40And so, you know, it's actually quite common
00:22:44where somebody can have a disability, but it's not obvious.
00:22:46And so if you have like between 20% and 40% of everybody
00:22:50claiming to have a disability,
00:22:51then the people who actually have a disability
00:22:52get less attention, they're not believed as much,
00:22:56they're treated with skepticism.
00:22:58So not only are we creating a victimhood culture,
00:23:01but we're also creating a cynical culture
00:23:03where the people who genuinely need help won't be believed.
00:23:06So, yeah, it's a pretty bad way, yeah.
00:23:09- Sloppaganda.
00:23:10More online articles are now written by AI than by humans,
00:23:13and research is increasingly finding that AI is better
00:23:16at persuading people than people are.
00:23:18Who wins in a world of unlimited propaganda?
00:23:21Not those with the best arguments,
00:23:22but those with the most slopp.
00:23:24This is similar to Moloch's bargain.
00:23:26When LLMs compete for votes or social media likes,
00:23:29they push lies and rage bait to win,
00:23:31even when explicitly instructed to stay grounded and honest.
00:23:34If chatbots conclude that getting our attention
00:23:36requires lying to us, is the AI misaligned or are we?
00:23:43- Yeah, so there's been a lot of talk about
00:23:46the kind of AI-driven disinformation age
00:23:49where basically nobody will be able to know what's true
00:23:53and everybody's gonna believe lies
00:23:54and all this kind of stuff.
00:23:55And I mean, yeah, that's probably part of it.
00:23:59I don't think it's as serious as people are claiming.
00:24:02I don't think the serious part of this
00:24:04is that people are gonna believe lies,
00:24:06because people have always believed lies.
00:24:10If you go back throughout any point in history,
00:24:13there were a lot of consensus beliefs
00:24:16that were ultimately proved to be wrong.
00:24:18So I don't actually think that people believing falsehoods
00:24:20is necessarily a bad thing.
00:24:21I think most of what people believe,
00:24:23as opposed to what they know, is false anyway.
00:24:26I think that the bigger problem
00:24:28is not the dissolution of truth,
00:24:31but the dissolution of trust.
00:24:33I think that's far more important,
00:24:34because a society can survive without truth
00:24:38pretty much most of the time.
00:24:40As long as you have very basic truths,
00:24:42like knowing that gravity is a thing, for example,
00:24:45as long as you have basic truths, society can survive.
00:24:48You don't need complex truths for society to survive.
00:24:51And history shows us that it's demonstrated
00:24:53that beyond reasonable doubt.
00:24:56But trust is a whole different ballgame.
00:24:59A society can't survive without trust,
00:25:03because pretty much everything depends on
00:25:07being able to trust other people in society.
00:25:10If you can't trust other people,
00:25:12then you don't have a society.
00:25:13It's literally the glue that binds a society together.
00:25:17And what I think is a problem
00:25:19is not that people will believe falsehoods.
00:25:21I think the problem is that the cost
00:25:24of determining what's actually true is gonna become so high.
00:25:27It's gonna require so much effort
00:25:29that people are essentially gonna give up
00:25:31really valuing truth as a principle.
00:25:34- This is one of my favorites from you, reality apathy.
00:25:37When the sheer volume of conflicting information
00:25:39makes the effort of finding the truth
00:25:41costlier than the value of knowing it,
00:25:43people give up trying to be accurate
00:25:45and instead choose whatever bullshit stinks least.
00:25:47Slop doesn't just threaten the truth,
00:25:49but the very worth of truth.
00:25:50And it's this sort of overwhelm.
00:25:54The goal of propaganda isn't to make you believe
00:25:57any one narrative sometimes.
00:25:59It's simply to make you more pliable
00:26:01at not wanting to believe anything.
00:26:05Earlier on today, in one of the old group chats
00:26:08from the guys that used to work for me in Newcastle,
00:26:10one of the guys said,
00:26:11is anybody else's algorithm getting peppered
00:26:15with all of this Epstein stuff at the moment?
00:26:17These are blue collar dudes from the Northeast of the UK.
00:26:21Maybe they're working in London or something.
00:26:23This is not, Epstein is not supposed
00:26:25to sort of cross their threshold.
00:26:27And it's obviously hit a limit at a volume
00:26:29where they think, holy shit,
00:26:31like this is so much Epstein stuff.
00:26:34I've just seen, I'm now convinced
00:26:37that he's playing Fortnite in fucking Israel.
00:26:39I don't know what to believe anymore.
00:26:41And that's literally reality apathy.
00:26:43And it was so funny to see that message come in
00:26:46and think that this is the overwhelm of information
00:26:49and conflicting points of view going in opposite directions,
00:26:52literally happening in front of my eyes.
00:26:54- Yeah, and I think one of the challenges going forward
00:26:59is gonna be trying to convince people
00:27:01that it's actually worth pursuing the truth.
00:27:04I think more than actually convincing them
00:27:05of any particular truth,
00:27:07just convincing them of the value of truth
00:27:09is gonna be extremely important
00:27:11because we're essentially entering a world of virtual reality.
00:27:15You can essentially create your own reality now.
00:27:18You can do it both figuratively
00:27:19through social media echo chambers,
00:27:21but you can also do it literally
00:27:23by essentially just sequestering yourself in your bedroom
00:27:28and living your entire life through your headset
00:27:31or your laptop screen or whatever,
00:27:34and just using AI to just generate whatever you want,
00:27:37whatever reality you want.
00:27:40This is not far away.
00:27:41I mean, there's recently been, I think it's Cdance,
00:27:44this new Chinese sort of video generation tool
00:27:48which is just insane. - Cdance.
00:27:50- Cdance, yeah.
00:27:51I think it's called Cdance, yeah.
00:27:53- Is that like Sora, it's Chinese Sora?
00:27:56- Much, much, much better than Sora.
00:27:58It's like a whole generation ahead of Sora,
00:28:01ahead of VO3,
00:28:03ahead of all the best frontier models in the West.
00:28:06This is something completely wild.
00:28:08I think China has got an edge in video generation
00:28:11because they don't have copyright laws,
00:28:13or at least they don't really care about copyright very much,
00:28:15whereas the West has an advantage in text-based generative AI
00:28:20because they don't have censorship laws.
00:28:22So there's this trade-off.
00:28:23- Whichever market has the poorest protections
00:28:27will get the most progress.
00:28:28- Yeah, basically, yeah,
00:28:29because that seems to me to be the bottleneck.
00:28:31- Oh, funny.
00:28:32Yeah, so you can make, oh, is that who made,
00:28:35I saw a pretty famous Dragon Ball Z recreation 3D thing.
00:28:40Have you seen this, is that made from that?
00:28:44Okay, I know exactly what you mean.
00:28:46And I saw it last night.
00:28:47And I remember thinking, fucking hell, that's really good.
00:28:51And they're still trying to use cell shading to make it,
00:28:54it's not supposed to look like people.
00:28:56It's supposed to look like cartoon people, but it's in 3D.
00:28:59And it's significantly better than Dragon Ball Z
00:29:03from a design standpoint.
00:29:04So yeah, I wondered what that was.
00:29:07Is that Sora?
00:29:07That seems really good.
00:29:08Is that fucking Nano Banana 5?
00:29:10What the fuck is going on?
00:29:11But it was this new thing that you're talking about.
00:29:13- Yeah, I think it's from ByteDance,
00:29:15which is the company that created TikTok and--
00:29:17- CapCut.
00:29:18- Yeah, basically CapCut and all the rest of that stuff, yeah.
00:29:21But it's wild.
00:29:22So we're basically entering this kind of virtual reality age
00:29:26where people can essentially create their own reality.
00:29:29Whatever they want to believe,
00:29:30they can make it at least seem true enough
00:29:33by curating information online.
00:29:37So yeah, we need to teach people to actually value truth
00:29:40as a species if we want to actually progress.
00:29:43- What was that line around dead internet theory,
00:29:47people being worried that all of the content on the internet
00:29:50is just going to be made by robots,
00:29:53we're worried about the fact that unthinking,
00:29:58replicative automatons are going to be producing
00:30:01most of the information that we see online.
00:30:04The future that we fear will come to pass
00:30:08has already come to pass
00:30:10because most people blindly just repost
00:30:12what they see in any case.
00:30:13Like we're worried about the fact
00:30:15that these disembodied fucking AIs are posting stuff.
00:30:19Meanwhile, someone that doesn't read an article
00:30:23or watch a video outside of the first 15 seconds
00:30:27decides to spew the half-baked opinion, which isn't theirs.
00:30:31You know, they're being marionetted
00:30:32by the few original thinkers that came before them
00:30:35and now just saying the nearest close, what was it?
00:30:40The new hill to die on that they've just decided
00:30:46they plant this flag and like this five minute old opinion
00:30:49is the new thing that they're going to wrap
00:30:52their entire identity around.
00:30:53You're already doing, the dead internet theory
00:30:57has been here since social media was here.
00:30:59People were unthinking in the way that they reposted
00:31:02and commented on stuff.
00:31:03They weren't being subtle and nuanced.
00:31:06And now all that you're worried about,
00:31:08it's the exact same as people being worried
00:31:10about self-driving cars that are significantly safer
00:31:14than humans are, but they've got a combination
00:31:16of naturalistic fallacy and some weird preference
00:31:19that they'd rather die by a human driver
00:31:21than be saved by a robot.
00:31:23And it's kind of the same, well, you know,
00:31:24I'd rather be lied to by an unthinking human idiot
00:31:29than convinced by an unthinking robot super genius.
00:31:33- Yeah, I mean, few people are willing to admit
00:31:36the similarities between humans and chatbots.
00:31:39You know, like there's a lot of people saying,
00:31:41oh, you know, well, you know,
00:31:44these chatbots aren't intelligent,
00:31:45they're just predicting the next token.
00:31:46But then you consider, you know, what are humans doing?
00:31:49You know, a lot of the time,
00:31:51they're just predicting the next token too.
00:31:53They're just, you know, they're just regurgitating,
00:31:55you know, what they've heard and kind of just developing
00:31:58explanations about the world based on that.
00:32:00You know, going by vibes is probably
00:32:02how people would describe it today.
00:32:04So, you know, I think, yeah, one of the good things
00:32:07about sort of the whole AI age is it's really allowed us
00:32:11to understand that a lot of what we thought were
00:32:14unique to humans are actually just basic algorithms,
00:32:16you know, just ways that we organize information,
00:32:18the way that we generate beliefs.
00:32:21A lot of it is, you know, people don't really understand
00:32:22what they're saying.
00:32:23They're just kind of regurgitating what they heard.
00:32:25And that is essentially what a chatbot does in a sense.
00:32:28And so it helps us to really understand how automated
00:32:32so much of our belief formation is.
00:32:35It's why we need to, you know, have more agency.
00:32:37And actually, if we want to be indistinguishable,
00:32:40if we want to be distinguishable from chatbots,
00:32:43then we need to actually sort of,
00:32:45we need to strengthen the one thing that we have
00:32:47that chatbots can't replace, and that's agency.
00:32:50The ability to actually act independently
00:32:54and to actually think about what you're doing
00:32:56rather than simply reacting to your circumstances.
00:32:59- 1% rule.
00:33:02In online communities, around 1% of users
00:33:04produce almost all of the content.
00:33:06As such, what you see online is not representative
00:33:08of humanity, but merely a loud, obsessive,
00:33:10and often narcissistic, psychopathic, and low IQ minority.
00:33:14Social media is literally a freak show
00:33:17and consuming only content that reinforces your views
00:33:20is intellectual incest, producing beliefs
00:33:23that are increasingly frail and deformed.
00:33:25- Yeah, so, you know, when I go online on social media,
00:33:31I often can sometimes, well, I sometimes feel disheartened.
00:33:35You know, it can, I think we've discussed this before,
00:33:38where, you know, you go on social media
00:33:39and you just see just loads of just crap on your timeline
00:33:43and just the most ill-informed opinions
00:33:45and people getting outraged over just nonsense,
00:33:48and it kind of can destroy your faith in the human race.
00:33:53I think Sam Harris, I think the reason why he left Twitter,
00:33:56I think he did describe it that way.
00:33:58He said that when he was on social media,
00:33:59it made him hate humanity.
00:34:01You know, and I can sympathise with that.
00:34:02- Yeah, you referred to it as the most pathological type
00:34:06of telepathy you can imagine, where all he could hear
00:34:09were the worst of everybody else's thoughts.
00:34:11- Yeah, and, you know, and sometimes, you know,
00:34:14it can sort of just really dishearten you.
00:34:16I have a few friends who, you know, on social media,
00:34:18and they sometimes have long breaks
00:34:20because it just completely just really demoralises them
00:34:24when they think, oh, this is what humanity is, you know,
00:34:26all this noise, like all this completely irrational noise
00:34:30just being thrown everywhere.
00:34:31But I think it's always helpful to remember
00:34:35that what you're seeing online
00:34:36is not actually representative of humanity.
00:34:39It's representative of the loudest
00:34:42and often the most obnoxious humans on the planet.
00:34:45And there's a lot of research to support this.
00:34:46You know, there's pretty consistent findings
00:34:49which find that people who are high in certain
00:34:52dark tetrad traits, particularly narcissism and psychopathy,
00:34:57tend to use social media more,
00:34:59but also they tend to engage
00:35:01in online political participation a lot more as well.
00:35:03They tend to engage in sort of online debates
00:35:06and things like that a lot more.
00:35:08And then you also have people who are essentially cluster B,
00:35:13you know, people who are really dramatic.
00:35:14Again, narcissism comes up here.
00:35:16Also histrionic personality disorder.
00:35:19Naturally, you know, the people that are--
00:35:20- What's histrionic personality disorder?
00:35:23- So histrionic is basically when you're a drama queen.
00:35:25Basically, it's when you're attracted, you know,
00:35:27you just want to draw attention to yourself
00:35:30by playing the victim or by, you know, just catastrophizing,
00:35:35just making out like everything's worse than it actually is,
00:35:38just through theatrical behavior, basically.
00:35:41And so naturally this is a good fit for social media,
00:35:45this kind of behavior, you know,
00:35:47because obviously if you want an audience
00:35:48and if you want to play theatrics,
00:35:50where else would you want to go
00:35:51than a place where everybody else is freaking out
00:35:54and everybody's looking to be freaked out, you know?
00:35:56So obviously social media attracts
00:35:58the absolute worst of the human race.
00:36:00It attracts the most impulsive, the most theatrical,
00:36:02the most narcissistic, the most psychopathic,
00:36:05the most low IQ.
00:36:07You know, these are the, you know, often the worst people.
00:36:09Not saying that there aren't good people in social media,
00:36:10of course there are, but when you look at it
00:36:12from a statistical point of view,
00:36:13you have over-representation
00:36:15of the worst elements of humankind.
00:36:17- I also imagine even if you have somebody
00:36:19who is compassionate and well-meaning and delicate
00:36:21and thoughtful and high IQ,
00:36:24they're operating in an environment
00:36:26where they regress to the mean.
00:36:28And the mean is mean, oddly enough.
00:36:30I had one.
00:36:31So for the people that haven't heard us do this before,
00:36:33most of the stuff is me shamelessly shilling Gwyneth's stuff
00:36:36and then he says it back to me.
00:36:37But sometimes I bring stuff from home
00:36:39and I've got some that I brought from home.
00:36:41So this one's kind of related.
00:36:42Recursive red pill learning.
00:36:44Most people get their information from the internet.
00:36:46The stories online which garnered the most attention
00:36:48are the most extreme,
00:36:50meaning that influencers' unrepresentative insights
00:36:52are being trained on other influencers'
00:36:54unrepresentative insights,
00:36:56leading to self-reinforcing antagonism between the sexes.
00:36:59And this came out of a quote that I saw online,
00:37:01which is, "Having a boyfriend is embarrassing now,"
00:37:04which was that Variety article
00:37:06that came out about six months ago,
00:37:07has the same energy as,
00:37:09"The Kardashians made skinny go out of style,"
00:37:12in that neither is true if you just go outside.
00:37:15So the loudest stories, the biggest stories,
00:37:19the ones with the most upvotes on Reddit,
00:37:21by definition are the ones
00:37:23that are the most attention-grabbing,
00:37:24which means that they're the most extreme
00:37:26or unrepresentative.
00:37:27And that means if you spend most of your time
00:37:30learning about the world through the internet,
00:37:32what you see is the least representative presentation
00:37:35of what reality is like over and over again.
00:37:38And it just retrains you to expect that as normality.
00:37:42- Yeah, and I've seen this play out in real time
00:37:45because I've been on social media since around 2014.
00:37:50And in that time,
00:37:51I've stuck with pretty much the same group
00:37:53of sort of mutuals, mostly.
00:37:55And I've actually witnessed an interesting pattern,
00:37:58which is that the people who spend the most time online
00:38:01have become more unhinged and more extreme in their beliefs.
00:38:04This is something I've personally witnessed.
00:38:06I know that this is N equals one,
00:38:08but it's more compelling to me than studies
00:38:10because it's something I've literally witnessed happen
00:38:12in real time.
00:38:14And I think this is also supported by research as well.
00:38:18Some of these, I wrote this article called "Dramageddon"
00:38:22about people talking about a civil war.
00:38:26And this talk has been going on since around 2021,
00:38:29like really serious talk.
00:38:31People like Elon Musk have sort of promoted this idea
00:38:34that there's gonna be a civil war
00:38:36between the left and the right in the US.
00:38:38And some people have said it will probably happen
00:38:41in Europe as well.
00:38:42But it's mainly, it seems to be much more in the US
00:38:44because the US is a lot more politically polarized
00:38:47than Europe is in general.
00:38:50And basically this idea is that a lot of these people
00:38:55think there's gonna be a civil war
00:38:56for precisely the reasons that you gave,
00:38:57which is that what we see is that what goes most viral
00:39:01are the so-called scissor statements,
00:39:03what Scott Alexander called scissor statements,
00:39:06which are statements that are deliberately designed
00:39:08to create debates, create arguments, basically.
00:39:13And this is one of the reasons why the media now,
00:39:18what they seek to do is they don't seek
00:39:20to just tell you things that are true.
00:39:22They seek to actually create statements or news reports
00:39:26that will divide people because when they do that,
00:39:30the two sides will argue over that issue.
00:39:32And in so doing, they will help that thing go viral.
00:39:36So for example, if you're the New York Times
00:39:40and you wanna go viral, how do you go viral?
00:39:42You're not gonna go viral by telling the truth.
00:39:44If you just state facts like about some sort of reporting,
00:39:48you're not really gonna go viral most of the time.
00:39:50But what will go viral is if you make a divisive claim,
00:39:54something that's gonna split the internet into two.
00:39:56So something like, oh, white people are privileged,
00:40:00are too privileged.
00:40:01If you say something like that,
00:40:03that's gonna divide the internet in half.
00:40:05You'll have half of the people be like, yeah,
00:40:07oh, white people are too privileged,
00:40:09we need to do something about it.
00:40:10And then you'll have the other half people say,
00:40:11no, no, no, no, this is all nonsense.
00:40:13This is based on false studies, bad studies,
00:40:16all this stuff.
00:40:17And so then they'll argue over it and in arguing over it,
00:40:20they're gonna make it go viral
00:40:21because then it's gonna appear on everybody's timeline.
00:40:23And then people are gonna be writing a sub-stack about it.
00:40:25They're gonna be making videos about it.
00:40:28And so this all helps the original claim to go viral.
00:40:32And so this is the sort of tragic system in which we're in,
00:40:35in which just stating true things does not go viral,
00:40:39but dividing people,
00:40:40saying things that are gonna divide people does.
00:40:43And this is why I think so many people still believe
00:40:45that there's gonna be a civil war in the US,
00:40:47even though there's just, when you look at reality,
00:40:50there's just no inkling of this whatsoever.
00:40:52The polarization does exist,
00:40:53but the polarization exists amongst the top 1% of people
00:40:57on social media who are mostly engaged in politics.
00:40:59It doesn't really exist very much in the wider world.
00:41:03- In other news, Shopify powers 10%
00:41:08of all e-commerce companies in the United States.
00:41:11They are the driving force behind Gymshark, Skims,
00:41:13Allo and Nutonic, which is why I partnered with them.
00:41:16Because when it comes to converting browsers into buyers,
00:41:19they are best in class.
00:41:20Their checkout is 36% better on average
00:41:23compared to other leading commerce platforms.
00:41:25And with Shop Pay, you can boost conversions by up to 50%.
00:41:28They've got award-winning support
00:41:29there to help you every step of the way.
00:41:31Look, you are not going into business to learn how to code
00:41:35or build a website or do backend inventory management.
00:41:38Shopify takes care of all of that
00:41:40and allows you to focus on the job that you came here to do,
00:41:43which is designing and selling an awesome product.
00:41:46Upgrade your business and get the same checkout
00:41:48that I use with Nutonic on Shopify.
00:41:50Right now, you can sign up for a $1 per month trial period
00:41:54by going to the link in the description below
00:41:56or heading to shopify.com/modernwisdom or lowercase.
00:42:00That's shopify.com/modernwisdom
00:42:02to upgrade your selling today.
00:42:04You stress people have more comforts and conveniences
00:42:07than ever, yet reports of unhappiness are at an all-time high.
00:42:10One reason is that discomfort isn't an obstacle to happiness,
00:42:13it's the path to it.
00:42:15For it's only by enduring struggles
00:42:17that we develop the resilience necessary
00:42:18for lasting contentment.
00:42:20And you had a fucking slammer
00:42:21that I've been thinking about so much.
00:42:23Automate only the skills you're willing to lose,
00:42:26that those two feel like they're pretty related.
00:42:29- Yeah, so I mean, we've been told,
00:42:33again, this is another sort of error
00:42:35that the social sciences have for a long time propagated,
00:42:40which is that if somebody is exposed to stress,
00:42:43then it's bad for their health.
00:42:45It can cause trauma or whatever, that horrible word.
00:42:48But I mean, when you actually look at the,
00:42:52not just the data,
00:42:53but when you just look at pretty much all of human history,
00:42:56it's clear that stress can be very beneficial.
00:42:59Not all stress, there's a certain kind of stress
00:43:02and that's called eustress.
00:43:04And so eustress is basically the stress that challenges you,
00:43:08that basically forces you to adjust,
00:43:10that forces you to improve, basically.
00:43:15It's not like the stress of being online
00:43:20and being constantly exposed to just horrific news
00:43:24from around the world.
00:43:25That's bad stress
00:43:26because you can't really do anything about that.
00:43:29If you're stressed because your feed is filled
00:43:32with horrific news stories from around the world,
00:43:36that's just bad stress.
00:43:37It's just gonna stress you out.
00:43:38You can't do anything about it.
00:43:40So it's pointless.
00:43:41It's pointless stress.
00:43:42It's pointless suffering.
00:43:44Good stress is when you can do something about it.
00:43:46So it's stuff like if you've got a date, for example,
00:43:51if you have a date with a girl, that's stressful
00:43:57because now you've gotta be the best version of you.
00:44:00You've gotta impress that girl.
00:44:02So you're under a lot of stress.
00:44:04But that forces you to become better.
00:44:06It's a challenge and you have to meet it.
00:44:08And what happens is that in so trying to meet that challenge,
00:44:13you become a better person.
00:44:15It helps you both at a psychological level,
00:44:19but also at a physiological level.
00:44:20It's hormetic stress.
00:44:22So hormetic stress is stress
00:44:24that sort of makes you adapt, basically.
00:44:27It makes your body adapt to it.
00:44:28And constant stress of that kind
00:44:30is really, really good for you.
00:44:32And the research is very clear on this,
00:44:34but also ordinary human experience is clear on this as well.
00:44:37Anybody who's lived on this earth knows
00:44:40that you need a little bit of stress now and again,
00:44:41just to sort of push you forward and get things going.
00:44:45And so this whole thing that we've been told
00:44:47by a lot of people, which is that we need to minimize stress
00:44:51because then we'll live longer or whatever,
00:44:52that's actually, it's not really true.
00:44:54It's half true.
00:44:56Bad stress is bad for you.
00:44:57It will reduce your lifespan probably.
00:44:59But we need to constantly expose ourselves to discomfort
00:45:04if we want to be able to be happy
00:45:06because happiness is dependent on having a resilient mind.
00:45:11You cannot be happy unless you have a strong mind
00:45:14because you have to be able to weather all the slings
00:45:17and arrows and the vicissitudes of life.
00:45:21Like they will be constantly throwing things at you.
00:45:23Life will constantly be knocking you astray from your course.
00:45:27It will constantly be throwing,
00:45:29just a lot of unexpected things are gonna be happening.
00:45:32If you're only happy when things are going your way,
00:45:36you're not gonna be happy most of the time.
00:45:38And so you have to cultivate the strength.
00:45:39And that's essentially comes from exposing yourself to stress.
00:45:42The more stress of that, not stress, but you stress,
00:45:45the more you stress you expose yourself to,
00:45:48the more resilient your mind becomes
00:45:50and the more you are able to stay happy
00:45:51no matter what life throws at you.
00:45:54- What about automate only the skills you're willing to lose?
00:45:57- Yeah, so this is basically the same principle.
00:46:00So stress is also a form of learning.
00:46:03It's how you learn, right?
00:46:05I always say that you can rent wisdom,
00:46:10but you can only purchase it with pain, right?
00:46:15So what I mean by that is you could tell me something,
00:46:18you could give me some modern wisdom, right?
00:46:21And I will be like,
00:46:22oh, okay, yeah, that's a really cool way of living,
00:46:24maybe I should do that.
00:46:25And I'll try it a couple of days
00:46:27and then I'll forget it exists
00:46:28and I'll just carry on with my life as it was.
00:46:30But if I learn that same lesson through hardship,
00:46:33if I suffer, if I'm exposed to stress
00:46:36and I have to adopt that out of necessity,
00:46:40then it becomes integrated into me
00:46:42and then it becomes a habit.
00:46:44It's something that I'll always remember
00:46:47because the pain engraved the lesson into my brain.
00:46:52And so stress can also be a form of learning.
00:46:56And one of the things with automating things
00:46:58is it completely reduces the friction, it reduces the stress.
00:47:01You no longer need to engage in any kind of discomfort
00:47:04because you just get things done automatically for you.
00:47:06And so you don't learn as a result of that
00:47:08because the pain, the stress
00:47:10is a necessary component of the learning.
00:47:12You're not gonna remember the lesson
00:47:14unless you really suffer or expose yourself
00:47:16to some kind of stress that forces your body
00:47:18to internalize the lesson.
00:47:20- Have you looked at that research,
00:47:25maybe Harvard, maybe MIT,
00:47:28about students that use LLMs to help them
00:47:32with learning and writing and the differential
00:47:35in terms of how much they can recall afterward?
00:47:38- Yeah, yeah, I recall this study, yeah, I think.
00:47:41So this is basically has found that essentially
00:47:45LLMs can cause brain rot, basically.
00:47:49I think that was like one of the sort of clickbait titles
00:47:52that was given to the study that LLMs cause brain rot.
00:47:55So yeah, I mean, it's the same principle, basically.
00:47:58Like when you are outsourcing your abilities to an LLM,
00:48:03there's no incentive for your body or your brain
00:48:09to learn the lessons, right?
00:48:11Because it's like what Plato said in "Fedrus"
00:48:16where he was talking about, one of his sort of concerns
00:48:23was he was writing at a time when sort of paper and pen,
00:48:27or parchment and pen, were becoming common.
00:48:29So this was the AI of his age.
00:48:31And he lamented that paper was gonna,
00:48:35or parchment was gonna destroy people's memory
00:48:38because if they could write things down,
00:48:40then they would have no incentive to remember it.
00:48:42And I mean, I don't know how true that is,
00:48:45but I think that there is a certain sort of analog
00:48:48with what we're seeing today,
00:48:49which is there's a thing called the Google effect.
00:48:52Now this is, it's not a robust finding,
00:48:55but I think the finding does exist.
00:48:59I think the finding is true,
00:49:00but it's probably smaller than it.
00:49:01It's probably overstated.
00:49:03But the Google effect is this idea
00:49:05that if you can just kind of Google anything,
00:49:07then there's no need for you to remember facts, basically,
00:49:11because your mind has essentially been extended
00:49:14to your screen.
00:49:16So that's now functioning as your memory.
00:49:18Your laptop screen, your phone screen
00:49:20is basically your memory now.
00:49:21So your actual memory doesn't really feel the need,
00:49:25as it were, to kind of remember anything.
00:49:27So, I mean, again, the research on this is a bit shaky.
00:49:31I don't want to say that this is genuinely a thing
00:49:33because it's contradicted by some of the studies,
00:49:36but some studies have found that this is the case.
00:49:38So I don't know with this chat GPT thing
00:49:41if it really does cause brain rot in the same sense,
00:49:44'cause it's only one study.
00:49:45And I'm very, very sort of wary of single studies now
00:49:48because of, of course, we've got a replication crisis.
00:49:51A lot of studies are not replicated now, so.
00:49:53But what I will say is one thing that we know for sure
00:49:57is if you don't use it, you lose it.
00:49:59This is a fact that's beyond dispute.
00:50:02It's true of your body.
00:50:04It's true of your brain, right?
00:50:05If you, you know, there's recent research
00:50:08which only came out, I think, yesterday,
00:50:09which found that people whose brains are active in late life,
00:50:14so from the ages of 50 to 80,
00:50:17they're much less likely to develop Alzheimer's
00:50:20and other forms of dementia.
00:50:22So if, you know, if you basically engage in things
00:50:25like video games, board games like chess,
00:50:29if you write and read a lot,
00:50:31if you keep your brain active in your sort of 50s,
00:50:34then your chances of developing like dementia
00:50:36are much lower apparently.
00:50:38And this is apparently like a pretty robust
00:50:40longitudinal study.
00:50:41So, and again, this fits, this is not just an isolated study.
00:50:45This fits with all the other research
00:50:46that has been done on this topic.
00:50:48Like the more actively you use your brain,
00:50:51the stronger your brain becomes, you know,
00:50:53although it's not technically a muscle,
00:50:54it functions like a muscle in that respect.
00:50:56And so, I mean, one of my big fears about AI
00:51:01is not that the machine is gonna go conscious,
00:51:04it's gonna become conscious.
00:51:05It's that it's actually gonna steal
00:51:07our consciousness away from us
00:51:09by essentially just causing us to outsource
00:51:12all of our agency, our intelligence, you know, to it,
00:51:16and causing our own brains to atrophy, so.
00:51:18- I think I had a conversation with Cal Newport,
00:51:21Deep Workman last week about a lot of this.
00:51:23Obviously his whole thing for 15 years now,
00:51:28since he wrote "So Good They Can't Ignore You"
00:51:31was how can you stand out
00:51:35in a field of relative equals?
00:51:39But I think his perspective, certainly my perspective now
00:51:41is that the field is getting worse and worse.
00:51:46The bar that you need to get over is becoming ever lower.
00:51:50You know, in order for you to get a partner at the moment,
00:51:55simply approaching somebody in person in the real world
00:51:59is a one in a thousand chance, as opposed to 50 years ago,
00:52:04that would be something that everybody was doing.
00:52:07And the same thing goes for
00:52:08what's the quality of your writing?
00:52:10- Well, what AI is enabling is velocity and quantity,
00:52:15but it's regressing to the mean with regards to quality
00:52:22and creativity and taste, especially.
00:52:24So if you can cultivate creativity, quality, work, writing,
00:52:29and good sense of taste,
00:52:32you are going to stand out even more.
00:52:34And you don't even need to cultivate it,
00:52:36you simply need to stop it from atrophying.
00:52:38If you can hold your level,
00:52:40if you can hold 2016 levels of focus
00:52:45and ability to write and overcome stuff,
00:52:46I mean, if there's somebody out there
00:52:48who's got sort of 2008 levels of non-distraction
00:52:52before Slack and before smartphones,
00:52:54you don't need to be better, you just need to not be worse.
00:52:58And that ability to kind of hold as the entropy,
00:53:02this sort of technological entropy of the system
00:53:04is trying to fucking compress you into dust.
00:53:07That to me, it kind of is hopeful.
00:53:09It's a hope inspired, as a civilization gets fatter,
00:53:13not great for the civilization,
00:53:15I think that it should be good that everyone's in health,
00:53:17but it does make for a pretty uncompetitive environment
00:53:21if you are someone that is able to avoid getting fatter.
00:53:24That's good for you in as much as civilization
00:53:28and the people around you are kind of a bit of a competition,
00:53:30which they are, but the same thing goes
00:53:34for being able to read.
00:53:35Now, how long is it going to be before if we neural link in,
00:53:40we don't actually need to have the written word anymore.
00:53:43We don't need to have the spoken word anymore.
00:53:45And that all of these skills that will atrophy,
00:53:48eventually you may get into a world
00:53:49where that's so redundant that you don't actually want it,
00:53:52that there's better ways.
00:53:53But at the moment, we're in a transition period
00:53:55where you still need to be able to have the skills
00:53:58from the old world in order to have a competitive advantage
00:54:00in the new one.
00:54:01So yeah, I'm increasingly thinking now
00:54:05about what are the things that are non-fungible?
00:54:08What are the things that are only human?
00:54:10What are the things,
00:54:11and that's really where most of my attention
00:54:13should be focused on writing without using AI
00:54:16to help me with my research,
00:54:17on coming up with ideas, on developing taste,
00:54:20on trying to be creative, on giving myself space,
00:54:22because there's all of the market moves in the direction of,
00:54:25well, I can just publish more.
00:54:26If I publish more slop because I've been enabled
00:54:29by the magnifier that is LLMs,
00:54:34that is where the entirety of the market will move
00:54:37because it's the path of least resistance.
00:54:39Okay, well, what's the opposite of that?
00:54:41What's the more difficult choice?
00:54:43- Yeah, the secret I think to surviving the future
00:54:45is gonna be agency.
00:54:47Because as I said before,
00:54:48that is the one truly non-fungible thing.
00:54:50I think everything else is downstream of agency.
00:54:53I think what's gonna happen in the sort of AI age
00:54:56is that essentially humanity is gonna split in two.
00:55:00And I think I've made a reference to this before.
00:55:01I think we had a conversation in 2021 in which I spoke
00:55:04about this.
00:55:05But basically the analog I use is a novel
00:55:08called "The Time Machine",
00:55:10which was written I think at the start of the 20th century.
00:55:12And it was basically the story of in the far future,
00:55:17humanity has sort of evolved into two subspecies.
00:55:21So you've got the Morlocks and the Eloi.
00:55:23And the Morlocks basically, they do all the work.
00:55:24They've maintained like all their faculties
00:55:27because they have lived lives of drudgery
00:55:30and they've passed this on down to their generation
00:55:31or from generation to generation.
00:55:34And they're in charge of all of the machinery basically.
00:55:36And they're constantly working
00:55:38and constantly improving themselves mentally and physically.
00:55:41And then you have the Eloi who were basically,
00:55:43they were the former aristocrats.
00:55:44They were the ones who had everything done for them.
00:55:47And as a result of this,
00:55:48they have all of their faculties of atrophy.
00:55:51So that their bodies are like really thin and frail.
00:55:55Their minds, they've become very naive.
00:55:58They're basically like children.
00:55:59They've regressed into children
00:56:01and they're completely dependent on the Morlocks
00:56:03who do everything for them.
00:56:04And in the end, basically it turns out
00:56:06that the Morlocks have been farming the Eloi
00:56:08in order to eat them basically.
00:56:10And while they're doing this,
00:56:11they're just distracting the Eloi
00:56:13with all this like entertainment
00:56:14or basically just to keep them placid.
00:56:17And I think that essentially we're gonna have something
00:56:19probably not as horrific as that, but something similar
00:56:22in the sense that we'll have a class system,
00:56:24a new class system.
00:56:25Well, we'll have high agency people
00:56:27whose agency is gonna be increased even more by AI.
00:56:31And then we'll have passive people
00:56:33whose passivity will be increased even more by AI.
00:56:36Because AI, the way I look at it,
00:56:37I don't look at it as artificial intelligence.
00:56:39I look at it as amplified intelligence.
00:56:42But as I say, it can also amplify stupidity.
00:56:45It amplifies, essentially it's an amplifier of everything.
00:56:49So if you're lazy, it will amplify your laziness.
00:56:52If you are highly agentic and conscientious,
00:56:55it will amplify those attributes as well.
00:56:58So what's gonna happen is the people who already have agency,
00:57:01they're gonna use AI to increase their options.
00:57:04They're gonna basically use it to do more.
00:57:07So they're gonna become even more agentic.
00:57:09And the people who lack agency,
00:57:11they're gonna use it to do things for them.
00:57:13They're gonna use AI to think for them,
00:57:15to basically outsource everything to them.
00:57:17So they're gonna get even less agency.
00:57:19So what we're gonna see is the compounding
00:57:20of both agency and its opposite,
00:57:22which is why I think there's gonna be this bifurcation
00:57:25of people who are high agency and low agency.
00:57:28We're gonna have extremely high agency people
00:57:30and extremely low agency people
00:57:32who will probably be the majority of humans in the future.
00:57:36Quite a scary prospect.
00:57:37- This episode is brought to you by Gymshark.
00:57:40You want to look and feel good when you're in the gym.
00:57:42Gymshark makes the best men's and girls' gym wear
00:57:46on the planet.
00:57:47Let's face it, the more that you like your gym kit,
00:57:50the more likely you are to train.
00:57:51Their hybrid training shorts for men
00:57:53are the best men's shorts on the planet.
00:57:56Their crest hoodie and light gray mall
00:57:58is what I fly in every single time I'm on a plane.
00:58:00The Geo Seamless T-shirt is a staple in the gym for me.
00:58:02Basically everything they make.
00:58:03It's unbelievably well fitted, high quality, it's cheap.
00:58:06You get 30 days of free returns, global shipping,
00:58:08and a 10% discount site wide.
00:58:09If you go to the link in the description below
00:58:11or head to gym.sh/modernwisdom,
00:58:13use the code modernwisdom10 at checkout.
00:58:16That's gym.sh/modernwisdom and modernwisdom10 at checkout.
00:58:22All right, next one from me.
00:58:23The personal Tocqueville paradox.
00:58:25You will always think you suck.
00:58:27That's good.
00:58:28It's okay to suck compared to your standards.
00:58:30As you grow, so will your standards.
00:58:32It doesn't mean that you actually suck.
00:58:34This is similar to the Matthew principle of self-improvement.
00:58:38There's two types of people.
00:58:39Those who don't know how to improve their lives
00:58:41and those who don't know when to stop.
00:58:43But that personal Tocqueville paradox thing of
00:58:45I have standards, those standards continue to rise
00:58:51as my capacity rises.
00:58:53And now the standards always outstrip
00:58:57where I think my capacity is at.
00:58:59Well, if you didn't, you would never get any better.
00:59:01And it's kind of like hedonic adaptation,
00:59:04but for your skillset or like a habituation
00:59:07to what your performance level is.
00:59:08And the Tocqueville paradox, which I learned from you
00:59:12as living standards in a society rise,
00:59:14people's expectations of those standards grow more quickly
00:59:18than the standards can deliver them to it.
00:59:20So this is why, given that Louis XIV,
00:59:24we have technology and a quality of life
00:59:26that he could not believe.
00:59:28And yet we feel like quality of life
00:59:31is the worst that it could be,
00:59:33despite all of the material comforts and safety and medicine
00:59:36and get access to the internet and air conditioning
00:59:38and fresh water and stuff that we've got.
00:59:40And I think the same thing happens
00:59:42with regards to personal growth as well.
00:59:43You just continue to outstrip your own standards
00:59:46over and over again with where you want to be.
00:59:49- Yeah, absolutely.
00:59:51I always think of regret as a sign of progress.
00:59:55A lot of people think regret is a bad thing.
00:59:57I don't, I actually think regret's a good thing.
00:59:59Because what it shows is that you've grown basically.
01:00:02Because if you're looking back
01:00:03and you're seeing an idiot in the past,
01:00:05then that's a sign that you have grown as a person.
01:00:08You basically, you have new standards of behavior
01:00:10that you had when you committed whatever act
01:00:13you are regretting.
01:00:15And so I think so many of these kinds of problems
01:00:18are really just base rate fantasies.
01:00:20You need to understand that your own standards have risen.
01:00:25And that's why when you look back and you think,
01:00:27"Oh, okay, this person wasn't the person that I wanted to be."
01:00:31That's because you are now a new person.
01:00:33You wouldn't be able to do that
01:00:34if you were the same person in a sense.
01:00:36And again, our sort of expectations for what is good
01:00:41do always increase as we improve.
01:00:46And we have to manage that.
01:00:48We have to always account for that.
01:00:49Because if we don't, we're essentially living
01:00:52in some sort of weird kind of on some treadmill basically.
01:00:57We're basically on a hedonic treadmill.
01:00:59The way that I like to look at things
01:01:02is to try to look at objective metrics
01:01:04rather than whether I'm using subjective metrics.
01:01:07Because subjective metrics are always moving around.
01:01:09They're always, they're very malleable.
01:01:12And on a bad day, you might have certain expectations
01:01:16and then on a good day, you have different ones.
01:01:18So I think looking at objective metrics
01:01:21are always much, much better.
01:01:23So for example, if you wanna look at,
01:01:26they could be really shallow ones.
01:01:27Like as a writer, if you're a writer like me,
01:01:30it would be like how many likes do I get
01:01:32on my sub-stack post?
01:01:33Or it could be something a bit more sort of in depth
01:01:36like looking at like where, who likes the piece?
01:01:44Is it just like sycophants who like your article
01:01:48or is it actually other people?
01:01:49Do people that you'd normally disagree with politically,
01:01:51are they liking your writing?
01:01:53Because if they do then that's a sign
01:01:54that you've really written something good.
01:01:56So there's many metrics you can use.
01:01:58And again, if you're using subjective metrics,
01:02:02it's like trying to navigate
01:02:03by the light of a shooting star.
01:02:05You're gonna be all over the place.
01:02:08So you have to have fixed points.
01:02:09You have to fix things that you are aiming for.
01:02:12And that way you can objectively measure
01:02:15where you're going.
01:02:17Then your own standards are not really gonna matter too much
01:02:19because you've got objective metrics fixed in place.
01:02:23- Rothbard's law.
01:02:24If a talent comes naturally to someone,
01:02:26they assume it's nothing special
01:02:27and instead try to improve at what seems difficult to them.
01:02:30As a result, people often specialize in things
01:02:33that they're bad at.
01:02:34- We've spoken about this one before.
01:02:37This was on the last episode we did.
01:02:39- Yeah, it just relates to these two so much, I think.
01:02:44It's so good.
01:02:45I have a friend, I think I told this story last time,
01:02:47Ryan Long, wonderful at doing comedy sketches
01:02:51and just so fantastic.
01:02:53But because that comes easily to him,
01:02:54he's decided that other art forms are more elevated.
01:02:59He was sort of blinded to the,
01:03:01there is this natural assumption
01:03:03that if something is worthwhile, it's going to be difficult.
01:03:06And that I wrote this essay a couple of months ago
01:03:10about the difference between inputs, outputs, and outcomes.
01:03:14So inputs is sort of time spent.
01:03:17Outputs is work done and outcomes is real world results.
01:03:22And people love to focus on the first two, not the third one,
01:03:25because you never have to ask the question of effectiveness.
01:03:27But this, the Rothbard's law thing
01:03:29actually plays a role in this too,
01:03:31because the outcome focused assessment of your own work
01:03:36gets, it forces you to look at your assumptions
01:03:41and maybe go, oh, actually I have a natural talent
01:03:46at something and this sort of strange pattern whereby
01:03:50I assume that I'm not supposed to achieve things
01:03:54without sweat and pain and discomfort and agony.
01:03:58Maybe that's wrong.
01:04:00Maybe that isn't something that I should try
01:04:03and build my entire worldview around.
01:04:05- Yeah, absolutely.
01:04:07But I think one of the problems
01:04:08and that's sort of highlighted by Rothbard's law
01:04:11is that often really the issue is that we just never try
01:04:15in the first place to do something that we're good at
01:04:17because we assume by default that we're,
01:04:21it's just basically a pretty easy thing
01:04:22that anybody can do, right?
01:04:24So what I would say is to overcome that
01:04:28is to just do what you love, right?
01:04:30I know it sounds a bit corny, right?
01:04:32But ultimately I found that that is the best heuristic
01:04:35for you when you want to try and work out
01:04:36what you want to do, do what you love.
01:04:39And the reason for that is because you will,
01:04:40even if you're not good at it,
01:04:42the fact that you enjoy doing it
01:04:44shows that you will be motivated to do it.
01:04:46You'll be motivated to get better at it.
01:04:48And obviously because our brains are neuroplastic,
01:04:51if you keep doing something, you will get better at it.
01:04:56And so I think even like I would rather do something
01:04:59that I'm bad at, but which I enjoy
01:05:01than do something that I'm good at, but which I don't enjoy.
01:05:04Because you've got to bear in mind,
01:05:05you're going to do this for the rest of your life, right?
01:05:07This is going to be your life.
01:05:08Like this is going to be the thing
01:05:10that essentially you get out of bed for each morning.
01:05:12So if you're getting out of bed and you're like,
01:05:14"Oh, I've got to do this."
01:05:16That's not a life because you're going to be,
01:05:18most of your life is going to be that.
01:05:20But if you're getting out of bed and you're like,
01:05:21"All right, okay, I've got a hard challenge.
01:05:24This is a really hard challenge.
01:05:25I don't know how I'm going to do it,
01:05:26but I'm loving the fact that I get to tackle it."
01:05:28That's how you want to live
01:05:29because then your fun is going to be the motivation.
01:05:33And that is going to ensure
01:05:35that you will get better at that thing.
01:05:36And so I think that's really the way around Rothbard's law,
01:05:40just to do what you enjoy, forget what you're good at.
01:05:43It doesn't matter.
01:05:44If you're young enough or if you're young enough
01:05:46in spirit even, you don't even have
01:05:47to be physically young enough.
01:05:49You keep doing something and if you're determined,
01:05:52if you really enjoy it, you will get better at it.
01:05:56- There's an interesting challenge I think that people face
01:06:00with believing that their accomplishments
01:06:04are as big as they are.
01:06:06You know, there's certainly some people out there
01:06:07who are BPD narcissist, full of ego, whatever.
01:06:12I think so many people, especially in the modern world,
01:06:15are just chronically uncertain.
01:06:17Am I okay?
01:06:18Is what I'm doing good?
01:06:20How much more do I need to be until I can rest?
01:06:23I always think about that scene from Avengers Endgame
01:06:26where Thanos has done the snap and he's got this cabin
01:06:28on a planet that overlooks a lake and he comes
01:06:31and he puts his helmet down and then he sits in this seat.
01:06:34He sits down in this sort of rocking chair
01:06:36and he makes this noise and it's kind of like satisfaction
01:06:40but it's much more like exhaustion and I often think
01:06:43about this assumption that at some point there will come
01:06:47a time, the provisional life or deferred happiness syndrome
01:06:52or the arrival fallacy, this sense that at some point,
01:06:55but there's a personal growth version of this too.
01:06:57There's a personal growth version of at some point
01:07:00I will have done the growing and the learning
01:07:03and I will be able to rest.
01:07:04Well, I don't think that you're ever gonna stop learning
01:07:07and growing and I think that you would probably not enjoy
01:07:09your life if you were to do that.
01:07:11But also what that means is you need to enjoy some
01:07:14of whatever it is that you want to do now
01:07:17because it will just be this.
01:07:19It is just going to be this convey about
01:07:21up until the end of time.
01:07:24- Yeah, so Naval Ravikant has a brilliant quote about this,
01:07:29which is, "If you can't be happy with a coffee,
01:07:32"you won't be happy with a yacht," basically.
01:07:36And it's just a really great sort of quote
01:07:39because it sums up pretty much everything
01:07:41you've been describing, which is people are always looking
01:07:43for this moment where everything is gonna be perfect.
01:07:46They're always chasing this idealized version of reality
01:07:50where they will have attained all the skills
01:07:52that they want to.
01:07:53They will have gotten all the things that they want to
01:07:55and then they will finally be happy.
01:07:57But ultimately, as we spoke of before,
01:08:01real happiness ultimately comes from the resilience
01:08:03of your mind.
01:08:05If you can find happiness in just something as simple
01:08:08as a coffee, then that is enough.
01:08:10Then that means you will be happy later on
01:08:14when you have even more.
01:08:16Then there needs to be this kind of baseline
01:08:18that you're willing to be happy at.
01:08:21So you need to be happy even if you have nothing
01:08:24because if you're tying your happiness to something,
01:08:28everything is transient, everything can be broken,
01:08:31everything can be destroyed in this world.
01:08:33And if you tie your happiness to that thing
01:08:36and that thing gets destroyed,
01:08:38you're gonna lose your whole purpose of existence.
01:08:41So the only thing that is gonna survive all of the slings
01:08:44and arrows of life is to tie your happiness
01:08:48to just the basic fact of existence,
01:08:50just the fact that you are alive
01:08:52and you get to live what is essentially
01:08:55such an improbable life.
01:08:56You know, there's this crazy sort of statistic,
01:08:58which is that if you look at genetically the number
01:09:00of people that could have been born,
01:09:02the chances of you being born are like one in N
01:09:07where N is greater than the number of atoms
01:09:10in the universe, right?
01:09:13So it's extraordinarily like improbable for us
01:09:17to even be here right now talking.
01:09:19And this is assuming that we were essentially
01:09:22selected randomly from the genetic lottery.
01:09:27But it's so improbable that we're even here.
01:09:30So I try to find happiness in the most basic things
01:09:34because then if you can do that,
01:09:35then everything else that you get
01:09:37is just gonna be a bonus, right?
01:09:39But if you tie your happiness to something
01:09:41that you haven't yet achieved,
01:09:43then your entire life's journey up until that point
01:09:45is gonna be miserable.
01:09:46And then you can't even be sure
01:09:47that when you attain that thing,
01:09:48it's actually gonna be as good as you thought it was.
01:09:51Because often we inflate our hopes and dreams beyond reality.
01:09:55So what we think is gonna make us happy,
01:09:57when we finally get it, it doesn't actually make us happy.
01:09:59And this has happened to pretty much everybody.
01:10:01You know, so everybody will recognize this.
01:10:03So you've gotta, I think if you wanna be happy,
01:10:05you've got to be happy no matter
01:10:06what the external world is like.
01:10:09You know, you have to cultivate internal happiness.
01:10:11You have to have that happiness with a coffee
01:10:13and then you'll be happy with a yacht.
01:10:16Original position fallacy, far leftists
01:10:19favor planned economies because they imagine themselves
01:10:21as the planners, not the planned.
01:10:24Far rightists favor a return to feudalism
01:10:26because they imagine themselves as the lords,
01:10:29not the peasants.
01:10:30Many delusional worldviews
01:10:31stem from main character syndrome.
01:10:34And I had this from three or four years ago,
01:10:37one of our first episodes, the alpha history fantasy.
01:10:39Modern men who are angry at a world
01:10:41they feel has rejected them,
01:10:42mistakenly believe that they would have done better
01:10:44in medieval times.
01:10:46They are somehow adamant
01:10:47that the chance of them being Genghis Khan
01:10:49is greater than the chance of them being
01:10:50cannon fodder peasant number 1,373
01:10:54whose favela was sacked and destroyed.
01:10:56- Yeah, yeah.
01:10:58So yeah, so the original position fallacy
01:11:03is really has its origins in the work of John Rawls.
01:11:06John Rawls was like a liberal philosopher, basically,
01:11:08a political philosopher.
01:11:10And so his argument was that basically people,
01:11:15when they think of like future states,
01:11:17they tend to assume that they're gonna be
01:11:19amongst the elites, basically.
01:11:21So, and this is true
01:11:23whether you're on the left or on the right.
01:11:25If you're on the left,
01:11:26you think you're gonna be one of the planners.
01:11:28If you're on the right,
01:11:30you think you're gonna be one of the nobles, right?
01:11:33History, again, history has shown this to be completely false.
01:11:35So for example, if you look at all the communist revolutions
01:11:38that occurred in the 20th century,
01:11:40whether you're looking at Stalin, Mao,
01:11:44Pol Pot, Ceausescu, all of these people,
01:11:48one of the first things that they did
01:11:49was to either imprison or murder the intellectuals, right?
01:11:54So the elites basically--
01:11:55- Is that because they were the ones
01:11:56who could have come up with ideas
01:11:58to reverse their proposed direction for the civilization?
01:12:03- Basically, yeah.
01:12:05So if we take one of these examples,
01:12:07so if we look at Pol Pot.
01:12:08So Pol Pot wanted to basically reset history to year zero.
01:12:15And he wanted nobody to remember anything
01:12:17from before year zero.
01:12:19Like for him, that was literally the beginning of time.
01:12:21So he wanted to completely wipe out
01:12:23all traces of the past beyond year zero.
01:12:27And one thing he knew about intellectuals
01:12:29was that they read books and that they wrote.
01:12:32And obviously writing and reading
01:12:33are essentially the society's memory.
01:12:36So if you can eliminate all the intellectuals,
01:12:39then you eliminate society's memory, basically.
01:12:41You wipe society's memory and you can start fresh.
01:12:44You can create a new fresh without any bourgeoisie,
01:12:47without any of the ideas of capitalism
01:12:50to pollute the modern world.
01:12:52- Would die along with the intellectuals.
01:12:54- Exactly, and there's an irony
01:12:55because there were a lot of intellectuals
01:12:57that were supporting Pol Pot.
01:12:59They were some of his fiercest defenders.
01:13:02They were the guys that were advocating,
01:13:04like writing the propaganda for him.
01:13:06They were the ones who were like...
01:13:07And this is not just with Pol Pot.
01:13:09This is with all the communist revolutions.
01:13:11Even Western intellectuals, many of them...
01:13:13There was one guy, I've forgotten his name,
01:13:14but he was a Western intellectual.
01:13:15He went to Pol Pot.
01:13:17He was one of the biggest cheerleaders for the Khmer Rouge.
01:13:20And he went to basically have a meeting with Pol Pot.
01:13:25And he ended up getting assassinated.
01:13:27And nobody knows who killed him,
01:13:28but it was probably on Pol Pot's orders.
01:13:30But a lot of the left-wing intellectuals believed
01:13:34that if they were to create a socialist society,
01:13:38that they would be at the top of society.
01:13:40They would be planning things.
01:13:42Everything would go according to their vision of society.
01:13:45And that's why it's such an intoxicating vision.
01:13:47That's why academics and other elites
01:13:50will tend towards these views of society.
01:13:55And they want either a socialist republic,
01:14:00if they're on the left,
01:14:02or if they're on the right,
01:14:03they'll probably advocate for something.
01:14:04It could be like neo-monarchy
01:14:06with the dementia small bugs, courtesy ovens,
01:14:08whatever, of the world,
01:14:09who believes that they would be...
01:14:10I'm sure Kurt Ziavan believes that
01:14:12if there was a right-wing revolution,
01:14:14that he would be at the right-hand side of the monarch.
01:14:17He would be the advisor.
01:14:18He would be the Svengali.
01:14:19But again, usually it's the revolutionaries
01:14:25who end up getting murdered themselves.
01:14:27This is true of the French Revolution too as well.
01:14:31The biggest advocates of the French Revolution
01:14:33ended up being the first people to get guillotined,
01:14:36or at least they did eventually get guillotined.
01:14:38So all of this stuff...
01:14:41Again, so this is probably going off on tangent anyway,
01:14:43but basically going back to the original idea.
01:14:44So it was originally John Rawls' idea.
01:14:47I kind of adapted it to extend it to the left and right.
01:14:51But his idea was just generally that people tend to benefit.
01:14:54They will tend to adopt whatever state
01:14:57that they think is gonna benefit them.
01:14:59They will tend to advocate
01:15:00for whatever state is gonna benefit them.
01:15:01And the solution that he proposed
01:15:05was what he called the veil of ignorance.
01:15:08And I think we might've covered this before,
01:15:10but basically the veil of ignorance is his belief
01:15:14that the best way to create a society
01:15:17is to begin by imagining
01:15:21that you are gonna be assigned at random
01:15:24a position in the world that you advocate for.
01:15:27So if you advocate for a socialist or a communist country,
01:15:33you can't do that with the assumption
01:15:35that you are gonna be the planner.
01:15:37You are gonna be the chairman of the party
01:15:39or anything like that.
01:15:41It's gotta be the assumption
01:15:42that you will be assigned a place
01:15:43within that state at random,
01:15:45because then this will motivate you to then hedge
01:15:50and ensure that every person in that state
01:15:53is well looked after, basically.
01:15:55So this is obviously coming from his left liberal perspective.
01:15:58- To optimize for the sort of highest average life quality
01:16:02as opposed to your selected fortunate quality.
01:16:06- Yeah, exactly, yeah.
01:16:08So this was his way of advocating for liberalism
01:16:10because that's essentially what liberalism does.
01:16:12Liberalism is based on the idea
01:16:14that you wanna ensure that everybody in society--
01:16:17- Redistribution.
01:16:18- Yeah, redistribution, but to an extent.
01:16:20It's obviously it's not the same as a socialist country,
01:16:23which is a socialist country
01:16:24would be complete redistribution or near total.
01:16:28Whereas liberalism is a sort of middle ground
01:16:31between socialism and sort of free market capitalism,
01:16:36like completely laissez-faire capitalism.
01:16:39So it's basically the idea that liberals
01:16:44want to maximize freedom,
01:16:46but they consider freedom to also be freedom
01:16:50from, for example, poverty or from oppression
01:16:53by higher classes of people.
01:16:56So they're similar to libertarians
01:16:59from the basic point in that they value liberty
01:17:01more than anything.
01:17:02It's just that liberals
01:17:03tend to have a slightly different definition
01:17:05of what liberty means.
01:17:07But for libertarians, liberty is literal.
01:17:09It's literally just freedom to do what you want.
01:17:12Whereas liberals, depending on the specific brand
01:17:16of liberalism, it might be the John Stuart Mill
01:17:18or the John Locke kind of liberalism
01:17:20where your liberty ends at the point
01:17:24at which it does harm to somebody else
01:17:25where it basically encroaches on their liberty sort of thing.
01:17:28- What about the coyotes law thing?
01:17:32Don't give the government a power
01:17:34you wouldn't want your political enemies to wield
01:17:37because one day they may well be in charge of it.
01:17:40- Yeah, yeah.
01:17:41So this is the sort of preventative
01:17:45to the original fallacy position.
01:17:47This is what I advocate for personally.
01:17:49I think that the best way to determine
01:17:51what policies to support are the ones
01:17:54that will be not harmful
01:17:58if the government were to be taken over
01:18:00by somebody that you despise basically,
01:18:03by the worst government that is possible in your country.
01:18:06So if you're on the left,
01:18:10then you should advocate for policies
01:18:12that would not harm your interests
01:18:14or the interests of those you advocate for
01:18:16if the government would suddenly become right wing
01:18:19and vice versa.
01:18:20So I think it's a pretty sort of straightforward
01:18:23common sense rule
01:18:24because I think one of the problems with people
01:18:26is that they tend to think about the short term
01:18:29at the expense of the long term.
01:18:30This is one of the fundamental problems with human beings
01:18:33and it extends to politics as well.
01:18:35People tend to only,
01:18:36they tend to imagine that whoever they're supporting
01:18:39is gonna be in power forever.
01:18:40And this is why when I see people like right wingers,
01:18:43for example, on Twitter,
01:18:44actively suppressing and censoring left wingers
01:18:49after advocating for free speech for so long,
01:18:52I just think well, you're just shooting yourselves in the foot
01:18:54because this is gonna be used against you.
01:18:56The apparatus you're creating
01:18:58is gonna be used against you.
01:19:00So for example, if we go with,
01:19:02if Trump, for instance, were to pass laws
01:19:08which were to make it illegal for people to criticize him,
01:19:11this is obviously a hypothetical situation.
01:19:13This is not something he's actually done,
01:19:14but this is a hypothetical.
01:19:16You would see people on the right supporting it.
01:19:18A lot of people on the right would support it.
01:19:19They'd be like, yes, stick it to the left.
01:19:23Yeah, trigger the libtards and all this stuff.
01:19:25Yeah, and they'll be cheering.
01:19:26But then Trump's not gonna be in power forever.
01:19:29And then you're gonna have probably a Democrat in charge
01:19:31and he's gonna have, now he's gonna have the power
01:19:33to do exactly to the right what Trump was doing to the left.
01:19:36So it's basically like the leopards eating your own face
01:19:39kind of thing where a lot of this stuff can backfire
01:19:42if you don't think about it on a long enough timescale.
01:19:45- In other news, this episode is brought to you
01:19:48by RP Strength.
01:19:50This training app has made a huge impact
01:19:53on my gains and enjoyment in the gym
01:19:54over the last two years now.
01:19:56It's designed by Dr. Mike Israel
01:19:57and comes with over 45 pre-made training programs,
01:20:01250 technique videos.
01:20:02Takes all of the guesswork out of crafting
01:20:04the ideal lifting routine by literally spoon feeding you
01:20:08a step-by-step plan for every workout.
01:20:10It guides you on the exact sets, reps, and weight to use.
01:20:14Most importantly, how to perfect your form
01:20:16so every rep is optimized for maximum gains.
01:20:19It adjusts your weights each week based on your progress
01:20:22and there's a 30 day money back guarantee.
01:20:24So you can buy it, train with it for 29 days.
01:20:26And if you do not like it,
01:20:27they will give you your money back.
01:20:29Right now, you can get up to $50 off the RP Hypertrophy app
01:20:32by going to the link in the description below
01:20:34or heading to rpstrength.com/modernwisdom
01:20:38and using the code modernwisdom at checkout.
01:20:40That's rpstrength.com/modernwisdom
01:20:43and modernwisdom at checkout.
01:20:46You see the same thing culturally, I suppose,
01:20:51as well as systemically or in terms of policies.
01:20:56So for instance, BLM rioting and pushing as hard as it did,
01:21:01it would surprise me, maybe I'm wrong,
01:21:07but 2020 into 2021, January 6th,
01:21:10was sort of the year of the riot.
01:21:12And I get the sense that the tone had already been set
01:21:16by something that seemed to be swept under the rug.
01:21:18It was done by the mostly peaceful and quiet protesters
01:21:22that I think legitimated a degree of retribution,
01:21:27even if it was only sort of in the minds of people
01:21:30that decided that that was the way that the world works,
01:21:32that one stupid action deserves another stupid action.
01:21:36You see this too with the way that people behave,
01:21:41the sort of language that people use online.
01:21:43Well, if your president gets shot at,
01:21:47then maybe their president can get shot at.
01:21:49And if you use a knife, then we can bring a knife too.
01:21:53And maybe we'll bring a gun and then you bring a gun
01:21:55and then someone brings a bazooka
01:21:56and you just the tit for tat sort of ever.
01:21:59It's kind of like, what's that?
01:22:01Isn't there a law?
01:22:03Doesn't somebody have an idea?
01:22:04I think Elon's talked about this.
01:22:05Well, over time, because laws get instantiated
01:22:10and rarely repealed, eventually everything
01:22:15will be made illegal, that there will be a law
01:22:18that stops you from doing everything
01:22:19because you creep this forward one step at a time.
01:22:23Well, you shouldn't drive when it's this wet with that car,
01:22:27then with a different car, then with any car,
01:22:28then when it's a bit less wet, then when it's dry,
01:22:30then when it's, you just end up litigating your way
01:22:33out of civilization.
01:22:35And this is kind of the same sort of thing
01:22:38that if you allow this behavior and then the behavior
01:22:41can come back in a little bit more from the other side
01:22:43and then the other side and then the other side.
01:22:44It's this game of ever-escalating tennis.
01:22:47- Yeah, I mean, so there is a concept
01:22:53which relates to this called reciprocal radicalization,
01:22:57which is basically where it's like a game of brinkmanship
01:23:00where you have one group who advocate for something
01:23:04which then the other side now feels entitled to.
01:23:07And then they'll escalate it even more.
01:23:10And then it will basically suck a repeating pattern.
01:23:13So it's like the left and the right
01:23:17almost have this symbiotic sort of relationship
01:23:20where the excesses of one group will fuel the excesses
01:23:23of the other group, the opposing group.
01:23:25And they're kind of like what is known as a mise-en-bien,
01:23:30which is a mirror when you have two mirrors facing each other
01:23:33and they kind of infinitely reflect each other.
01:23:35- Okay. - It's like
01:23:37they're constantly reinforcing each other in that sense.
01:23:40So it's not just with the left and right.
01:23:42You also see this amongst terrorists and governments as well.
01:23:45So what will happen is that you have terrorists
01:23:46who will commit an act of violence
01:23:48and then the government will respond to that
01:23:50by having a crackdown and by tightening laws.
01:23:53And then the terrorists will use this
01:23:54as an example of the government being tyrannical.
01:23:57And so that would justify further action
01:23:59against the government.
01:24:00And then the government will use further action
01:24:02to justify their own further action by saying,
01:24:03oh, these terrorists are even more dangerous now
01:24:05so we have to enact even tighter laws.
01:24:07And so it's like an ever tightening sort of situation
01:24:10where the excesses of one group
01:24:12fuel the excesses of the other group.
01:24:14And ultimately the only way out of this
01:24:16is long-term thinking again.
01:24:17So this is, again, it's short-term thinking.
01:24:19It's when people are engaging
01:24:22in the sort of the satiation of their own impulses
01:24:25rather than actually engaging in long-term thinking
01:24:28about the consequences of their actions.
01:24:30It's first order thinking.
01:24:32They're only thinking about the immediate consequences.
01:24:35They're not thinking about
01:24:35the consequences of the consequences,
01:24:37let alone the consequences of the consequences
01:24:39of the consequences, which is what you really need to be doing
01:24:42when you're in the political game.
01:24:44So, yeah.
01:24:44- There's a, that's short-term, long-term thing.
01:24:49There's a similarity with Amara's law.
01:24:52We tend to overestimate the short-term impact of new tech
01:24:54and underestimate the long-term impact
01:24:57because hype inflates expectations
01:25:00and thus disappointment and thus skepticism.
01:25:03As such, it's possible for AI to both be a bubble
01:25:06and the most transformative tech since fire.
01:25:09- Yeah, so this is an idea that's illustrated
01:25:11by something called the Gartner hype cycle.
01:25:14So if you go on Wikipedia, it will tell you
01:25:16that the Gartner hype cycle is pseudoscience.
01:25:18It's not supported by evidence.
01:25:19This is nonsense.
01:25:20The Gartner hype cycle is not supposed to be
01:25:23a scientific sort of like study of what actually happens.
01:25:27What it's supposed to be is a general rule of thumb
01:25:29and it does fit most major technology, major technological
01:25:34sort of developments.
01:25:35One of the problems with Wikipedia is it's,
01:25:37often straw man's ideas before discredit
01:25:39or trying to discredit them.
01:25:40So I wouldn't pay attention to the Wikipedia article
01:25:43of the Gartner hype cycle.
01:25:44Basically, what the Gartner hype cycle states
01:25:47is that you have, when you have a new technology,
01:25:50you have like massive surge in hype, right?
01:25:54Where everybody's incentivized to sort of just kind of
01:25:58get on the hype bandwagon, basically.
01:26:00Because it's a new technology and people are speculating
01:26:03that they're spit balling.
01:26:05They're speculating about where this could go
01:26:08and people get excited about it.
01:26:09People write click bait articles about it.
01:26:11And so this obviously inflates people's expectations.
01:26:14And so then the next stage of the hype cycle
01:26:17is where people start to realize, hang on a second,
01:26:21the hype was hype.
01:26:23They start to sort of realize that the reality
01:26:26of the new technology is not quite what people were saying
01:26:29it was gonna be.
01:26:30And this causes a kind of backfire effect
01:26:34where people temper their expectations by over-correcting.
01:26:38So what they do is they assume that
01:26:40because where the technology is actually headed
01:26:43is slightly different from where it was,
01:26:46where the hype claimed it was headed.
01:26:48Therefore, they were wrong about the technology completely.
01:26:51And therefore the technology is worthless.
01:26:53So then you get people now from the opposite side
01:26:56arguing for the opposite thing, saying, it was all hype.
01:27:00Humans are stupid, don't listen to humans.
01:27:02This technology is just gonna fizz out.
01:27:04It's just crap.
01:27:06So people will naturally react very strongly
01:27:08by over-correcting, that's what humans tend to do.
01:27:10So you get a lot of articles arguing for the opposite.
01:27:13But then what will happen is everybody will go,
01:27:14oh, okay, well, yeah, so the hype was just crap.
01:27:16So let's just get on with our lives.
01:27:18And they'll forget about the technology.
01:27:20And then it's when they forget about the technology,
01:27:23that's when the technology will start to change the world.
01:27:26Because even though the technology
01:27:28is no longer in popular discourse,
01:27:30it has been adopted by the sort of people
01:27:33at the frontier of development,
01:27:35the industries where it can actually be used.
01:27:37And these are usually not exciting industries.
01:27:39They're usually things like doing financial wizardry,
01:27:47which is not really something that interests most people.
01:27:50So it will usually have very limited visibility
01:27:54for a long time.
01:27:55But then the developments in those industries
01:27:57will gradually compound
01:27:59until we have something that is really, really amazing.
01:28:02And AI is a great example of this.
01:28:04So one of the main pioneers of AI
01:28:12is a guy called Marvin Minsky.
01:28:13He was a major figure in the development of neural networks.
01:28:17I think it was in the 1970s.
01:28:18He said that in around seven to eight years,
01:28:23we will have human level intelligence
01:28:25in neural networks, basically.
01:28:27He said something like that anyway.
01:28:28And obviously this is completely absurd
01:28:30because by the 1980s,
01:28:32we had really, really basic neural networks.
01:28:35And that continued into the '90s
01:28:37and everybody had kind of by then
01:28:38just forgotten about the hype.
01:28:40Everybody was like, ah,
01:28:42this whole neural network stuff's crap.
01:28:44Nothing's gonna happen.
01:28:45It was all just hype.
01:28:47Everybody forgot about it
01:28:48apart from a small number of researchers
01:28:51and a small number of people
01:28:52who were using convolutional neural networks
01:28:55to do things like imaging and things like that.
01:28:57And then what happened is suddenly you have ChatGPT, boom,
01:29:01in like 2022.
01:29:03And this seemed to have come out of nowhere,
01:29:05but it didn't actually come out of nowhere.
01:29:07The technology for the transformer architecture
01:29:10was actually developed by Google DeepMind.
01:29:12And this was a few years before ChatGPT
01:29:15sort of accommodated it
01:29:17and actually began to develop it themselves.
01:29:20But before that, nobody really cared.
01:29:24For 30 years, nobody really cared in the mainstream
01:29:27about neural networks.
01:29:28So this is a good example of it.
01:29:29But the thing is, is that the Gartner hype cycle continues.
01:29:31So it's not just you have this one hype cycle
01:29:34and then it's over.
01:29:35It often repeats itself.
01:29:37So we're gonna see it again
01:29:39with things like world models now, I think, where there's-
01:29:43- What are world models?
01:29:44- So world models are like a stepping stone towards AGI.
01:29:47A world model is where you have things like physics
01:29:51implemented into your LLM.
01:29:53It's not really an LLM anymore
01:29:55because it can do so many other things.
01:29:57It's more like a video model,
01:29:58but it's a video model that actually has real world physics.
01:30:01And at the moment, Google is probably best placed for this
01:30:03because they have all the data.
01:30:05They've got the real-time data through search.
01:30:08They've got video data through YouTube.
01:30:10And then they've got like,
01:30:11they've got spatial data as well through Google Street View
01:30:15and all that kind of stuff.
01:30:16So they have actually got the best world model
01:30:18at the moment called Genie, Genie 3.
01:30:21But basically, a world model is basically when an LLM
01:30:24or an AI can model the world, basically, literally.
01:30:29That's why it's called a world model.
01:30:30It can model the world.
01:30:31So it can understand things like physics.
01:30:33So it can understand collisions.
01:30:35It can understand gravity.
01:30:36It can understand the way that fluids move,
01:30:40like water and things like that.
01:30:41And we have like a kind of,
01:30:43we have a simulacrum of that in video generation,
01:30:45but video generations don't understand physics.
01:30:47They're just copying the physics of films
01:30:51and other stuff like that.
01:30:53Whereas a world model genuinely understands the physics.
01:30:56And so that's the first step towards creating AGI
01:30:59because then you can actually activate AI
01:31:01in the physical reality.
01:31:03And this is probably gonna be the next hype cycle.
01:31:05It's already begun.
01:31:06There's been a lot of hype around Genie 3.
01:31:09What will probably happen is we'll have something
01:31:10called the trough of disillusionment,
01:31:12which is the next of the Gartner hype cycle.
01:31:15And then when everybody's forgotten about world models,
01:31:17we'll start to see real world models emerge.
01:31:19- Wow.
01:31:20There's a, I was looking at,
01:31:22I've spoken to a lot of behavioral genetics guys
01:31:27and girls on the podcast.
01:31:28I've got Catherine Page-Harden coming back on
01:31:29for her new book next week.
01:31:31And I'd always wondered,
01:31:34there's an equivalent basically with,
01:31:36over time, things changing.
01:31:38And the Wilson effect feels like a biological equivalent
01:31:43of what we're talking about with regards to the hype.
01:31:45So this is from you.
01:31:47Heritable traits like IQ and personality
01:31:49become more heritable with age because as you mature,
01:31:52you become more independent and free to be who you really are.
01:31:56Many heritability studies find that nurture's influence
01:31:59is stronger only because they never see
01:32:01that nature's influence is longer.
01:32:04- Yeah, so historically, the social sciences
01:32:09and the field of genetics has pretty consistently
01:32:13underestimated the heritability of a lot of traits.
01:32:17And just to give a very recent example,
01:32:18I think just a couple of days ago,
01:32:20there was a new study published,
01:32:23which I retweeted onto my timeline, which basically shows,
01:32:26so initially there was the belief that heritability
01:32:30of lifespan is between 20 and 25%.
01:32:34And this new study has found that it's actually closer to 50%.
01:32:39And this is a pretty important,
01:32:40this is obviously a pretty important finding
01:32:42because this is the heritability of your lifespan,
01:32:44how long you're gonna live.
01:32:46And so there's been a massive underestimation of lifespan
01:32:50in terms of the heritability of it.
01:32:52And I think this is not fully explained
01:32:54by the Wilson effect,
01:32:55but I think the Wilson effect is a contributor to this.
01:32:58And it's basically what it is is because studies tend
01:33:01to be quite short term,
01:33:02genetics studies tend to be quite short term.
01:33:04So they will tend to,
01:33:05obviously it's very hard to track a human being
01:33:08throughout their entire life.
01:33:10So usually longitudinal studies in genetics
01:33:13will tend to sort of follow people for a few years.
01:33:16So usually three, three years, five years,
01:33:19and that's not enough time to really understand
01:33:22the effects of these genes
01:33:24because a lot of these genes only become apparent
01:33:27later in your life.
01:33:28People tend to sort of,
01:33:31there's a kind of,
01:33:33what happens is that there's a masking effect.
01:33:36So early in your life,
01:33:37the effects of genes are masked by your upbringing,
01:33:40by your environment.
01:33:42So for example,
01:33:43if you are genetically predisposed to love reading,
01:33:47but in your life,
01:33:49your parents never buy you any books
01:33:51and instead they buy your PlayStation, right?
01:33:54You're gonna spend your childhood playing PlayStation
01:33:57instead of reading books,
01:33:57which is what you really love to do.
01:33:59It's only when you get older
01:34:01that you're able to follow your own natural inclinations,
01:34:04which are books.
01:34:05And so it's only when you're older
01:34:07that you have the power to buy books
01:34:09and therefore it's only when you're older
01:34:10that your genetic predisposition to books becomes apparent.
01:34:15And so this is a very simple example,
01:34:17but this is very common I think now in a lot of studies
01:34:20where there's a lot of reassessment that needs to be done
01:34:24due to these studies being so short term.
01:34:27We really need to study people
01:34:29at different stages of their life.
01:34:30We need to study them when they're children,
01:34:32we need to study them when they're adults,
01:34:33and we need to study them when they're elderly
01:34:35in order to actually have a good understanding
01:34:37of the influence of genes versus environment.
01:34:40- I saw there was a line from you, an Emerson one.
01:34:44"People do not seem to realize that their opinion of the world
01:34:47"is also a confession of their character."
01:34:49And Dylan O'Sullivan,
01:34:51that I know we're both fucking huge fans of.
01:34:53- Yeah, he's great, yeah.
01:34:54- He's so good, dude.
01:34:57He says, "Nothing gives you a clearer look into someone
01:35:00"than how they misinterpret things.
01:35:03"Every misinterpretation is a confession."
01:35:05And it feels like Emerson and Dylan
01:35:08are kind of agreeing with each other here.
01:35:09Their opinion of the world is a confession of their character
01:35:12and their misinterpretation of the world
01:35:14is also a confession.
01:35:16- Yeah, so to give you another quote from Naval,
01:35:20I think he said something like, "It's almost always possible
01:35:25"to be both honest and optimistic."
01:35:28So what I find is that if you are optimistic
01:35:34is not because you're deluded necessarily.
01:35:36It's often just because of your personality,
01:35:38because you choose to see the good
01:35:40rather than choosing to see the bad.
01:35:42It's often just a choice.
01:35:43It is literally just often a choice.
01:35:45It's something that has really sort of become
01:35:49an important force in my life now.
01:35:51This understanding that I can actually choose
01:35:53how I perceive things.
01:35:55I can choose whether I see things as a good or a bad thing,
01:35:58depending on the facts that I select
01:36:00and the way that I interpret them.
01:36:02And I'm aware that, yeah, okay,
01:36:03this often requires me to ignore certain things,
01:36:06but we're always ignoring things anyway.
01:36:08So it's not like I'm doing anything wrong here.
01:36:11Attention is selective.
01:36:12Attention is like empathy.
01:36:13It's a spotlight.
01:36:14You shine it on some things and by doing so,
01:36:16you cast everything else in darkness.
01:36:18And so when you are pessimistic,
01:36:20this is not a sign that you see reality more clearly
01:36:22as a lot of pessimists like to believe.
01:36:24It's actually a sign that you're choosing
01:36:26to shine your spotlight on shit rather than on diamonds,
01:36:30to put it simply.
01:36:32You have a choice where you shine your spotlight.
01:36:35And ultimately it's a case of what are you looking at?
01:36:38What are you perceiving?
01:36:39When you see something, what details are you picking out?
01:36:42And this is why when I see miserable people now,
01:36:45I don't see realists.
01:36:47I just see miserable people.
01:36:49I see people who are unhappy inside
01:36:52who are essentially externalizing their unhappiness
01:36:55by choosing to see the absolute worst in everything.
01:36:59And this is why I don't have much tolerance now
01:37:01for people who just keep complaining about things
01:37:04because to me, that's just a way
01:37:07to dig your hole deeper, basically.
01:37:09You're just making life worse for yourself
01:37:11by choosing to see the worst.
01:37:12But there's no solutions in complaining.
01:37:15If you just keep complaining,
01:37:16all you're doing is you're convincing yourself
01:37:18that the world is bad.
01:37:19And the world doesn't need to be bad.
01:37:21You don't need to lie to see the good.
01:37:23- But surely in some situations, things are bad.
01:37:26Is it not fair to accurately represent that and reflect it
01:37:29so that maybe people try to change the thing that's bad
01:37:33and shouldn't be bad?
01:37:34- No, no, you should always recognize
01:37:36that things can be improved, always.
01:37:37Yeah, absolutely.
01:37:38But this doesn't mean that you should focus on the bad
01:37:40and have the bad as the only thing to focus on.
01:37:43There's always two sides to the story.
01:37:45So the way I look at it is yes,
01:37:47you should always be cognizant of problems.
01:37:52I'm not saying that you should not see problems.
01:37:54You should choose not to see problems.
01:37:55You should see the problems.
01:37:57But instead of focusing on complaining about them,
01:37:59you should try to focus on solutions instead.
01:38:02What can I do rather than make this better?
01:38:04So this actually fits in with another of my ideas.
01:38:07So there's a concept called the Stockdale paradox.
01:38:10And the Stockdale paradox is quite an interesting one
01:38:12because it's basically taken from a guy
01:38:16called James Stockdale.
01:38:18He was an admiral, right?
01:38:19And he survived nearly eight years of torture and isolation
01:38:22in the Hanoi Hilton, basically, which was so-called
01:38:25because it was one of the most brutal camps
01:38:27in North Vietnam during the Vietnam War.
01:38:30He was basically a POW for quite a long time.
01:38:33Yeah, so for eight years.
01:38:33And so basically, so what he observed while he was there
01:38:38was that there were people who were optimists
01:38:41and there were people who were pessimists.
01:38:43And both groups ended up suffering hard.
01:38:47And many of them died very early.
01:38:49So the optimists would basically believe
01:38:52that they were gonna be released from the jail by Christmas.
01:38:55And then when Christmas didn't come, it would be Easter.
01:38:57And then Easter didn't come.
01:38:58So they would keep hoping.
01:38:59And eventually their hope just ran out
01:39:00and they just kind of gave up on life.
01:39:02And some of them, they just lost the will to live
01:39:05because they had hoped and their hopes had been destroyed.
01:39:08But then on the other hand, there were the pessimists
01:39:10who were people who just kind of believed
01:39:12that their station was completely irredeemable
01:39:16and there was no hope in the first place.
01:39:18So obviously they had no motivation to improve.
01:39:21What Stockdale found was what got him through the eight years
01:39:25was not by being an optimist, not by being a pessimist,
01:39:30but actually by practicing a kind of optimistic pessimism.
01:39:35And this was essentially, so what it was is that basically
01:39:39the key to achieving this kind of paradoxical state of mind
01:39:43is to accept that bad outcomes are indeed a real possibility,
01:39:47but rather than let that possibility crush your hopes,
01:39:50you can develop hope in your ability
01:39:52to deal with those problems by preparing for them.
01:39:55So by acknowledging and confronting
01:39:57the harshest potential outcomes,
01:39:59you make them less of a problem
01:40:01and less of a reason for negativity.
01:40:03So what I'm saying is basically healthy optimism
01:40:08arises through a kind of practical pessimism.
01:40:10It's not the blind idealism
01:40:12that everything is always gonna turn out fine,
01:40:14but rather the self-belief that you can deal with things
01:40:17no matter how they turn out.
01:40:19And that's essentially what confidence is.
01:40:21Confidence is not the belief
01:40:22that everything is gonna be all right.
01:40:24Confidence is the belief
01:40:25that you will be able to handle with things
01:40:27even if they're not okay, right?
01:40:30So you'll always be able to deal with the eventualities.
01:40:33And the way that you do that
01:40:34is by acknowledging the worst case scenario,
01:40:37but preparing for any scenario, essentially.
01:40:40So you don't necessarily have to be pessimistic.
01:40:43You don't have to be optimistic.
01:40:44You fuse the two.
01:40:45So this is obviously, this is a bit separate
01:40:46from the idea of seeing the beauty in things,
01:40:49but this is obviously a very healthy attitude to have
01:40:52with regards to just half glass, half full,
01:40:57or glass half empty.
01:40:58Just understand that the glass is half, right?
01:41:02That's it.
01:41:03It doesn't need to be half full.
01:41:04It doesn't need to be half empty.
01:41:05It's just half.
01:41:06- Well, George's line from the agency book
01:41:10is some people look at the glass and see it as half full.
01:41:13Some look at the glass and see it as half empty.
01:41:16What you should do is realize that you are the tap,
01:41:19and that's his line around agency,
01:41:22which is wherever this is, you can actually pour into it.
01:41:26Yeah, the Stockdale thing's interesting
01:41:31because I can see how people preparing
01:41:36for the bad would quite easily cause them
01:41:40to tumble down the rabbit hole of ruminating about it
01:41:42and worrying about it and woe is me and concept creep.
01:41:45And now I've got a pathology
01:41:46and I've got multiple personality disorder.
01:41:49That is the genesis of it.
01:41:52- Ultimately, it comes down to,
01:41:55it ultimately comes down to how you interpret it, right?
01:42:00So there is, yes, you could just go down this rabbit hole
01:42:04where you're just constantly thinking the worst case scenario
01:42:06but that's only gonna happen if you haven't found a solution.
01:42:10Yeah, if you have a solution,
01:42:12if you have developed a solution to the worst case scenario,
01:42:14then it's no longer gonna really dwell on your mind
01:42:17because you already have the solution.
01:42:19And that's ultimately what I do.
01:42:20If I was to say, come on this podcast
01:42:24and I would have the worst case scenario
01:42:26where I would say something,
01:42:29let's say I said the N word or something like that.
01:42:32If you're somebody who is really anxious,
01:42:35you'd be worrying about something like that so long.
01:42:38You'd be like, oh my God, what if I say the wrong thing?
01:42:41That would just cause you to be a nervous wreck
01:42:42and it would probably just make for a very bad episode.
01:42:45But if you have a solution,
01:42:47if you actually have trained yourself
01:42:50to not engage in these kinds of intrusive thoughts
01:42:53that might cause you to say those words
01:42:55or if you have a way to sort of style it out,
01:42:58then it's not gonna be a problem.
01:43:01It's the same with anything.
01:43:03Anxiety is really a result of you not having a solution
01:43:08to the worst case scenario.
01:43:10But as long as you have that solution,
01:43:11you're not gonna have the...
01:43:12I mean, you might still have anxiety
01:43:14if you're a neurotic person.
01:43:17But the thing is you can--
01:43:18- Well, you're not gonna have any less
01:43:20if you've got a solution.
01:43:21And what's that line about anxiety hates a moving target,
01:43:24action is the antidote to anxiety.
01:43:26- Yeah, that's it.
01:43:28- Yep, yep, yep.
01:43:29Look, Gwenda, dude, you're a legend.
01:43:31I appreciate you coming on, this always rules.
01:43:33It's one of my favorite episodes.
01:43:34Where should people go to check out all of the stuff
01:43:37that you've got going on?
01:43:38- Yeah, so main place is my blog,
01:43:40which is just gwenda.blog.
01:43:42And you can also find me on Twitter @g_s_bogle.
01:43:47Or just type my name into Google
01:43:49and I'm sure it will come up, yeah.
01:43:51- Heck yeah. - Yeah, and cheers.
01:43:53Been a pleasure, yeah, thank you.
01:43:54- It's always a good one.
01:43:56Keep writing 'cause we got more to talk about.
01:43:58- Oh yeah, yeah.
01:43:59Congratulations, you made it to the end of an episode.
01:44:03Your brain has not been completely destroyed
01:44:04by the internet just yet.
01:44:06Here's another one that you should watch.
01:44:09Go on.

Key Takeaway

Human nature is defined by a series of paradoxical traits where our greatest virtues, like empathy and intelligence, can lead to tribalism and atrophy unless tempered by individual agency and a commitment to objective truth.

Highlights

The Oxytocin Paradox reveals that empathy is often selective and tribal, meaning high compassion for an in-group frequently correlates with extreme cruelty toward out-groups.

The Rumpelstiltskin Effect explains how naming a psychological problem provides a sense of control, though it risks pathologizing normal human emotions like sadness.

The '1% Rule' on social media implies that online discourse is dominated by a loud, often narcissistic minority, creating a distorted view of actual human nature.

The 'Sloppaganda' and 'Reality Apathy' concepts highlight how AI-generated content and information overload erode the value of truth and social trust.

Eustress (beneficial stress) is presented as a necessary path to resilience, contrasting with modern efforts to eliminate all discomfort which leads to psychological fragility.

The Wilson Effect suggests that heritable traits like IQ and personality become more prominent as individuals age and gain the agency to express their true nature.

The 'Original Position Fallacy' describes how people support radical political shifts by mistakenly imagining themselves as part of the new elite rather than the oppressed masses.

Timeline

The Oxytocin Paradox and Selective Empathy

Gurwinder Bhogal introduces the idea that oxytocin, the 'love hormone,' actually drives tribalism and in-group loyalty rather than universal compassion. He uses the metaphor of a spotlight to explain that when we empathize deeply with one group, we effectively cast all others into darkness, often leading to spite toward perceived enemies. Examples include the high support for political violence among supposedly 'empathetic' social justice activists on platforms like Blue Sky. The discussion touches on real-world conflicts, such as the Israel-Palestine issue, to show how selective empathy fuels global cruelty. Ultimately, the speakers argue that the world needs less tribalism and more recognition of shared humanity rather than just 'more empathy.'

The Rumpelstiltskin Effect and Pathologization

This section explores the Rumpelstiltskin Effect, the psychological phenomenon where naming a problem makes it feel manageable but can also lead to an 'excuse culture.' By labeling a personality trait like shyness as 'social anxiety disorder,' individuals may feel a temporary sense of relief but might also outsource their agency to their biology. The speakers discuss 'concept creep,' where definitions of racism or trauma expand as objective instances decrease, keeping the demand for these labels high. They cite statistics showing that 4.4% of the global population has an anxiety disorder, totaling 359 million people. The conversation warns that medicalizing normal human sadness as clinical depression can prevent people from taking the necessary actions to improve their lives.

Malingering, Sloppaganda, and the Death of Trust

The dialogue shifts to 'malingering,' noting that 20% to 40% of students at elite American universities now register as disabled to gain advantages like extra exam time. This trend creates a cynical culture where those with genuine, invisible disabilities are treated with increasing skepticism by society. The speakers then tackle 'Sloppaganda,' the rise of AI-generated misinformation that prioritizes engagement through rage-bait over factual accuracy. This leads to 'reality apathy,' where the cost of finding the truth becomes so high that people simply stop trying and believe whatever 'stinks the least.' They conclude that while society can survive without complex truths, it cannot function without the fundamental glue of social trust.

The 1% Rule and Social Media Pathologies

Bhogal explains the 1% Rule, which states that a tiny, unrepresentative minority produces the vast majority of online content. This minority often scores high in 'dark tetrad' traits like narcissism and psychopathy, turning social media into a 'freak show' of extreme opinions. The 'Recursive Red Pill' effect is discussed, where influencers train their views on other influencers' unrepresentative insights, leading to a self-reinforcing loop of gender antagonism. Scissor statements—claims designed specifically to divide people—are used by media outlets like the New York Times to ensure virality at the expense of social cohesion. The speakers emphasize that the political polarization seen online rarely reflects the reality of the general population.

Eustress, Resilience, and the Dangers of Automation

The concept of 'Eustress' is introduced as the beneficial, challenging stress that forces personal growth and psychological resilience. Bhogal argues that we should only automate the skills we are willing to lose, as the friction of learning is what actually engraves wisdom into the brain. He cites the 'Google Effect' and recent studies on Alzheimer's to suggest that a lack of mental exertion leads to cognitive atrophy or 'brain rot.' The future may see a bifurcation of humanity into 'High Agency' and 'Low Agency' groups, similar to the Morlocks and Eloi in H.G. Wells' The Time Machine. Those who maintain their ability to focus and think independently will hold a massive competitive advantage in an increasingly passive world.

Personal Growth and the Tocqueville Paradox

The 'Personal Tocqueville Paradox' explains why high-achievers often feel like they 'suck' because their standards rise faster than their actual capacity. Bhogal reframes regret as a positive sign of progress, indicating that the person has outgrown their previous, less-informed self. They discuss Rothbard’s Law, which observes that people often specialize in things they are bad at because they undervalue their natural, effortless talents. To combat the 'arrival fallacy'—the belief that happiness is a destination—they emphasize finding joy in simple things like a cup of coffee. This internal baseline of happiness ensures that external success is viewed as a bonus rather than a requirement for a meaningful life.

Main Character Syndrome and Reciprocal Radicalization

The brief concludes with the 'Original Position Fallacy,' where people support radical ideologies like communism or feudalism because they assume they would be the ones in power. In reality, revolutions often consume their own 'intellectual' cheerleaders first, as seen in the French Revolution and the Khmer Rouge's 'Year Zero.' The 'Wilson Effect' is also explained, showing that as we age, our genetic traits (nature) actually become more influential than our upbringing (nurture). Finally, they discuss 'Amara’s Law' regarding AI, suggesting we are currently in a hype cycle that will inevitably lead to a 'trough of disillusionment.' This cycle of overestimation and skepticism is a repeating pattern in how humanity adopts transformative new technologies.

The Stockdale Paradox and Practical Optimism

The final section details the Stockdale Paradox, named after Admiral James Stockdale who survived eight years as a POW by balancing realism with faith. Stockdale noted that blind optimists died first because their specific hopes were repeatedly crushed, while pure pessimists lacked the will to continue. The ideal mindset is 'practical pessimism,' which involves acknowledging the harshest possible outcomes while maintaining total confidence in one's ability to handle them. Bhogal asserts that anxiety is often just the absence of a solution to a worst-case scenario; once a plan is in place, the fear dissipates. The episode ends with a call to action to cultivate individual agency as the only non-fungible human trait in the AI era.

Community Posts

View all posts