#2204 - Matt Walsh

#2204 - Matt Walsh

Released Wednesday, 18th September 2024
Good episode? Give it some love!
#2204 - Matt Walsh

#2204 - Matt Walsh

#2204 - Matt Walsh

#2204 - Matt Walsh

Wednesday, 18th September 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Joe Rogan podcast, check it out! The

0:04

Joe Rogan Experience. Train

0:06

by day, Joe Rogan podcast by night,

0:08

all day! Alright,

0:12

hello. What's happening? Hey,

0:15

great to be back. Thanks for

0:17

having me. Your movie is really

0:19

funny. It's really funny. By myself,

0:21

laughing out loud, hysterically today. I

0:24

watched it in the sauna, I watched it in the

0:26

gym. I watched it, it

0:29

was, it's one of the best comedies

0:31

I've seen in a long time. There's so

0:33

many moments that are so uncomfortable. That

0:36

means a lot, appreciate that. That's what we're

0:39

hoping for. The

0:41

Robin DiAngelo one, where you gave that

0:43

guy money for reparations and you got

0:45

her, she thought it was uncomfortable. Yeah,

0:48

that was kind of a... When

0:52

we had the idea for the film to talk

0:54

about race, we knew we needed to get Robin

0:56

DiAngelo. I didn't think we'd get her, because I

0:58

figured she'd be a lot more... Savvy? Yes,

1:01

savvy and cautious. But

1:04

apparently she has no idea what's happening outside of

1:07

her bubble at all. So

1:09

she didn't know who I was. I gave her my name, and she

1:11

had no clue. Wow. But

1:15

we kind of went into that

1:17

knowing what the end was supposed to be, if we could

1:19

get her. We came up with that idea, we went to

1:22

a bar the night before the

1:24

interview, and we came up with this idea, could

1:26

we get her to actually pay reparations to

1:28

Ben, our black producer? And

1:32

we had to kind of talk him into it. And

1:35

it was really just like, in

1:38

real time I was there for about two hours, and

1:40

it was an hour and a half of the most

1:42

mind-numbing conversation, where I'm just... None

1:45

of that's in the movie, because it's just me, like, fluff

1:48

questions. And I'm repeating back to her

1:50

own ideas, so she knows that I'm a safe person. It's

1:53

a safe space, and then you've got to build to it, and build to

1:55

it, and build to it, and then finally you get to a point where

1:57

you can do something a little weird. that

2:00

she'll probably go along with it. And she did. I

2:04

mean, you saw we go through a whole, we have a

2:06

whole series of exercises we wanna do with her. And

2:08

she went, she did it. She was game. So

2:14

that was, and that was

2:16

one of the first things we filmed. So after we got that, we

2:18

knew that, okay, we have a movie here.

2:20

I feel like you got your money's worth with her, seeing as it's

2:22

$15,000, but I

2:24

feel like you got robbed by the lady that got

2:26

upset about the mascot. 50

2:30

grand, you barely got anything out of her. Yeah,

2:32

that was, well, part of the point of the

2:34

movie is that's why we put the price tags

2:36

on the screen. We

2:38

want people to see how absurd it is. So in

2:40

a certain way, it was like, the

2:45

higher they quoted the price, we said, great, we'll

2:47

pay that. Because we want this

2:49

in the movie. Right. Because

2:51

if all these people had said, oh yeah, I'll do it for

2:53

free, or I'll do it for 200 bucks, just pay my travel,

2:55

doesn't really make the point. But

2:58

they all were quoting exorbitant prices,

3:00

and she was the most. And

3:04

then she basically said almost nothing, but it

3:06

was okay. No

3:09

one ever found out who the identity of the mascot

3:11

is? No, I don't think so. It would

3:14

have been hilarious if it was a person of color. Well,

3:16

it almost certainly was. It almost certainly was, because

3:18

if it was a white guy, they would have

3:20

thrown him under the bus. Yeah, they

3:22

would have. So the fact that they didn't, yeah,

3:25

it's probably something like Hispanic kid or something. And

3:28

you gotta imagine, you can't see real good

3:30

with that fucking costume. You ever put a

3:32

mascot costume on? I haven't,

3:34

but I can tell that there's little eye

3:37

slits. You can't even see what's below you.

3:39

Exactly. Duncan

3:41

and I did a whole podcast where we pretended to

3:43

be furries. We lasted, every

3:46

podcast we do, we dress up. We'll dress up

3:48

like Star Wars people or whatever, spaceship

3:50

people. We did a podcast as furries.

3:53

We kept the helmets on for maybe five minutes. We're like,

3:55

I can't fucking do it. And we both took them off.

3:57

We're like props to the furries. If you could run around

3:59

with this thing on, This is hard to do. Like

4:01

you can't see shit. You can't breathe. So

4:03

the idea that he missed those kids is

4:06

like. The furries are doing a lot more than running

4:08

around in those things too. So that's the. They are.

4:10

They are. I think they designed special ones for that.

4:12

Yeah. I don't even wanna know. But. Like

4:15

a hatch. But that's, it's actually a perfect example of

4:17

what these people do, these race

4:19

hustlers, that something happened, it

4:22

was a little bit unpleasant. Yes.

4:24

Not a big deal. There's a million ways to interpret that.

4:27

It's just a normal human thing that happens in the world.

4:29

Things happen that are a little bit unpleasant. You're disappointed. Your

4:31

kid didn't get a high five. Okay, it happens. But

4:34

for them, they have one

4:36

lens for seeing the world. And the lens

4:38

is through this left wing racial ideology. So

4:40

everything that happens is colored by that. Right.

4:43

And everything is understood through that lens. So

4:46

anything, I mean, you think

4:48

about Michelle

4:51

Obama when she was first

4:53

lady. She had multiple stories that she would

4:55

tell about as first

4:57

lady being discriminated against because

4:59

of her race, allegedly. And one of them

5:01

was, she was

5:03

in line for ice cream or something and someone cut in front

5:05

of her. And she told

5:08

this story in some interview, this very dramatic

5:10

story about, well, they didn't see her cause

5:12

she's black. And meanwhile,

5:14

it's like, we've all been cut lady. People have

5:16

cut in front of all of us. It's just

5:18

that if it happens to me at Walmart, I

5:20

don't think of it racially. I just think, oh,

5:23

this person's an asshole. Exactly. But for her, it's

5:25

all racial. So that's kind of. But that's a

5:27

crazy one to say that someone cutting in front

5:29

of you, a selfish act is somehow racist. Just

5:31

because, that's like looking for racism everywhere.

5:35

That kind of situation is so normal. That's

5:38

so normal that some dick cuts in front of you. Right,

5:40

exactly. It's an unpleasant thing that

5:42

happens to all people. And

5:45

if you're not in the kind of

5:47

race hustle or bubble, you don't see it

5:50

that way. But that's. But it's interesting that nobody

5:52

wants to call that out. Nobody wants to be reasonable. Nobody

5:55

wants to say, well, is that like, you just say,

5:57

oh, wow, you have to listen to it. That's part

5:59

of the problem. Like you can't say, are

6:01

you sure that's racist? Cause then you're a

6:03

racist apologist and then you're racist

6:05

by proxy. Yeah. And how do you know, so how

6:07

do you know what's in that other person's mind? How

6:10

can you ascribe motives to them? This,

6:12

it drives me nuts that this is what, this is

6:15

what we do now, where if someone does something or

6:17

says something, someone else is offended by

6:19

it, that person who's offended

6:21

gets to decide what

6:24

the intent was behind the other person's

6:26

action, to the extent that if

6:28

the other person says, this was my

6:30

intention, I'll tell you what it was. They don't

6:32

get to have a say in the intentions behind their own actions.

6:35

They are suddenly not authorities in

6:37

their own behavior. Exactly. This

6:39

other person who was the offended party gets

6:42

to inform you what you meant by that thing,

6:46

which is really what the, I

6:48

mean, the move is called Am I racist? But in

6:51

reality, there's

6:54

only one person who can answer whether you're a racist person

6:56

and that's you. And if

6:59

you don't think that you're racist, then

7:02

you aren't because racism is a thought process. And

7:04

if it's not in your head, then you're not

7:06

racist. You might

7:09

have stereotypical views about people of other

7:11

races, everybody does to some extent. You

7:14

might think things that are even insulting about

7:16

people of other races, but it's not racist

7:18

because racist means you hate people

7:20

of other races or you think they're inferior to you. But

7:24

you could be not a racist person

7:26

and think that whatever, Asians

7:28

are bad drivers. You could think that that

7:31

stereotype is true. Whether it's true or

7:33

not, you just happen to think that that's a true thing about

7:35

this group. Doesn't mean you hate them, doesn't mean that you think

7:37

that they're inferior. You

7:40

can say frat boys are annoying and not hate

7:42

men. Exactly. Yeah. Exactly.

7:45

And most of the time, these stereotypes, they

7:47

didn't just fall out of the sky, like

7:49

they're grounded in something. If they did, no

7:51

one would, it wouldn't make any sense. Right.

7:55

And nobody would be offended. That's the thing, nobody would

7:57

be offended by a stereotype. Right. That

7:59

really was not true at all. at all.

8:01

You're only offended because it rings true at

8:03

least a little bit because otherwise

8:05

it would just be absurd.

8:08

Which is why when you get, I mean in the movie we

8:10

go, there's a section where

8:12

we go kind of outside this bubble and

8:14

we go down, we talk to bikers at

8:16

a biker bar in the

8:18

south, we talk to the poor black

8:21

community in New Orleans. And

8:24

the only reason we did that was

8:26

just, let's find people who are not,

8:29

they probably didn't go to college so they

8:31

didn't get brainwashed there. They're not getting the

8:33

corporate DEI seminars. They're not reading

8:36

Robin DiAngelo or any of these people. What

8:39

do they think about this stuff? Are they worried

8:41

about systemic racism? Do they see everything as racist

8:43

all the time? And what we found is no,

8:45

they're just not even, they don't even

8:47

speak that language. When you say the

8:49

term systemic racism to them, they say,

8:51

well what do you mean by that? What is that? This

8:53

was something that, look, people are always concerned

8:55

about people being racist, but there's something that

8:57

happened in this country somewhere around 2012ish where

8:59

things really,

9:03

really ramped up. And it

9:05

just became

9:08

much more of a subject, a

9:10

subject that was like constantly around

9:13

worrying about racial bias and it

9:17

ramped up, right? It ramped up till you get

9:19

to the point where you do have some of

9:21

these race hustlers that are saying

9:24

everyone's racist. You must confront

9:26

your unconscious bias and you're

9:28

just constantly hearing about it.

9:32

I think you're right that it was around 2012. BLM

9:35

came into formation in 2013, I think that was

9:38

the Trayvon Martin thing. So

9:42

it's not a coincidence that it seemed like race

9:44

relations in this country were improving decade after decade.

9:47

They weren't perfect, but it seemed like they were

9:49

pretty good. Much better than the sixties. Yeah, the

9:51

nineties. I grew up in the nineties. It was

9:53

not perfect, but I

9:55

grew up in a diverse area. I went to public

9:57

school, a lot of people with different ethnicities and races.

10:00

We weren't talking about racism all the time. It was

10:02

basically fine. And

10:05

then something happened in the

10:08

middle part of the first decade of

10:10

the 2000s where it seemed like

10:12

things started backsliding. And that's right at

10:14

a time when Barack Obama was elected.

10:17

And that's not a coincidence. Like a lot of

10:19

people have noticed that, it's odd that we had

10:21

a black president and then all of a sudden,

10:24

now we're having race riots again. And

10:26

I think the reason is that when

10:29

you elect a black president, I didn't like Obama,

10:32

I didn't vote for him. I think his policies are terrible.

10:35

But you would think that at least one positive

10:37

you could draw from that is that, well, at

10:39

least that means that systemic racism is

10:42

not a problem in this country anymore. I mean, if

10:44

a black guy could rise to the

10:46

top of the system and run it, then

10:49

clearly the system is not racist against black

10:51

people. And in fact, was

10:53

overwhelmingly voted into that position by Americans. Which

10:57

is true, so that is evidence that America

10:59

isn't systemically racist against black people, but the

11:01

race hustlers don't want us

11:03

to draw that conclusion. They're worried that we'll look

11:05

at Obama as president and say, okay, well, racism

11:07

isn't a big issue anymore. And

11:10

that's a problem for them because there's a lot

11:12

of power, money and influence to be found in

11:14

the racist, racism narrative. So they had to kind

11:16

of like double up on their efforts

11:18

to convince us that America is actually racist,

11:21

which is why during Obama's term,

11:23

that's when we started getting all these race hoaxes

11:25

and the race riots and BLM. That's

11:28

when things like people started talking about microaggressions and all

11:30

this kind of nonsense, because they needed

11:32

to tell us that, yeah, you might think

11:35

that this issue is kind of solved now, but it's

11:37

not. Racism is actually worse than

11:39

you ever imagined. It's lurking everywhere. And

11:41

now we're at a point, yeah,

11:44

and then not long after that, they started

11:46

tearing down Confederate civil war monuments

11:48

and stuff, stuff that's

11:50

been there for like 100 years, which was

11:52

always weird because 100 years ago, people

11:55

could whatever, walk by a Robert E. Lee

11:57

monument and not care. a big deal to

12:00

them, black or white. Now

12:02

all of a sudden, it's

12:04

a bigger deal to us than it was to people

12:07

whose parents fought. They had grandparents who

12:09

fought in the Civil War or died

12:11

in the Civil War. They

12:14

were okay with it. And yet

12:16

for us, what, the wounds of the Civil War

12:18

are fresher or more raw for us than they

12:20

were for people a century ago? It makes no

12:22

sense. How are we less able to be

12:25

objective and non-emotional about the Civil War

12:27

than people who had family

12:29

members? I mean, slaves, ex-slaves were still

12:32

living back then. Well,

12:34

I think it's because it's just like a

12:36

religious ideology. Like when the Taliban started blowing

12:38

up those ancient statues of Buddhas. Do you

12:40

remember that? Yeah. Because like

12:43

they could, like they destroyed

12:45

things that were a part of human

12:47

history that we would have studied for

12:49

thousands of years. And they destroyed them

12:51

because they didn't go along with their

12:53

religious ideology. And I think part of

12:56

the woke thing is this

12:58

religious ideology that has

13:01

to be followed. And you cannot stray

13:03

from the lines. You have to stay

13:05

inside whatever this ideology is promoting and

13:07

telling you what to do. And

13:10

one of the things was that you had

13:12

to take down all these statues of terrible

13:14

people. And I remember Trump saying at the

13:17

time, well, the problem with that is like,

13:19

eventually they're going to take down George Washington

13:21

and everybody thought he was crazy. Like that's

13:23

a crazy thing to say. But once they

13:26

got past Civil War people, then they got

13:28

to who owned slaves and then

13:30

they got to taking down, they wanted to take

13:32

down statues of Thomas Jefferson and eventually did

13:34

get to George Washington. Yeah. And that was

13:36

always, it was always going to go that

13:38

way because George

13:40

Washington, the founding fathers owned slaves. Not

13:44

only that, but they were rebels

13:47

rebelling against a governmental

13:50

authority. And if they had

13:52

lost them, they all would have been hanged

13:54

as traitors and that's how they'd be remembered.

13:57

Thankfully they didn't. But so there's a,

13:59

it's actually, there's It's not that far

14:01

of a leap to go from one to the other. And

14:04

of course the issue is that everybody

14:07

who lived on earth prior

14:09

to about, certainly

14:12

prior to 100 years ago, is

14:15

racist by our standards today, every single one.

14:18

There was no one who lived on earth 100 years ago who

14:20

we would not consider racist anywhere

14:22

of any race. If you

14:24

go back 200 years or

14:26

earlier than that, almost

14:29

everybody either owned slaves

14:31

or was OK with slavery as an institution. You

14:34

go back 500 years and

14:37

there was nobody on the planet who

14:40

considered slavery to be wrong fundamentally.

14:42

They might have had issues with how slaves

14:44

are treated in some context, but it

14:47

took like thousands of years for it to ever

14:49

even occur to a single human on earth that

14:51

slavery is actually fundamentally wrong, which

14:55

is a crazy thing. And that's actually an interesting

14:57

thing you could talk about and think about. Like,

14:59

why is that? How could it be that it's

15:01

so obvious to us but some of

15:03

the greatest minds of history, they never thought of it. But

15:06

we can't talk about that because we have to talk

15:08

about slavery and racism as if they're exclusively white Western

15:11

phenomena. Well, I've

15:14

had friends that have a different perspective on

15:16

the Obama situation. And my friend

15:18

Willie was talking to me about this. And he was

15:20

saying that what happened was when you look, one

15:25

thing that we can be sure of is that

15:27

race is surreal. There are real racists in this

15:29

country. There's real anti-Black racists, anti-Asian

15:31

racists. There's certain people that have hateful

15:34

ideology in this country, just in certain

15:36

percentage of them in the world. So

15:39

those are real. And when Obama

15:41

became president, those people became more

15:43

emboldened. And he said that he

15:45

saw a lot more of that

15:47

online and a lot more attacks,

15:50

and especially in uncensored

15:52

online forums like 4chan and

15:54

places where you can kind of get

15:56

away with saying whatever the fuck you want. He said he

15:58

saw a lot more of that. on the streets

16:01

and he said this is probably why he

16:03

believed Michelle Obama didn't want to run for president

16:05

because she experienced so much of that hate

16:07

while they're in the White House. Forget about hate

16:09

for their their policies and

16:12

what you think about them as

16:14

president and vice and first lady

16:16

but the racism hate. So

16:18

his perspective as a black guy was like

16:20

you had to be a black person to realize

16:22

how angry people were that there was a black

16:25

guy who was president because that was real too.

16:28

It was real that racism in American

16:30

racial relations in America had changed radically

16:32

since the 1960s certainly since the 1920s

16:34

and 30s and

16:37

and over the years has kept getting

16:39

better but in his mind there

16:42

was something that happened where when Barack

16:44

Obama got into the White House that

16:47

the real hardcore racist got very vocal

16:49

and he experienced it and

16:51

I think this is akin

16:54

in some ways to what's going

16:56

on with anti-semitism online because

16:59

I think there's always been a certain amount

17:01

of people in this country and in the

17:04

world that are like deeply anti-semitic and

17:07

they just don't like Jews and

17:09

when something happens where all the

17:12

sudden now it's okay to criticize

17:14

Jews because of Israel's position in

17:16

Gaza and what they've done now

17:18

you see anti-semitism just pop out

17:20

of the woodwork. I think

17:22

there's something like that where people feel

17:25

emboldened to talk about things so like

17:27

maybe we just don't have an accurate account of

17:29

how fucked up some people are but

17:32

the general population and whether you're

17:34

conservative or whether you're liberal

17:38

everybody kind of agrees that racism

17:40

is a stupid thing there's amazing

17:42

people of all ethnicities and colors

17:45

and you should judge people like

17:47

Martin Luther King said by the content of their

17:49

character we all agree that but there's

17:53

a certain amount of people that are

17:55

always going to be racist but when

17:57

you start looking for it everywhere and

18:00

And saying everything is racist, first of

18:02

all, you are, it's an

18:04

insult to real racism. It's

18:06

an insult to the people that are the

18:08

victims of real racism. When you consider microaggressions

18:10

or cutting in line in front of you

18:12

to get ice cream, there's

18:15

people that are real victims of racism. And

18:18

pretending that everything is racist just

18:20

minimizes that and in fact probably

18:22

makes more people racist. It's going

18:24

to make a bunch of dumb

18:26

liberals drop to their

18:28

knees or give you money for reparations but

18:30

it's going to make a bunch of other

18:32

people really resentful and it just polarizes us

18:34

and drives people further and further apart. It's

18:38

just genuinely stupid. It's

18:40

a self-fulfilling prophecy and I think

18:43

that's true what

18:46

he said about, I'm

18:49

sure that when there's a black president that we

18:51

know there are real racists out there, they're anti-black,

18:53

they're anti-white racists but they're

18:55

out there and social

18:58

media was also really coming online around that

19:00

time so people had a forum to express

19:02

this kind of stuff. And

19:05

anonymously. Anonymous. And so yeah, those

19:07

people come out of the woodwork. I'm sure that did happen, I

19:09

don't deny that. The difference though

19:11

is that that

19:13

kind of racism is personal

19:16

and individual. It's

19:18

not systemic. It's not in the

19:20

system. And also

19:24

it's absolutely rejected by

19:27

society. It's absolutely rejected by polite society. So

19:29

there's a reason why they had to go to 4chan

19:31

or whatever to express those views because you can't come

19:34

out in public and say it and if you do

19:36

it, it'll be like the end of whatever your career

19:38

is, it's probably the end of it. And

19:40

that's kind of, that's

19:43

the most, when it comes to, as you said, there's never

19:45

going to be a time when there's no racists in the

19:47

world. So the

19:49

most you can do is, okay, we're not going to have

19:52

this stuff systemically. The system's going to treat

19:54

everybody equally. Great. So we've

19:56

got to go to the list. We've already done that. Actually,

19:58

we've gone too far because we have affirmative action where... Now

20:00

you're discriminating against white and Asian people, but so

20:03

anti-black racism is out of the system. Fantastic,

20:05

that's good. It's

20:08

not accepted by mainstream society. Great.

20:11

And then, so that's kind

20:14

of it. I mean, what else can we

20:16

do with this? You can't get inside people's

20:18

hearts and make them not feel things. Those

20:20

people are gonna be out there. They

20:22

know that it's not accepted in mainstream

20:24

society. And I kind

20:26

of think you could sort of

20:28

move on from it culturally to

20:30

other issues. It's not a major

20:33

issue anymore, but

20:36

they won't allow it. And you're right that

20:38

then it's got this pendulum thing where, okay,

20:41

well, if you go after white people and

20:43

you demonize them relentlessly,

20:46

then you do it practically from birth now through the

20:48

school system, some

20:51

of those white people are gonna end up being

20:53

stricken by guilt, and they're gonna walk around

20:55

feeling like they're guilty for something. That's the

20:57

white guilt liberal thing. But

20:59

then you're gonna have others who kind

21:02

of become exactly what you accuse them of being because they're

21:04

like, oh, you know what? If you're gonna

21:07

call me racist anyway, then you know what? Fine.

21:10

And there's gonna resentment that builds up and then you actually

21:13

create more of it, which I think they're happy about. If

21:16

actual racism is increasing in society, I don't know if

21:18

it is or not, but I think

21:20

the people that call themselves anti-racist are quite

21:22

happy about that. Well, business is booming. But

21:24

the other thing is like, think about Robert

21:27

D'Angelo, who you said just lives in her

21:29

own bubble and really didn't know who you

21:31

were and didn't catch on at any point

21:33

in time that any of this

21:35

stuff was ridiculous. Like, these

21:38

people, if that's all

21:40

you think about and that's all you, like, I

21:42

have friends that live in California and

21:44

every now and then I'll talk to them and

21:47

some politics issue will come up. And

21:49

they give me this fucking CNBC, they

21:53

give me this MSNBC, this fucking

21:55

propaganda viewpoint on something that's so

21:57

wrong, just so... and I just

22:00

go, okay, I can't, like

22:02

you're in. You're

22:04

in your bubble, there's no real

22:06

discourse. There's

22:08

no real, there's no discussions

22:11

about whether or not what these people are saying

22:14

is correct. It's just, you're a part of this

22:16

tribe and this is what you believe. And

22:19

I think that's the case with these anti-racist people

22:21

too. Some of them

22:23

might be like just hardcore grifters. Like

22:25

they could be playing three card money

22:28

or they could just get corporations to give them

22:30

money by saying that everybody's

22:32

racist. There's some people that are definitely like

22:34

that. But there's other people that are just,

22:37

that's their friend group. Like that's

22:39

their social circle. Their social circles, all people

22:41

believe this stupid shit and they all yap

22:43

it to each other and they say it

22:45

like it's a mantra and they pray five

22:48

times a day with it. You

22:50

know, it's really like a religious thing. I think it is

22:52

like a, yeah, I think you're exactly right about that. That's

22:54

why for me, the more, so

22:56

the grifters that are getting paid, why

23:00

they're doing it. Like they're getting paid. A lot

23:02

of them. A lot of them

23:04

are. And even when they're not, there's still power

23:06

and influence and they're being consulted

23:09

as kind of these moral gurus, which

23:11

is very, strokes the ego.

23:14

That's rewarding for the, the more interesting thing

23:16

is what about the people who

23:19

go to those people and

23:21

consult them as moral gurus? I mean, in the movie

23:23

we have this race to dinner where you got these

23:25

white women who sit around

23:27

a table and they invite these other two women,

23:30

Cybert Rau and Regina Jackson, to

23:32

come to dinner. They pay them to come to dinner

23:34

and call them racist for two hours. And

23:37

it's like, why would you subject yourself

23:39

to that? It's so, it seems

23:42

like the most miserable experience to

23:44

volunteer to be broken down and insulted and degraded,

23:46

which is what happened to these women. I mean,

23:48

I saw it. They were, it's like two hours

23:50

of them just getting, you're

23:53

racist, you're racist, you're racist. They

23:55

had to go around the table, confess their racist

23:57

sins. And

23:59

then they all, each go and they say what their

24:01

racist sin, like what's a racist thing you've done recently,

24:04

they all confess and I'm

24:06

listening to it it's like none of you have actually done

24:08

anything racist. I listen to all your stories, none

24:10

of that is racist. There's a woman

24:12

who said that she's married to a black guy and

24:16

she yeah he's

24:18

loud and she tells him to quiet down sometimes. What

24:22

wife has not said that to their husband. I

24:26

get that once a month. Right so what do they think? I

24:30

think my wife is racist. She could be. She

24:33

could be. She's sexist against me. Yeah.

24:35

She's sexist against me. So what are

24:37

they getting out of it? Well they're getting

24:39

out of it first of all they're terrified of

24:41

being called racist so they jump the gun so

24:44

they headed off at the path like I'm gonna

24:46

make sure I'm not racist so I'm gonna become

24:48

an anti-racist. You know I talked

24:50

about this before but when my kids were young

24:52

like my youngest

24:54

was pretty young when they

24:57

started doing this anti-racism thing at

24:59

the school where they said it's

25:02

not enough to be not

25:05

racist. This is actually right after we left

25:07

so it's right after like the George Floyd

25:09

things popped off. They said it's not good

25:12

enough to not be racist you have

25:14

to be anti-racist. You're talking about some

25:16

of these kids in

25:18

that school are six. Like

25:21

what are you saying? It's not enough. What

25:23

are you saying? You saying a six year

25:25

old has to be an anti-racist can't they

25:27

just play with their toys? Can't

25:30

they just go to the park and hang

25:32

out with their friends? Can't they just play

25:34

sports? Can't they just enjoy each other? Six

25:36

year olds don't give a fuck what color

25:38

somebody is. They don't. They all

25:41

just play together. They just want to play

25:43

with the people who are nice to them

25:45

and who they have fun with and laugh

25:48

with. And here you've got some fucking grifter

25:50

who latches themselves onto some school system that's

25:52

filled with all these terrified

25:54

liberals that are just terrified of being

25:57

called out for anything and they're all

25:59

the rules are changing and everybody's

26:01

like, oh! And so they

26:03

bend the knee. They bend

26:05

the knee. And with kids, it's so

26:07

insidious because, yeah, kids don't

26:10

care about race. They

26:12

notice it though, which is fine.

26:15

But then you give them like this complex

26:17

from such a young age, which is

26:20

so unnecessary. And that's why, I mean, I remember

26:22

when my oldest daughter was five, we were at the

26:24

mall or something and a black

26:27

family walked by. And she pointed

26:31

at them and said, why

26:33

are people black? Why

26:36

is their skin like that? She wanted to

26:38

know, why does skin color exist? How

26:42

do some people have different skin color than other people? And

26:45

of course, I told her,

26:47

to be polite, we don't point at people in public.

26:49

So I told her that. But then we

26:51

talked about it. It's OK to wonder

26:54

that. It's OK to notice that. I

26:56

think with these anti-racist people, if

26:58

I was listening to them, I should have, this

27:01

would have been an opportunity for me to give her

27:03

a whole lecture about racism and make her feel really

27:05

bad for noticing that and asking about it. And

27:09

then you create this complex. And

27:12

yeah, fast forward 20 years, and she's one

27:14

of these women at a race

27:16

to dinner. Exactly. It's

27:18

awful. But it's a

27:21

very potent thing. I

27:23

mean, white guilt, the fear

27:25

of being called racist. It's hard for me to understand

27:27

because I get

27:29

called racist all the time, 50,000 times a day.

27:33

And it just rolls off my back. I don't care because

27:35

it's just it's unknown. It doesn't mean anything. It doesn't

27:37

mean anything. But for

27:39

you and I, it doesn't mean anything. But for

27:41

a lot of normal people, especially. It's a

27:43

death sentence. Right, to be called that. It's

27:46

the worst thing in the world. They're

27:48

terrified of it. They'd

27:50

rather they literally rather be called anything than

27:53

racist. Yeah. And then so

27:55

for them, once you those kinds of

27:57

people, the

28:00

threat, when being called racist

28:02

is a threat, you can get them to do anything.

28:05

And we've, I mean, spoilers

28:07

or whatever, but in the movie,

28:10

the last thing in the movie when I do

28:12

my own anti-racist workshop with these people, and they're

28:14

all real people, and

28:17

we get them to join in on some

28:19

things that are really like morally

28:21

repugnant because

28:24

they're terrified of being called racist publicly. They

28:27

can't stand that thought. And

28:29

the other thing that happens with kids is if

28:32

you have a thing like you're telling the

28:35

kid they have to be anti-racist, well, some

28:37

kids are going to use that as a

28:40

platform to increase, you

28:42

know, whatever social cred that they have, and

28:44

they get feedback from it. It's

28:47

positive feedback, and they get very vocal, and the more

28:49

vocal, the more people are impressed, and the more work

28:51

they do, the more people are going, you're doing great

28:54

work. And then you get what's essentially like

28:56

the racial version of Greta Thurnberg. Like

28:59

what is that lady? That lady's moral

29:01

outrage at what have you done? How

29:03

dare you? And everybody's like, yes, we

29:06

like what you just did. And so now you

29:08

do it all the time. And so now somehow

29:10

or another, a 16-year-old kid travels all over the

29:12

world telling everybody they're bad, flying

29:15

around in jets, telling everybody they're bad

29:17

for ruining the environment. And she gets

29:19

to feel... Morally

29:21

superior, morally superior, virtuous. And for

29:23

a child to be in a

29:25

position where they become virtuous is,

29:28

you know, they love that. They

29:30

love that. To be in a position where they can

29:32

lecture adults. Yes. Or adults are looking

29:35

to them as authorities. Yes. College

29:37

kids love to do that. The moment they're out of

29:39

their house, the moment they don't have their parents telling

29:41

them what to do anymore, now they can tell other

29:43

people what to do. And it's just like, it's

29:45

one thing that you see online from

29:48

people who have been bullied in the past. People

29:52

that have been picked on and fucked with,

29:54

boy, they like to do it to people,

29:57

like online, on Twitter mobs. They like to

29:59

jump in. And I know a lot of people

30:01

that have, I've known a lot of people that have

30:03

engaged in these things. I've known them personally.

30:06

These feeble, weak, terrified

30:08

men, and they say

30:10

the most heinous things

30:12

about people, like uncharitable,

30:14

not knowing like what kind of

30:16

response these words are going to

30:18

have in that person. And

30:21

they bully these people because they've been hurt. You

30:23

know, it's that hurt people, hurt people thing. That's

30:26

what it is. They don't think it's

30:28

as bad as bullying, like in real life,

30:30

bullying is terrible. You're going to hit somebody? How dare

30:32

you, you fucking monster? Well, you're emotionally

30:34

scarring people online every day and you

30:36

think you're doing it through this. It's

30:40

like one of the things, Elon's

30:42

talked about this, that one of

30:44

the things that Woke does, it

30:46

allows really mean people. This ideology

30:48

allows really mean, shitty people to

30:50

have a virtuous way of expressing

30:52

that. Yeah, I

30:54

think that's right. And also the internet,

30:58

I mean the whole idea that the internet isn't real, you

31:00

hear it all the time. That's why I hate

31:02

when people say, well, Twitter isn't real life. And

31:05

I understand what's meant by that when people say that, but

31:07

it actually is real life because these

31:10

are human beings who are communicating with each

31:12

other. Now, there are bots too, but if

31:15

you're a human being on Twitter saying something,

31:18

that's real life. It's not fake. This isn't

31:20

happening in some kind of dream world.

31:24

But then people think that, well, okay, if I just say

31:26

this on Twitter, I put it in a YouTube comment section

31:29

and it's this heinous, awful thing. It

31:31

doesn't count. It doesn't mean I'm a bad person because it's not real

31:33

life. Which is like, that's like

31:37

writing on a loose leaf paper, calling someone a

31:39

piece of shit and handing it to them. And

31:42

then they get mad at you and you say, hey man,

31:44

it's the paper, it's not real life. It

31:46

just happened on the paper. It's

31:49

a method for communicating. And

31:51

so I think people have been conditioned

31:54

that in this world, it's like a moral exception.

31:56

So you can do and say whatever you want

31:59

and you don't have to feel bad. about it. And then it

32:01

turns people into sociopaths after a while I think.

32:03

I think it does too. And I also think

32:05

it ramps up anxiety in a huge way for

32:07

the people that are actually engaging in it. You

32:10

know, the people that actually do it, I

32:12

think they're just fully anxious all day long.

32:14

And I think it's terrible for mental health.

32:17

Even if you're like quote unquote winning these

32:19

verbal battles online that you're engaging in, I

32:21

think it's terrible for everybody. It's

32:23

really terrible for the people that are

32:25

just like all day long negative. Like

32:27

there's an arguing with people. Like why

32:29

do you want that in your life?

32:31

That's a very unusual position to be

32:33

in where all day long you're in

32:35

conflict. That's only war. In

32:37

the real world, most of the day,

32:39

there's no conflict. That's why conflict is

32:42

so uncomfortable because it's so unusual. If

32:44

you're used to conflict with people all the time and you see

32:46

some guy and he's like, fuck you. No, fuck you. But

32:49

if you're not used to someone saying, fuck you. And then all of a

32:51

sudden, hey, fuck you. And you're like, what? Like you're

32:53

terrified. You're freaked out. Like what's going on? Oh my

32:56

God, this is conflict. The kind

32:58

of conflict, verbal conflict that people engage

33:00

in online all day long has the

33:02

same sort of effects on your psyche.

33:04

You are perceiving the world to be

33:06

this. This is one of the things

33:08

that's so polarizing about this particular election,

33:10

right? That people are willing to

33:12

accept propaganda because it feeds into

33:15

their view of the world, which

33:17

is that they're engaged in this

33:19

moral battle, good

33:21

versus evil. And both sides think they're good and

33:23

both sides think the other side is going to

33:25

be the end of the world. And

33:28

it's accentuated heavily by mentally

33:31

ill people that are on

33:33

Twitter all day long. Yeah,

33:36

I'm one of them. So

33:38

he's fine. But

33:40

I mean, I am guilty of some of this. I

33:43

do. I'm on it way too much, first of all, but

33:46

I, but then I have my excuse, which is as

33:48

part of my job, your job. I

33:51

do often think if I didn't do this for a living at all, I

33:54

don't think I'd be on any of this stuff. I think

33:56

I'd be off everything. Yeah. If I was

33:58

not a quote unquote public. I would be

34:01

off everything. Because I don't know if you have

34:03

a problem. If I go on vacation or something and I'm

34:05

taking time, I have no issue putting it down. I

34:07

have no compulsion to look at it. In fact, I

34:09

have to, when I come off vacation, it's effort to

34:11

get back, it's like, okay,

34:14

I got to get back into this again. It

34:16

takes me a couple days, and then after a couple days, now it's a compulsion

34:18

again. But I have to

34:20

reignite this weird compulsion to constantly look at

34:23

my phone. I have a

34:25

problem, too, in that I'm a comedian and

34:27

that I'm also a gold miner, right? So

34:29

what that means is when I'm going through my

34:31

newsfeed, my newsfeed is the thing I'm the most

34:34

addicted to. I'm mining for gold. Like, what's going

34:36

on here? What'd they do? They did what? They

34:39

fucking what? And I need those.

34:41

Those are really important to me. Because

34:43

those can be my next hour of

34:45

standup. Those can be, they're chunks. And

34:48

it's not every day. It's like, I can go

34:50

through 30 days of nonsense and just not one

34:52

thing. But every now and then, there's a chunk

34:55

of gold in there. I'm like, oh, I got

34:57

one. And then I put that in my notes

34:59

and I justify endless

35:02

scrolling to get to those gold

35:04

nuggets. But if you didn't do any of this

35:06

for a living, if you just worked at Lowe's or something and you

35:08

know, do you think you'd still be? The

35:10

problem is I'd still be me. And I still have this

35:14

really intense curiosity. I'm

35:17

really curious about all kinds of things. There's so

35:19

many subjects I'm really, really interested in. I

35:22

mean, I would for sure still be paying

35:24

attention to science

35:26

issues and space travel and

35:29

new discoveries in the universe. And there's a

35:31

bunch of stuff that I would just be,

35:34

ancient history, ancient civilizations. I would be, there's

35:36

no way I would not be fascinated by them because they

35:38

almost have nothing to do with my job. Yeah,

35:40

I think I would be the same, but I don't

35:42

think I'd feel the need to, I

35:45

would like to absorb all that interesting information, but I

35:47

wouldn't feel the need to say, hey world, here's what

35:49

I think about this. Right. I would just absorb it.

35:51

The problem is if you do and you do it

35:53

just once and then you get feedback and then people

35:55

say, hey, I really like what you posted. And like,

35:57

oh great. And then all of a sudden you're

35:59

connected. And then you're like looking for this

36:01

feedback so you're trying to post things to get

36:03

likes and you're trying to post things to get

36:05

reposts and get comments and You're engaging in the

36:07

comments and now now it's now you're fucked now.

36:10

You're locked into this weird Ecosystem

36:12

with these people you don't even know they might

36:14

be all stupid They might be all

36:16

you know really annoying people that you would avoid in real life

36:18

like if you work with them You're like go. There's Tom and

36:21

get the fuck out of it, and you go to the other

36:23

side of the office But now

36:25

you're engaging with them people that you

36:27

avoid having conversations with you are now

36:30

Like in mortal combat

36:32

with words on Twitter It's

36:35

fucking stupid and not only that but their

36:37

engagement with you is is cheap. It's uh

36:40

They don't care that much So even

36:42

if someone gives you positive feedback, and they say oh that was

36:44

a great tweet They've forgotten about it

36:47

two seconds later Right you're just they're just scrolling you're

36:49

just the latest thing they saw and then they're scrolling

36:51

and they've already forgotten about it They don't care if

36:54

they cuss you out because they're mad at you same

36:56

deal. They forgot about it two seconds later, so Yeah,

36:59

I guess it could be kind of intoxicating you get

37:01

the engagement, but then it

37:04

doesn't matter and that's one of the That's

37:06

one of the things that makes it so so toxic

37:08

is how sort of like nihilistic all is that's why?

37:12

This was this never was an issue before but now I feel like when

37:14

I go on social media I'm

37:17

constantly seeing these Horrific

37:20

videos of people dying like

37:23

snuff films are

37:25

all over social media now

37:27

and it feels like a relatively recent

37:30

development and That's

37:33

really horrible what it does. I don't even think

37:35

we quite understand what it's doing to our minds

37:37

I actually think we are are all traumatized from

37:39

it I don't use the word trauma loosely, but

37:42

what's what's traumatizing is not only are you seeing somebody die?

37:45

But it's a context. It's like You

37:49

see this horrible video someone just got shot, and

37:51

then you keep scrolling and

37:54

a second later you're Reading

37:56

something about whatever you know a celebrity

37:58

news or you're one watching a cat

38:00

video. Right. So it's like

38:02

this, it's this horrific human thing that happened, but

38:05

for you it's just content, you absorb it that

38:07

way. And

38:09

I don't know, after a while of

38:11

just absorbing human suffering in this way,

38:13

it's gotta mess with your mind.

38:16

Of course it does. I mean,

38:18

you're the product of what you take in, even

38:20

if that information is like low

38:22

impact, it's not the same impact as being there

38:24

when the hit men show up and gun the

38:26

guys down in front of the cafe. I've

38:29

seen these videos where it's just mass

38:31

shootings. This one video I saw

38:33

the other day of some gang violence situation,

38:35

these guys drove by, gunned these guys down,

38:37

and then the guys started shooting back and

38:39

they were all shot while they're shooting back,

38:41

and then the car backs up and then

38:43

they gunned them down more. Yeah, same. It's

38:45

fucking crazy. Same that way. But it's not

38:47

the same as being there. If you were there, that would haunt

38:50

you for the rest of your life. If you were across the

38:52

street and you watched that happen, you watched these people die, it

38:54

would haunt you for the rest of your life. But

38:56

it just, you get a little blip. Instead of getting 100% dose,

38:58

you get a little 1% dose, a

39:01

little 1% dose, and you get them all day long. And by

39:03

the end of the day, you're just like, what the fuck is

39:05

the world? Yeah. But

39:08

it's the thing, it kind of should haunt you for the rest

39:10

of your life. Right. It's a horrible

39:12

thing to see. But it's like Twitter in

39:14

that it's not a full experience. Like

39:17

the full experience, if you were having the kind

39:19

of exchanges that some people have with each other

39:21

where they're just ruthlessly insulting and shitty to people,

39:24

if you were having those in person, there's a

39:26

high probability that that's gonna lead to violence. Actual

39:29

violence. Like if two men are in

39:31

a room and one man starts insulting

39:33

this other person really viciously

39:36

and talking about their life and their family and all

39:38

kinds of crazy shit that people do online, there's

39:41

a probability, it's more than 0% that

39:44

this is gonna result in violence. But

39:47

there's zero possibility of it online.

39:49

It's just a free shot. And

39:52

that's a part of the problem as well, is that it's

39:55

not a real human interaction.

39:58

So you're getting like these little doses. of

40:00

shittiness from people, but you're not getting this one

40:02

burst where you and this guy are about to

40:04

throw down because he's like,

40:06

he's insulting you to the point where like this

40:09

person is actually dangerous. Like this is actually,

40:11

this person hates me. Like this could

40:13

be a real bad situation here. And

40:15

I think much like that

40:17

exists on Twitter where you have these little

40:19

shitty interactions, it's like 1% of real hate

40:22

and it just adds up over time. That's

40:24

the same thing as seeing violence, seeing

40:26

all these executions, seeing all these botched

40:29

robberies, seeing all these people that get

40:31

murdered in some third world

40:33

country. You

40:35

just get a little tiny piece of it

40:38

all the time and it normalizes it. It's

40:41

probably really, really bad for us.

40:45

Do you pay attention to what people say about you online? No. You

40:48

never do that? No. You never

40:51

search for Joe Rogan and never? Nope. Nope.

40:54

Shouldn't do it. Yeah. It's

40:56

not good for you. It's not good for you because I've done it on occasion.

41:00

It's nothing but if you want

41:02

to just destroy your self

41:04

image, you could do it pretty quickly. Well that's what

41:06

they want. That's what people want to do when

41:08

they say things like that. Like this is my opinion. And

41:10

you know, a lot of it is like really out of

41:12

line. Like a lot of it is

41:14

just like the worst possible, like I said before, like

41:17

the least charitable takes, the least

41:19

nuanced, this ridiculous caricature of

41:21

a human being just to

41:24

try to demonize

41:28

them to make yourself look

41:30

virtually or virtuously superior.

41:33

It's just dumb. It's a dumb way for people

41:35

to communicate and the kind of people that do it are

41:38

all losers. There's no like really

41:40

exceptional, fascinating people that engage in that kind

41:42

of stuff. Well the thing that gets me,

41:44

I don't mind, when people insult

41:46

me, I don't care. I'm

41:49

used to it. It's the lies.

41:51

Like when I see something about myself that's

41:54

just a straight up lie. Totally

41:56

made up. And then it picks up

41:58

traction and people are sometimes. It could be

42:00

someone photoshopping a tweet that I never said

42:03

or whatever, anything. That

42:06

stuff still

42:08

bothers me. Then

42:11

I try to tell myself, it shouldn't bother me. But at the same

42:13

time, it should. It's normal for a

42:15

person to be bothered when you're

42:18

being lied about and other people are believing

42:20

a lie. I think it's a normal human

42:22

reaction that I'm like, I don't want... That's

42:24

not fair. It's not true. You

42:26

could attack me for things that I really have said and done. You

42:29

can't do that. That's not true. But

42:33

then at a certain point, you just have to sort of give

42:35

into it and realize this is the way the internet

42:37

works, I guess. Well,

42:39

it's also who knows who's doing it. And at

42:42

this point in time, we have to accept the

42:44

reality of propaganda. And

42:46

that there... We've talked about

42:48

this ad nauseam, but I'll say it again.

42:50

There was an FBI, a former analyst did

42:53

some sort of a study on Twitter where

42:56

he was estimating the amount of bots

42:58

versus... This is right around the time when Elon was

43:00

saying that it's more than 5%. He

43:02

said he thinks it's about 80%. He thinks

43:05

80% of the accounts. Yeah, 80% of the

43:07

accounts are fake accounts, which

43:09

just stop and think about if you're

43:11

in a country. Let's

43:14

imagine you want the politics of

43:16

America to swing in a certain

43:19

direction because we most certainly

43:21

do this in other countries. We

43:23

don't have to educate people on

43:25

the long history of interventionalist foreign

43:28

policy where we have gone in

43:30

and installed new leaders

43:32

of countries and organized

43:35

all kinds of shit. So we

43:37

do it, and we do it, and we know they do

43:39

it. But isn't it like the cheapest way to do it?

43:41

Wouldn't it be to do it on social media? And

43:43

if you did it, why would you do it like one account?

43:45

Why wouldn't you have a million accounts? I would have a million

43:48

accounts. Like you just got to get a computer

43:50

that keeps making new accounts. When

43:52

you run a program, it's not the most difficult thing

43:54

to do. For people that know how to actually code

43:56

operating systems, you don't think there's someone out there that

43:58

can code a computer program. that can

44:01

operate millions of different Twitter accounts

44:03

and you run it through some

44:05

sort of AI

44:07

that you've developed some large language model on

44:09

things to say about MAGA or things to

44:11

say about abortion or things to say about

44:13

Conservatives or things to say about liberals and

44:16

you put a fucking American flag and you're

44:18

a little bio and or you put a

44:20

pronoun thing he her zeezer whatever

44:22

it is and then you just

44:25

flood the internet with fake anger

44:27

and fake discourse and

44:29

you lie about people and you

44:32

Anytime there's a post about anything controversial you

44:34

insert something in there that gets people even

44:36

more riled up You could

44:38

get people you could swing the vote you could

44:40

swing the vote in one way or another especially

44:42

with fence sitters With people that are

44:44

not sure like I don't know is Trump really

44:46

the answer and then you get online and you

44:49

see all this hateful shit or you

44:51

might get on a MAGA forum you know the There

44:54

are they are eating cats He

44:56

was telling the truth ABC's biased and you could

44:58

swing it one way or the other and I

45:00

think they're all trying to manipulate it all

45:03

these foreign Governments and I think internally in

45:05

the United States I'm sure there are groups

45:07

that are doing it too that are manipulating

45:09

things in one way or the other in

45:12

a disingenuous way because it's Available and

45:15

I don't know how to stop it. I think the only

45:17

way for you to not personally be

45:19

really Affected

45:22

by it is you have to understand that it

45:24

exists and then you have to Recognize

45:27

that you know some of these takes are not

45:29

even real human beings So instead of saying Jesus

45:31

Christ people who think that way go Maybe

45:34

not like maybe this isn't maybe there's a few

45:36

people to think that way But you're being led

45:38

to believe that it's a huge movement of people

45:40

when it might not be but the problem is

45:42

when it Even if it's

45:44

fake people are so stupid that even if

45:46

it's a fake thing That becomes

45:48

a bit of a movement online with fake

45:50

dumb people will jump in there and then

45:53

it'll become a real thing Yeah,

45:55

like you're aware of the the

45:57

free bleeding movement that Chan

46:00

pushed. Yeah, I

46:02

think I heard of that. It became kind of

46:04

real, didn't it? It became real, that's what I'm

46:07

saying. Or flat earth is the same thing. It

46:09

became a joke if people were fucking around at

46:11

first. We've known the earth has not been flat

46:13

for a long ass time and

46:15

then... But now that's totally real, right? Now

46:17

it's totally real. Now there's massive groups of

46:19

people that think the earth is flat. Which

46:21

isn't... I can't. I can't. I don't

46:24

know how that... Yeah. Yeah, you can't. But

46:26

the thing is, that's how dumb people are.

46:29

That you can have a fake thing and

46:31

say it enough times and enough people jump

46:33

in and be on board with it

46:35

and then it becomes a real thing. And then you don't

46:37

even have to use propaganda anymore.

46:39

These morons are doing it for you. The

46:42

thing that gets me about the flat earth thing is... Because

46:44

I didn't realize that it was a real thing until, I don't know, a

46:46

few years ago. I did. I

46:49

posted something about it. And

46:51

all these comments from real

46:54

people that... What gets

46:56

me is... Well, you have the people that say, yeah, I think

46:58

the earth's flat. And that's... You're just really stupid. But

47:01

I was more fascinated by the

47:03

80% of people who...

47:06

80% of the flat earth crowd.

47:08

80% of them, their take was, well, I'm not saying

47:14

the earth is flat, but I'm

47:17

open to it. I'm open to the

47:19

possibility. I get it if

47:22

you're just completely stupid and you got sucked into

47:24

this cult thing. But what I

47:26

don't get is how can you be on the

47:28

fence about the shape of the earth?

47:31

Well, it's just people that really are not

47:33

educated. That's number one.

47:35

And people that believe that there's

47:38

a collusion that's so large that

47:40

all of the space agencies from

47:42

Japan, from China, from Russia, all

47:45

of them are liars. That

47:47

all of them are colluding together to hide

47:50

the true shape of the earth. Because if

47:52

we really knew the earth is flat, then

47:54

it always is connected

47:57

to some sort of a Bible thing. it's

48:00

the firmament and they believe that we're

48:02

hiding the fact that God is real

48:04

and somehow there's some mass

48:07

conspiracy that all these world governments and

48:09

every person that ever was involved in

48:11

the space agencies, they've all hid from

48:13

us. Yeah. And

48:17

the moon landing, you're not a... you

48:19

believe in the moon landing, right? I used to believe in the

48:21

moon landing. You don't anymore? I had a joke in my act

48:23

about it that before COVID, I would have told you vaccines are

48:25

the most important invention in human history and after COVID, I'm like,

48:28

I don't think we went to the moon. Yeah,

48:30

I know that was in your act, but you actually think

48:32

that. I think there is a less than zero possibility that

48:34

we did not go to the moon. Oh my gosh.

48:37

I know. Why do you think we went to

48:39

the moon? Because it's exactly what you just said about... there's

48:42

a lot of reasons, but the

48:44

main thing is what you just said about the earth. The

48:48

vastness of the conspiracy that would be

48:50

required to fake that, it's

48:52

so vast that it's just... it's

48:55

a lot more incredible to believe that we faked it

48:57

than to believe that we just went... and go into

48:59

the moon, don't get... it's a massive achievement. But

49:04

I think the greatest human achievement of all time. But

49:07

even so, to fake it, would he be

49:09

even more massive? Because

49:12

not only would you need all of

49:14

these space agencies and all the different

49:16

whatever people and American institutions to be

49:18

colluding, but you'd also need foreign

49:20

governments, including adversarial foreign governments, who

49:23

at this point certainly would know we faked it and for

49:26

some reason haven't blown the lid on it. So

49:28

they're letting us take this achievement that

49:30

they know... why haven't the Russians come

49:33

out and say... All those things you're saying

49:35

are true. I don't argue with any of the things

49:37

you're saying. But one of the things that

49:39

I think you have to consider is if

49:41

it's not possible for human beings

49:44

to safely go through the Van

49:46

Allen radiation belts and out into

49:48

deep space without much protection and

49:50

face the temperatures that are on the surface of

49:52

the moon, which get up to 250 degrees and

49:54

250 degrees below zero in the shadows. There's

50:00

no environment there. It's hostile,

50:02

beyond belief. Micro meteorites

50:04

are flying into the moon all the time. They're

50:07

flying through space all the time. We've

50:09

never had a single biological

50:11

organism go out into deep space,

50:13

pass the Van Allen radiation belts, and then

50:16

come back to Earth and come back alive,

50:18

except human beings during the

50:20

Apollo missions. Every single

50:22

space station mission, every single space

50:24

shuttle mission, all of them are inside

50:26

350 miles from the Earth's surface. The

50:30

only time human beings have ever been past

50:32

that and through the Van Allen radiation belts

50:34

was the Apollo missions. We

50:36

were the only humans that were ever able to

50:38

do that. The Russians never figured out how to

50:40

do it. No one else figured out

50:42

how to do it, but the Apollo astronauts. We

50:44

did it seven times, six successfully, from 1969 to

50:47

1972. If

50:51

you said to me, do you think that they

50:53

could fake the moon landing today, I would

50:55

say no. I would say no, no, no, no. People

50:57

are going to be able to track it. It's

51:00

very easy. They have satellites.

51:03

They're going to know everything. But

51:05

in 1969, the technology was so

51:07

crude that when they first showed

51:09

the Apollo 11 landing, they

51:12

didn't even show a direct feed to the

51:14

networks. So like if you're on CBS News,

51:17

you don't get a direct feed. What you

51:19

do is you point a camera at a

51:21

projection screen. So that's why the film looks

51:23

so shitty. The camera is pointed to a

51:25

projection screen where you see the astronauts jumping

51:27

around on the moon. And you

51:30

see this weird, grainy

51:33

third generation image, right?

51:37

And we did it, and we have never done it since.

51:39

And we've always said we're going to do it, and

51:41

no one's ever even come close. No one's ever

51:43

even gone into deep space since 1972. We

51:46

also haven't been trying. We haven't been trying. But

51:49

we always talk about going back, including Herbert

51:51

Walker Bush talked about going back. George W.

51:53

talked about going back. They all talk about

51:56

going back, but nobody ever gets anywhere. Well,

51:58

I think that's because we lost

52:00

the spirit and

52:02

hunger for discovery. We didn't just lose that. We

52:04

lost all the technology from the Saturn V rocket.

52:07

They don't even have that anymore. In fact, they

52:09

don't even have the original film. They

52:11

erased all the original footage of the Apollo

52:13

missions. So you just have copies of everything.

52:15

You could develop the technology again. You can

52:17

do all that. Sure you could. If

52:20

you can get through the Van Allen radiation

52:22

belts into deep space with human beings and

52:24

have them safely come back. But I think

52:26

what you're describing to me, all that does

52:29

is highlight how incredible the achievement was. If

52:32

they did it. Right. If they did

52:34

it. Well, here's the main point. There's

52:37

no evidence because saying

52:40

that it was a hoax is

52:42

an assertion of it's not just

52:45

denying an event. You're asserting a whole

52:47

other event that you say happened instead.

52:50

And there is evidence that we

52:53

went to the moon. Now someone

52:55

who's a skeptic might say, it's not enough

52:57

evidence or it's not good evidence. There's like

52:59

evidence. There's eyewitnesses. There's people that went and

53:01

came back and told us. There's

53:04

footage. There's a lot. There is

53:06

evidence. But there's no evidence of the

53:08

hoax. No one has come and said,

53:10

here's my affirmative evidence that

53:12

this hoax happened. It's

53:14

never happened as

53:17

far as I'm aware. No one's ever provided that evidence. I

53:19

see what you're trying to say. The

53:22

evidence that they went to the moon,

53:24

there's a bunch. There's moon rocks. That's

53:27

one. There's lunar reflectors that they placed

53:29

on the moon. That's another. And

53:32

there's a couple of problems with those. First

53:34

of all, the Soviets put laser

53:38

reflectors on the moon as well. And

53:40

also, the moon itself, in many places where

53:42

you shine lasers on it, it bounces back

53:45

by itself. The reflective quality of the moon,

53:47

the reason why the moon is so

53:49

bright and white in the sky when the sun hits it, there's

53:51

a certain amount of, you get a

53:53

certain amount of bounce back off of different things

53:56

with lasers. There's

53:58

some photographs that are interesting. What

54:00

was it was the India? What was the

54:02

one where they got the most high resolution

54:05

photos of the lander? Wikipedia

54:07

right now of all third-party evidence

54:09

of the Apollo mission one

54:12

of things that's interesting is they gave a moon rock

54:14

to Was it

54:16

a prime minister of Holland is that what it was

54:18

which one was the moon rock they gave it turned

54:20

out to be petrified wood So

54:22

the Apollo astronauts gave a moon

54:24

rock to some foreign dignitary and

54:27

it turned out to be a piece of petrified

54:29

wood They

54:31

do have samples of moon

54:34

rocks that came from the moon But

54:36

we also have those on earth in fact, we're on

54:38

brawn in 1968

54:41

I believe went to an artica. There's all

54:43

these photographs of him in Antarctica Antarctica is

54:45

a great place to take moon rocks because

54:48

Antarctica is just just gigantic sheet of white

54:50

and you can spot the meteorites in the

54:52

ground. So this is the photo And

54:55

this is from what what is this from? What

54:58

is the So

55:00

this is an India India Indian space

55:03

research organizations I don't know how to

55:05

say that word Shandrayaan

55:07

to orbiter captured

55:09

images of NASA's Apollo 11 and

55:12

12 landing sites and lunar modules

55:14

from a hundred kilometer altitude Apollo

55:17

12 image astronaut boots tracks are still

55:19

even visible due to the recent interest in

55:21

another post I shared decided

55:23

download out of view the raw

55:25

imagery So that looks like there's

55:28

some kind of thing on the

55:30

moon. It's pretty good evidence. It is

55:32

evidence of something's on the moon It's not evidence of

55:34

human beings went through the moon. See

55:36

we have things that are on the moon. We

55:38

have things on Mars right now Well, I think

55:40

it's things that were we would shot things into

55:42

space for sure Yeah, but it's evidence. It's

55:45

not proof in and of itself, but it is

55:47

evidence Listen, I'm not saying we didn't go to

55:49

the moon What I'm saying is the subject is

55:52

complex and it's not even a little complex. It's

55:54

really complex There's a documentary called the

55:56

funny thing happened on the way to the moon this

55:58

guy Bart Sabrel. He's been obsessed He was

56:00

a yes on the show too. Been obsessed about this his

56:02

whole life and absolutely believes that we never went to the

56:04

moon. And there's

56:07

enough shit that you go, okay, if he's

56:09

right about any of these things, it's weird.

56:12

One of the things was some of the photographs of the

56:14

moon, they ran through one of those AI

56:17

detectors that can tell you whether or

56:19

not something's false or artificially generated. And

56:23

it showed different images from, I think it

56:25

was a Chinese satellite of the moon. They

56:27

said this is legitimate, but then it got

56:29

to these Apollo images and they said these

56:31

have been doctored. This AI program, they ran...

56:33

Who's a Pussa? This AI program that they ran... All the images

56:35

or a few of them? A few of them. But

56:38

then other ones were found to be authentic. I don't think so.

56:40

I think they only ran a few images through. See

56:42

if you can find those Jamie. Find what they

56:44

did. Again, this is not saying that we didn't

56:47

go to the moon. It could be. And

56:49

this was a fact with the Gemini 15 program

56:52

where Michael Collins, there was a

56:54

photograph of Michael Collins that they

56:56

took in one

56:58

of his training exercises where he

57:01

had those packs

57:03

that they put on where they can move around while they're doing

57:05

moonwalks, not

57:08

moonwalks, spacewalks outside of where they're connected

57:10

by a tether. And

57:12

he was in this harness and

57:14

manipulating this device. And what they

57:16

had done is taken a photograph

57:18

of him training and then someone,

57:21

probably some overzealous PR person, had

57:23

taken that photograph and then blacked

57:25

out the background and tried

57:28

to pass it off as a

57:30

really clear photograph of him out

57:33

there on a spacewalk, which is probably very difficult

57:35

to get. You'd have to have another person at

57:37

the camera frame it right. They had this photo.

57:39

They're like, look, he did it. Let's just

57:41

pass this off as the real thing. Which

57:43

is, you're also talking about the Nixon administration

57:46

where they were just full of shit constantly.

57:49

Yes. Yeah.

57:53

So there's different video where they

57:55

ran it through and they

57:57

said it was real and it was that it was a

58:00

the Chinese program, but

58:02

when they ran the American ones, the American images,

58:04

they said that they were doctored. Again,

58:06

it doesn't mean that we didn't

58:09

go to the moon, but it does mean, okay,

58:11

there's that. That's weird. Have

58:13

you ever seen the Apollo 11 post-flight press conference? Yeah.

58:16

It looks like a hostage video. It looks like

58:18

a bunch of guys who don't want to be

58:20

there. They look real fucking nervous and they look

58:23

real deceptive. If you watch that video, it's weird.

58:26

But I think that's just, that's the temperament it requires

58:28

to do something like that. It's crazy. It's

58:31

almost suicidal to go to the moon. You

58:34

have to be barely even a

58:36

human psychologically to

58:38

do it. That to

58:40

me is just like, and they just went through this whole experience, but

58:42

who knows what that does to the human psyche? To

58:46

even just be in the vastness of

58:48

space, even on the space station, I

58:50

feel like that would change me as

58:52

a person. Not necessarily for

58:54

the best. There's actually a psychological condition

58:56

that they talk about, this sort of

58:58

understanding that we're all connected. It's

59:02

akin to a religious experience that many astronauts get

59:04

when they go up to the space station and

59:06

look down at the earth and go, oh my

59:08

God, what are we doing? We're all together in

59:11

this thing and we're so alone in the universe.

59:13

For us to be fighting over these trivial differences

59:15

and these stupid lines in the dirt that we

59:17

draw, when we are just clinging

59:19

to this ball in the middle of everything. So

59:23

then what would you say, or someone who is a

59:26

full on believer? Not a full on believer.

59:28

Someone who is a full on believer in the moon hoax.

59:31

What would they say to my

59:33

other point that there

59:36

is evidence we went to the moon, you can try to nitpick

59:39

the evidence. There is zero evidence

59:41

of a hoax because that's a whole other

59:43

event that would have had to have happened.

59:46

There is no evidence at all, not one

59:48

sliver of evidence ever of that hoax having

59:50

ever happened. But

59:53

I think that's a weird way to frame it. Is

59:55

there evidence of a hoax of

59:57

the JFK assassination that... Harvey

1:00:00

Oswald acted alone? Do you think there's evidence? Well,

1:00:03

but the event itself being that

1:00:05

JFK was killed happened. Right,

1:00:08

but that's not the conspiracy. So the

1:00:10

conspiracy is, did he act alone? And

1:00:13

is there evidence that he didn't act alone? What do you think?

1:00:18

I'm very skeptical that he acted alone, yeah.

1:00:20

Right. But I don't know what happened.

1:00:22

I don't know exactly what happened, nobody does. Exactly. Same

1:00:25

exact perspective. Same exact perspective

1:00:27

about this moon thing. It

1:00:30

may have happened, but this

1:00:32

was a time of deep deception in

1:00:34

the American world. This is a time

1:00:36

after Operation Northwoods, this is a time

1:00:38

after the Kennedy assassination, this is a

1:00:40

time, I mean, this is

1:00:43

a weird fucking shaky time in terms

1:00:45

of propaganda. This is after Eisenhower warned

1:00:47

about the military industrial complex. This is

1:00:49

like, there was a lot of deception.

1:00:51

Gulf of Tonkin incident. There's a lot

1:00:53

of open deception that the American

1:00:55

people were being subject to. And then there's this

1:00:57

Cold War between us and Russia. This

1:01:00

space war for superiority. We

1:01:03

wanted it so bad, we brought in Nazis. But

1:01:06

I think, I mean, the JFK is an

1:01:08

interesting example. Because

1:01:10

yes, there are things that I'm skeptical of that are claimed that

1:01:12

I don't really have evidence that the thing didn't happen, or that

1:01:15

it didn't happen the way they say, but I'm still skeptical. So

1:01:17

I get that. But it

1:01:19

feels different to me because the JFK

1:01:21

assassination did happen. The question is,

1:01:23

how did it happen? But

1:01:25

if we're going to assert that a

1:01:28

major historical event and probably

1:01:30

the greatest, the most significant historical event in history over one

1:01:32

of them did not happen at all,

1:01:35

no one did it, then like I

1:01:37

said, so what you're actually

1:01:39

claiming is that some other thing, they

1:01:42

went somewhere and they pulled off this hoax and they planned

1:01:44

it and they did, like an event happened where they were

1:01:46

faking it. And so

1:01:48

what I would want to see,

1:01:51

has anybody come out, any whistleblower ever

1:01:53

to say, hey, I was involved in

1:01:55

the shoot or I'm in Hollywood,

1:01:57

I talked to a guy who was there. Well,

1:02:00

that's not even evidence. Real

1:02:03

evidence would be some sort of documentation, some

1:02:05

sort of a way to go

1:02:07

over like there's a

1:02:12

binary code that shows the distance between

1:02:14

the Earth and the lunar module at

1:02:16

every stage of the journey. But

1:02:19

that's missing. That stuff's

1:02:22

missing. All the tracking data, they can't find

1:02:24

it. All the original footage

1:02:26

is missing. And that could just be people who

1:02:28

are really bad with historical items. That's possible. But

1:02:32

to say that faking

1:02:34

the moon landing would be a bigger

1:02:36

achievement than actually going to

1:02:38

the moon, I would say

1:02:41

only if people could actually go to the moon. So

1:02:43

here's the question. Can we really, everyone

1:02:46

wants to dismiss it, can we really

1:02:48

send a biological entity into space, go

1:02:50

through that radiation, which is thick,

1:02:54

covering the Earth and have it come back alive? Well,

1:02:58

supposedly, this is

1:03:00

the only time people had done it. And supposedly,

1:03:02

the way they did it was by going through

1:03:04

the top area of the Earth where

1:03:08

the Van Allen radiation belts, it's like kind

1:03:10

of like a doughnut that covers the Earth.

1:03:13

It's not uniform. And there's

1:03:15

an area at the top where you can go out. But

1:03:18

according to Bartz and Braille, they didn't go that way

1:03:20

because you would have had to launch from Antarctica to

1:03:22

do that. It's not really possible that that happened, that

1:03:24

they went that way. So

1:03:26

he thinks that if they did go through that,

1:03:29

there is no other examples of living things

1:03:34

that have done that and come back alive. And they've

1:03:36

known that this is an issue. They've known that this

1:03:39

Van Allen radiation belt, which is this

1:03:42

band of heavy radiation that covers the

1:03:44

Earth and protects us, they've

1:03:46

known that it's out there because they tried to blow

1:03:48

it up once. There was a

1:03:50

thing called Operation Starfish Prime where they launched

1:03:54

one of several nuclear bombs

1:03:57

into the radiation belt to try to blow

1:03:59

a hole through. it and

1:04:01

unfortunately that happened 60 67 maybe

1:04:03

was starfish prime

1:04:05

but it did the opposite

1:04:11

effect why did they do that just for the

1:04:13

wanted to see what happens and giggles well they

1:04:15

were they had so much power and you know

1:04:17

you've got nuclear bombs and you can't blow people

1:04:19

up but you're still doing studies so they're doing

1:04:21

tests all throughout Nevada and they I mean that's

1:04:23

what killed John Wayne John Wayne got cancer because

1:04:25

he was working on a set doing a Western

1:04:28

right next to where they were blowing up nuclear bombs the

1:04:30

like 200 people on his on

1:04:33

his on the set got cancer starfish prime

1:04:35

high altitude nuclear test conducted by the United

1:04:37

States a joint effort of the atomic energy

1:04:39

Commission and the Defense Atomic Support Agency

1:04:41

July 9th 1962 so this is like while Kennedy

1:04:46

was in office they were trying to figure

1:04:48

out how to we will get to the

1:04:51

moon not in this decade but in the

1:04:53

other or whatever he said high altitude

1:04:55

nuclear test so the thing you did

1:04:58

unfortunately was it supercharged

1:05:00

the bands and it

1:05:02

made it much have much more radiation

1:05:05

not only that it it blew

1:05:07

out power in some parts of Hawaii

1:05:09

I think I think it cooked

1:05:11

it cooked a few satellites right we

1:05:13

talked about this the other day it cooked a few

1:05:16

satellites okay so can I ask this though do

1:05:18

we have examples we're

1:05:21

saying well we don't know if a human can go

1:05:23

through the right I would

1:05:25

say well we do know because they did if they

1:05:27

did well but

1:05:29

if we're gonna we know they sent people into

1:05:31

near-earth orbit that's a fact we know that well

1:05:34

how do we know that if we don't know

1:05:36

I mean maybe this because we actually can see

1:05:38

that we can see where they launched

1:05:40

you can follow the trajectory you can know

1:05:42

about the propulsion units that they use you

1:05:44

know about what they were trying to accomplish

1:05:46

and you could watch it so my question

1:05:48

is have we tried to

1:05:51

send humans through the radiation belt and

1:05:53

not been able to has that

1:05:56

happened they never even tried that they

1:05:58

just did it Right.

1:06:01

That's what's even crazier. But how do we know they did

1:06:03

it? If they

1:06:05

only time they did it, the last time they did it was

1:06:07

1972. You

1:06:10

don't think that's a little weird? Not

1:06:12

really. No, no, no, listen, listen. Even

1:06:14

if they did go to the moon, let's say, I'll

1:06:16

say they went to the moon, it's fucking weird. Everything

1:06:20

from 1969 is easier, cheaper, and

1:06:22

faster to reproduce today except

1:06:24

the moon landing, except space travel. I just don't think

1:06:26

there's a will. But we thought, what? I don't

1:06:29

think there's a will. How much fucking resources there is

1:06:31

on the moon? Do you know how many valuable minerals

1:06:33

are on the moon? And trillions of dollars of things

1:06:35

that are very difficult to find in the United States

1:06:37

are on the moon? I don't think the American ...

1:06:39

The people, in the

1:06:42

1960s, the American people cared deeply about

1:06:44

going to the moon. I don't think

1:06:46

that these days, I think we should

1:06:48

care about that, but most people don't care about ... If

1:06:51

we found out that we didn't have to

1:06:53

dig for lithium, that we

1:06:55

could just go to the moon and pull giant

1:06:58

chunks of it out and not have slave labor

1:07:00

and no one has to feel bad about using

1:07:02

your iPhone. You don't think that

1:07:04

they would do that? Of course they would

1:07:06

do that, if they could. If you could have a

1:07:08

mining station on the moon, no problem at all, totally

1:07:11

safe, of course they would do that.

1:07:13

Yeah. And I think ... It only takes two weeks

1:07:16

to get there. People mine in

1:07:18

northern territories, like people in mine

1:07:20

in Canada in these horrible conditions, fucking

1:07:22

freezing cold out. It

1:07:24

probably takes a lot of time to get

1:07:27

to a point where you can do that

1:07:29

consistently. Right, but it's so valuable. The

1:07:31

idea that they wouldn't do that and they haven't done

1:07:33

anything even remotely close to that since 1972 is weird.

1:07:38

I agree that it's ... I

1:07:40

mean, weird makes it sound necessarily nefarious.

1:07:42

I think it's ... No,

1:07:44

just weird. It's very

1:07:47

unusual. It's unusual technologically. It's

1:07:49

incongruent. It's incongruent with technological

1:07:51

progression. We

1:07:53

have that with everything else. Everything else. Phones

1:07:55

are in your fucking pocket now and they

1:07:58

have more computing power than the entire cluster

1:08:00

that they used to launch

1:08:02

the Apollo program. The

1:08:04

Apollo program was a fucking giant room full

1:08:07

of computers. Your phone is significantly

1:08:09

more powerful than that. Everything

1:08:11

else got better, except that. We thought that people were going

1:08:13

to be going to space all the time. You ever watch

1:08:15

that TV show Space 1999 when you were a kid? You're

1:08:19

younger than me. There was a stupid show called Space

1:08:21

1999 and they thought, boy, by 1999 we'll

1:08:24

be flying around spaceships and people will be living on

1:08:26

the moon. Every time

1:08:28

they've done in the past, after the

1:08:31

moon landings, every time they did any

1:08:33

sort of science fiction movie, it always

1:08:36

involved colonies already established on the moon

1:08:38

and on Mars and people traveling.

1:08:40

Because we thought that was going to

1:08:42

happen. Orville and Wilbur Wright, right? Think

1:08:45

about the launch of the first

1:08:47

airplane and then the launch of

1:08:49

the Apollo program. It's only like

1:08:51

60 years. It's

1:08:53

kind of crazy. The launch

1:08:55

of the first airplane ever

1:08:58

and dropping nuclear bombs out

1:09:00

of an airplane is only like, what is it,

1:09:02

50 years? I think it's something

1:09:04

like 50, something kooky. Yeah. 50,

1:09:07

60 years. That's nuts. I agree.

1:09:09

But then now you have supersonic jets, like 100 years later. Now

1:09:12

you have insane capabilities of

1:09:15

like Air Force fighter jets.

1:09:17

Unbelievable power and maneuverability

1:09:19

far beyond anything anybody would

1:09:22

have possibly imagined when Orville

1:09:24

and Wilbur had that stupid

1:09:27

fucking bird looking flimsy thing.

1:09:29

So everything progresses technologically, except

1:09:32

... Well, but

1:09:34

here's what I would say to that. Except traveling

1:09:36

to other planets. I would say two things. Number

1:09:38

one, I think that it just, it does

1:09:40

take ... Yeah, we're kind

1:09:43

of spoiled by the fact that there was

1:09:45

this burst of incredible technological advancement. In everything.

1:09:48

Right. In automobiles. It doesn't necessarily

1:09:50

... Not every facet of technology is going

1:09:52

to continue at that pace forever into an

1:09:55

infinity. So I think it does take,

1:09:58

especially if you take a historical perspective, a longer term

1:10:00

historical perspective, it just takes a while to get from one

1:10:02

thing to the next. It hasn't even been that long. I

1:10:04

mean, 1969 was not that long ago from the historical perspective.

1:10:11

And especially if you want to do the next thing, I mean,

1:10:14

what's the next thing? The next thing is to go to Mars.

1:10:17

Most people agree. That's

1:10:19

so much far exponentially farther away

1:10:21

and harder to do. And

1:10:24

so if it takes decades more to figure out

1:10:26

how to do that, that doesn't seem that crazy

1:10:28

to me. And the second thing I'll say

1:10:30

is that I do think, I get

1:10:32

your point about resources on the moon, there's a reason to go back.

1:10:34

I agree, practically

1:10:37

speaking. But it's

1:10:39

just true that it requires a

1:10:42

society that deeply values

1:10:45

exploration for its own sake and is willing

1:10:47

to make the sacrifices, is willing

1:10:49

to send people

1:10:51

off to do things just for the sake of

1:10:54

exploration, knowing that they might die. I think we

1:10:56

have almost no appetite for that

1:10:58

now. Maybe

1:11:00

the Challenger explosion was, you could

1:11:03

point to that as the time when we sort of

1:11:05

just, we have no appetite for people. We

1:11:07

don't want people to die for this anymore. I see what you're saying. Here's

1:11:09

the problem with what you're saying. The American people don't

1:11:12

get to say, and whether or not we do things,

1:11:14

like they don't get to say, and whether or not we make

1:11:16

a space shuttle. They don't get

1:11:19

to decide whether or not we establish a

1:11:21

new space station. No one

1:11:23

talks about it. They just do it. We

1:11:25

barely get a say in how much money goes to Ukraine.

1:11:28

Yeah, but it's got to be funded. Right, but how much is

1:11:30

funded to go to Ukraine? All of a sudden, they

1:11:32

had $175 billion plus to fund this proxy war. Who

1:11:39

decided that? It wasn't the American people.

1:11:42

It wasn't, but politicians are the ones who

1:11:44

decided. People vote for those politicians. Unfortunately, there

1:11:46

are a lot of Americans who are basically

1:11:49

okay with sending money to Ukraine, which they

1:11:51

shouldn't be. It's insane. I agree with you.

1:11:53

The point is, is that if you

1:11:55

had a skillful politician who got on

1:11:57

television and explained that we have found

1:14:00

terrified of getting Under

1:14:03

the sights of the intelligence agencies And if

1:14:05

you have top-secret clearance if you're involved in

1:14:07

some sort of a project like look at

1:14:09

the Manhattan project People kept their fucking mouth

1:14:11

shut they they knew they had

1:14:14

they were working on something of Importance that

1:14:16

was above and beyond their need to yap

1:14:18

about shit, but you but for the moon

1:14:20

landing you would need way more people involved

1:14:22

I don't know more institutions because you actually

1:14:24

have a real space program so the space

1:14:26

program is not fake right so let's just

1:14:28

assume I'm a non-believer. I

1:14:30

would tell you that the space

1:14:33

program was absolutely real the Saturn

1:14:35

5 rocket was absolutely real the

1:14:38

Modules though the way they were able to parachute

1:14:40

down into the ocean a hundred percent real they

1:14:42

did go into space But how far

1:14:44

did they go this is the real question and Bart

1:14:48

Sibrel the guy who made this

1:14:50

documentary he asserts that they went

1:14:52

somewhere into Earth's orbit Like

1:14:55

you know in space But not through

1:14:57

the van island radiation belt radiation belts

1:14:59

and not to the surface of the

1:15:01

moon and back and that they had

1:15:03

video footage that they had done in

1:15:06

some Scenario

1:15:08

some people think it's in the Nevada desert

1:15:10

who knows what it is But they had this footage of

1:15:13

people bounce it around and they said they got it on

1:15:15

the moon And then they brought this back does

1:15:17

yet, but does he have any evidence of? That

1:15:21

event occurring what would he say

1:15:23

well? I know they only went so far and came

1:15:25

back because I because of this well he

1:15:27

has a bunch of different things and one

1:15:30

of them is the one that's like very

1:15:32

hotly debated and it's the Different

1:15:35

light sources in the photographs so a lot of

1:15:37

the photographs from the surface of the moon have

1:15:39

intersecting light Intersecting shadows

1:15:42

so you have a shadow that's going this way and another shadow

1:15:44

that's going that way indicating more than one Light

1:15:47

source or a close by light source. That's

1:15:49

you know coming in not something That's you

1:15:51

know thousands millions of miles away like the

1:15:53

like the Sun There's

1:15:55

those there's the photographs. There's the photographs that

1:15:57

run through AI he has this other video

1:16:00

of what looks like them filming

1:16:02

the earth through one of the

1:16:04

round portal windows with everything blacked

1:16:06

out in the cabin and

1:16:09

then they pull down the things that

1:16:11

were blocking off the other light sources

1:16:13

and the cabin floods with light and

1:16:15

it looks like they're in near-earth orbit

1:16:17

and it's very confusing because you're like

1:16:19

well what is that video what exactly

1:16:21

is going on there because if they

1:16:23

really are in deep space and they

1:16:25

really are filming this small image of

1:16:27

the earth because that's all they can

1:16:30

see from 200,000 miles out well why

1:16:34

when they take those things down does it

1:16:37

look like the whole cabin is filled with

1:16:39

light why is it why does it look

1:16:41

exactly like they're in near-earth orbit that

1:16:43

but that still goes back ever seen it the

1:16:47

specific you want to see it sure for

1:16:49

shits and giggles yeah because we're in

1:16:51

the middle of this stupid conversation it's a fun one

1:16:54

it's one of the most fun of all

1:16:56

conspiracy theories because if they did it

1:16:59

wow first of all if

1:17:02

they killed the president wow and they it seems

1:17:04

like they kind of did that so if

1:17:06

they did this too like what else

1:17:09

did they do like what other hoaxes were played

1:17:11

on on the American people if this is real

1:17:14

that's why it's fun I'm not saying it's real but

1:17:17

it is is a fun one it's

1:17:19

not as simple as the earth is

1:17:21

flat that's a stupid one but this

1:17:23

is a fun one this is a

1:17:25

fun one because you're you're dealing with

1:17:27

the kind of power with complete control

1:17:29

over the media complete control over newspapers

1:17:31

and what they reported the interest of

1:17:34

you know national security the

1:17:36

Cold War with Russia the space

1:17:38

war with Russia we

1:17:40

wanted it so bad we brought in some of

1:17:42

the most heinous human beings that

1:17:44

have ever lived to run our

1:17:46

NASA program yeah it's not as

1:17:49

it's not as dumb as flat earth but it

1:17:51

does remind to me it reminds me of to

1:17:53

me it's in the vein of like Sandy

1:17:56

Hook was a hoax no no

1:17:58

no that's heinous Let's not morally

1:18:00

not more. Let me show you the video. Let me show

1:18:02

you the video Jamie you got that Funny

1:18:05

thing happened on the way to the moon. No, but I

1:18:07

can't is he hiding it. I know

1:18:10

it's available. Sure sure I I'm

1:18:13

not even I have to figure out exactly what the videos I'm

1:18:15

looking for Proof I

1:18:17

know I'm trying to I'm digging through I pick

1:18:19

a video. I try to find it's not in there I have to

1:18:21

find another video. It's not like okay. You'll

1:18:23

find the exact name of the video Okay,

1:18:26

you'll find it he'll find it once he does

1:18:28

again, I'm not saying we didn't go I'm saying

1:18:31

this is a fun one And it's a weird

1:18:33

one There's there's a lot

1:18:35

of weirdness to it. Isn't it similar? Okay,

1:18:37

because a lot of this comes from It's

1:18:40

such a it's it's such an incredible

1:18:42

feat That's so difficult to do that. It's hard

1:18:45

to believe anyone actually did it sure which

1:18:47

I can understand that mentality

1:18:50

But the thing is you can go back in history and

1:18:54

you could look at In for

1:18:56

the sake of discovery and exploration you can look

1:18:58

at what other men have done hundreds of years

1:19:00

ago That arguably is more

1:19:02

impressive than going to the moon like what

1:19:06

I mean You

1:19:08

name it like take any take any famous

1:19:10

Explorer from like the 1500s to the 1800s

1:19:16

And whether it's Magellan or James Cook or

1:19:18

Christopher Columbus or any of them What

1:19:22

they were able to do navigating this vast

1:19:24

ocean going to play having no modern technology

1:19:26

at all Being able to go

1:19:28

from where they're starting point hit some little tiny

1:19:30

island Somewhere and then go around and navigating a

1:19:33

world that they don't even know what it looks

1:19:35

like. They have no maps They have no GPS.

1:19:37

They have nothing at all. I

1:19:40

cannot conceive of How

1:19:42

they could have ever done that? I don't know how

1:19:44

in the world not knowing what the

1:19:46

world looks like having no map having no GPS

1:19:48

having no modern navigation whatsoever how in the world

1:19:50

could you possibly get on a ship in

1:19:52

you know launching out of France or Portugal

1:19:54

or wherever and Make

1:19:57

it anywhere across the ocean. How could you

1:19:59

I don't know how you could It's incredible,

1:20:01

but it's incredible, but we know that it

1:20:03

happened. Okay, it's incredible But it

1:20:05

doesn't come it doesn't compare because they do it now

1:20:07

easily So anybody can

1:20:09

get in a ship right now and travel you

1:20:11

can get a small boat That

1:20:13

you have enough resources and you have enough gas

1:20:16

and you could travel through these routes you can

1:20:18

do it It took them hundreds of years, but

1:20:20

you can do it right now. So it's way

1:20:22

easier to do now, right? So it's something that

1:20:24

they did that's incredible. No doubt

1:20:27

no argument, but something that could be reproduced

1:20:29

today easily But at

1:20:31

least at least possibly I wouldn't say easily it's it's all

1:20:33

it's a task But it also took centuries to get to

1:20:35

the point where it could be easily But

1:20:38

it got better right after each one did

1:20:40

it because they had maps now and then

1:20:42

they also used their sextants and they understood

1:20:44

constellations in a way that most people don't

1:20:46

today and Sextants are if you actually use

1:20:49

them correctly and you understand which way the

1:20:51

the tides go and which way the the

1:20:53

water currents are going Which way the flow

1:20:55

is happening. They had a deep

1:20:57

understanding of the currents of the earth They knew

1:21:00

travel rains like travel lanes and they knew

1:21:02

which ways they could go with ships So

1:21:04

applying that to the open ocean applying that

1:21:07

to the these continents. They weren't even sure

1:21:09

were there It was very iffy very dangerous

1:21:11

very courageous But once they did it Then

1:21:13

everybody else could do it easier and then

1:21:16

they started doing it better and better and

1:21:18

then people started coming to America and then

1:21:20

Buh-buh-buh-buh-buh and now here we are and now

1:21:22

anybody can get in a boat anybody with

1:21:24

enough resources Can have a boat

1:21:26

that can travel those routes? No

1:21:29

one can just say I want

1:21:31

to go to the moon today and get

1:21:33

their private Mooncraft and fucking shoot

1:21:35

off into the atmosphere and land on the

1:21:37

moon So no

1:21:40

one's done that since 1969. That's

1:21:43

a recent occurrence in

1:21:45

terms of like human history But

1:21:47

not technologically the technology from 1969

1:21:49

is not even it's like cave people shit compared

1:21:52

to what we have today So

1:21:54

you really can't compare the

1:21:56

courageous amazing deeds of these

1:21:58

early explorers From

1:26:00

200,000 miles away and even though

1:26:03

they are filming it by blocking out all

1:26:05

the lights and filming it through this window

1:26:07

That actually is the the earth.

1:26:09

That's actually what it looks like when you're in

1:26:12

deep space. You could say that too You

1:26:15

just don't know and that It's

1:26:17

it's hard to figure out what's what it's hard

1:26:19

to figure out what's what when you see a

1:26:21

video like that You just go hmm. Okay. What

1:26:24

is that? and

1:26:28

I don't think it's impossible to

1:26:30

fake people going to the moon. I think

1:26:32

it'd be very difficult It would require

1:26:35

a lot of people to be on board, but

1:26:37

I also think it could be compartmentalized The

1:26:39

people that make the rockets they they you are

1:26:41

design You're what you're doing is you're making a

1:26:44

specific part and this guy's making another part and

1:26:46

you have the engineers put this thing together And

1:26:49

you launch this thing into space the people that

1:26:51

would have to know are the people that are

1:26:53

actually charting the trajectory of the

1:26:55

Apollo mission the people that are

1:26:57

actually talking to the astronauts and explains them

1:26:59

what to say during the press conference the

1:27:02

people that are engineering the

1:27:04

whole thing and you could probably get

1:27:06

away with doing something like that with a few hundred people

1:27:08

and you could get a few hundred people of High-ranking

1:27:11

people that have top-secret clearance keep their

1:27:13

mouth shut you could I would just

1:27:15

need I Would

1:27:18

need some kind of solid evidence of

1:27:20

that to believe that's true. Yeah me

1:27:22

too this to me There

1:27:24

are some things that we call conspiracy theories that

1:27:26

I think are you know clearly true

1:27:28

There are some things that we call conspiracy theories that I think

1:27:31

are maybe true But

1:27:34

there are conspiracy theories that to

1:27:36

me are just that they're just they're

1:27:38

just they're not even theories

1:27:40

really They're just kind of like fanciful

1:27:44

Whatever projections

1:27:48

and The ones that I don't find

1:27:50

convincing are where they they usually

1:27:52

start with There's

1:27:54

a so-called official narrative of a thing that happened

1:27:56

mm-hmm. There's a couple of things

1:27:59

about what actually happened that are

1:28:01

kind of weird. And

1:28:03

we look at that and go, that's a little bit weird. And

1:28:06

then the conspiracy theorists, in that case, they come in

1:28:08

and they find these little tiny

1:28:10

cracks, if you want to call it, and then inside

1:28:13

the cracks, they shove this whole like Hollywood

1:28:16

cinematic narrative that they have created

1:28:18

to explain what's actually like

1:28:20

a pretty tiny crack. You don't need

1:28:22

this whole thing to explain that. So

1:28:25

with the moon thing, I mean, one of the first, weird

1:28:30

aspects of the moon landing that I think

1:28:32

started kind of the conspiracy theories about it

1:28:34

was the flag,

1:28:36

the fact that the flag's moving in the picture. And

1:28:39

so yeah, it's like, when you look at that, you don't really understand, you look at

1:28:42

what that is weird because there's no wind on the moon. But

1:28:44

then you understand that, okay, for example,

1:28:46

when you put the flag down, it

1:28:48

creates reverberations, it makes the flag

1:28:51

move, it's gonna move for longer because

1:28:53

there's no gravity. So

1:28:55

there's an explanation for that. But

1:28:58

if you're the conspiracy theorist, then

1:29:00

you take the flag moving and you just let, you're

1:29:02

like, nope, the whole thing

1:29:04

is bunk. Have you ever seen the

1:29:06

video footage of the astronaut hopping by

1:29:08

the flag and the

1:29:11

breeze of him hopping by makes the flag

1:29:13

wiggle? He

1:29:15

doesn't touch the flag at all, the flag is

1:29:17

completely stationary, and the astronaut hops by the flag,

1:29:20

and as he hops by the flag, the flag

1:29:22

wiggles. Okay, are we saying that wouldn't happen

1:29:24

on him? No, it wouldn't, there's no air. Yeah,

1:29:28

okay, I haven't seen that. Oh, we'll show it to you.

1:29:32

It's weird, listen, what

1:29:34

you're saying is entirely correct. Everything

1:29:36

you're saying is entirely reasonable and correct if they

1:29:38

actually can get through the

1:29:41

Van Allen radiation belts. If they can, this

1:29:43

is stupid, this whole thing's stupid. But if they can't really

1:29:45

do that, and they never have done that, and the only

1:29:47

time they say they've done that is these missions, it gets

1:29:50

real weird. And

1:29:52

since they haven't done it since then, it gets real weird. And

1:29:55

it's not just that, there's other

1:29:57

video, it's not just the one where the guys. So

1:32:00

he's gonna hop by. Okay.

1:32:05

See that? Yeah, but he could have

1:32:07

hit the flag. Yeah, but he didn't. Look, look at the distance. Look

1:32:09

how far away he is from it. Pull

1:32:13

it back again. Oh. See

1:32:17

where he is? So he's in

1:32:19

front. He's way in front of

1:32:21

that thing. He paps by and it wiggles. I

1:32:24

don't know. He's in the suit. The suit's pretty clunky. Yeah,

1:32:26

but he's not close to it. Look at the perspective. Let's look

1:32:29

at it in slow motion. So watch, he

1:32:31

hops by. It just wiggles in the breeze.

1:32:34

That's a breeze, dude. So

1:32:36

that might not have actually happened on the moon, okay? That

1:32:38

might be footage that they filmed in Nevada desert. And the footage they

1:32:40

got on the moon got all fucked up. And so they tried to

1:32:43

pass that off on people and they thought that no one would know.

1:32:46

It doesn't necessarily mean we didn't go to the moon.

1:32:48

But that does look weird. And

1:32:50

it's just not, it's not one thing. If that

1:32:52

was the only thing, you'd be like, oh, well, I'm gonna go to the moon. It's

1:32:54

not one thing. If that was the only thing,

1:32:56

you'd be like, oh, well, who

1:32:59

knows? But there's a lot of them. He

1:33:02

could have hit it. I mean, he's close. It's

1:33:04

possible. It doesn't look like he hit it. It looks like a

1:33:06

breeze. Yeah, but

1:33:08

then the other part of this is that they, so what?

1:33:10

The people that went through all the struggle to fake the

1:33:12

moon landing, how would they

1:33:14

miss these things? Well,

1:33:16

I don't think they thought people would catch it.

1:33:18

First of all, you're dealing with a time where

1:33:21

there's no VHS tapes. You're dealing with the internet,

1:33:23

right? So you show it on television once. You

1:33:26

get to choose what gets shown and what doesn't get shown. You film

1:33:28

a bunch of shit. That's how

1:33:30

they got that footage of them inside the craft, filming through

1:33:32

that circular hole. Because they don't

1:33:34

air everything on television, but you have archives. So

1:33:37

you have all these archives and these kooks go through the archives.

1:33:40

They find things like that.

1:33:42

Okay, but that doesn't even mean that that was

1:33:44

actual moon footage. That could

1:33:46

have been some of the training footage. I'll tell

1:33:48

you what would convince me to, not that it's a fake,

1:33:50

but at least would make me open to it. One

1:33:53

thing that would shake my faith considerably in the

1:33:56

moon landing, if Elon

1:33:58

Musk were to come out and say, and say,

1:34:00

yeah, I don't know about this moon landing thing. Then,

1:34:02

okay, fine. Because, and I'm not saying this is my

1:34:04

whole reason for believing it happened, but Elon Musk, first

1:34:07

of all, if the moon landing was fake, he knows it was, he knows

1:34:09

it was fake. Sure. He's the richest

1:34:11

man in the world. He's shown zero

1:34:14

concern for, you know, propping up

1:34:16

official narratives at all. Right. So

1:34:18

he's a guy that would know if it's faked,

1:34:22

would, there'd be no reason for him to continue

1:34:24

that narrative if it was fake. In

1:34:26

fact, he could even say, you know, they faked it. I'm gonna do it for

1:34:28

real. I'll be the first one to go to the moon because they faked it.

1:34:32

And he hasn't said that. So I also find that to be pretty

1:34:35

compelling. The fact that he, as someone who wouldn't know,

1:34:37

let's say the problem is that you and

1:34:39

I, most people that talk about this, we have no like

1:34:41

direct access to knowledge

1:34:43

about space. This is all being given to us by

1:34:46

other people. So

1:34:48

you got to go to people that are actually working with

1:34:50

this stuff. And

1:34:54

so the fact that he has no time for this theory

1:34:57

at all, I also find to be persuasive.

1:34:59

It's good. It is persuasive, definitely.

1:35:01

But also he has a contract

1:35:03

with NASA and he has to

1:35:05

be very careful about what he says and does.

1:35:07

And for him to say something incredibly insane, like we

1:35:09

never went to the moon, even if he believes

1:35:11

it, that would be a big

1:35:14

risk with zero reward because there's no way to

1:35:16

prove, as you've said, there's no way to prove

1:35:18

that we didn't go to the moon. And

1:35:21

to say that we didn't go to the moon is

1:35:23

a kook take. Like that's, what the fuck is wrong

1:35:25

with you? You could say stupid things

1:35:27

like that when you're a comedian who's a podcast host. But

1:35:30

if you have contracts with

1:35:32

NASA and you run SpaceX and you are

1:35:34

legitimately making some of the greatest breakthroughs in

1:35:36

space travel that human beings have ever known,

1:35:39

like what they're doing with those Falcons, when

1:35:41

they have them land, fucking insane. Insane, come

1:35:43

back and land. We've never been able to

1:35:45

do that before. And it's all because of

1:35:47

Elon. I mean, if

1:35:50

he really is going to get people to Mars,

1:35:54

something is going to be addressed

1:35:56

eventually as to, you

1:35:59

know, if, if. If they do it and they pull it off

1:36:01

and it's easy and comfortable, okay, we

1:36:03

probably did it in 1969. If

1:36:06

they go to the moon and there's no problem going through

1:36:08

the Van Allen radiation belts with no particular

1:36:11

insulation other than what the spaceship had,

1:36:13

maybe. Yeah, they probably did

1:36:15

it. Well, I will say, I

1:36:17

don't even, the moon landing hoax idea

1:36:19

is, it's barely even a kook take

1:36:21

anymore. I think it is, but you're

1:36:24

probably in the majority with your take on

1:36:27

it. The

1:36:29

last time I talked about this publicly,

1:36:31

I got absolutely ripped

1:36:33

to shreds. Of course. I mean,

1:36:35

it felt like 99% against, and it's going to

1:36:38

happen again in response to this

1:36:40

conversation. 99% against you? Yeah.

1:36:42

Against your take. So most people think that we

1:36:44

didn't go to the moon. It

1:36:47

seems- Maybe that's your followers, bro. I

1:36:49

think if you get the overall internet, it would go

1:36:51

the other way. The overall internet,

1:36:54

most people would think you're a kook for

1:36:56

even entertaining the idea that we never went to the

1:36:58

moon. Maybe, but it seems

1:37:00

like it's shifting drastically. A lot of that is

1:37:02

people just have lost all faith in our institutions,

1:37:04

which I understand. Yes. So people

1:37:06

are, I mean, that was kind of the point of your bit,

1:37:09

that people are, once you see that this is a lie, this

1:37:11

is a lie. Yeah, yeah,

1:37:13

exactly. That is happening, and I'm totally sympathetic to that

1:37:15

part of it. But

1:37:19

I just think that the moon landing is, there's

1:37:21

a lot of good evidence for

1:37:23

it. And also, this

1:37:26

is an emotional argument. Yeah, it's an American thing. It's

1:37:29

one of our greatest achievements as Americans. Sure. You

1:37:31

got to pry that from my cold, dead hands. I

1:37:34

mean, you got to

1:37:36

really show me something to make me

1:37:38

willing to give that up. I would tell you that

1:37:41

one of our greatest achievements is faking the

1:37:43

moon landing. I

1:37:45

could be. I think it's an amazing

1:37:47

achievement. I think it's an amazing achievement.

1:37:50

It's akin to turning Kamala Harris into

1:37:52

the most compelling presidential candidate since Barack

1:37:54

Obama. There's things that they

1:37:56

can do with propaganda and spin that are

1:37:58

truly amazing. her become

1:38:00

this like celebrated character when just a few

1:38:02

months ago everybody was upset that she was

1:38:04

on the ticket and oh my god if

1:38:06

Joe Biden dies and she becomes president people

1:38:08

are freaking out now all of a sudden

1:38:10

everybody's like yes she should be president. That's

1:38:12

also wearing off though I mean that's... You

1:38:14

think so? I don't think so. She

1:38:17

doesn't have she had they were able to

1:38:19

make her into a sensation a political sensation

1:38:22

for about

1:38:24

a month I don't

1:38:26

think she has that anymore I don't think

1:38:28

people are because you got it you can hype

1:38:30

somebody up and you can turn them into the

1:38:33

next political savior through really good branding they

1:38:35

did that with Obama but you

1:38:38

got to have something that's be something they at least

1:38:40

have to have charisma I mean Obama had charisma so

1:38:43

you at least have to have that with a... If you

1:38:45

have a politician who has charisma then the media can come

1:38:47

in they can do the rest and they can turn you

1:38:49

into... Well she certainly has charisma when she has planned speeches

1:38:51

and she gets to read off a teleprompter and maybe that

1:38:54

thing in her ear what do you think about that? I

1:38:56

think that's legit. It could be.

1:38:58

You see the company has responded? To

1:39:02

the... Yeah. What did they say? They said

1:39:04

they they definitely didn't deny it and they

1:39:06

said it looks very close to like what

1:39:08

our device is and I go to their

1:39:11

website it might be on their website they

1:39:13

might have somebody sent me

1:39:15

something and I just

1:39:18

looked at it briefly I'm like oh this will probably come

1:39:20

up today I wanted I want to see it in real

1:39:22

time because whatever the website is

1:39:24

of the company that makes that thing

1:39:26

they've apparently addressed it on the website

1:39:32

but is that illegal? I

1:39:34

don't know if it's illegal but it's incredibly

1:39:37

unethical. Unethical for sure but also

1:39:39

if they pulled that off with

1:39:41

earrings like fucking amazing and

1:39:45

it would explain because she she stayed

1:39:47

on script really well amazingly well amazingly

1:39:49

well Does

1:39:53

it say anything about the presidential debates? The

1:39:57

companies definitely responded maybe it wasn't their website maybe

1:39:59

it was social... media. What is the name of the

1:40:01

company? Okay,

1:40:04

Google Nova audio earrings

1:40:07

response to presidential debate.

1:40:13

Nova audio earrings response to

1:40:15

presidential debates.

1:40:20

It might have been a troll. That's

1:40:22

why I wanted to see it in real time to find out what

1:40:24

the fuck it is. But see if there's a website where

1:40:27

they responded because I think they did

1:40:30

respond. I

1:40:32

could find it. I know I saved it. Company

1:40:37

says Kamal's earrings strikingly

1:40:39

similar to its Bluetooth device. Okay,

1:40:42

there it is. Strikingly similar

1:40:45

to its Bluetooth device

1:40:47

offers to make ones

1:40:50

for Trump. Imagine if

1:40:52

Trump starts wearing earrings. First

1:40:55

of all, that would never work because you can't tell him what to

1:40:57

do. Yeah, he would never. He's free balling. We

1:41:01

do not know whether Mrs. Harris wore

1:41:03

one of our products. The resemblance is

1:41:06

striking and while our product is not

1:41:08

specifically developed for the use at presidential

1:41:10

debates, it is nonetheless suited

1:41:12

for it. Okay,

1:41:15

there you go. To ensure a

1:41:17

level playing field for both candidates, we are currently

1:41:19

developing a male version and will soon be able

1:41:21

to offer it to the Trump campaign. The choice

1:41:24

of color has been challenging though as orange

1:41:26

does not go well with a

1:41:28

lot of colors. That

1:41:34

company's funny. They're funny. That's

1:41:36

a funny company. I

1:41:38

would buy their shit. Bulletproof earrings for

1:41:40

Trump. Yeah, right? Yeah,

1:41:43

I mean, how crazy is the

1:41:45

conspiracy theory that he didn't actually get shot? That

1:41:48

he cut his ear like a pro wrestler.

1:41:51

Yeah, that's another or that it was shrapnel,

1:41:53

which to me would make even less sense

1:41:55

because yeah, it's a very minor injury because

1:41:57

she just got nicked. But

1:41:59

if it's. Shrapnel, you would expect, you know,

1:42:02

the marks all over his face. Right.

1:42:05

Or not. You know, the thing is

1:42:07

shrapnel could be a small piece of shrapnel. You

1:42:09

know, shrapnel's not uniform, right? So if it hits

1:42:11

a railing, which apparently there is some shot, there's

1:42:14

some video footage of, because I think

1:42:16

there was nine shots fired total. Yeah.

1:42:19

Was that what it was? Something like that.

1:42:21

Something crazy like that. What about that though?

1:42:23

Trump sustained two centimeter wide gunshot wound

1:42:25

to his ear. Okay. The

1:42:28

thing is, ears heal pretty quick. Yeah, and two centimeters.

1:42:30

I mean, you can't see it. Yeah. I

1:42:33

saw holes from when I got my ears pierced. Oh yeah, but

1:42:35

that's different. That's a hole. This

1:42:37

is a scratch. Like, ears, like, I've gotten my

1:42:39

ears fucked up a bunch of times from jujitsu,

1:42:41

and they heal pretty quick. It's

1:42:44

foreheads heal quick, ears heal quick. Things

1:42:47

around your mouth heal really quick. There's

1:42:49

parts of your body that have a lot of

1:42:51

blood vessels, and they heal pretty quick. He's old,

1:42:53

which is odd for him not to have a

1:42:56

scar, but it's not inconceivable that it could just

1:42:59

scratch the surface, and that would cause a lot of

1:43:01

blood. Like if you get a forehead cut, forehead

1:43:04

cuts are crazy. You just pours blood on your

1:43:06

face. But if you get a cut like on your

1:43:08

knee, it doesn't even drip. You

1:43:11

have to have a real cut on your knee to

1:43:14

be dribbling blood down your shin. The

1:43:16

forehead is filled with blood vessels,

1:43:18

as I think are the tips of the ears.

1:43:21

So I think it would bleed a lot, and

1:43:23

it might be a minor injury that bleeds a

1:43:25

lot, and it could heal in

1:43:27

a few days. Also it wouldn't even ... Even if

1:43:29

he didn't get hit with a bullet, which he did, but if

1:43:32

he didn't, it doesn't make a difference.

1:43:34

He still got shot at. It doesn't

1:43:36

change what happened. People behind him, one

1:43:38

guy died, and other people got grievously

1:43:41

injured, terribly injured to the

1:43:43

point where it's going to affect them for the rest of their life. The

1:43:46

more bizarre thing about the shooting is that

1:43:48

it's only been two months since it happened,

1:43:51

right? Two months. Not

1:43:54

even two months. It's been like a month and

1:43:56

a half, and we've

1:43:58

moved on like it never happened. It never

1:44:00

happened in two weeks. In two weeks they

1:44:03

stopped talking about it. It's had no political

1:44:05

impact whatsoever. Nuts. He

1:44:08

got no polling boost from it. Reagan

1:44:10

got like 12 points briefly. Which just shows you the

1:44:12

polls are full of shit. Probably.

1:44:16

Yeah, full of shit. I mean they are

1:44:18

full of shit, but also it would not

1:44:20

shock me if, because we're so easily distracted,

1:44:23

if people really did just forget and don't care a week later,

1:44:25

two weeks later. Well, as long as it's not

1:44:27

in the news and it's not in the news. You

1:44:30

don't care about it. Also, there was

1:44:32

no press conference. So that's

1:44:34

kind of crazy. There was no disclosure

1:44:36

of all the information about this young

1:44:39

man's prior history, what led him to

1:44:41

this. They

1:44:43

went to his apartment and it was

1:44:45

professionally scrubbed. There was no silverware in

1:44:47

his place. There's also

1:44:49

this bizarre thing where there's a, you

1:44:52

know how they get ad data where you could track

1:44:54

where phones have been? Just one

1:44:56

phone was going from outside of the

1:44:58

FBI office in Washington, DC to

1:45:01

where this kid is multiple times.

1:45:05

So how did this kid get these explosive devices?

1:45:07

How did he get up on the roof? How

1:45:11

did they not flag him? And you see a

1:45:13

guy walking around with a rangefinder a half

1:45:15

an hour before the event. That guy is

1:45:18

going to jail. Like what are you talking

1:45:20

about? There's two reasons for a rangefinder. You're

1:45:22

trying to shoot something or you're using it

1:45:24

for golf. If you're not playing

1:45:26

golf, then you're trying to shoot something. That's the

1:45:28

only other reason for a rangefinder. Yeah. And

1:45:31

that was like three hours before. Yeah. They knew

1:45:33

about that kid. They were

1:45:35

aware that he was there. He somehow or another

1:45:37

got on the roof with a rifle. The whole

1:45:39

thing sucks. It stinks to high

1:45:41

heaven. And then they cremate him. He's

1:45:44

gone. They get his body. Someone

1:45:46

snatches his body like five or

1:45:48

six days after the event. And 10 days

1:45:51

later, he's cremated. The whole

1:45:53

thing is nuts. Like who is this kid? How

1:45:55

did he do this? Why did some 20-year-old kid

1:45:57

take shots at the president? Why didn't

1:45:59

he have a scope? This is one where

1:46:01

I'm totally open to conspiracy theories only because there's

1:46:03

not even there's not even like an official narrative

1:46:05

for it They've basically told

1:46:07

us nothing. So we're left

1:46:09

to fill in the blanks Not only that think about how

1:46:12

perfect it would have been for a

1:46:14

plan to assassinate someone if you

1:46:16

do get this lone crazy kid

1:46:19

You give them whatever. I mean, there's been no

1:46:21

toxicology examination of his body that's been released, right?

1:46:23

So who knows what the fuck this kid's on

1:46:25

if you're gonna try to convince someone to go

1:46:28

shoot the former president You'd probably dope him up

1:46:30

with some crazy shit, right? And then that would

1:46:32

be in his system and then would be like

1:46:34

be able to trace. Okay, how

1:46:36

do you get this? Let's and let's talk to all

1:46:38

the people that are on his cell phone all the

1:46:40

people that are in his email Let's investigate and find

1:46:42

out where the fuck you got this stuff that he's

1:46:44

on when he shoots at the president You don't hear

1:46:46

a peep out of that so this

1:46:49

guy somehow or another figures out how

1:46:51

to get on the roof take these

1:46:53

shots and Then they

1:46:55

kill him now if he shot and

1:46:57

hit Trump if Trump didn't turn his

1:46:59

head of that pivotal moment What they're

1:47:01

talking about and it's a headshot Trump

1:47:03

is dead the world's in chaos and

1:47:05

this kid's dead Seconds later

1:47:07

and then it's like that crazy

1:47:10

kid who shoots the president and that's

1:47:12

it and then Okay.

1:47:15

Now who's gonna run as a Republican

1:47:17

who's in the world's in chaos? Yeah,

1:47:20

it would have been a perfect plan If

1:47:22

that kid just pulled it off Yeah,

1:47:26

I mean and I just I

1:47:28

can't I Think about I

1:47:31

when I do a like a college speech. We'll have a

1:47:33

little we'll have a few security guys there and

1:47:37

There's no way if someone showed up with a rangefinder

1:47:41

They would not get in the building Anyone that

1:47:43

looks vaguely suspicious with any kind of bag is getting

1:47:45

flagged and I got like, you know Three or four

1:47:47

guys and I'm just a guy. I'm

1:47:49

like just a guy giving a speech at a college That

1:47:52

never would have happened. It could not have happened. They

1:47:54

would have flagged especially three hours ahead of time So

1:47:57

how does that happen with the president u-nighters or former

1:47:59

president? kind

1:52:00

of story from their perspective, because

1:52:02

they look at him as a monster, this

1:52:04

monstrous figure, and the

1:52:07

media deliberately created this. They gave him all

1:52:09

the attention, they sucked all the oxygen out

1:52:11

of the room for every other

1:52:13

candidate, because this is the guy they

1:52:15

wanted, and they thought we were

1:52:17

gonna annihilate him, there's no way he's gonna win a general

1:52:19

election, and of course he won, but, and

1:52:22

I think that's one of the reasons why ever

1:52:24

since then, they haven't been able, the media, they

1:52:26

just can't, they hate him

1:52:28

with an extra passion, that they have

1:52:30

not had even for other Republican presidents, and I

1:52:33

think a lot of it is, it's

1:52:35

like this, they're projecting, because

1:52:38

they realize that they did this, and

1:52:41

they just can't get over it, I think. Well

1:52:43

there's definitely this over correction. Robert

1:52:46

Epstein talked about that, Robert

1:52:48

Epstein's done all that work on Google, and

1:52:51

these ephemeral instances

1:52:54

of interacting with Google, where it shows

1:52:56

you with search results, and with news

1:52:58

stories that get brought to your feed,

1:53:00

they're temporary, you don't record them, so

1:53:02

he records all these, and

1:53:05

what he has found through his

1:53:07

research is that, especially with people that are

1:53:09

on the fence, like people that are 50-50, you

1:53:12

can swing 50-50 to 90-10, like

1:53:16

people that don't know who they're gonna vote for, you

1:53:18

can make it 90-10, just

1:53:20

through these interactions with Google.

1:53:23

It's really shocking. What do you mean 90-10 in

1:53:25

what way? 90-10, like say if you want

1:53:28

Hillary to win, or you want Trump to win, whatever

1:53:31

candidate you choose, if you manipulate the

1:53:33

search results, if you manipulate just the

1:53:35

fill-in, the suggestions is

1:53:38

Matt Walsh A, and then it just fills it

1:53:40

in. Just through that,

1:53:43

just through the suggestions, they can manipulate

1:53:45

it to a significant difference for people

1:53:47

that are on the fence, that are

1:53:49

independents, or that are undecided. And

1:53:51

he said you can take 50-50 and turn it to 90-10, which

1:53:54

is fucking stunning. Terrifying.

1:53:58

It's terrifying, it's unregulated. And one

1:54:00

of the things that happened was after Trump won in 2016,

1:54:03

there was some sort of a

1:54:06

meeting at Google where they were openly talking about

1:54:08

this. And they were

1:54:10

talking about, we can't let this happen again,

1:54:12

which is such a crazy thing to say,

1:54:14

that we can't let the people decide who

1:54:16

they want to be president again.

1:54:19

If that is what they said, if that is

1:54:21

what they... And let's find out what the actual

1:54:24

quote was. I could see

1:54:26

how someone would say that if they

1:54:28

worked at an insurance company and they're

1:54:31

a pro die-hard Democrat, blue no matter who,

1:54:33

and they were like, we can't let this

1:54:35

happen again. I could see how

1:54:37

you say that if you're just an individual voter who doesn't

1:54:39

really have an impact. But if you're

1:54:41

someone who can shift undecided voters from 50-50

1:54:44

to 90-10, as Robert Epstein is

1:54:47

alleging, if that's true, that's

1:54:50

a crazy thing to say. Because you're

1:54:52

deciding, you're going to decide the result

1:54:54

of the election. And you don't give

1:54:56

a fuck about debate and free speech

1:54:58

and people being able to decide for

1:55:00

themselves because you think that you're right.

1:55:03

And you think everybody else should agree

1:55:05

with you. You also think that you

1:55:08

are... You've told yourself that you are

1:55:10

the vanguard protecting democracy in

1:55:12

our way of life. Which is crazy. Which is

1:55:14

insane. Of course, the idea that you have to

1:55:16

prevent people from voting for a certain guy in

1:55:18

order to protect democracy is

1:55:21

nuts. It's so nuts.

1:55:25

But that's what they actually believe. And

1:55:27

when you tell yourself that, you convince yourself that, well,

1:55:30

this is for their own good. These

1:55:32

people are silly. They don't know. They're

1:55:34

dumb. They're bigoted. They don't understand what

1:55:36

they're doing. And so for their own good, we have to prevent them.

1:55:38

We have to do whatever we can to prevent this. When I talk

1:55:40

to some of my hardcore lefty friends that

1:55:43

are still left in LA that I was telling

1:55:45

you about before, they say we. They say we

1:55:47

all the time. We have to win this.

1:55:49

They say that all the time. We can win if this happens. They

1:55:52

say that kind of shit. And they talk

1:55:54

about it like they're talking about the Dodgers. They

1:55:56

really do. They talk about it like they're talking

1:55:58

about our team. And they're connected

1:56:00

to all these other people in their

1:56:02

community and they're all on this team

1:56:04

and it's weird, man. It's a weird

1:56:06

little hack that it's just like hypnosis.

1:56:09

It's weird that you can just do

1:56:11

that to people. It's weird that you

1:56:13

can get people to just ideologically be

1:56:15

captured and join this team and lose

1:56:17

all ability to look at things objectively

1:56:19

and then just understand nuance and understand

1:56:21

the influence of propaganda and like, how

1:56:23

many people are spending money on this?

1:56:25

And like, why is all the news

1:56:27

have this one specific narrative and then

1:56:29

Fox News is a totally different, what is going

1:56:32

on here? And nobody does that. And

1:56:34

not only can you get them

1:56:36

to, obviously they hate Trump, but to also demonize

1:56:40

half of

1:56:43

the country's population. I mean, there was just, I

1:56:46

think it was MSNBC yesterday,

1:56:49

one of these pundits was

1:56:51

talking about Trump and said, well, he's despicable,

1:56:53

he's terrible, but his supporters are too,

1:56:56

he said. And they went

1:56:59

on about how terrible his supporters are,

1:57:01

which is you're talking about tens of millions of Americans.

1:57:03

The basket of deplorables. Right. But

1:57:06

we take it for granted now, but 15

1:57:08

years ago, that would kind of be unthinkable. You

1:57:11

wouldn't do that. You say whatever you

1:57:13

want about a politician, you hate them, they're terrible, but it

1:57:15

was just generally understood that you don't use that

1:57:17

language to talk about all the people voting for

1:57:20

them. These are American citizens. I remember when Mitt

1:57:22

Romney and Barack Obama debated, it was

1:57:25

the most cordial, professional,

1:57:28

respectful discussion of the issues and who could

1:57:31

do a better job. Kind of

1:57:33

amazing. Kind of amazing that

1:57:35

that was, what was that, 2012? Kind

1:57:38

of amazing. And I

1:57:40

don't mind, because you can go back farther

1:57:43

in American history and you can find, like

1:57:45

back to the beginning and they're in Congress, like beating each

1:57:47

other over the head with fireplace pokers

1:57:49

and that sort of thing. There's

1:57:53

an argument to be made for that kind of, it certainly

1:57:55

makes C-SPAN a lot more interesting, but

1:57:58

that shows a certain passion. for the

1:58:00

issues, I suppose. Sure. But

1:58:03

that's, it's, what we have now is different from

1:58:05

that. It's much more, I mean, there've

1:58:09

been multiple cases recently of

1:58:11

congressional hearings where

1:58:13

they start screaming at each other. Marjorie Taylor

1:58:15

Greene, AOC. Marjorie Taylor Greene and AOC, and

1:58:17

who's the other one? Jasmine Crockett, I think.

1:58:21

And it's like a waffle house. It's like, you know.

1:58:24

Just no respect for each other, but also

1:58:26

no dignity at all, no class.

1:58:29

No respect for the position. Right. Like

1:58:31

you can't be yelling out, oh baby girl.

1:58:33

Like, you're a congresswoman. This is crazy. And

1:58:36

they're making fun of each other's wigs. Google

1:58:38

versus Trump leaked video reveals executives' negative

1:58:41

reactions to Trump's 2016 election victory. So

1:58:43

what is the actual quote? I didn't

1:58:45

see the actual quote that we were

1:58:47

trying to find. There were, stuff

1:58:49

said that they weren't happy. And then

1:58:51

this got, so this was a confidential

1:58:53

video that got released via Breitbart in

1:58:55

2016. So he's saying here, hold

1:58:58

on. He's saying here, most people

1:59:00

are pretty upset and sad because of the election. Imagine

1:59:03

that, most people. Like, how do you know? Myself

1:59:07

as an immigrant and a refugee, I certainly

1:59:09

find this election deeply offensive. And

1:59:11

I know many of you do too. I think

1:59:13

it's a very stressful time and it conflicts with

1:59:16

many of our values. So scroll up, what

1:59:18

else does he say? He

1:59:20

also, he then added too, like he hopes that

1:59:23

there might be, I don't know where to find where it was. This might

1:59:25

have been right here. Yeah, less convinced. He

1:59:27

said, I find many things Trump has done very

1:59:30

offensive. I don't have very

1:59:32

high hopes, but he could do anything. You have

1:59:34

no idea. Maybe he will do something great. Who

1:59:36

knows? Take a little bit of wishful thinking. So

1:59:38

Google pushed back that there wasn't any bias discussed

1:59:40

in the meeting. Well, that's

1:59:42

bias right there, saying that most of us are

1:59:44

upset, right? For over 20 years, everyone

1:59:47

at Google has been able to freely express their opinions

1:59:49

at these meetings. Nothing was said at that meeting or

1:59:51

any other meeting to suggest that

1:59:53

any political biases ever

1:59:55

influences the way we build or

1:59:58

operate our products. So this is

2:00:00

Google. official statement. So what else did

2:00:02

he say though? Because the thing that Robert was

2:00:04

alleging that he was saying we're gonna make sure it doesn't

2:00:06

happen again. I couldn't find that quote. I watched a little

2:00:08

bit of the video with closed caption. Scroll back up so

2:00:10

I could just read all those quotes. Someone

2:00:15

else that they're giving a quote of, not him.

2:00:18

Mm-hmm. I think a lot

2:00:20

of us agree this election is particularly hard. He said there was a

2:00:22

lot of rhetoric. There

2:00:26

was a lot of rhetoric. Yeah. Well,

2:00:28

that's what elections are.

2:00:30

One of the

2:00:34

things that's always interesting to me is that they're

2:00:36

that they are so desperate to stop Trump and

2:00:38

they they act

2:00:40

like it's it's you know the future

2:00:42

of the planet hangs in the balance. Meanwhile they

2:00:46

still own everything. I mean they

2:00:49

own all the institutions. Google you

2:00:51

know the federal government. So

2:00:54

the truth is that Trump could get into

2:00:56

office. This happened last time.

2:00:58

He was in office for four years. They act like it's into

2:01:00

the world. Then he's out of

2:01:02

office and they basically reverse everything he did

2:01:04

in like two seconds. A

2:01:06

couple executive orders whatever and and most

2:01:08

of it never took hold anyway because

2:01:10

the bureaucracy is entirely aligned against Trump.

2:01:13

So it's that's the problem is

2:01:16

that even when Trump gets in

2:01:18

there there's there's he's handicapped in

2:01:20

his ability to do anything because

2:01:22

the entire federal government he

2:01:25

might be at the top of it but everybody underneath

2:01:27

him despises

2:01:29

him and they're all leftist. So and

2:01:33

they could just reverse it the second that he

2:01:35

leaves and yet they still act like if he's

2:01:37

in there that it's it's the end of the

2:01:39

world. They still can't. You think they'd almost have

2:01:41

an attitude. They're like yeah whatever fine. Let them

2:01:43

have it for four years. Yeah it still won't

2:01:45

matter because we're still gonna be in charge of

2:01:47

everything. Did you see

2:01:49

the conversation where this woman was talking

2:01:51

to someone from the Trump from

2:01:54

from Trump's team saying

2:01:56

worried that he was going

2:01:58

to weaponize the judicial system

2:02:01

once he got into office, that if he

2:02:03

got into office, he would weaponize the judicial

2:02:06

system and go after his enemies. Oh,

2:02:08

wow. And he says, like, what- I can't imagine. What

2:02:11

are you saying? For you saying that and

2:02:13

asking whether or not Trump would do that,

2:02:15

you have to acknowledge the

2:02:18

fact that that's absolutely happening to him

2:02:20

right now. And then she

2:02:23

tries to push back against it, and he does

2:02:25

a brilliant job of explaining how she's incorrect. Hey,

2:02:27

let me- I'm going to find this, Jamie, because

2:02:29

this is a good one. Unless

2:02:32

you could find it. But it's kind of

2:02:34

crazy, like, to see this conversation take place,

2:02:36

because you're just like, what? Like,

2:02:38

how are you even- how are you

2:02:40

so blind to what's absolutely happening that

2:02:42

you could even say that? I'm

2:02:46

going to find it. God damn it. It's

2:02:49

so hard to find things that you save on

2:02:51

these little social media platforms. See

2:02:54

if you can find it, Jamie. We're from

2:02:56

the debate. No, it was from a conversation

2:02:58

between someone in the Trump administration, someone on

2:03:00

his team, and I

2:03:04

know I could find it if you just give me a second. That

2:03:06

Trump is going to weaponize the- Yeah, that's

2:03:08

her argument, is that Trump is going to

2:03:10

weaponize the political system.

2:03:12

And, you know, it's guys saying, how

2:03:14

are you even saying that without admitting

2:03:16

that he- that they're doing that

2:03:19

right now to him? God

2:03:21

damn it, I'm not going to find it. I don't know where I

2:03:24

saved it. Sorry.

2:03:27

Every time I type it in, all I see is stuff about Trump saying

2:03:29

he would use- I

2:03:31

know, but that's because of Google, and that's why What's

2:03:34

His Face is correct. God,

2:03:37

I know I saved it. Shit. Of course, the

2:03:39

funny thing is that he definitely will not do

2:03:42

that. No, well, he didn't do that when he

2:03:44

was in office. He could have done that to

2:03:46

Hillary. Right. He said he thought it

2:03:48

would be a bad look. Yeah, yeah, he ran

2:03:50

on lock-up, he didn't. I mean, that's the thing.

2:03:52

They always- their criticisms of Trump, one of the

2:03:54

reasons they don't really do that, don't really land,

2:03:56

is that they're not

2:03:59

hitting him. in the places where, like they

2:04:01

don't even understand what his weaknesses actually are. They

2:04:04

try to make him out to be some kind of

2:04:06

dictator. That's the opposite. If anything, he has the opposite,

2:04:09

if anything, he has the

2:04:12

opposite flaw, that he's actually

2:04:14

hesitant to wield power even in times when

2:04:16

he should. So,

2:04:20

you know, if anything, that should be the criticism, that

2:04:22

you should use your power more, but he's

2:04:25

probably the least dictatorial, you

2:04:28

know, presidential candidate we've ever had. Yeah,

2:04:32

probably, when you think about what he actually did when

2:04:34

he was in office. But

2:04:36

that's why it gets weird. It's like, because they

2:04:38

can say something and it can

2:04:40

be not true, but yet enough people repeat

2:04:42

it and then it just becomes a narrative

2:04:45

that everyone just, I mean,

2:04:47

like, it's true that he's

2:04:49

a convicted felon now, but is it true that

2:04:51

it makes any sense? No, for you to say

2:04:53

that he's a convicted felon, like, okay, right, but

2:04:55

what did he do? Do you know what he

2:04:57

did? What he did is a misdemeanor. And

2:04:59

it also, it had lapsed the,

2:05:04

whatever the fuck it is, where you... Statue

2:05:07

of limitations, thank you. And

2:05:10

there's 34 counts for a bookkeeping. Right,

2:05:13

but they don't know what he did. The people

2:05:16

saying that, they don't know what he did. They don't

2:05:18

care. Right, they don't care. So they just repeat that

2:05:20

thing that he's a convicted felon. I

2:05:23

can't find this goddamn thing, it's driving me nuts. Because

2:05:26

it was really interesting. I hate when I save something and I

2:05:28

don't know where I put it, but I know I do. And

2:05:31

the funny thing is, these are also

2:05:33

people who otherwise would say that the

2:05:35

court system is entirely corrupt, that

2:05:38

just because you're a convicted felon, it really doesn't

2:05:40

mean anything at all, necessarily. Right. But

2:05:44

in this case, they put a lot of stock in it. Well,

2:05:47

it's just, we're in the

2:05:49

weirdest time where people are willing to believe

2:05:51

bullshit. It's not as simple as being able

2:05:53

to recognize bullshit. They recognize it, they have

2:05:56

it right in front of them, and they're

2:05:58

willing to believe it because it's- more

2:06:00

convenient to believe it. Yep. All

2:06:04

right, I can't find it. I'm giving up right

2:06:06

now. Damn

2:06:10

it. Well

2:06:12

they said stuff like that for sure. Yeah, I

2:06:15

know they have, but this one was really interesting

2:06:17

because you see this guy combat it and

2:06:19

the way he combats it is so interesting to

2:06:22

see her squirm because yeah, that's

2:06:24

exactly what they're doing. I mean it's not a

2:06:26

terrible crime that he committed and you're making it

2:06:28

seem as if it's something

2:06:30

that he deserves to be in jail for the

2:06:32

rest of his life for and that's crazy. That's

2:06:34

a crazy thing to say and that might actually

2:06:36

happen if he doesn't become president. If he doesn't

2:06:38

become president, they might actually lock him up for

2:06:41

25 years for that, which is

2:06:43

essentially the rest of his life will be

2:06:45

behind bars at Rikers. Yeah,

2:06:47

I kind of go back. That's the conventional

2:06:49

wisdom, at least the people I talk

2:06:51

to that they say, well, if Trump doesn't win, he's going

2:06:53

to jail and so he's got a lot on the line

2:06:55

here. I

2:06:58

kind of think, are they really like, maybe

2:07:01

it's naive of me to think, but would they do

2:07:04

that or would they rather just, he

2:07:07

loses, Kamala wins

2:07:09

and then they can let, they'd want

2:07:12

Trump to just fade into obscurity and never talk about him

2:07:14

again. I don't know. I

2:07:18

would tend to think that that would be that they'd prefer that, but

2:07:21

probably not. Yeah, you don't know.

2:07:25

It's so hard to tell what people would

2:07:27

or wouldn't do today. It's

2:07:29

just the whole country seems so

2:07:31

committed to their side and

2:07:34

I don't know what a solution to that is and

2:07:37

I don't know how we get past

2:07:39

this and whether

2:07:41

Trump wins or loses. What happens?

2:07:43

What happens next? There's certainly a real

2:07:46

thirst for vengeance. They

2:07:48

want revenge on him. I think that's what

2:07:50

it comes down to. Whether

2:07:53

it helps them politically or not because I think that's the problem

2:07:55

is that if they, if

2:07:57

Kamala wins and then they really go after Trump and try to... put

2:08:00

him in jail and if they actually do put him in jail, I

2:08:02

don't see how it helps them politically. I think

2:08:04

that's just going to radicalize people on the right even more than

2:08:06

they already are. It will radicalize people on the

2:08:08

right, but people on the right. For good reason, by the way.

2:08:11

I'm radicalized. It doesn't

2:08:15

help them politically, but I think they just, he

2:08:17

has to pay the price for defying them for

2:08:20

so many years. But if he

2:08:22

does get in office, then it gets very

2:08:24

interesting. Because then it's like, what

2:08:27

can he do now? How much different is his take on

2:08:29

it now? Because one of the things that he said is

2:08:31

the first time he got in, he didn't know anything

2:08:33

about governing. He's like, I had to find people and I

2:08:35

picked some of the wrong people, but

2:08:37

I know better now and I

2:08:39

could do a better job of it now, which

2:08:41

kind of makes sense. Because if I wanted to

2:08:43

talk to him, one of the things I really

2:08:45

want to ask is what is it like? When

2:08:47

you actually get in there, they don't think you're

2:08:50

going to be in there, and now all of

2:08:52

a sudden you're the actual president. What is the

2:08:54

resistance like? What are the communications like? What can

2:08:56

you say about how you have

2:08:58

these conversations with these people and how

2:09:00

you govern? How you

2:09:02

get things done? Yeah. How much power do

2:09:05

you actually have? What is it actually like? Because we

2:09:07

all have this sort of mystical view

2:09:09

of what it's like to be the actual president, but

2:09:12

very few people and only one ever that's

2:09:14

not a part of the system has ever

2:09:16

snuck through and attained that

2:09:18

position. It's only him. Yeah, I'd

2:09:20

be interested to hear his answer to that. It

2:09:22

wouldn't surprise me if in a weird way when

2:09:25

you become president, you

2:09:28

feel very powerless once you're sitting

2:09:30

there because you realize that you're

2:09:32

overseeing this just gigantic mammoth

2:09:36

thing that's just

2:09:39

so unwieldy. There's

2:09:42

no way to really control it. Yeah.

2:09:46

And especially in his case, you've got so

2:09:48

many people within his own administration plotting against

2:09:50

him. Is there a

2:09:52

lot of people within his own administration now you think plotting

2:09:54

against him or only then? Do you

2:09:56

think it's just like backstabby politics that just

2:09:58

what they do period? I

2:10:00

mean at the time there certainly was and I think but I

2:10:02

think he also made some he made some Bad

2:10:05

choices and personnel he made a lot of really bad.

2:10:07

I mean bringing guys like John Bolton in There's

2:10:12

no chance that you're not going to be undermined by somebody like that

2:10:16

Then especially you know in Trump's first

2:10:18

campaign in 2016 Drain

2:10:21

the swamp was was one of the was build the

2:10:23

wall and drain the swamp and lock her up you

2:10:25

know the two three big things and And

2:10:29

You don't we don't hear drain the swamp nearly as much anymore.

2:10:31

In fact. I don't think I've heard it It's

2:10:33

never said anymore, right? Which is

2:10:36

unfortunate because that is actually that is that is the first

2:10:38

thing that needs to happen if he gets in there And

2:10:40

what I would love to see is that Okay,

2:10:43

he's in there now. He got back in they tried to

2:10:45

stop and they tried to kill him They tried to put him in jail. He still

2:10:47

got in there He doesn't he's not

2:10:49

getting reelected. This is it for him four

2:10:52

more years. He's out of politics forever after

2:10:54

that And

2:10:56

I'd love to see him just I Got

2:10:59

nothing to lose now. They're gonna put me in jail when I'm

2:11:01

out of office You know so I got nothing to lose. I

2:11:03

don't care what these people say and Just

2:11:05

ruthlessly push your agenda through no matter how much

2:11:07

they complain about I'd love to see him do

2:11:10

that I think that they're assuming

2:11:12

he will which is why they're so desperate to stop him But

2:11:17

but I don't know they didn't happen the first time you know

2:11:19

so I Hope

2:11:21

it does. What's also Gonna

2:11:24

be very interesting to see what what

2:11:27

do they do to try to prevent this

2:11:29

from happening in the future? Because one of

2:11:31

the things that has been discussed is cracking

2:11:33

down on misinformation and that free

2:11:35

speech doesn't include misinformation Which

2:11:38

is a wild thing to say after what we

2:11:40

just went through with kovat where what people were

2:11:42

saying was misinformation Turned out to be a hundred

2:11:44

percent true and not just about kovat, but about a

2:11:46

bunch of things hunter Biden laptop story There's quite a

2:11:49

few different things you could point to like

2:11:51

who the fuck gets to decide what's information Only

2:11:55

the government you guys the people that

2:11:57

have lied about basically everything like

2:11:59

this This is a crazy thing to say and to

2:12:02

be running on that and to get people to

2:12:04

support that. The lack

2:12:06

of understanding of what it means to

2:12:08

be able to freely express ideas and

2:12:10

communicate and whistleblowers, whistleblowers from corporations that

2:12:12

are telling you about something they're doing

2:12:14

that's illegal, whistleblowers from government agents that

2:12:16

are telling you they're spying on you

2:12:18

when it's illegal, all that shit. To

2:12:21

have that big filter

2:12:24

through the government is an insane position

2:12:27

and yet that's something that they talk about and this is

2:12:29

something bizarrely that the

2:12:31

left supports. Well

2:12:33

even if, because even if it is mis, most

2:12:35

of the stuff they call misinformation isn't but in

2:12:38

the case when there's something that is mis, it's

2:12:40

just not true, plenty of that goes around

2:12:42

the internet. That's still free speech too. You

2:12:44

have the right to say things that are not, as long

2:12:46

as you're not slandering somebody, you

2:12:48

have the right to make claims about the world that don't

2:12:50

happen to be true. So

2:12:53

the idea that that doesn't qualify as free speech

2:12:55

is of course absurd and then,

2:12:57

but then who, then that also

2:12:59

requires some central authority to be

2:13:01

the arbiter of what is true and what is not.

2:13:03

Exactly. And it's like a

2:13:06

childish view of truth and lies.

2:13:08

It's childish because one of the only ways that people

2:13:10

find out if something is correct or not is let

2:13:13

someone say something that's incorrect and then someone who knows

2:13:15

a lot more comes along and corrects them. Right.

2:13:17

Yeah, that's, that's how it works. You know,

2:13:20

like I had Terrence Howard, you know

2:13:22

Terrence Howard? The actor, brilliant guy, but

2:13:25

wrong about a lot of the things that he thinks he's right about.

2:13:28

I brought him in with Eric Weinstein and

2:13:30

Eric Weinstein, who's a genius,

2:13:32

like a legitimate genius and a

2:13:34

mathematician explained him like very patiently

2:13:37

and carefully, this is why you're wrong and

2:13:39

this is what you need to know. And

2:13:41

you've got some good ideas, but you're off

2:13:43

on all these different things. I'm an actual

2:13:46

expert and let me help you

2:13:48

out here. And so anybody who saw Terrence Howard

2:13:50

talk on the first podcast had this idea. Like,

2:13:53

oh wow, maybe he's right about all these things. Anybody

2:13:55

who saw the second podcast with

2:13:57

Eric where Eric clearly corrects him.

2:14:00

and actually knows what he's talking about. He's

2:14:02

a brilliant guy. Now

2:14:04

you have a, that's what free speech is supposed

2:14:06

to be about. That's what it's supposed to be

2:14:08

about. An actual expert comes in, corrects everything, and

2:14:10

then you have this look at it, like, okay,

2:14:12

now I see. Now it's been, but it's not

2:14:14

silence Terrence Howard because he doesn't know what the

2:14:16

fuck he's saying. No, it's like let him talk.

2:14:19

Now let someone who really knows what they're talking

2:14:21

about explain to him why he's wrong. That's

2:14:23

the benefit of free speech, and everybody

2:14:25

listens to that, has a better understanding

2:14:27

of all these different really weird, complex

2:14:29

things that they're discussing that maybe otherwise

2:14:31

you would never have illuminated in

2:14:34

that way. You'd never really be able to understand it.

2:14:37

That's why I think the free speech thing is, people

2:14:39

act like it's a complicated, it's

2:14:42

a complicated subject. Where do you draw the line?

2:14:44

What is free speech? What

2:14:47

qualifies and what doesn't? I don't think it is that

2:14:49

complicated really. I think it's just, you

2:14:52

should have the right to express

2:14:54

whatever your opinion happens to be. Everyone

2:14:56

should be able to say their opinion, their

2:14:58

point of view. Wrong

2:15:01

or right, reprehensible or not.

2:15:04

They should be able to say it. Yeah, you

2:15:06

can't defame someone. You can't threaten to

2:15:08

kill somebody, but those aren't really opinions.

2:15:11

That's different. If it's just your opinion about what's happening

2:15:13

in the world, it should

2:15:15

be allowed. And it should be allowed legally, it

2:15:17

should be allowed on every social media platform. I

2:15:20

think it's kind of simple actually to

2:15:23

differentiate between that and the, because yeah, there's certain

2:15:25

kinds of speech that should not be allowed. We

2:15:27

all understand that. Yeah, it's complicated and this childish

2:15:29

idea that just handed over to the government to

2:15:31

clean it up, that's not the answer. It

2:15:34

is complicated. There are gonna be people that say a

2:15:36

bunch of things that aren't true. But

2:15:38

the way to combat that is not put the

2:15:40

government in charge of what's true. Especially

2:15:43

when they've been wrong so many times. Or

2:15:45

they just out and out lied so many times. That's a

2:15:47

crazy position for the left to take. The ones who are

2:15:50

supposed to be the party of science and reason, the

2:15:52

ones who are supposed to be the most educated.

2:15:54

It's just a bizarre perspective, just because you don't

2:15:56

want Trump to win. Just you

2:15:58

don't want this to happen again. And they

2:16:00

hate speech too is the other label they use to... You

2:16:07

can't say, Tim Wall said this recently about

2:16:10

free speech. He said, well, of course you can't. Hate

2:16:13

speech and misinformation doesn't count. Well,

2:16:15

what is hate speech? Hate speech is just, you're

2:16:18

expressing that you hate something. People

2:16:21

hate things. It's

2:16:23

legitimate. There are some things we should hate. So

2:16:27

the idea that it's automatically illegitimate

2:16:29

to express a view if it's

2:16:31

communicating hatred is of course ridiculous.

2:16:33

It is ridiculous, but it's also

2:16:35

a really goofy label that you

2:16:38

can slap on basically anything. Like

2:16:40

hate speech can get to the point where if

2:16:42

you call Caitlyn Jenner, Bruce Jenner, that's hate speech,

2:16:45

right? That's dead naming. Dead naming falls

2:16:47

under hate speech. And so

2:16:49

what are you saying? You can't do that?

2:16:51

Like, well, that's fucking ridiculous. I can call

2:16:53

him a cunt, but I can't say his

2:16:55

name is Bruce. That's insanity. Like

2:16:58

what world are we living in where you

2:17:00

can decide what someone can and can't say

2:17:02

by a label? That's so why, it's such

2:17:04

a net you're casting. You

2:17:07

know, hate speech, it's

2:17:09

like completely subjective. Anyone can decide what's

2:17:11

hate speech. Right. And

2:17:15

it implies that all hatred is automatically bad, or at

2:17:17

least it puts the people in power and position where

2:17:19

they can decide what kind of things you're allowed to

2:17:21

hate and what you're not. Right, and

2:17:23

it makes things all equal too. Something

2:17:26

like very benign versus something truly awful.

2:17:29

It's all under this one stupid umbrella of hate

2:17:31

speech. Yeah, hate

2:17:33

crimes too, the same thing. Where do you think we would

2:17:35

be if Elon hadn't bought Twitter? Different

2:17:39

world, right? Yeah, that's, I

2:17:42

think Elon Musk is,

2:17:46

he is actually preserving free speech. One

2:17:49

of the main people preserving free speech in America right now. And

2:17:53

going into space. So it's always funny to me

2:17:55

when people, when the

2:17:57

left tries to nitpick and needle at them, it's like this is

2:17:59

the best. is one of the most significant human beings on the

2:18:01

planet right now. And literally one of

2:18:03

the most significant human beings historically ever. Right.

2:18:06

He's like a Nikola Tesla type character that people are

2:18:08

gonna be talking about 100 years from now. And

2:18:11

he'll, you know, SpaceX will launch a rocket and it'll

2:18:13

blow up or something. Stephen

2:18:15

King was making fun of him. Right, yeah, someone

2:18:17

like Stephen King, like, rocket blew up.

2:18:19

It's like, dude, your rockets

2:18:22

don't blow up because you don't build them.

2:18:24

I mean, you know. Not only that, like

2:18:26

he made this tweet about how it damaged

2:18:28

the ionosphere that when it

2:18:30

blew up. But do you know that that

2:18:32

like heals up in like 40 minutes? Yeah.

2:18:35

He didn't even bother looking into that. Like

2:18:37

every time they punch a rocket through that

2:18:39

shit, it damages it. But it heals. It's

2:18:42

like you punch a hole through a cloud. And

2:18:45

a lot of times when they say that the rocket malfunction or something

2:18:47

is actually doing exactly what it was supposed to do, this is a

2:18:49

test run or whatever. But it's-

2:18:51

Yeah, they have to test tolerances and parameters. I

2:18:54

mean, they have a lot of them blow up. That's

2:18:56

what you have to do until you get one that doesn't blow up.

2:18:59

Yeah. And we

2:19:01

just need people in the world. This is very

2:19:03

much the, it's like the Teddy Roosevelt man in

2:19:05

the arena, you know, speech.

2:19:09

You need people in the arena who are actually trying

2:19:11

to do stuff, do important things.

2:19:14

You need people like that. And of

2:19:17

course, social media gives a platform for people who are

2:19:19

like not doing anything at all to

2:19:21

just sit and snicker at the few people in

2:19:23

the world who are trying to achieve something. Yeah,

2:19:26

but that's okay. That's okay too. That's their free

2:19:28

speech. You know, let, you know, if that's

2:19:30

what Stephen King wants to do today, let

2:19:32

them go. Who cares? You

2:19:34

know, it's interesting to watch. It's all of

2:19:36

it is interesting to watch. You

2:19:39

know, there's a lot of people out

2:19:41

there that are fools and they serve

2:19:44

as education to

2:19:46

others. You see the folly in

2:19:48

their actions and behaviors and how stupid they look

2:19:50

and how ridiculous this whole thing is. And

2:19:53

it's there for you. You learn from those

2:19:55

people. You

2:19:57

have a better understanding of human behavior. You have

2:19:59

a better understanding. people are capable

2:20:01

of being really interesting,

2:20:04

intelligent people, but also being buffoons at

2:20:06

the same time. And

2:20:09

that we're all subject to all

2:20:11

these various influences, and especially

2:20:13

through the use of social media, which just,

2:20:15

like I said before, it's an anxiety creating

2:20:17

machine. And there's so many

2:20:19

of these people that are attached to it that

2:20:21

are so deeply rooted in these online conversations and

2:20:24

so disconnected from the natural world. And

2:20:27

it's odd. It's odd to watch, but they're

2:20:29

there for you. They're there for an education,

2:20:31

an understanding, a greater understanding of

2:20:33

the weird nuances of human thinking.

2:20:37

Because that's genuinely what this whole thing is

2:20:39

all about. All the ideologies and all the

2:20:42

left and the right and the immigrants are great and

2:20:44

immigrants are terrible and they're eating ducks. All

2:20:47

of it is just human thinking, trying to

2:20:49

figure out what's the correct and incorrect way

2:20:51

that we all cohabitate and what's the best

2:20:54

way for all of us to sort of

2:20:56

get along. Yeah. I mean, that's the

2:20:58

catch-22 with social media because it could be, if

2:21:00

you use it exactly the right way, it does

2:21:03

give you access to all

2:21:05

these human beings and the way that they're thinking about things, which

2:21:09

could be quite enlightening. But

2:21:12

most people don't use it the right way. And also... You

2:21:15

have to use it the right way. And you can. This

2:21:17

is also why, in my opinion, none of my kids

2:21:19

have smartphones or social media. They're going to

2:21:21

get bullied. Well, they're

2:21:23

not on social media. They're going to get bullied

2:21:25

by... How old are your kids? All

2:21:27

of us are 11. That's young

2:21:29

enough. They shouldn't have social media. Yeah, I agree with you

2:21:31

there. But as they get into the high school

2:21:33

ages, I think it's

2:21:35

a new world. We're navigating it. They should learn how

2:21:38

to navigate it too. I think

2:21:40

it is very addictive, but also there's

2:21:42

people that know how to walk away

2:21:44

from it and know how to self-regulate. And I think

2:21:46

that's a valuable skill that I think everyone's going to

2:21:48

have to learn. Yeah, I think what

2:21:50

you... I mean, we haven't quite decided when

2:21:53

we're going to introduce this stuff to the kids. But once

2:21:55

again, at a certain point, yeah, I don't want them to

2:21:57

be 18. It's their first time ever holding

2:21:59

a cell phone. Because then you're just, then they

2:22:01

have no idea, we haven't given them the tools

2:22:03

to understand how to use this stuff, like the

2:22:05

emotional and intellectual tools.

2:22:09

So you gotta introduce it at some point, but most

2:22:13

kids today, I don't know what the latest figure is, but

2:22:16

millions of kids today have smartphones by the time they're like

2:22:18

eight or nine years old. A lot

2:22:20

of my kids, friends when they come

2:22:22

over, they're eight, nine, 10 years old and they've got phones, I

2:22:25

just think it's like, it

2:22:28

can only harm them. You understand, as a parent, you're

2:22:30

giving them something, at this age, they cannot use it

2:22:32

appropriately or correctly, they don't have the tools for it,

2:22:35

they're not old enough. It cannot

2:22:37

help them in their life, it can only harm them,

2:22:39

it can only do damage to them. I

2:22:41

think we're gonna look at it 20, 30 years from now, the

2:22:46

same way we look at people smoking. I

2:22:48

think we're gonna think, what we're doing, what we're

2:22:50

doing giving kids those goddamn phones, what did we

2:22:53

do? Like, we don't

2:22:55

even know what the kids of today who are on

2:22:57

the internet, who are subject to the same sort

2:22:59

of horrific images that you and I are talking about earlier. What

2:23:02

is that doing to people long term? I

2:23:04

never got exposed to anything like that when I was seven. How

2:23:07

many kids are getting exposed to murder videos

2:23:09

when they're 10 years old? Probably

2:23:11

quite a few. Pornography. Yeah, oh, that's the

2:23:14

craziest one, right? Because that was a hard

2:23:16

thing to get. It was

2:23:18

difficult. When I was a boy, we'd find

2:23:20

magazines in the woods. You knew a

2:23:22

guy who had a VHS tape, oh my God,

2:23:24

it was crazy, no one can find it. Now,

2:23:27

kids have it on their phones and it's instantaneous,

2:23:29

you have 5G on your phone. You can go

2:23:31

to any porn site any time you want. Yeah,

2:23:34

I mean, the average age of first exposure to pornography

2:23:36

now is like, I

2:23:38

think it's around 10 years old. I mean, it depends on,

2:23:40

I guess, what study you look at, but it's young. Yeah.

2:23:43

And it's not just, because, yeah, people sometimes will dismiss

2:23:45

the harms of it, but because they'll say, they'll

2:23:48

say, oh yeah, I found a Playboy under my

2:23:50

dad's mattress or whatever. Not the same. It's not

2:23:53

at all the same. I mean, the kind of

2:23:55

thing you're being exposed to, how often you're being

2:23:57

exposed to it, how ubiquitous it is now, how

2:23:59

readily. available it is it's not at all the

2:24:01

same you know we had the guys on from that

2:24:03

chimp crazy show you know that new

2:24:05

show on HBO where the people have pet chimps

2:24:08

no crazy it's the

2:24:10

same guys who did Tiger King oh

2:24:12

and it's amazing it's on max what

2:24:14

used to be HBO and

2:24:16

one of the things they said is the

2:24:18

chimps get addicted to pornography really yeah they

2:24:21

get addicted to pornography and they watch it

2:24:23

all the time like these certain chimps that

2:24:25

get older they give them iPads they give

2:24:27

them phones and they show them you know

2:24:29

they get on the internet and if someone

2:24:32

shows them pornography they get addicted to pornography

2:24:34

that's crazy that's crazy and they start

2:24:37

sexualizing human beings well

2:24:39

that kind of goes to show there's something primal about even just

2:24:41

the well obviously pornography but the

2:24:44

the phone the even

2:24:46

like my kit my two-year-old twins

2:24:50

they don't have phones obviously but there's

2:24:52

just something about the phone itself even if it's off

2:24:55

they just like to cool object yeah yeah and in

2:24:57

our kids we will you know they don't have tablets

2:24:59

not stuff but if we go on a long car

2:25:01

trip it's the one time we make an exception if

2:25:04

we're going like a 20-hour car trip just

2:25:06

so we don't just for our own sanity we'll let them

2:25:08

have tablets in the car just games and books and stuff

2:25:10

but then we get wherever we're going

2:25:13

take that we take the tablets away you don't get them anymore but

2:25:16

there's there's like a detox period of like two

2:25:18

or three days where they they're jonesing for it

2:25:20

they're constantly they can't they're constantly asking for it

2:25:22

and once you get past that they're fine again

2:25:24

but there's a real it's like there's something it

2:25:27

creates this compulsion and

2:25:29

kids take to it really quickly and

2:25:32

it just becomes a it's like a it becomes

2:25:34

another limb for them part of their

2:25:37

part of them somehow really quickly yeah

2:25:39

it's weird and the addictions

2:25:42

to phones which we all have

2:25:44

then the addictions of social media

2:25:47

which a lot of people have and

2:25:49

then you get these weird insulated

2:25:51

groups that live in echo chambers and

2:25:53

that's I think like one of the

2:25:55

things you highlight the most about this

2:25:57

show this am I racist film that you

2:26:00

you made is like the struggle

2:26:02

sessions where these people, like the

2:26:05

first scene where you, before

2:26:08

they know who you are, where you're

2:26:10

going and it's sitting there and talking to

2:26:12

these people about these things. Like who are

2:26:15

you? Like where do you live? How

2:26:17

do you think like this? Like what is

2:26:19

going on in your life that you've been

2:26:21

exposed to this version of the world that

2:26:24

seems so ridiculous to

2:26:26

someone who's not in that bubble? So ridiculous

2:26:28

that it seems fake. It seems like you're

2:26:30

doing like a Borat thing. Yeah. And

2:26:33

it's, yeah, we've gotten that with the movie people ask is it, is

2:26:36

that all real or did we stay? It's all

2:26:38

totally real. We didn't script any of it. And

2:26:40

that in particular is a, yeah, it's like a

2:26:42

support group for white people who

2:26:44

are struggling with their white grief because

2:26:47

they, they have privilege and they're grieving

2:26:49

their whiteness and their privilege. And

2:26:52

there's this woman, Brisha Wade, I think

2:26:54

is the name. The

2:26:57

black woman, she'll do these

2:26:59

sessions with white people where she'll kind of like talk them

2:27:01

through their whiteness and people

2:27:03

have people pay money to go and

2:27:05

sit around and, and talk to her.

2:27:07

And that was

2:27:09

another one that was like an hour and a half, two hours in

2:27:12

the room in real time. When

2:27:14

did they start figuring out who you were? At

2:27:19

some point mid midway through, they started, well,

2:27:21

they started looking at me strange because I

2:27:23

was intentionally making it really awkward just

2:27:26

cause it was funny. But

2:27:29

then I, as you can see in the movie, I got,

2:27:31

I get, I get emotional cause I'm on my own journey

2:27:33

of self discovery. And I had to leave cause there's one,

2:27:35

one rule that all these people have, we ran into this

2:27:38

multiple times is if you're

2:27:40

white, you're not allowed to cry in front of black

2:27:42

people cause that's white tears and

2:27:44

you can't shed white tears around black people cause

2:27:46

white tears are manipulative. So

2:27:48

in this place that she had a cry room. She said, if

2:27:51

you get emotional, you have to cry. Go to, we have a

2:27:53

room, get away from us and go cry over there.

2:27:57

So I, at one point I left to the cry room cause I was,

2:27:59

I was getting emotional. It's

2:28:01

a very emotional experience to confront my whiteness. And

2:28:06

I guess while I was gone, they

2:28:08

started talking to each other, like, who is this guy?

2:28:11

They looked it up and they were googling. And

2:28:14

then I came back. The whole thing had changed. The

2:28:16

tone had changed. They kicked me

2:28:18

out. It's a great scene. They call the cops. Well, when

2:28:20

the guy is saying, he's trying to hold your hand, trying

2:28:22

to grab you, and you're like, I'm not consent to be

2:28:24

touched. He's like, I'm not going to touch

2:28:26

you. I'm just going to answer your questions. Come on, come on,

2:28:28

answer your questions. He's going to answer your questions. Like, what kind

2:28:30

of answers are you going to give me, buddy? That

2:28:33

guy, too, I can say. One

2:28:36

of the people in that group

2:28:38

was a professional cuddler, actually. It's

2:28:41

not in the movie, but we just know that about

2:28:43

them. They get paid to cuddle with people? Yeah,

2:28:45

one of the people in the group. Jeez. A

2:28:48

cuddlist is what they call it. So these are...

2:28:51

They're fringe people. We thought about...

2:28:54

Somehow we could put up a lower third on

2:28:57

the screen to... But

2:28:59

it seems fake. It's like people don't...

2:29:01

Really, professional cuddler, come on. But

2:29:04

this is too crazy. These people exist out there. This is

2:29:06

a world that they live in. They

2:29:08

go to events like this, and they're very... They

2:29:12

have a lot of guilt for the fact that they're

2:29:14

white. There

2:29:17

were people crying in the circle. I mean, they

2:29:19

were getting really emotional talking about it. There's

2:29:22

the part where they... She says,

2:29:26

think about being white. What

2:29:28

emotions come up when you think about being white? And

2:29:31

then everyone goes around, and they're like, oh,

2:29:33

I have revulsion. I just feel... I

2:29:36

cringe. I feel cringe. Really? This

2:29:38

is you. You're talking about yourself.

2:29:42

It's just... It's sick. It's a

2:29:44

sickness. Yeah. Well, it

2:29:47

was very funny at the end, too. We

2:29:49

tried to get people... Spoiler alert. Tried to

2:29:51

get people to self-flagellate. Yeah.

2:29:58

And just a few people were like, that's

2:30:00

it. I'm out. Like slowly you lost like

2:30:02

a bunch of people over the course of

2:30:04

it. I didn't think, I mean we had

2:30:06

that plan as our last exercise when we

2:30:09

bring the whips out and we

2:30:11

debated like is anyone really gonna take a whip? I didn't

2:30:13

think anybody would. I

2:30:16

thought that this would be, because we needed an end for the

2:30:18

scene and so I thought I'd bring the whips out and everybody

2:30:20

would leave. And

2:30:23

so then that would be the end and then that would be,

2:30:25

that's our, because it's a narrative, we're trying to tell story. So

2:30:27

that would be the thing that shows me that

2:30:30

I've gone too far. But then they start taking the

2:30:32

whips. And I'm like, I

2:30:34

don't think I can actually have you beat yourself right

2:30:36

now, like liability, I don't know if I can do that. So

2:30:40

I was not expecting that. We lost a lot of people and

2:30:42

you asked who's the most racist person in the room? Yeah.

2:30:46

There was really the, right

2:30:49

before that, I mean, spoilers I guess,

2:30:51

but when

2:30:54

I'm berating my racist uncle. And

2:30:59

well, I don't know, you gotta watch it.

2:31:01

You gotta watch it. Yeah, that was for

2:31:03

me, the most shocking, making

2:31:05

the movie, the most shocking thing to

2:31:07

me that happened, that really took

2:31:10

me back, was in that moment and the way

2:31:12

they responded to it, which

2:31:14

I was not expecting. And it's

2:31:16

kind of dark. Yeah, they got aggressive with them.

2:31:18

Yeah. Yeah. It's

2:31:21

a great movie, man. And it's

2:31:23

just like what is a

2:31:25

woman in sort of the same vein of just,

2:31:28

it almost feels like satire, but

2:31:30

you realize it's not. It's just ridiculous.

2:31:33

But you do a great job. And you do a really

2:31:36

good job of staying calm and

2:31:38

dead panning, because I don't

2:31:40

have that skill. I

2:31:43

would not be able to hold it together. I would have

2:31:45

to start laughing. At some point in time, I would crack.

2:31:48

It's just, I wouldn't be able to not enjoy it

2:31:50

in the moment to the point where

2:31:53

I would go. That's the thing, you don't enjoy

2:31:55

it in the moment because it's actually, it's really

2:31:57

unpleasant in the moment you're in this environment with

2:31:59

these. insane people. Yeah.

2:32:01

It's exhausting listening

2:32:04

to this. How did you develop that

2:32:06

skill to do that though? Because

2:32:10

that's a skill. I

2:32:12

don't know if I... I

2:32:16

can't say I did anything to develop it. It's more just...

2:32:20

I just know what... We're making

2:32:22

a movie, so I'm aware of that the whole time. Obviously

2:32:25

if the cameras weren't rolling, I wouldn't be

2:32:27

reacting the same way. So

2:32:30

I'm just kind of keeping that in the back of my mind. This

2:32:33

is what we need for this scene. But

2:32:35

the main thing is we want to... With both movies. The

2:32:38

whole point is to create

2:32:41

an environment where the

2:32:43

other person feels comfortable saying what they actually

2:32:45

think and what they really believe and

2:32:48

doing what they would really do. And

2:32:50

that means not reacting. If you

2:32:52

laugh at them, they clam up. If

2:32:55

you argue with them, if you

2:32:57

show any real skepticism, they

2:33:00

clam up. They're not going to tell you what they really

2:33:02

believe. And then it's a boring movie because all you're getting

2:33:05

are the talking points. And that's especially the

2:33:07

case we found in this... That

2:33:09

was the case with What is Woman? We're talking to the trans activists.

2:33:12

But in this, when you're talking

2:33:14

to the race hustlers, they've been

2:33:16

doing it for a lot longer. The race hustle's

2:33:18

been around a lot longer than the trans hustle. And

2:33:21

they're pretty good at what they do. And they're

2:33:25

usually pretty sensitive to detecting

2:33:29

when someone's being skeptical. And if they get

2:33:31

that, then they kind

2:33:33

of go into a different mode. And they go into this kind

2:33:35

of HR, DEI mode where

2:33:37

everything's very sanitized, very surface

2:33:40

level. They're not going

2:33:42

to tell you this stuff about how all white

2:33:44

people are inherently racist. They're not going to get

2:33:46

into the really brutal, terrible racist stuff. So

2:33:50

we just stopped making this movie. How can we just

2:33:53

create an environment where they'll really be

2:33:56

themselves? The kookiest

2:33:58

version. and all that

2:34:00

required was just kind of being a blank slate and

2:34:03

asking questions. With this one, it required more of

2:34:05

a affirmatively

2:34:08

agreeing with them and

2:34:10

demonstrating that I'm fully on board

2:34:12

with this. There's a feel that

2:34:14

you have when you go into

2:34:16

it. If I was

2:34:18

there and I didn't know you, I'd be like,

2:34:21

I think this guy is fucking around. There's just

2:34:23

an edge, just a touch of it, just a

2:34:25

touch of it that makes it even funnier. Because

2:34:27

you're hanging in there and you're being dead planned.

2:34:30

But there's some moments where one of

2:34:32

my favorite moments was you asked Robin

2:34:35

DiAngelo what mansplaining was. And

2:34:38

then when she gave you a definition,

2:34:40

you mansplained her, you corrected her. And

2:34:44

she didn't even pick up with what you just did. Yeah.

2:34:48

I was proud of that. It's very

2:34:51

subtle. It's very subtle. I was stretching

2:34:53

when I was watching that, just laughing

2:34:55

really loud. Oh

2:34:58

my god. But the thing is, she did kind of.

2:35:01

And I think you can see it on camera. In the room, I

2:35:03

could tell. She

2:35:05

was kind of like, what did you just do? She

2:35:07

was trying to figure out in her head. I think she was trying to

2:35:10

sort through it. Is

2:35:12

this? But I think

2:35:14

she just couldn't. The

2:35:19

possibility that she was

2:35:21

in the room with someone who doesn't already agree

2:35:23

with her about everything, it's

2:35:25

unthinkable to her. She couldn't fathom it.

2:35:30

That's probably the first time in like 20 years

2:35:32

that she'd been in a room with someone who doesn't agree with her

2:35:35

on everything. Has she responded to the movie at all? No,

2:35:39

she took down her Twitter page. So

2:35:42

most of the people in the movie have

2:35:45

taken down their Twitter pages, deleted them. So

2:35:48

they're kind of, they're

2:35:51

going into a bubble somewhere. I mean, the truth

2:35:53

is there's not a lot they can say because

2:35:56

listen, if we deceptively edited it, if

2:35:59

we pulled any. like that, they'd happily come out and

2:36:01

say that. But they know that we didn't. Everything that's

2:36:03

in there is what they said. We didn't change anything.

2:36:07

It's all real, and they know that. So what

2:36:10

can they say? And especially in Robin DiAngelo's case,

2:36:14

she goes in a direction. She's

2:36:18

willing to do some things that are

2:36:20

quite embarrassing for her. But

2:36:24

we didn't put a gun to her. We didn't force her.

2:36:27

So what can she say? Well, listen, man,

2:36:30

congratulations. It's really funny. It's great. And I

2:36:32

think it's a great way to expose how

2:36:34

ridiculous some of this shit is. You can

2:36:36

expose it by being angry and yelling and

2:36:38

arguing with people on Twitter. But to do

2:36:41

it the way you did it and just

2:36:43

make it a hilarious hour and a half

2:36:45

movie is really good. So

2:36:47

kudos. Thank you, man. Appreciate it. Congratulations.

2:36:49

All right. Tell everybody where they can

2:36:51

see it. It's on dailywire.com. Actually,

2:36:54

it's in theaters. Oh, it's in theaters. It's in theaters right

2:36:56

now. Oh, nice. Nice. You can

2:36:58

get tickets at emiracist.com. Nice. So

2:37:00

we're trying to get it

2:37:02

out to make it available to whoever wants

2:37:05

to see it. It's very funny, folks. All

2:37:07

right. Thank you, Matt. Appreciate it. Thank you.

2:37:09

Bye, everybody. Bye. Bye. Bye. Bye. Bye. Bye.

2:37:12

Bye. Bye. Bye. Bye.

2:37:14

Bye. Bye. Bye. Bye. Bye. Bye.

2:37:17

Bye. Bye. Bye. Bye. Bye.

2:37:19

Bye. Bye. Bye. Bye. Bye.

2:37:21

Bye. Bye.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features