#2206 - Chamath Palihapitiya

#2206 - Chamath Palihapitiya

Released Wednesday, 25th September 2024
Good episode? Give it some love!
#2206 - Chamath Palihapitiya

#2206 - Chamath Palihapitiya

#2206 - Chamath Palihapitiya

#2206 - Chamath Palihapitiya

Wednesday, 25th September 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Joe Rogan podcast, check it out! The

0:02

Joe Rogan Experience. Train by day,

0:04

Joe Rogan podcast

0:06

by night, all day! Talkin' with you

0:08

at live is like, that will live in

0:10

infamy. It is the best

0:13

clip because he's like a totally different person.

0:16

Well it's what he really is. But

0:18

he really is. Yeah, it's like the Ellen thing, you know? I

0:21

mean he really did lose his shit there. Oh,

0:24

it looked like weirdly. You know?

0:27

I got the Christian Bale one because he's

0:29

in character, his intense scene. Some

0:32

guy's fucking around in the background, like, God

0:34

damn it, stop fucking around! I get that.

0:37

He's in this frenzy of this intense scene.

0:40

But what is fucking, what is Bill doing? Republican

0:45

talking points on Fox News. That was a

0:47

different part of current affair, right? No,

0:50

it was before. It was current affair. It's

0:52

when he's doing like, you know, like, a lot of

0:54

things. It was current affair, right? It's when he's doing

0:56

like, gossip and stuff. Oh, that's right! He

0:59

was a gossip guy! He was like an

1:01

entertainment tonight type guy. Exactly. Inside

1:03

Edition. One of those deals. Inside Edition. Oh,

1:06

is that what it was? That's what it's called.

1:08

Inside Edition. Yeah, and fucking those things. They never

1:10

go away. It's such a weird environment, the left

1:12

and right. There's

1:15

no like, centrist news source

1:17

on television. There's no like,

1:20

this is probably what's going on, news

1:22

source. Yeah. It's always one or the

1:25

other, and it's like you're living in a bipolar

1:28

person's brain, you know? I

1:30

think like, part of what's

1:32

happened is we used

1:34

to have news, and you

1:36

could make a good living in news,

1:38

and you know, journalists were really

1:40

sort of the top of the social hierarchy in

1:42

some way, shape or form, because they were this

1:45

check and balance. And

1:47

then somewhere along the way, this business model focused

1:50

people on clicks, and nobody

1:52

told the rest of the world that

1:54

the underlying incentives were going to change.

1:57

And so that's where you find yourself, where...

8:00

pushes them to a boundary

8:02

that they didn't know was possible, you're teaching them

8:04

stuff, that's really cool. So

8:06

I understand what the intent is, but

8:09

then the byproduct is there's

8:11

a small group of folks that get shut out,

8:13

and then that person that could be that

8:16

Steve Jobs-like person, that Elon Musk-like

8:18

person, is held

8:21

a little bit back. And I think that that

8:23

hurts all of us. So you've got to find

8:25

a way where we're doing just a

8:27

little bit better. Well, isn't that

8:30

the part of the problem with eliminating

8:32

gifted classes? Right, there's talk, I think

8:34

they're doing that in New York, is

8:37

that where they're doing that? Find out

8:39

if that's the case. There's some where

8:41

there's this hot controversy about eliminating the

8:43

concept of gifted classes. But

8:46

the reality is, there's some people that are

8:48

going to find regular classes, particularly mathematics

8:50

and some of the things, they're going to

8:52

find them a little too easy. They're

8:55

more advanced, they're more advanced students. And those

8:57

students should have some sort of an option

8:59

to excel. And it should

9:01

be inspiring, maybe intimidating, but

9:03

also inspiring to everybody else. That's part

9:05

of the reason why kids go to

9:07

school together. Look how

9:09

hard she works. She works so much harder than

9:12

me. Look how much she's getting ahead. Fuck, I

9:14

got to work harder. And it really does work

9:16

that way. That's how human beings in

9:18

cooperation, that's how they grow

9:21

together. And I think that

9:23

it used to be the case that if you went

9:25

to in high school. This

9:27

episode is brought to you by blinds.com. Do

9:30

you know the right window treatments aren't

9:32

just about privacy? They could

9:34

actually save you some serious cash in your energy

9:36

bills too. But is it really

9:38

worth the hassle? It's a

9:40

lot of waiting for some pushy

9:43

salesperson to come to your house

9:45

with an overpriced quote. Sucks, right?

9:47

Well, say goodbye to all that

9:49

nonsense because blinds.com has revolutionized the

9:51

game. blinds.com lets you do a

9:53

virtual consultation with their award-winning design

9:55

experts whenever you have time. No

9:57

pushy sales rep in your home.

12:00

said he took them. That's okay.

12:02

What's wrong with that idea? Just nothing

12:04

wrong. It sounds optimal. It sounds pretty

12:06

reasonable. It sounds great. It's

12:09

just a matter of resources and

12:11

then also completely revamping how you teach

12:13

kids. This is my

12:15

gripe with this whole ADHD

12:18

thing. I've talked

12:20

to many people who have varying opinions on

12:22

whether or not that's an actual condition or

12:24

whether or not there's a lot of people

12:26

that have a lot of energy and you're

12:28

sitting in a class that's very boring and

12:30

they don't want to pay attention to it.

12:32

Instead, you drug them and you

12:34

give them medication that is essentially

12:36

speed and lets them hyper focus on

12:38

things. Now all of a sudden, little

12:41

Timmy's locked on. It was really

12:43

just the medication that he needed. I

12:45

think for a lot of those kids,

12:47

if they found something that was really

12:50

interesting to them, maybe they're really bored

12:52

with this, but they're really excited by

12:54

biology. Maybe there's something that

12:56

resonates with their particular personality and what

12:59

excites them. They could find a pathway.

13:01

Instead, we have this very

13:05

rigid system that wants to

13:08

get children accustomed to the idea

13:10

of sitting still for an

13:12

hour at a time over and over

13:14

and over again throughout the day being

13:16

subjected to people who aren't necessarily that

13:18

motivated or getting paid that well. Well,

13:21

we're going to probably talk about AI today, but

13:23

let's just touch on this just in this one

13:26

second. We

13:29

are going to create computers

13:33

that are able

13:35

to do a lot of the rote thinking for

13:37

us. What

13:39

that means is, I think, the

13:43

way that humans differentiate ourselves is

13:45

that we're going to have to

13:47

have judgment and taste. Those are

13:49

very defining psychological characteristics, in my

13:51

opinion. What that

13:53

means is if you go back to

13:55

how school is taught, what you

13:57

said is very the

14:00

world is going to look like in 30 years. In

14:02

30 years where you have a PhD

14:05

assistant that's in your pocket that can

14:07

literally do all of the memorization,

14:09

spell checking, grammar, all of the

14:12

fact recall for you, teaching

14:16

that today is probably

14:18

not going to be as important as interpreting

14:20

it. How do you teach kids to learn

14:22

to think, not to memorize

14:24

and regurgitate? So we have

14:27

to flip, I think, this education system. We have

14:29

to try to figure out a different

14:31

way to solve this

14:33

problem because you can't set

14:36

children in this generation of our kids

14:40

to go and have to compete with a computer.

14:45

That's crazy. It's crazy. That's crazy. That's how you

14:47

make a Drake song in three minutes. The computer

14:49

is going to win. So what

14:51

can't the computer do is, I think, maybe

14:53

a reasonable question. And I think the computer,

14:56

in a lot of cases, can't

14:59

express judgment. It'll learn, but

15:01

today it's not going to be able to, the same

15:03

way that humans can. It's

15:05

going to have different tastes, right?

15:07

So the way that we interpret things,

15:09

the same way that you motivate people, like

15:11

all the psychology, all these things that are

15:14

sort of like these softer skills that allowed

15:16

humans to cooperate and work together, that

15:18

stuff becomes more important when you have a

15:21

fleet of robots. And

15:23

so if you go all the way back to school, today

15:27

the school

15:29

system is unfortunately in a

15:32

pretty tough loop. Look, teachers,

15:36

I think, are going to become the

15:40

top three or four important people in

15:43

society. And the

15:45

reason is because they are going to

15:48

be tasked with teaching your kids and my

15:50

kids how to think, not to memorize. Don't

15:52

tell me what happened in the war of

15:54

1812. You can just use

15:58

a search engine or use a search engine. use a

16:00

chat GPT and find out the answer. But

16:03

why did it happen? What were the motivations?

16:05

If it happens again, what would

16:07

you do differently or the same? And

16:10

those kinds of reasoning and judgment things, I

16:12

think, were still far ahead of those computers.

16:14

So the teachers have to teach that, which means you have

16:17

to pay them more, you have to put them in a

16:19

position to do that job better. And then

16:21

back to what you said, you know,

16:23

in my, I've lived this example of

16:25

ADHD in my family. One

16:27

of the kids was diagnosed with it. And

16:31

unfortunately, what happens is the system a little bit

16:33

closes in on you. So on the one side,

16:35

they give you a lot of

16:37

benefits, I guess. I put it in quotes

16:39

because you get these emails that say

16:41

if they want extra time, if they want this,

16:44

if they want, you know, they'll give

16:46

you a computer, for example, to take notes so that

16:48

you don't hand write. So

16:50

those feel like aids to help you. Right.

16:54

But then on the other side, you know, one

16:56

person was very adamant like,

16:58

hey, you want to medicate. And

17:02

my ex-wife and I were just like, under

17:04

no circumstances are we medicating our child. That

17:07

was a personal decision that we made with

17:09

the information that we had knowing that specific

17:11

kid. All kids are different, so

17:13

I don't want to generalize. And

17:16

then the crazy thing, Joe, what we did was we took the

17:18

iPad out of the kid's hand. And

17:21

we said, you know, we had these

17:23

very strict device rules, and

17:26

then COVID turned everything upside down. And

17:29

you're just surviving. You're sheltering and

17:31

playing. Right. Five kids running around.

17:34

They're not really being, you know, taught by

17:37

the schools. The schools won't convene the kids.

17:41

And so what do you do? You just hand them the device. Everything

17:44

was through the device. The

17:46

little class they got through the device, the way

17:48

that they would talk to their friends through the

17:50

device. So it reintroduced itself in a way that

17:54

we couldn't control. And

17:56

then we saw this slippage. And

17:58

then what we did was we just... just drew a bright red line

18:00

and we said, we're taking it out of your hands. No

18:03

more video games, no

18:05

more iPad, we're gonna dose

18:07

it in very small doses. And he had

18:09

an entire turnaround. But then

18:12

here's what happened. I

18:14

took my eye off the ball a little bit this summer,

18:16

because it was like he had a great

18:18

year, he reset his self confidence

18:20

was coming back. I was like, man, this is

18:22

amazing. And then I do the

18:24

thing that, you know, a lot of people would do, oh here,

18:26

you can have an hour. Ah

18:28

yeah, it's fine, you know, talk to your friends, you know. And

18:31

then it started again. And then again, now we just

18:33

have to reset. So at least

18:35

in our example, what we have found, and

18:38

I'm not, it may not apply to everybody, but for us,

18:42

him not being bathed

18:44

in this thing, had

18:48

a huge effect. Playing basketball outside,

18:51

you know, roughhousing with his brothers,

18:54

you know, having to talk to his friends, having

18:56

to talk to us, watching movies,

18:59

you know, or we would just sit around, because by the

19:01

way, what I

19:03

noticed was like, my kids had a hard time

19:06

watching movies or

19:09

listening to songs on Spotify for the

19:11

full duration. They'd get to

19:13

the hook and they'd be like, Ford, next. And

19:16

they'd be like, you know, they'd watch like eight minutes next. And I

19:18

was like, what are you guys doing? Like,

19:20

this is like enjoying the fullness. They

19:24

couldn't even sit there for three and a half minutes. So

19:27

what at least, you know,

19:29

my son was learning was, right,

19:31

to just chill a

19:34

little bit, be there, be able to watch the show.

19:36

And these shows move at a glacial

19:38

pace relative to what they're used to

19:41

if they're playing a video game. Or TikTok.

19:43

Or TikTok. Yeah. Yeah, because

19:46

TikTok, they're like this, boom, boom, boom, boom. And

19:49

it's helped. It's not a cure. But

19:53

it just goes back to what you're saying, which is like, if

19:56

you give parents options, I

20:01

heard this crazy stat, I don't know if this is true. If

20:04

you take your devices away from a kid, the

20:07

kid will feel isolated from

20:09

their other students. The

20:11

critical mass, I don't know if this is true or not,

20:13

but it's what I was told, so I'll go with it,

20:16

was that if you get a third of the

20:18

parents, so like in

20:20

a class of 20, if you get a third of

20:22

the parents to agree as well, no

20:24

devices, the kid feels

20:26

zero social isolation because

20:29

it becomes normative. It's normal.

20:32

You got a flip phone and you're texting

20:34

like this to your parents or you're calling.

20:39

I don't know, it may be worth trying. There was

20:41

a crazy thing, I don't know if you can find

20:43

this, but there was a crazy thing, Eaton College, which

20:46

is like the most elite,

20:49

if you will, private school in

20:51

the UK. It's kind of

20:53

where all the prime ministers of the United

20:56

Kingdom have matriculated through Eaton College, so it's

20:58

like high school, fancy high school. They

21:02

sent a memo to the parents for

21:04

the incoming class and

21:06

the headmaster said, when

21:09

you get on campus with your child, we're

21:11

going to give you like what is basically

21:13

a Nokia flip phone. You

21:15

are going to take the SIM card out of

21:18

this kid's iPhone or Android, you're going to stick

21:20

it in this thing and this is

21:22

how they're going to communicate with you and communicate

21:24

amongst each other while they're on campus at

21:26

E. Wow. Mandatory.

21:28

Mandatory. I

21:31

thought this was incredible. I

21:33

don't know what the impact is, but

21:37

that takes a lot of courage and I

21:39

thought that's amazing. Well, it's

21:42

great because then if they're

21:44

communicating, they're only communicating. They're

21:46

not sharing things or Snapchatting

21:48

each other back and forth

21:50

and the addictive qualities of

21:52

these phones, which is if

21:54

you think about the course of human evolution

21:57

and you think of how we adapted to...

22:00

agriculture and civilization and we essentially

22:02

became softer and less muscular and

22:05

less aggressive like that

22:07

took a long time. A long time.

22:09

That was a long time. This thing

22:11

is hitting us so quickly and

22:14

one of the bizarre

22:16

things is it creates

22:18

a disconnection even

22:20

though you're being connected to

22:23

people consistently and constantly through

22:25

social media there's a disconnection

22:27

between human beings and normal

22:30

behavior and learning through interaction with

22:32

each other, social cues, all the

22:34

different things that we rely on

22:36

to learn how to be a

22:38

friend and to learn how to

22:40

be better at talking to each

22:43

other. I have a rule with

22:45

my oldest who's 15. He'll

22:48

call me, he'll call me or

22:52

even when I call him. It's

23:00

like this like it's like a

23:02

grunt greeting. Right, not a talk

23:04

anymore. And

23:06

I'm like hello. And

23:12

so I went through this thing where like I would just hang

23:14

up and I'm like you

23:16

know beep hang up and then he would call

23:18

me back. And

23:25

then finally I said I just

23:27

I just want you to have these building

23:29

blocks they may sound really stupid to you

23:31

right now but looking

23:33

people in the eye being

23:36

able to have a normal conversation

23:39

and be patient in that conversation is

23:43

going to be really valuable for you. People

23:45

will really be connected to you.

23:47

You may not feel that and you may think this

23:49

is like lame and stupid what I'm

23:51

telling you but I was like just try to

23:53

just try to do it. And then

23:56

what's so funny is like I

23:59

would tell this story about like, you know, our

24:01

kids go to like a, you know,

24:03

very well-meaning private school, right?

24:06

And I

24:08

almost think like sometimes like, again,

24:11

we're not teaching necessarily kids to think for

24:13

themselves. We're asking them to memorize a bunch

24:15

of things. And one

24:17

of the things that I worry that we've taught

24:20

our kids to memorize are like the

24:22

fake greetings and salutations. So on

24:25

the one hand, you have what's really visceral, which is,

24:27

oh. And then on

24:29

the other hand, you know, sometimes you'll see these

24:31

kids and they'll get introduced somebody to, hello, how

24:33

are you? It's great

24:35

to meet you. And I'm like, man,

24:37

this is the most, this is the

24:39

fakest thing I've ever seen. So you're at these two

24:41

ends of the spectrum. And

24:44

I would make fun of my kids sometimes because

24:46

like, you know, they would say thank you,

24:48

but they would say like, thank you, like the queen. They'd

24:50

be like, thank you. And I'm like,

24:53

what are you doing? Who taught you that? You

24:55

were taught at school to say thank you like that?

24:58

You could just say thank you. Right. Thanks.

25:02

I appreciate that. Just look somebody in the eye.

25:04

Thank you. But what concerns me is

25:06

as this tech gets more and more invasive

25:08

in terms of how human beings, particularly children

25:10

interface with it. And as it gets, I

25:12

mean, we're really, we would just be guessing

25:14

as to what comes out of AI and

25:17

to what, what kind of world we're even looking at in 20

25:19

years. It

25:21

seems like it's having a profound effect

25:24

on the behavior of human beings, particularly

25:26

young human beings and their development. How

25:28

old are you? I'm 48. I'm

25:31

57. So when I grew

25:33

up, there was zero of this.

25:35

And I got this slow trickle

25:38

through adulthood from when I

25:40

was a child, the VHS tapes and answering

25:42

machines for the big tech. Yeah, you had

25:44

the rotary phone. Yes. Yeah,

25:46

exactly. So we went through the whole

25:48

cycle of it, which is really interesting. So

25:51

you get, you get to see this

25:53

profound change in people and what it's

25:55

doing to kids. And

25:57

you got to wonder, like, what is that

25:59

doing to the species? And is that going

26:02

to be normal? Is it going to be

26:04

normal to be emotionally

26:06

disconnected and very bizarre in

26:08

our person to

26:12

person interface? I think

26:14

that when technologies get going, you

26:17

have this little burst. It's

26:20

like these Cambrian moments. You get

26:22

these little bursts which are overwhelmingly

26:24

positive. I don't know what your

26:26

reaction was, but my reaction when I first

26:28

saw the iPhone, I was blown

26:31

away. And I

26:33

think the first four or five years was

26:35

entirely positive because

26:38

it was just so novel. You took this big

26:40

computer and we effectively shrunk

26:42

it to this little computer, made

26:44

it half to a third of the cost. And

26:47

lo and behold, supply demand, just

26:49

the number of computers tripled and

26:51

quadrupled and quintupled and so many

26:53

more people were able

26:55

to be a part of that economic

26:58

cycle, all positive. Then

27:00

you get a little dip. And the little dip

27:02

is when I think we lose

27:04

a little bit of the ambition of that first moment

27:08

and we get caught up in the economics of

27:10

the current moment. What I mean by that

27:12

is the last five or

27:14

10 years, I think why

27:16

you feel this viscerally is we

27:19

haven't had a big leap forward from the

27:21

iPhone of really 2014, 15. And

27:24

I'm not picking on the iPhone. I'm just like

27:26

a mobile device. So what have you

27:28

had over the last 10 years? You've had an

27:30

enormous amount of money get created

27:33

by an enormous number of apps. And

27:37

the problem is that they are in a

27:39

cul-de-sac and so they'll just iterate in this

27:41

one way that they understand because

27:43

the money is really good, quite honestly.

27:47

And the incentives of the capital markets will tell you

27:49

to just keep doing that. But

27:52

then I think what happens is something shocks

27:55

us out of it and then we get the

27:57

second wave. So if you go all the way back

27:59

to look at the ... like the PC. The

28:02

first moment of the PC in the 70s and

28:04

the early 80s was incredible. You

28:06

had these people that were

28:08

able to take it and do all kinds

28:10

of really interesting things. It was pure. Then

28:14

you had sort of like the 90s and the early 2000s

28:16

and what was it?

28:18

It was duopolistic at

28:20

best, Microsoft and Intel.

28:22

What they were able to do was

28:24

extract a huge tax by putting all

28:26

of these things on folks' desks and

28:28

it was still mostly positive but

28:31

it was somewhat limited because

28:33

most of the spoils went to these two

28:35

companies and all the other companies basically

28:38

got a little bit run over. Then it

28:40

took the DOJ to step in in

28:42

2000 and try to course correct

28:44

that on behalf of everybody

28:46

basically. Then

28:49

what happened was the internet just exploded

28:52

and the internet blew the doors wide open and all

28:54

of a sudden if you had a PC, you didn't

28:57

have these gatekeepers. It actually didn't

28:59

even matter whether you were running on Intel

29:01

anymore. You just needed a browser. You didn't

29:03

need Microsoft Windows and

29:05

you didn't need Intel and

29:08

then just the internet just explodes.

29:12

We have a positive moment followed by

29:15

call it 10 or 15 years of basically

29:18

economic extraction and

29:21

then we have value. I think

29:23

today it's like we've invented something really

29:25

powerful. We've

29:28

had 10 or 15 years that were largely

29:30

economic and

29:32

again I think this is like the problem I'm

29:34

going to sound like every other nerd

29:38

from Central Casting from Silicon Valley telling you this

29:40

but I do think that there's a version

29:42

of this AI thing which blows the doors

29:45

wide open again. I

29:47

think we owe it to ourselves to figure out how

29:49

to make that more likely than

29:51

not likely. Well it seems it's

29:53

inevitable right? AI's emergence and it's

29:56

where it goes from here on is inevitable. It's

29:58

going to happen and we should. probably try

30:01

to steer it at least in a way

30:03

that benefits everybody. And I agree with you.

30:05

There is a world I could see where

30:08

AI changes everything. And

30:10

one of the things that makes me most hopeful is

30:13

a much better form

30:15

of translation so that we'll

30:17

be able to understand each other better. Totally. It's

30:20

a giant part of the problem in the world

30:22

that's the Tower of Babel. So we

30:24

really can't communicate with each other very well.

30:26

So we really don't know what the problems

30:29

are in these particular areas or how people

30:31

feel about us, how we feel about them.

30:33

Can't empathize. Yeah, we can't. And

30:35

it's very easy to not empathize with

30:37

someone where you don't even know what

30:39

their letters are. Have you been

30:41

in a situation where you have a translator with a thing in

30:43

your ear? No. Empathy

30:46

zero. Because the problem is the person there

30:48

is giving it to you in a certain

30:51

tone because it's first person. Oh, I've had

30:53

that with many interview fighters. I've had translators.

30:55

Yeah, but when you're here, it's very hard

30:57

to feel empathy for this person because it's

30:59

this person that you're focused on because you're

31:02

trying to catch it. Right. So

31:04

you hear the words. I think somewhat

31:06

of the meaning is a

31:08

little bit lost. Then you go back to this person

31:10

and you say something and they're in the same problem

31:12

that you are. So I agree with that.

31:14

The translation thing is cool. I think that there are ... There's

31:17

going to be some negative areas.

31:23

I think that there's going to be a lot of pressure

31:26

on certain jobs and we got to figure that out. So

31:28

it's not all roses. But some areas,

31:30

if you imagine them, I'll

31:32

give you a couple if you want, are just

31:35

bananas, I think. Okay. Okay.

31:39

So I'll go from the most likely to

31:41

the craziest. Okay. Okay. So

31:44

most likely today, do you know if

31:46

you know somebody that's had breast cancer, if

31:49

they go into a hospital,

31:51

a random hospital in America, and

31:54

the doctor says, we need to do a lumpectomy,

31:56

meaning we need to take some mass out of

31:58

your breast to take the cancer out. What

32:01

do you think the error rate today is

32:03

across all hospitals in America? It's

32:06

about 30%. Wow. And

32:09

in regional hospitals, so places that are

32:12

poor, right, or places that are

32:14

in far flung parts of the United States, it can

32:16

be upwards of 40%. This

32:19

is not the doctor's fault, okay? The

32:22

problem is that you're

32:24

forcing him or her to

32:27

look with their eyes into

32:29

tissue and try to

32:31

figure out, well, where is the border where the

32:33

cancer stops? So for

32:36

every 10 surgeries, what that means are a

32:39

week later, so imagine this, you get a breast

32:41

cancer surgery, they take it

32:44

out, they send it to the pathologist. The pathologist

32:46

takes between seven and 11 days. So

32:49

you're kind of waiting. One

32:52

of the calls come back, you're

32:54

clean margins, you're great. Now go to the

32:56

next step. Three of

32:58

the calls, I'm sorry, there's still cancer

33:00

inside your body. Three. So

33:04

these women now go back for the next

33:06

surgery. But the problem is one

33:08

of those women will get another call that says, I'm

33:10

sorry, there's still cancer. And

33:13

so what is that?

33:16

That's a computer vision problem, right?

33:20

That's not necessarily a problem

33:22

that can't not be solved literally

33:24

today. We have

33:27

models, we have tissue samples of

33:30

women of all ages, of all

33:32

races, right? So you have all of

33:35

the different boundary conditions you'd need to

33:37

basically get to a 0% error rate.

33:41

And what's amazing is that is now working its

33:43

way through the FDA. So call

33:45

it within the next two years, there'll

33:47

be an AI assistant that

33:50

sits inside of an operating room. The

33:53

surgeon will take out what they think is appropriate,

33:55

they'll put it into this machine, and it'll literally,

33:57

I'm going to simplify, but it'll flash red or

33:59

green. I mean, you got

34:01

all the cancer out. You

34:04

need to take out a little bit more just right over here. And

34:07

now you get it out and now all of a sudden instead

34:09

of a 30% error rate, you have a 0% error

34:12

rate. That's amazing.

34:15

That's today because you have this computer that's

34:18

able to help you. And

34:20

all we need is the will and

34:23

the data that says, okay, we want to

34:25

do this, just show me that it works

34:28

and show me what the alternative would be if we didn't

34:30

do it. And the alternative turns out

34:32

to be pretty brutal. Fourteen

34:35

surgeries for every ten surgeries. I mean, that's not

34:37

what the most advanced nation in the world should

34:39

be doing. Right. Okay, so if

34:41

you do it for breast cancer, the reason why breast

34:44

cancer is where folks are focusing is

34:46

because it gets so much attention and

34:49

it's like prime time. But

34:51

it's not just breast cancer, lung

34:54

cancer, pancreatic cancer,

34:57

stomach cancer, colon cancer.

35:01

If you look at any kind of tumor, so

35:04

if you're at the stage where you're like,

35:06

we need to get this thing, this foreign

35:08

growing thing out of our body, we

35:12

should all have the ability to just do

35:14

that with 0% error and it will be

35:16

possible in the next couple of years because

35:19

of AI. So that's kind of like

35:21

a, that's cool and it's coming. I

35:25

think between years two and years five, you're

35:29

going to see this crazy explosion in

35:32

materials. And this is going

35:34

to sound maybe dumb, but I think

35:36

it's one of the coolest things. If you look

35:38

at the periodic table of elements, what's

35:42

amazing is like we keep adding. So

35:46

there's like 118 elements. We

35:48

actually just theoretically forecasted there's

35:50

going to be 119, so we created a little box.

35:54

It's going to be, it's like, it's theoretical, but it's

35:56

going to show up and they forecasted

35:58

that there's going to be 114. Okay?

36:02

So, the periodic table of elements,

36:04

quote-unquote, grows. But

36:06

when you look at the lived world today, we

36:10

live in this very narrow expression of all

36:12

of those things. We use the same few materials over

36:15

and over and over again. But

36:17

if you had to solve a really complicated

36:19

problem, don't

36:21

you think the answer could theoretically be in this?

36:24

Meaning, if you took, I'm going to make it

36:26

up, Selenium

36:29

and then doped it with titanium 1%,

36:32

but if you doped it with boron 14%,

36:35

all of these things are possible. It's like stronger

36:37

than the strongest thing in the world, and wow,

36:39

and it's lighter than anything. So

36:41

now, you can make rockets with it and send it all

36:44

the way up with less energy. It's all possible. So,

36:46

why haven't we figured it out? Because

36:50

the amount of energy and the amount of computers

36:52

we need to solve those problems, which

36:54

are super complicated, haven't

36:56

been available to us. I

37:00

think that is this next phase of AI. So

37:02

what you said, which is we're going to have

37:04

these PhD-level robots and

37:06

agents. In the next two to five years,

37:08

we're going to come up with all kinds of materials. You'll

37:12

have a frying pan that's nonstick, but doesn't have to heat

37:14

up. Oh, whatever you want.

37:16

From the most benign to the

37:18

most incredible stuff, we'll just re-engineer

37:22

what's on earth. That's

37:25

going to be crazy. It's going to

37:27

be incredible. We all benefit from that. The

37:29

kinds of jobs that that creates. We

37:32

don't even know what job class that is to work

37:34

with Selenium and BOR. I'm making up

37:36

these elements, so please don't. So

37:39

the point is that there's ... So

37:41

that's like in the middle phase. So

37:44

our physical lived world is

37:46

going to totally transform. Imagine a

37:48

building that's made of a

37:50

material that bends. It

37:53

can just go like this and nothing changes to it. Why

37:56

would that be important? Well, if you want to protect yourself

37:58

from the crazy stuff. the

38:00

unpredictability of climate in

38:02

the areas where it's susceptible to that, maybe

38:05

you can construct these things

38:07

much cheaper. Well, earthquakes. Earthquakes.

38:10

You could construct more of them. Imagine in San

38:12

Francisco, you could build buildings that solve

38:15

the housing crisis, but do it in a way that

38:17

was cheaper because the materials are totally different and you

38:19

could just prove that these things are bulletproof. So

38:22

instead of spending a billion dollars to build a building,

38:24

because you got to go hundreds of

38:27

feet into the earth, you just

38:29

go 50 feet and it just

38:31

figures it out. So

38:35

that's possible and I think there will

38:37

be people that use these AI models to go and solve

38:39

those things. And

38:41

then after that, I think you get

38:43

into the world of it's not just robots

38:46

that are in a computer, but

38:48

it's like a physical robot. And

38:51

those physical robots are going to do things

38:53

that today will make

38:55

so much sense in hindsight. So an example, I

38:57

was thinking about this this weekend. Imagine

39:00

if you had a bunch of optimists, like

39:03

Tesla's robot, and they

39:05

were the beat cops. They

39:08

were the highway patrol. Now

39:12

what happens there? Well first, you

39:15

don't put humans in the way. I

39:19

suspect then the reaction of those robots

39:21

could be markedly different. Now those robots

39:23

would be controlled remotely. So

39:25

the people that are remote now

39:28

can be a very different archetype. Instead of

39:30

the physical requirements of

39:32

policing, you now add this other layer,

39:34

which is the psychological elements

39:37

and the judgment. So

39:39

my point is that if you had robots

39:41

that were able to do the dangerous work

39:43

for humans, I

39:46

think it allows humans to do again, judgment,

39:50

those areas of judgment which are very gray and

39:52

fuzzy. It'll take a long

39:54

time for computers to be able to replace us

39:56

to do that. I really do think so. I

39:58

think the biggest thing... that we have

40:02

done as a disservice

40:05

to what is coming is some

40:08

folks have tried to say that AI is the end all

40:10

and be all. And I think the better

40:12

way to think about this is that you

40:15

know how you used to have to get

40:18

your spelling right in an email? And

40:20

now you just don't think about it because Gmail just fixes

40:22

it. It

40:24

up-levels us. You used to

40:27

have to remember the details of like

40:29

some crazy theory, random detail fact. Now

40:31

you can just Google it. So you

40:33

can leave your mind to

40:35

focus on other things, right?

40:38

The creativity to write your

40:40

next set, to think about the next interview, to

40:43

think about your business

40:46

because you're occupying less time with the

40:48

perfunctory stuff. I

40:50

think these models are doing that

40:53

and they're going to get complemented with physical models.

40:56

Meaning physical robots. And

40:59

they're going to do a lot of work for us that

41:01

we have not done. Or

41:04

today that we do very precariously. You

41:07

know like should a robot go in and save you from

41:09

a fire? I think it can probably do a pretty

41:11

good job. They'll have multiple

41:13

sensors. They'll have vision. They'll be able to

41:15

understand exactly what's going on. If

41:17

something is falling, they'll just be able to put their hand up

41:19

and just like stuff. You know what I mean? If

41:22

they encounter any person of any body weight, it's

41:25

no problem. Pick that person up.

41:27

Transport them. Again,

41:29

it allows humans to

41:32

focus on the things that

41:34

we're really, really differentiated at. I

41:38

do think it creates complications, but

41:40

we have to figure those out. So

41:44

that's like a short, medium, long

41:46

term. Well, I see what you're

41:48

saying in the final example as

41:50

the rosy scenario. That's the best

41:52

case option, right? That it gives

41:54

people the freedom to be more

41:56

creative and to pursue different things.

41:58

And I think... there's always going to be

42:01

a market for handmade things.

42:03

People like things that

42:05

like they're like an

42:07

acoustic performance. They like stuff where it's

42:10

like very human and very real. But

42:13

there's a lot of people that just want

42:15

a job. And these

42:17

people maybe just aren't

42:20

inclined towards creativity. And maybe

42:22

they're very simple people who just want a job and

42:25

they just want to work. Those

42:27

are the people that I worry about. I worry about them

42:29

as well. And I think that like I

42:33

didn't live in the agrarian economy nor in

42:35

the industrial revolution. So I don't know how

42:37

we solve this problem. But

42:39

we have seen that problem two times.

42:43

And each time we found a way.

42:47

And this goes back to sort of like news

42:49

and politics and like just working together.

42:52

But in each of those moments we found a

42:54

way to make things substantively

42:56

better for all people. Like

42:59

I saw this crazy stat in 1800. Do

43:02

you know how many people lived in extreme

43:04

poverty? How many? Whoa.

43:07

You know where we are today? Sub 10%. Single

43:10

digits. And it's

43:12

a straight line that goes like this. And that

43:14

was through an agrarian revolution. It was through the

43:16

industrial revolution. So it

43:18

is possible for humans to cooperate to

43:20

solve these problems. I

43:23

don't know what the answer is but I do think you are

43:25

right that it will put a lot of

43:27

pressure on a lot of people. But

43:30

that's why we got to just figure this out. What

43:32

are your thoughts on universal basic income

43:34

as a band aid to sort of

43:37

mitigate that transition? I'm

43:40

pretty sympathetic to that idea. I

43:43

grew up on welfare. So

43:46

what I can tell you is that there are a

43:48

lot of people who are

43:51

trying their best. And

43:53

for whatever set of boundary conditions, can't figure

43:55

it out. I grew up on welfare as

43:57

well. If

44:00

I didn't have that safety net, my

44:06

parents' struggles, I think

44:08

would have gotten even worse than what they were. So

44:12

I'm a believer in that social safety net. I

44:14

think it's really important. It's the best case scenario,

44:16

right? Because your parents worked their way out of

44:18

it, my parents worked their way out of it,

44:21

but some people are just content to

44:23

just get a check. This

44:26

is the issue I think that a lot

44:29

of people have, is that people will become

44:31

entitled and just want to collect a check.

44:33

If it's a substantial portion

44:35

of our country, like if universal

44:37

basic income, if AI

44:39

eliminates, let's just say a crazy

44:42

number, like 70% of the manual

44:44

labor jobs, truck drivers, construction workers,

44:46

all that stuff gets eliminated, that's

44:48

a lot of people without a

44:50

purpose. One of the

44:52

things that a good day's work and

44:54

earning your pay, it makes

44:56

people feel self-sufficient. It makes people feel

44:58

valuable. It gives them a sense of

45:01

purpose. They could look at the thing

45:03

that they did, maybe build a building or something like

45:05

that and drive their kids by, hey, we built that

45:07

building right there. Wow. It's

45:10

a part of their identity. If

45:12

they just get a check, and then what

45:14

do they do? Just play video games all day? That's

45:16

the worst case scenario, is that people

45:19

just get locked into this world of

45:21

computers and online and just

45:23

receive checks and have the

45:25

bare necessities to survive and are content with

45:27

that and then don't contribute at all. The

45:33

jobs that ... Let's put it

45:35

this way. If we were

45:38

sitting here in 1924, whatever, 100 years

45:40

ago, right in the midst of the

45:42

turn of

45:46

the Industrial Revolution, we

45:49

would have seen a lot of folks that

45:51

worked on farms and

45:54

we would have wondered, well, where are those

45:56

jobs going to come from? I

46:01

think that now when you look

46:03

back, it was like not

46:06

obvious, but you could see where the

46:08

new job classes came from. It's like

46:10

all of these industries that were possible

46:12

because we built a factory. And

46:14

a factory turned out to be a substrate,

46:17

and then you built all these different kinds of

46:19

businesses which created different kinds of jobs on top

46:21

of it. I

46:24

would hope that if we do this right, this

46:27

next leap is like that, where

46:29

we are in a period where

46:31

it's hard to know with certainty what

46:35

this job class goes to over here. But

46:38

I think you have a responsibility to go and figure

46:40

it out and talk it

46:42

out and play it out because

46:45

the past would tell you that we

46:48

have a really good humans when

46:51

they're unimpeded, have a really

46:53

good ability to invent these things.

46:56

So I don't know, maybe what

46:58

it is is by 2035, there's a

47:01

billion people that have traveled to

47:04

Mars and you're

47:07

building an entire planet from the ground

47:09

up. There'll be

47:11

all sorts of work to do there. What

47:14

kind of people are going to go first there? I

47:17

think that there'll be a lot of people that are frustrated

47:19

with what's happening here. Sure,

47:22

just like the people that got on the

47:24

Pinta, the Santa Maria and made their way

47:26

across the ocean. It all starts with a

47:28

group of people that are just like, I'm

47:30

fed up with this. But

47:33

to want to go to a place that

47:35

doesn't even have an atmosphere that's capable of

47:38

sustaining human life and

47:41

you can only go back every couple

47:43

of years, those people are going

47:45

to be psychos. You're going to have a completely

47:48

psychotic Australia on

47:50

meth. It's

47:53

like the worst case scenario of the cast

47:55

outs of society. Just

47:58

like what you say is it. It's so

48:00

true, but if you think about what

48:03

that decision looked like 400 years

48:06

ago when that first group

48:08

of prisoners were put on a boat and sent to

48:10

Australia, that's probably what it

48:12

felt like. Most people

48:15

on the mainland when they were like,

48:17

Jachao, were probably thinking, man, this is

48:19

insane. So

48:21

it'll always look like that. It'll be easier

48:23

to rationalize it in hindsight, but

48:25

I do think that there will be a lot of people

48:27

that want to go when it's possible to go. And

48:31

look, we're in the studio. We

48:35

could be anywhere. We could be in Salt Lake

48:37

City. We could be in Rome. We

48:40

could be in Perth. You

48:42

don't know. All the same. Especially

48:44

today. So you could be on Mars. Yeah,

48:46

you could. You wouldn't know. Yeah. That

48:49

could be the future. Instantaneous

48:52

communication with people on

48:55

other planets, just like you could talk to people

48:57

in New Zealand today. So that's

48:59

an amazing example

49:02

of an innovation in

49:04

material science that

49:06

we have been experimenting with for years.

49:08

So basically at the

49:10

core of what you just said is

49:12

a semiconductor problem. It's

49:14

a doping problem. Is

49:17

it silicon germanium? Is

49:19

it silicon germanium with something else? And

49:22

the problem, Joe, is to answer what you

49:24

just said is a tractable

49:27

problem that has been bounded

49:30

by energy and computers. And

49:33

we're at a point where we're

49:35

almost at infinite energy. And

49:37

at a point where we're almost at like, what

49:40

I say is very specific, which is

49:42

we're at the point where right

49:44

in the distance is the

49:47

marginal cost of energy is basically zero.

49:49

The marginal cost, meaning to generate the

49:51

next kilowatt, is going to cost like

49:53

sub a penny. Even

49:55

with today's stuff, you don't need nuclear. You don't need any of

49:58

that stuff. We're just on this trend line right now. And

50:02

because of AI, we're at the

50:04

point where to get an answer to a question,

50:06

super complicated, is going to be basically zero,

50:09

the cost of that. When

50:11

you put those two things together, what

50:14

you just said, we will be in

50:16

a position to answer. The world will be able

50:18

to say, oh, Joe, you want instantaneous communication between

50:20

here and Mars? We need to harden

50:23

these communication chips. We're going to build it with

50:25

this stuff, which we simulated on a computer. We

50:28

made it. It's shipping, we're done. Now

50:30

that will still take five and 10 years to do,

50:32

but my point is all

50:34

these things that sound crazy are not.

50:39

They're actually not that crazy. These

50:41

things are achievable technical

50:43

milestones. Everything

50:46

will boil down to a technical question that I think

50:48

we can answer. You want a hoverboard?

50:50

We could probably figure it out. Well

50:53

then also with quantum computing, and one

50:55

of the things about AI that's been

50:57

talked about is this massive

50:59

need for energy. They're

51:01

going, at least it's been proposed,

51:04

to develop nuclear sites specifically to

51:06

power AI, which is

51:08

wild. Yeah. I have

51:10

to be ... You got

51:15

to dance around this? No,

51:18

I'll tell you what I think. Okay,

51:25

well, maybe

51:28

before I give you my opinion, I'll

51:30

tell you the facts. Okay.

51:35

Today, it costs about four

51:37

cents a kilowatt hour. Don't

51:39

forget the units. Just remember the four cents concept.

51:43

Twenty years ago, it costs six or seven cents. If

51:46

you go and get solar panels on your roof, basically

51:49

cost nothing. In fact, you can probably make

51:51

money. It costs you like negative one cent

51:53

because you can sell the energy in many

51:55

parts of America back to the

51:57

grid. If

52:00

you look inside the energy market, the

52:03

cost has been compounding, and you

52:05

would say, well, how does this make sense? If

52:08

the generation cost keeps falling,

52:11

why is my end user cost keep going up? This

52:14

doesn't make any sense. When you look

52:16

inside, we

52:19

have a regulatory

52:24

burden in America that

52:27

says to the utilities of which they're like less

52:29

than 2,000 in America. We're

52:32

giving you a monopoly, effectively. In

52:35

this area of Austin, you can provide all the

52:37

energy. Now, Texas is different, but I'm just using

52:39

it as an example. But

52:42

in return, I'm going

52:44

to allow you to increase prices, but

52:46

I'm going to demand that you improve the

52:49

infrastructure. Every few

52:51

years, you've got to upgrade the grid. You've got to

52:53

put money into this, money into that. Over

52:56

the next 10 years, we've got to put a trillion dollars, America

52:59

collectively, into improving the

53:02

current grid, which

53:04

I think will not be enough, because

53:06

it is aging, and most

53:09

importantly, it's insecure, meaning

53:12

folks can penetrate that, folks can hack it, folks

53:14

can do all kinds of stuff. Then

53:18

it fails in critical moments. I

53:20

think that in Austin, you had a whole bunch

53:22

of really crazy outages in the last

53:25

couple of years. People died. In

53:28

2024, that's totally unacceptable. I

53:33

think as people

53:35

decide that they want resilience, you're

53:39

going to see 110 million power

53:41

plants, which

53:44

is every homeowner in the United States. Everybody's

53:47

going to generate their own energy. Everybody's

53:50

going to store their energy in a power wall.

53:52

This stuff is going to become, I mean,

53:56

absolutely dirt cheap, and

53:58

it'll just be the way. that

54:01

energy is generated. So you have

54:03

this, but this is not the whole

54:05

solution, because you still need the big guys to show up.

54:09

When you look inside of like the big

54:11

guys, so like now you're talking about these

54:13

2,000 utilities that need to spend trillions of

54:15

dollars, they

54:18

can do a lot of stuff right now to make

54:20

enough energy to make

54:22

things work. But when you look

54:24

at nuclear, I would

54:26

just say that there are two different

54:28

kinds of nuclear. There's the old

54:30

and the new. The old stuff, I

54:33

agree with you, it's just money and you can get it

54:35

turned back on. It's

54:37

a specific isotope of uranium, you can deal

54:39

with it, everybody knows in that world how

54:43

to manage that safely. But

54:45

then what you have are like these next generation things, and this

54:48

is where I get a little stuck, and

54:50

I'm not smart enough to know all of it, but

54:53

I'm close enough to be slightly ticked off by

54:56

it. There's a materials and

54:58

a technical problem with these things, and

55:00

what I mean back to materials. Some

55:04

of these next-gen reactors need

55:07

a material that will take you

55:09

like 50 years in America,

55:11

in the world, to like harvest and ounce.

55:14

The only place where you can really get

55:16

it is the moon in sufficient quantity. Are

55:18

you really gonna, what I mean, that's how

55:20

it's gonna work? You're gonna go to the

55:22

moon, you're gonna go to the moon, you're

55:24

gonna harness this material, then you know

55:27

schlep it all the way

55:29

back to someplace in Illinois to make stuff.

55:31

I find that hard

55:33

to believe. What is the material? I

55:37

can find it, it's in an email that one of my

55:39

folks sent me, but it's like it's a certain

55:43

form of reactor that uses a

55:45

very rare material

55:48

to create the plasmonic energy that can generate

55:50

all of this stuff, and it's just very

55:52

hard to find on earth. So I kind

55:54

of scratch my head. What's the benefit of

55:56

this particular type of reactor? Enormous energy. So

55:59

like you know, a solar cell gets this

56:01

much energy, a nuclear reactor

56:03

does this, and this other thing does that, and

56:06

it's super clean. My

56:08

point is, these next-gen reactors, I think, have

56:11

some pretty profound technical problems that haven't

56:13

been figured out. I applaud the people

56:17

that are going after it, but

56:19

I think it's important to not oversell

56:22

that because it's super hard,

56:25

and there's still some profound

56:28

technical challenges that haven't been solved

56:30

yet. We just got past what's

56:33

called positive net energy, meaning ...

56:37

I'm making

56:40

up a number. A

56:42

hundred units of energy in, and at least

56:45

you try to get out 100.01, and we're

56:47

kind of

56:50

there. That's

56:52

where we are on these next-gen reactors.

56:54

The old generation of reactors, I'm a

56:56

total believer in, and we should be

56:58

building these things as

57:01

fast as possible so that

57:03

we have an infinite amount of energy.

57:05

By the way, if you have infinite energy, the

57:07

most important thing I think that happens is you

57:09

have a massive peace dividend. The

57:13

odds of the United States going to war when

57:15

we have infinite energy approach is

57:18

zero. But isn't the problem

57:21

with introducing this to other countries, and

57:23

I believe it was India where they

57:26

introduced nuclear power plants, then they realized

57:28

very quickly they could figure out how

57:30

to make nuclear weapons from that. Yes.

57:33

Yes. When the uranium degrades, it can

57:35

be used in weapons-grade uranium. The

57:37

real problem would be if that is

57:40

not a small handful of countries that

57:42

have nuclear weapons, but the entire world,

57:44

it could get very sketchy. I

57:48

think you're touching what

57:51

I think, objectively to me,

57:55

is the single biggest threat facing

57:57

all of us today. I

58:09

escaped a civil war, so I've

58:11

had a lived experience of how

58:15

destructive war could be. The collateral

58:17

damage of war is terrible. Where were you?

58:20

In Sri Lanka. I

58:24

was part of the ethnic majority, Sinhalese

58:26

Buddhist. They

58:30

were fighting Hindu Tamil

58:32

minority. It was

58:34

a 20-year civil war. It flipped the whole

58:37

country upside down from an incredible place with

58:39

99% literacy to just a

58:44

struggling, developing third-world country. We

58:47

moved to Canada. We

58:49

stay in Canada. My

58:53

parents do whatever they could. They

58:55

got run over by that war. They went from

58:57

a solidly middle-class

58:59

life to

59:01

my father had a ton

59:03

of alcoholism and didn't really work,

59:05

and my mother went from being a nurse to

59:08

being a housekeeper. It

59:10

was dysfunctional. It

59:14

really crippled, I think, their dreams for themselves.

59:17

They breathed that into their kids. Fine.

59:20

But that can't be the solution where hundreds of

59:22

millions or billions of people have to deal with

59:24

that risk. I

59:27

am objectively afraid that

59:30

we have lost the script a little bit. I

59:32

think that folks don't

59:34

really understand how destructive war

59:37

can be, but also

59:39

that there are not

59:42

enough people objectively afraid of this. That's

59:45

what sends my spidey senses up and says,

59:47

hold on a second. When

59:49

everybody is telling you that this

59:51

is off the table and not possible, shouldn't

59:55

you just look at the world around and

59:57

ask, are we sure that that's true? And

1:00:01

I come and I think to myself,

1:00:03

wow, we are at the biggest risk

1:00:05

of my lifetime. And

1:00:08

I think the only thing that is

1:00:10

probably near this is maybe

1:00:12

at some point in the Cold War, I don't know because I

1:00:14

was so young, definitely Bay

1:00:18

of Pigs, but

1:00:20

it required JFK to draw a hard line

1:00:22

in the sand and

1:00:24

say, absolutely not. So

1:00:28

will we be that fortunate this time

1:00:30

around? Are we going to find a

1:00:32

way to eliminate that existential risk? This

1:00:34

is why my

1:00:36

current sort of like vein of

1:00:38

political philosophy is mostly that, which

1:00:41

is like

1:00:43

the Democrats and the Republicans, there's

1:00:45

just so much fighting over

1:00:47

so many small stakes issues in the

1:00:49

sense that some

1:00:52

of these issues matter more or

1:00:55

less in different points, but there is

1:00:57

one issue above all which where if

1:00:59

you get it wrong, nothing matters. And

1:01:01

that is nuclear war. And

1:01:05

you have two and a

1:01:07

half nuclear powers now that

1:01:09

are out and about extending

1:01:12

and projecting their power into the world,

1:01:15

Russia, China, and

1:01:17

Iran. That

1:01:20

wasn't what it was like 10 years ago. That

1:01:23

wasn't what it was like 25 years ago. It wasn't even

1:01:25

what it was like four years ago. And

1:01:28

I just don't think enough people take a step back and say,

1:01:30

hold on a second. If this thing

1:01:33

escalates, all

1:01:35

this stuff that you and I just talked about won't matter.

1:01:39

Whether our kids are on Adderall

1:01:41

or not or the iPad, don't

1:01:43

give them so much Fortnite or

1:01:45

material science or optimists.

1:01:48

It's all off the table because

1:01:50

we will be destroying

1:01:53

ourselves. And

1:01:55

I just think that that's tragic. We have

1:01:57

an enormous responsibility right now for... We

1:06:00

collectively don't understand that. We sweep

1:06:03

it under the carpet and we talk about

1:06:05

all the other things. And

1:06:07

I understand that some of those things, all

1:06:10

of those things, let's say, matter, but at

1:06:13

some point in time, nothing

1:06:15

matters. Because if you don't get this

1:06:17

right, nothing matters. And

1:06:20

I think we have to find a way of finding

1:06:22

people that draw a bright red line and

1:06:25

say, this is the line I will never cross

1:06:27

under any circumstance. And

1:06:30

I think America needs to do that first because

1:06:32

it's what gives everybody else the ability to

1:06:34

exit stage left and be okay with it. The

1:06:37

other problem that America clearly has

1:06:39

is that there's an enormous portion

1:06:42

of what controls

1:06:46

the government, whether you want to call it

1:06:48

the military industrial complex or military

1:06:51

contractors. There's so much money to

1:06:53

be made in pushing that line,

1:06:55

pushing it to the brink of

1:06:57

destruction but not past, maintaining a

1:06:59

constant state of war but not

1:07:02

an apocalypse. And as

1:07:04

long as there's financial incentives

1:07:06

to keep escalating and you're

1:07:09

still getting money and they're

1:07:11

still signing off on hundreds

1:07:13

of billions of dollars to funnel this

1:07:15

and it's all going through these military

1:07:18

contractors and bringing over weapons and gear.

1:07:21

The windfall is huge. The amount of money

1:07:23

is huge. And they do not want to

1:07:25

shut that off for the sake of humanity,

1:07:28

especially if someone can rationalize. You

1:07:30

get this diffusion of responsibility when there's a whole

1:07:32

bunch of people together and they're all talking about

1:07:34

it. Everyone's kind of on the same page and

1:07:36

you have shareholders that you have to represent. The

1:07:39

whole thing is bananas. So I think you just

1:07:41

said the key thing. This may be super

1:07:44

naive. But

1:07:47

I think part of the most

1:07:49

salvageable feature of the military

1:07:52

industrial complex is that these

1:07:54

are for-profit, largely public companies

1:07:58

that have shareholders. And

1:08:01

I think that if you nudge them

1:08:05

to making things that

1:08:08

are equally economically

1:08:10

valuable or

1:08:12

more, ideally more, they

1:08:14

probably would do that. What

1:08:17

would be an example of that

1:08:19

other than weapons manufacturing? What would

1:08:21

be equally economically viable? So

1:08:23

when you look at the primes,

1:08:27

the five big kind of like

1:08:30

folks that get all of the economic

1:08:34

activity from the Department of

1:08:36

Defense, what they act is

1:08:39

as an organizing principle for a bunch of

1:08:41

subs underneath, effectively. They're like a general contractor

1:08:43

and then they have a bunch of subcontractors.

1:08:47

There's a bunch of stuff that's happening in

1:08:49

these things that

1:08:52

you can reorient if you

1:08:54

had an economy that could support

1:08:56

it. So for example, when

1:08:58

you build a drone, what

1:09:01

you also are building a subcomponent, a

1:09:03

critical and very valued subcomponent, all

1:09:06

the navigation, all the communications, all of it

1:09:08

has to be encrypted. You can't

1:09:10

hack it. You can't do any of that stuff. There

1:09:13

is a broad set of commercial applications

1:09:15

for that that are equal

1:09:17

to and greater than just the profit

1:09:20

margin of selling the drone, but they don't

1:09:22

really explore those markets. If

1:09:25

for example, we are multiplanetary,

1:09:27

I'll just go back to that example,

1:09:32

I will bet you those

1:09:35

same organizations will make two or

1:09:37

three times as much money by

1:09:40

being able to redirect that same technology

1:09:43

into those systems that you just described. Hey,

1:09:45

I need an entire communications infrastructure that goes

1:09:47

from Earth to the moon

1:09:50

to Mars. We need to be able to

1:09:52

triangulate. We need internet access across all these

1:09:54

endpoints. We need to be real time from

1:09:56

the get-go. There's

1:09:58

just an enormous amount. a

1:22:00

tactical unit and they have Belgian Malinois

1:22:02

and bulletproof vests and machine guns. The

1:22:04

whole thing's crazy and they get in

1:22:07

shootouts with the cartel in National Forest

1:22:09

Land because it's a misdemeanor

1:22:13

to grow pot illegally in a

1:22:15

state where pot is legal. So

1:22:17

California has legal marijuana. You could

1:22:19

go to any store, anywhere, use

1:22:21

credit cards. It's open free market.

1:22:24

If you follow the rules, you can open

1:22:26

up a store. But if you don't follow

1:22:29

the rules, you can sell it illegally and

1:22:31

it's just a misdemeanor. I wanted to learn

1:22:33

about marijuana, the market, but

1:22:35

you can't process the money, I think.

1:22:37

In some states. I know in

1:22:39

Colorado it was a real problem and in Colorado

1:22:41

they had to do everything in cash. Yeah, it's

1:22:44

like breaking bad bricks of cash and all of

1:22:46

these things. Well, they were using mercenaries. They're

1:22:49

essentially using military contractors

1:22:51

to run the money back

1:22:53

and forth to the bank because you had to bring

1:22:56

the bank money in bulk. So

1:22:58

you'd have a million dollars in an armored

1:23:01

car and a bunch of guys tailing the

1:23:03

car in front of the car and they're

1:23:05

driving into the bank and everyone knows there's

1:23:07

a million dollars in that car. So

1:23:10

you have to really be fortified. And

1:23:13

so it was very sketchy for a lot of people. I

1:23:15

don't know what the current condition in Colorado

1:23:18

is now. I don't know if they still

1:23:20

have to do it that way. A couple

1:23:22

of companies, I remember the reason I know

1:23:25

this is a guy came and

1:23:27

pitched me on some business and he

1:23:29

was the software for all that.

1:23:34

I think the company went public and I just

1:23:36

realized it just went sideways because nobody wanted to

1:23:38

touch it because they didn't want to build rails

1:23:41

for that economy, which didn't make

1:23:43

much sense seeing as, to me

1:23:46

at least, just because if the laws say it's legal

1:23:48

then it should all be treated equally. But

1:23:50

then the problem, I think I remember them telling

1:23:52

me, was that federally it's still gray. Yeah,

1:23:55

it's gray and they're trying to diminish that.

1:23:58

The latest steps during the body administration. is

1:24:00

to change it to a schedule three and

1:24:02

that's up for that's a proposal

1:24:04

that's going that would that would help but really

1:24:06

it should be just like alcohol it

1:24:09

should be something that you have to be 21 years old to

1:24:11

buy should have to have an ID and we

1:24:13

should educate people how to use it responsibly and

1:24:15

we should also pay attention to whoever the fuck

1:24:17

is growing it and make sure you're not going

1:24:19

wacky you know like there's there's people that are

1:24:21

botanists that are out of their mind potheads that

1:24:24

are just 24 7 hit and bongs and they're

1:24:27

making stuff that will put you on

1:24:29

Mars like without Elon Musk I remember

1:24:31

the the

1:24:33

problem that somebody raised I

1:24:36

read this in an article was you

1:24:38

need to make it more than

1:24:41

what it like more legal than it is today

1:24:43

so that you can get folks to put like

1:24:45

some version of a nutritional label on the thing

1:24:47

and show intensity right because the

1:24:49

intensity is not regulated right well they

1:24:51

do regulate it in California if you

1:24:53

go to good places in California let's

1:24:55

say this is 39% THC which

1:24:58

is very high this is 37 this

1:25:00

is you know but then there's also

1:25:02

the problem with one thing that marijuana

1:25:04

seems to do to some people that

1:25:07

alcohol doesn't necessarily some people have a

1:25:09

propensity for alcoholism and it seems to

1:25:11

be genetic but

1:25:13

there's a thing that happens with marijuana

1:25:16

where people who have a tendency towards

1:25:18

schizophrenia marijuana can push them

1:25:20

over the edge and Alex Berenson wrote a great

1:25:22

book about this called tell your kids and

1:25:26

I've personally witnessed people who've lost

1:25:28

their marbles and I think it's

1:25:31

people that have this propensity because

1:25:33

one of the things that I

1:25:35

think is beneficial about marijuana in

1:25:38

particular and this is some of

1:25:40

the one of the things that freaks people

1:25:42

out is the paranoia right well paranoia is

1:25:44

I feel I feel like what it is

1:25:46

is a hyper awareness and I

1:25:48

think it it pushes down all these

1:25:50

boundaries that you've set up all these

1:25:53

walls and all these blinders so

1:25:55

that you you see the world for what it really

1:25:57

is and a lot of people it freaks out but

1:25:59

What I think it does is it

1:26:02

ultimately makes you more compassionate and kinder

1:26:04

and nicer. And you realize like... In

1:26:06

the moment or afterwards? Afterwards.

1:26:08

Afterwards. I think it's a

1:26:11

tool for recognizing

1:26:13

things that you are

1:26:15

conveniently ignoring. And

1:26:17

you know, my friend Eddie told me about this

1:26:19

once. He was saying, if you're having a bad

1:26:21

time and you smoke marijuana, you're going to have

1:26:23

a worse time because you're already freaking out. You're

1:26:25

already freaking out about something. You know, if you're

1:26:28

going through a horrible breakup and you get high,

1:26:30

you're like, oh, no one loves me.

1:26:32

But if you're having a great time with

1:26:34

your friends, you'll probably just laugh and be

1:26:37

silly, right? Because you're not freaking out about

1:26:39

something. You probably... You're in a good place

1:26:41

mentally, which we should all strive to be

1:26:43

in a good place. I have this weird

1:26:46

psychosomatic guard

1:26:48

that developed. My father was an alcoholic

1:26:51

and I didn't drink at all

1:26:54

in my teens, in my 20s, and mostly in my 30s.

1:26:58

And then in my mid 30s, I started drinking

1:27:00

wine and I love wine and I think I

1:27:02

can handle it and I really enjoy it. I

1:27:04

love it. I do too. But

1:27:07

I cannot drink hard alcohol. The

1:27:09

minute that it touches my lips, I

1:27:11

get severe hiccups. I

1:27:14

mean, like debilitatingly bad hiccups.

1:27:16

Really? Any kind of alcohol.

1:27:18

Do you think it's psychosomatic? I think it's completely

1:27:20

psychosomatic because it makes no logical sense. If

1:27:23

the tequila touches my lip, I just start

1:27:25

hiccuping like crazy. And it's like this weird

1:27:27

protective thing that I think my brain has

1:27:29

developed because my dad used to drink some

1:27:31

stuff that would like make you

1:27:33

blind. Right? Like moonshine.

1:27:35

It was like 150% proof the guy would just chug it.

1:27:40

I mean, he was... Well, I think there

1:27:42

are whiskey connoisseurs and there are... I

1:27:44

mean, there is like

1:27:46

scotch, like old scotch does have

1:27:48

a fantastic taste. It's got an

1:27:51

interesting sort of an acquired taste.

1:27:54

But there's real wine connoisseurs. Wine is

1:27:56

incredible. Wine is a different animal. The

1:27:58

flavor of wine... like

1:30:01

your dinner and I'm like

1:30:03

what is there a cap you know

1:30:05

and he's like this time no cap and first I was

1:30:07

like God I must have lost a lot of money. What

1:30:12

did they say? No cap?

1:30:15

For $19,000 why? So then I said fuck

1:30:18

it I'm just gonna try this so I went

1:30:20

to Caesars and it was like four

1:30:22

or five thousand dollars I mean I would never buy this

1:30:24

in a norm but I got it cuz it was free.

1:30:28

Joe's okay all

1:30:30

this build up in my mind this is oh

1:30:32

my god this is gonna be ethereal it's gonna

1:30:34

be ambrosia it was not ambrosia. Whereas

1:30:38

you can find these other ones that are made

1:30:40

by you know folks that just put their entire

1:30:43

lives into it you taste

1:30:45

the whole story I

1:30:47

just think it's incredible. It's a weird

1:30:49

status thing the expensive wine it's just

1:30:51

like Cuban cigars. It's really dumb. Yeah

1:30:53

it's a weird thing. The real skill

1:30:55

is being able to know price

1:30:58

value and when

1:31:00

you know it it's so satisfying because

1:31:02

it's like oh this is just delicious and then

1:31:05

when your friends enjoy it they're like oh my

1:31:07

god this is delicious and I'm like yeah that's

1:31:09

80 bucks. Yeah. How? And

1:31:11

I'm like well it's very hard to find so then

1:31:13

the skill is like it's funny I'll

1:31:15

tell you this is how bad wine has

1:31:18

gotten for me meaning like I

1:31:20

love the story I love the people

1:31:22

I want to support the industry so I

1:31:25

went to register for an

1:31:27

alcohol license at the ABC

1:31:29

in California. Really? Because I was tired

1:31:33

and frustrated of trying to buy retail

1:31:35

because you have to go through folks that have their

1:31:38

own point of view and

1:31:40

I was like well if we just become

1:31:43

you know we as in like me and a

1:31:46

friend of mine and so we set up a thing

1:31:48

we set up a little I filed

1:31:50

the paperwork and

1:31:52

it's called like you know CJ Wine

1:31:55

LLC you know my friend

1:31:57

me and Joshua and

1:31:59

we're able to negotiate directly

1:32:02

with the wineries. And

1:32:04

we're able to get it from wholesalers in Europe or

1:32:07

in South Africa or in Australia.

1:32:10

And it just allows us to buy

1:32:12

a bottle, try it, if we really like it. Thursday

1:32:15

nights at my house is always poker. We

1:32:18

serve it to our friends, if they like it, then we can

1:32:20

buy a couple cases, I can share with my friends and you

1:32:22

get it at wholesale. It's a great little hack. Is there

1:32:24

a limitation? Is there a certain

1:32:26

specific amount that you have to buy? I look

1:32:29

like a retail store. You

1:32:31

could be like Amazon. And so

1:32:33

a retail store could just buy a few bottles? They

1:32:35

could buy a case. They could buy a few

1:32:38

bottles. That's a little bit harder. So you

1:32:40

have to have a more personal relationship. But

1:32:43

then the really good stuff, you can buy a few cases

1:32:45

and then pass them on to your friends. And I don't

1:32:47

know. It's, I think, wine's incredible. And

1:32:50

with food is incredible. But when I hear people

1:32:52

that are going to open up their own wine

1:32:54

label, I'm like, oh, good Lord. How

1:32:57

much do you know about wine? Like,

1:32:59

oh, I'm going to start a wine business. Like, what?

1:33:02

I went to a couple of these wineries.

1:33:04

And I just asked them, just out of

1:33:06

like, just explain to me how

1:33:09

you got there. And all

1:33:11

I could think of was, man, this is

1:33:13

way too complicated. But these folks, it's like

1:33:15

animal husbandry. They're breeding this

1:33:18

fine with this fine. But then they're going to

1:33:20

take, you know, cleave off this little bit. So

1:33:22

it's like, it's a breeding program over 10

1:33:24

and 20 and 30 years. And

1:33:27

it's like, this is really

1:33:29

complicated. Oh, yeah. They do

1:33:31

weird stuff. Like, they'll splice

1:33:33

avocado trees with, what is

1:33:35

that nut? Pistachios.

1:33:38

So they'll take avocado trees and they

1:33:41

splice them with pistachios to make the

1:33:43

tree more sturdy. Like, you can take

1:33:45

two different species of tree. And

1:33:47

if you cut them sideways and splice them

1:33:49

together, they'll grow. A friend of mine

1:33:52

started a company that's making like potatoes. And

1:33:54

he makes like these ginormous potatoes like

1:33:56

this. It's an incredible thing because like

1:33:59

the yield is the

1:38:00

way my body responds. I can tell you how my body feels.

1:38:04

If I take a picture, I try to

1:38:06

work out and I take pretty detailed, what's

1:38:09

my BMI, what's my muscle mass, what's my

1:38:11

fat percentage? And I always

1:38:13

take those readings right before

1:38:15

I go. And

1:38:18

when I look afterwards, and

1:38:20

I don't do anything when I'm there, I

1:38:22

swim in the sea when I can, when

1:38:25

I'm on vacation or whatever, I walk a

1:38:27

lot, but nothing else. No weights,

1:38:29

no nothing. My

1:38:33

muscle mass stays the same. My

1:38:36

fat percentage goes down.

1:38:41

I look healthier

1:38:43

and I feel

1:38:47

really great. And

1:38:49

all I do is I just eat what's in front

1:38:51

of me. I don't think about quantities, whatever. But when

1:38:53

I'm back in the United States, so I get to

1:38:56

be there, call it six weeks a year, but

1:38:59

when I'm back in the United States, I have to

1:39:01

go back on lockdown because like a

1:39:03

lot of people, I

1:39:05

had this thing, like if you look at a picture of me in

1:39:07

Sri Lanka, I look

1:39:10

like old Dave Chappelle. I

1:39:15

was like this, I was just a total stick

1:39:17

figure within

1:39:20

one year of being in North America, in this

1:39:22

case in Canada, when you look at the school

1:39:24

pictures, I was fat. I

1:39:27

couldn't explain it to you. Just the difference in the food

1:39:29

system. And my parents were making the same things because they

1:39:32

wanted to have that comfort of what they were used to.

1:39:35

So I don't know

1:39:38

if it was the food supply or not, but it

1:39:40

has to be. It has to be. It has to

1:39:42

be. Everybody says the same thing. And then my whole

1:39:44

family has struggled with it. So

1:39:47

I think that there's something, and then when I go now

1:39:50

to Italy as a different reference example,

1:39:53

it's like the best shape of my life. And I

1:39:55

do less. You feel completely different. Even when you eat

1:39:57

things like pizza over there, you don't feel like... you

1:40:00

ate a brick. I've eaten pizza

1:40:02

here and I love it, but when I'm over I'm

1:40:04

like, oh what did you do? What did you do?

1:40:06

Like you ate a brick. But over there it's just

1:40:08

food. It tastes great. The pasta

1:40:11

doesn't bother you, nothing bothers you. It's

1:40:13

just whatever they're doing. And it's

1:40:15

there's many things. Just one of them they're

1:40:17

not using enriched flour and another thing is

1:40:20

they have heirloom flour so it hasn't been

1:40:22

maximized for the the most amount of gluten.

1:40:24

I'm curious to see what Bobby does if

1:40:26

Trump wins in this world, make

1:40:29

America healthy again. I don't exactly know what

1:40:31

his plans are. Yeah, what's

1:40:33

possible? Like how much can you really affect

1:40:35

with regulation? How much can you really bring

1:40:37

to light? And what are we gonna learn

1:40:39

about our food system? I mean even Canada,

1:40:42

if you could, one of the things about

1:40:44

the hearings that they just had was

1:40:46

they were comparing Lucky Charms that they sell

1:40:48

in the United States that are very brightly

1:40:50

colored versus Lucky Charms they sell in Canada.

1:40:53

Completely different looking product because in

1:40:56

Canada it's illegal to use those

1:40:59

dyes that we use ubiquitously. And

1:41:01

those dyes are terrible for you. We know they're

1:41:03

terrible for you. In Canada those are terrible for

1:41:05

you which is why they're illegal up there. The

1:41:07

food tastes the same. It still sucks. It's still

1:41:09

bad for you. It's still covered in sugar. But

1:41:11

at least doesn't have that fucking poison that just

1:41:13

makes it blue or red. And it is

1:41:17

impossible like to teach my kids healthy eating

1:41:19

habits as a result of this. The food

1:41:22

in the United States it's just it's

1:41:24

everywhere and it's beating

1:41:26

into you that this

1:41:28

is a cheap way of getting caloric intake.

1:41:31

And it is full of just all

1:41:33

these stuff you can't pronounce. It's garbage.

1:41:36

Yeah it's all garbage and it's so

1:41:38

it's so common. And then if you're in a

1:41:40

what they would call food desert, if you're in

1:41:43

a place that only has fast food, like

1:41:45

my god like your odds of being metabolically

1:41:47

healthy if you're poor and you're living in

1:41:49

a place that's a food desert. It's impossible.

1:41:52

It's fucked. It's impossible. It's too hard and

1:41:54

it's also very expensive which is even crazier.

1:41:56

It's so expensive to eat well and

1:41:59

to eat like clean. and make sure

1:42:01

that you don't have any additives and garbage in your food.

1:42:03

Do you remember

1:42:06

in like the 90s and 2000s

1:42:08

where what we were told

1:42:10

was fat was bad? Yeah. And

1:42:13

like you would see sugar-free and and I would

1:42:15

just buy it. Oh yeah sugar free is great.

1:42:17

Oh like sugar free I'm doing the healthy thing

1:42:19

here. This is great. Margarine. Margarine or then I

1:42:21

would see fat free and I'd be like oh

1:42:23

I'll do that. Yeah. And it turned out

1:42:25

all this stuff was just... Well it's

1:42:28

not even it's such a small

1:42:30

amount of people that affected that.

1:42:32

That's what's so terrifying. There's a

1:42:34

small amount of people who bribed

1:42:37

these scientists to falsify data

1:42:39

so that they could blame

1:42:41

all these coronary artery diseases

1:42:43

and heart diseases

1:42:46

on saturated fat when it was really

1:42:48

sugar that was causing all these problems.

1:42:51

And we had a very dysfunctional understanding

1:42:53

of health for the longest time. The

1:42:55

food pyramid was all fucked up. The

1:42:57

bottom of the food pyramid was all

1:42:59

bread and carbs. Yeah

1:43:01

it's so nuts and just made a

1:43:04

bunch of like really sloppy humans and

1:43:06

you could see it in the beaches the photos from the 1960s versus

1:43:09

looking at people in the 2000s. Have you

1:43:11

had Casey and Kelly Means on? They're coming

1:43:13

on. Yeah. They're coming on. Yeah. They

1:43:16

have an incredible story. Should

1:43:18

I say it or we can just... Sure. Yeah. They

1:43:20

have this incredible story

1:43:22

that they tell about what happened

1:43:25

and what they say is in

1:43:27

the 80s when you had

1:43:29

these mergers, one of the

1:43:32

mega mergers that happened was tobacco

1:43:34

company with food company. There was two of them

1:43:37

and a lot of the scientists started

1:43:39

to focus some of their energy on

1:43:41

taking that skill, I'll just put that

1:43:43

in quotes, of making something very addictive

1:43:46

and transposing it to food. It's

1:43:49

like okay if I'm at RJR and I'm used

1:43:51

to making cigarettes, how do

1:43:53

I think about structurally building up something

1:43:55

that wants you to

1:43:57

eat more but now instead of smoking, instead of a cigarette,

1:43:59

it's... Soda?

1:46:00

Yeah. And it was like 10% of

1:46:03

the total budget or? Something nutty like

1:46:05

that. It's like some ginormous amount of money

1:46:07

is just basically giving folks sugar

1:46:09

water. Yeah. And

1:46:11

you wonder why. Now the solution is just

1:46:13

to give everybody on the back end of

1:46:15

it ozmpic. It's also like,

1:46:18

let's be real. That's not food. Okay?

1:46:22

It's something you put in your mouth, but you can't

1:46:24

buy cigarettes with food stamps. All right? So

1:46:26

if you buy, you can't buy cigarettes with food

1:46:28

stamps, why should you be able to buy something

1:46:30

that's really bad for you? I mean, what would

1:46:32

change if we said food stamps, we're going to

1:46:35

actually increase the amount that you get, but we're

1:46:37

going to regulate what you can buy. And you

1:46:39

have to buy all the things from the outside

1:46:41

of the store. I don't even think you have

1:46:43

to regulate it. Like,

1:46:45

think of what has happened because of companies

1:46:47

like Uber Eats and DoorDash as an example.

1:46:51

What have they done? And I'll tell

1:46:53

you why I think this is important. Those

1:46:55

guys have gone out and Cloud Kitchens. There's

1:46:57

three companies. They have

1:46:59

bought up every single kind

1:47:02

of warehouse in every part

1:47:04

of every city and suburb in

1:47:06

America. And what they put in

1:47:08

there are these things that

1:47:10

they call ghost kitchens. So that

1:47:13

when you launch the app and a lot of

1:47:15

the times when you get a drink from Starbucks,

1:47:17

it's not coming from the actual Starbucks down the

1:47:19

street. It's coming from a ghost kitchen. Why? Because

1:47:22

they centralize all the orders and it creates

1:47:24

an economy of scale. Why

1:47:27

am I telling you this? I think that

1:47:29

there is a way for food stamps to

1:47:31

sit on top of that infrastructure and just

1:47:33

deliver food. But the problem

1:47:35

is people, especially people that don't

1:47:38

know or care, want that sugar

1:47:40

water. Well. You

1:47:43

know, like the choices are...

1:47:45

I understand. Yeah. You're

1:47:48

going to have people that choose that Big Mac because it

1:47:50

is delicious. Yeah. And I think that

1:47:52

they are delicious. And once a year I have a Big Mac. But

1:47:56

I think if you're going to tie it to something like

1:47:59

a government subsidy... But

1:50:01

now there's an industry that's making

1:50:03

$3 trillion by giving people these

1:50:05

GLP ones. And the

1:50:07

problem is, just like every other

1:50:09

industry, once it starts making money, it does not

1:50:11

want to stop. And

1:50:13

by the way, I think that they should be allowed

1:50:15

to make money. But what I'm

1:50:18

saying is, in a free market, every

1:50:20

actor is allowed

1:50:22

to act rationally. And actually,

1:50:24

what you want is everybody to

1:50:27

have their own incentives and to act naturally.

1:50:29

That's when you get the best outcome. Because

1:50:31

if you're acting with some shadow agenda,

1:50:33

you're not going to necessarily do the right thing. So

1:50:35

my point is, in this example, the

1:50:39

government's job in this narrow

1:50:41

example is to get the

1:50:44

best healthcare outcome. Because if they're

1:50:46

doing any form of long-term planning,

1:50:49

it's pretty obvious. Like, we are hurtling

1:50:51

to a brick wall on

1:50:53

this healthcare issue with respect

1:50:56

to people's health. You don't have a solution. The

1:50:59

only solution cannot be to medicate

1:51:01

and then triple and quadruple the

1:51:05

budget deficit that we already don't have a

1:51:07

path to pay down. Right. Well,

1:51:09

the only other thing that I could think is if

1:51:11

there was some sort of a way that

1:51:15

would be effective at establishing

1:51:18

discipline other than just promoting it.

1:51:20

Like, I

1:51:22

could conceive of, especially when you're

1:51:24

dealing with something like Neuralink, or

1:51:27

some sort of a new way

1:51:29

of programming the mind, where

1:51:31

it just changes whatever the

1:51:34

behavior pattern is that accepts

1:51:37

these foods as choices. Like,

1:51:41

lobotomize your appetite. That would

1:51:44

be a very dystopian place. Sketchy

1:51:46

to fucking be an early adopter. If

1:51:48

you want the subsidy, you need to

1:51:50

get this brain impact. It would not

1:51:52

be a good place. Oh, that would

1:51:54

be bad. That's worst case scenario. Best

1:51:56

case scenario is you just have a

1:52:00

national scale promotion of health

1:52:03

and wellness and abandonment of

1:52:05

this body positivity nonsense and

1:52:08

fat doctors and people are telling you

1:52:10

that every weight is a healthy weight

1:52:12

and all food is food and to

1:52:14

think otherwise is discriminatory, which you're hearing

1:52:16

from people. And by the way,

1:52:18

that stuff is funded and that's

1:52:20

what people need to know. That nonsense

1:52:23

is actually funded. They pay people to

1:52:25

be influencers and they're getting paid by

1:52:27

these food companies to say these nonsense

1:52:29

things that are scientifically

1:52:32

factually incorrect. They're not true.

1:52:35

It is not healthy in any way, shape

1:52:38

or form to be obese. And

1:52:40

when they tell you that it's health, you

1:52:42

can be metabolically healthy and still have fat,

1:52:44

it's okay. It's not okay. It's

1:52:46

not okay. That's just not true. And is that

1:52:48

fat shaming? I don't know. You can

1:52:50

call it whatever the fuck you want, but it doesn't change what it does

1:52:52

to the human body. And it doesn't make

1:52:54

someone better if you don't

1:52:57

make them feel bad about being robustly

1:53:00

unhealthy. Well, it's an

1:53:02

enormous disservice to folks if we

1:53:06

don't expose an alternative

1:53:09

path. Okay, we're

1:53:11

spending this much money. We spend so much money

1:53:13

in all kinds of random stuff. Like, just a

1:53:15

simple example that we saw this past week. $50

1:53:18

billion spent between rural

1:53:22

broadband and chargers. We

1:53:24

have no rural broadband and we have

1:53:26

three chargers. No,

1:53:29

this is the data. That's

1:53:31

$50 billion. Okay,

1:53:34

that's not the $300 billion that... Explain that, what

1:53:36

you mean by that. In

1:53:42

Congress, when they come together to pass these

1:53:44

bills, sometimes what happens is there's a lot

1:53:46

of horse trading, right? And

1:53:48

you get what's called a Christmas tree bill, which

1:53:51

is like everybody gets to hang something off the

1:53:53

Christmas tree. And the

1:53:55

crazy part of the United States is these

1:53:57

little bubbles now here are $10 billion. The

1:58:00

cost of that would basically fall through the floor

1:58:02

if you put in an order for 50 million

1:58:04

of these units. Right. SpaceX

1:58:06

would make them for like eight bucks. Right.

1:58:08

You know what I mean? It's fast internet

1:58:10

too, which is even crazier. Yeah.

1:58:13

There's all of this stuff that

1:58:17

we should do. We just

1:58:19

need a few folks, I

1:58:23

don't know, that can either course correct or just can shine

1:58:25

a light on it. It's

1:58:27

like this thing where I'm so optimistic, married

1:58:31

to enormous fear and just like

1:58:33

a little... I

1:58:36

kind of go back and forth between these

1:58:38

things. Let me paint the ultimate dystopian solution.

1:58:42

The ultimate dystopian... Part of our problem

1:58:44

is we have corruption, we have what

1:58:47

you were talking about with deals, sort

1:58:49

of like the border wall deal had

1:58:51

money in it for Ukraine. There's all

1:58:54

these weird deals, those bills that don't

1:58:56

make any sense. How did you add

1:58:58

all this stuff? Why is this 2,000

1:59:00

pages? How many people signed

1:59:02

it actually read it? AI

1:59:04

government. AI government solves

1:59:06

all those problems. AI government

1:59:08

is not corrupt. AI government just

1:59:10

works literally for the people. Instead

1:59:13

of having all these state representatives and all these

1:59:15

bullshit artists that pretend to be working on their

1:59:17

truck and they don't know what the fuck they're

1:59:19

doing, they're just doing it for an ad, you

1:59:21

don't have any of that anymore. Now everything's governed

1:59:23

with AI. The problem is who's controlling the AI

1:59:26

and is there some sort of an

1:59:28

ultimate regulatory body that makes sure that

1:59:30

the AI is biased or tainted? I

1:59:33

think there's a step before that which is a lot more

1:59:36

palatable. I think the thing with... I

1:59:39

thought about your version and

1:59:41

the problem that you state is the key

1:59:43

problem which is how is this model trained?

1:59:46

Who got their hands on that

1:59:48

core stuff, the weights and the values of that?

1:59:51

Who decides? At

1:59:53

some point there's going to be... Or not

1:59:56

they're going... There is already today in AI models

1:59:58

a level of human override. It's

2:00:00

just a natural facet of how these things are.

2:00:02

There's a way to reinforce the learning

2:00:04

based on what you say and what I

2:00:06

say. It's a key part of how an

2:00:09

AI model becomes smart. It

2:00:11

starts off as primordial, and

2:00:13

then Joe and Chamath and all these other

2:00:15

people are clicking and saying, yes, that's a

2:00:17

good answer, bad answer, ask this question, all

2:00:20

this stuff. Who

2:00:22

are those people? Right, and it could be gamed

2:00:24

as well, right? Like you could organize. I think

2:00:26

at scale, we haven't figured out ... We

2:00:29

haven't seen it yet. But it will be when

2:00:31

the incentives are that high. And

2:00:33

we've seen distortions, like the Gemini

2:00:35

AI that were asked to make

2:00:38

Nazi soldiers, they made these multiracial

2:00:40

Nazi soldiers, and

2:00:42

that kind of stuff. Where it's just

2:00:44

like, who are the founding fathers? It's

2:00:46

all, here's a black guy, here's a

2:00:48

Chinese lady, like, okay, we get it,

2:00:50

you're not racist. But you're

2:00:52

being crazy. You're distorting

2:00:54

the past. Exactly. Specifically

2:00:57

history. One of them was like

2:00:59

a Native American woman was a Nazi soldier. It's like,

2:01:01

this is so nuts. So that

2:01:03

is a problem in that AI

2:01:05

is not clean, right? It's

2:01:08

got the greasy fingerprints of modern

2:01:10

civilization on it and all of

2:01:13

our bizarre ideologies. But there's

2:01:15

a step before it that I think can

2:01:17

create a much better government. So it's

2:01:19

possible today, for example, to understand

2:01:22

... Have you ever done a renovation on your house? Yes.

2:01:26

You make plans, you

2:01:29

go and your architect probably

2:01:31

pays an expediter to stand

2:01:33

in line in City Hall.

2:01:36

There's a person that goes and reviews that plan. They

2:01:39

give you a bunch of handwritten markups based

2:01:42

on their understanding of the building code. You

2:01:44

can't use this lead pipe, you need to use aluminum,

2:01:46

this window's too small, all this stuff. You

2:01:49

come back, you revise, you go do this two or

2:01:51

three times on average to do a renovation. And then

2:01:53

they issue your permits. Now

2:01:57

an AI can actually

2:01:59

just ingest all the ... rules, knows

2:02:01

exactly what's allowed and what's not allowed, and

2:02:04

they can take your picture and

2:02:07

instantly tell you, Joe, fix these 19 things. You

2:02:09

fix those things. You go to the city. You can

2:02:11

show that it maps to all the rules, so

2:02:14

you can streamline government. You can also point out

2:02:16

where they're making decisions that don't map to what

2:02:18

the rules say. That,

2:02:21

I think, is going to be a really important first

2:02:23

step because it allows us to

2:02:25

see where maybe

2:02:27

this administrative state has grown

2:02:29

unwieldy, where you got to

2:02:31

knock some of this stuff back and clean up some of

2:02:33

the cruft because there's rules on top of

2:02:35

rules and one conflicts with the other. I

2:02:38

bet you there are things on the books today that are

2:02:40

like that. A hundred percent. We have

2:02:42

no way of knowing. Right. You

2:02:45

know? But I do think an AI can tell you these

2:02:47

things and say, just pick which one. It's

2:02:49

A or it's B. I

2:02:51

think that that starts to cut

2:02:54

back a lot of the

2:02:56

data. Difficulty in just making progress.

2:02:58

Right. You know? One

2:03:01

of the things that I thought was

2:03:03

extraordinary that Elon was getting pushed back

2:03:05

on was his idea

2:03:07

of making the government more

2:03:09

efficient and that

2:03:12

auditing the various programs

2:03:14

and finding out how to make them more

2:03:16

efficient. A lot of people really freaked out

2:03:18

about that. Their main freak

2:03:20

out, the main argument from

2:03:22

intelligent people that I saw was, what are you going

2:03:24

to do? You're going to fire all these people that

2:03:27

are in charge of government. I

2:03:29

don't think that's the answer for ineffective

2:03:31

government is to let the same people

2:03:33

do the same thing because otherwise you

2:03:35

have to fire them. That sounds insane.

2:03:37

Right. And to say

2:03:40

that the government as efficient as

2:03:42

is humanly possible or even

2:03:44

close to it, no one believes that. No

2:03:47

rational person believes that. Everyone believes in

2:03:49

bureaucracy. Everyone believes there's a lot

2:03:51

of nonsense going on. Everyone believes that. Look

2:03:54

at the difference between what Elon's

2:03:57

been able to accomplish with SpaceX versus

2:04:00

is what NASA has been doing recently.

2:04:02

Look at the difference between what they're

2:04:04

able to accomplish with Starlink versus this

2:04:06

$42 billion program that yielded zero results.

2:04:09

Look at the difference between all these different

2:04:11

things that are done in the private sector

2:04:14

when there's competitive marketplace strategies. You

2:04:16

have to figure out a way

2:04:18

to get better and more

2:04:21

efficient and you can't afford to have a

2:04:23

bunch of people in your company that are

2:04:25

doing nothing and that are creating red tape

2:04:27

and making things harder to break. That's

2:04:31

bad for the business. That's

2:04:33

the argument for letting private companies take

2:04:36

over things. By the way, I think

2:04:38

that what, and just

2:04:40

to build on what you're saying, people

2:04:42

jump to this conclusion that like government

2:04:44

shouldn't exist. It's not some anarchic

2:04:47

thing where like government's actually very

2:04:50

important. They create

2:04:52

incentives and then those of

2:04:54

us in private industry go out and try

2:04:57

to meet those incentives or take advantage of

2:04:59

them. That's very normal. A

2:05:01

well-functioning government creates

2:05:03

very good incentives. An

2:05:05

incredible example of this is in the

2:05:08

1950s, do

2:05:11

you know what the GDP of Singapore

2:05:13

was? No. It

2:05:15

was the same as the GDP

2:05:17

of Jamaica. You fast forward

2:05:19

70 years and you understand what

2:05:22

good governance looks like. We actually

2:05:24

were talking about Singapore yesterday,

2:05:26

how extraordinarily efficient their recycling

2:05:29

program is. It's unbelievable.

2:05:31

I mean, it's really amazing

2:05:33

what they do. They

2:05:35

really recycle. They recycle how we think

2:05:38

we're recycling. They really do. They really

2:05:40

separate the plastic. They break it up.

2:05:42

They use it to make power. They

2:05:44

use it to make road materials. They

2:05:46

make building materials out of it. They

2:05:48

reuse everything. They were thrust

2:05:50

into a spit of land with

2:05:52

no natural resources. They

2:05:54

had to become incredibly well-educated

2:05:57

and industrious. You

2:06:00

know, Lee Kuan Yew was able

2:06:02

to create the right incentives for

2:06:05

government to do a good job. They pay their

2:06:07

civil servants incredibly well, but

2:06:09

then also for private industry to show up and do

2:06:11

the rest. And it works incredibly.

2:06:14

You can do that in the United States.

2:06:16

The thing that we would benefit a lot

2:06:18

from is if we

2:06:20

could just point out all the

2:06:23

ways in which there's either too many laws

2:06:25

or laws are conflicting, you

2:06:28

can at least have a conversation about batting those back. And

2:06:31

the second is if

2:06:33

you look inside of private

2:06:36

equity, there is one

2:06:38

thing that they do, which I think

2:06:40

the government would hugely benefit from, and

2:06:42

it's called zero-based budgeting. And

2:06:45

this is an incredibly powerful but boring

2:06:47

idea. What private

2:06:49

equity does when they buy a company, some

2:06:52

of them, the best ones, they'll

2:06:54

look at next year's budget. And

2:06:56

if they say, what should the budget be? Well,

2:06:59

guess what's going to happen, Joe, in your company? Everybody runs

2:07:01

and says, I need X for this, Y

2:07:03

for that, Z for this. And

2:07:05

you have this budget that's just ginormous. Instead,

2:07:08

what some of the best private equity

2:07:10

folks do is say, we're starting with

2:07:12

zero. Next

2:07:14

year's budget is zero. We're spending nothing.

2:07:16

Now, let's build it

2:07:19

back up meticulously, block by block.

2:07:22

So somebody comes in, okay, what is it exactly that you

2:07:24

want to do? I

2:07:26

want to build an interface that

2:07:28

allows it, eh, they start saying

2:07:31

something, like, no.

2:07:33

Okay, what do you want to do? I want to

2:07:35

upgrade the factory so that we can make a more

2:07:37

high yield. Okay, done, you're in. How much do you

2:07:39

need? Okay. One by

2:07:41

one by one. And if

2:07:43

you go and you do that inside the government,

2:07:46

what you probably would find is that same group

2:07:48

of people would probably enjoy their job a lot

2:07:50

more. Their

2:07:52

hands would be on the controls in

2:07:54

a much more directed way. We'd

2:07:57

spend a lot less because a lot of this stuff

2:07:59

probably just goes by the wayside and we don't even

2:08:01

know, you know, and

2:08:04

people would just be more able to go

2:08:06

and work. You could do what you

2:08:08

wanted to do. I could do what I

2:08:10

wanted to do. Elon could do what he wants to do.

2:08:13

There was a thing, I tweeted it out today. He

2:08:20

cannot get the

2:08:22

FAA to give him a flight

2:08:26

permit for Starship 5 and 6. So

2:08:29

they're waiting on dry docks, right?

2:08:31

They're slow rolling the approval, right?

2:08:36

It takes him less time to build

2:08:38

these Starships now than it does to get government

2:08:40

approval. That's what he said. Meanwhile,

2:08:43

the FCC, which is a sister

2:08:46

organization to the FAA, fast

2:08:48

tracked the sale of 220 radio stations

2:08:51

in part to some

2:08:53

folks that were foreign entities right

2:08:56

before an election that touched like 160 million Americans.

2:09:01

When you look at that, you would

2:09:03

say, how can some folks cut

2:09:06

through all the red tape and get an answer quickly? How

2:09:09

can other folks be waiting around

2:09:12

for something that just seems so obvious and

2:09:15

so exceptional for America? And

2:09:19

there's no good answer. I

2:09:22

don't know what the answer is. I

2:09:25

don't think any of us know. No. And then there's just

2:09:27

folks that are stuck in space. Meanwhile,

2:09:30

there's these two people stuck in

2:09:33

space. Yeah. And...

2:09:35

And Jamie said they were supposed to be there for how

2:09:37

long? Eight hours? Uh, yeah. They

2:09:40

were supposed to be there for eight hours. They're supposed to be

2:09:42

quick. They're supposed to be quick. And they've been

2:09:44

there for months. They're going to be there until February. That's

2:09:47

so insane. They're going

2:09:49

to be there until February. How terrifying must that

2:09:51

be? I mean, for

2:09:53

maybe you and me? Eight days. Eight

2:09:56

days. Think

2:10:00

I think I would freak out 100%

2:10:03

I think they would do how do you not well

2:10:05

I read this article where they interviewed them now This

2:10:07

could be the party line. I don't know, but they're

2:10:10

like this is great. It's my natural place. Oh Lord

2:10:13

I Can't believe that

2:10:15

I had a friend of mine. That's what they say them

2:10:17

to themselves keep from going crazy Well, yeah a friend of

2:10:19

mine went to space the

2:10:21

founder of Cirque du Soleil gee la liberté and He

2:10:25

brought a super high Google

2:10:27

about like it's already over Still

2:10:30

going no, it's still going on says

2:10:32

it until February 21st. Yeah, February 20th.

2:10:35

Yeah, we're stuck in space This is

2:10:37

like it ended up spending That's

2:10:39

just more AI Wow,

2:10:42

yeah, it could be way more It

2:10:45

could be way way more. That's weird that

2:10:47

AI that's another flaw with AI right that

2:10:49

they would Read it like

2:10:51

that. What are what the incentive is for AI a lot

2:10:53

of you about that? How does

2:10:56

AI not know it's not 2025 yet? No,

2:10:59

we're stuck in space until February of

2:11:02

2025. Well, that's that's just a straight-up

2:11:04

error. That's a weird error though

2:11:06

It is a weird. Yeah but these

2:11:09

poor people you know and But

2:11:11

so my friend that was up there said It

2:11:15

was incredible. He has this funny story where he

2:11:17

was a smoker still is a smoker, but

2:11:21

This was like 20 years ago. So he was going

2:11:23

up on like a Soyuz rocket and he

2:11:26

shows up I guess in Siberia's

2:11:28

where they do the launches and He

2:11:32

was really stressed out because he had to stop smoking They're

2:11:40

like oh it's totally fine door they smoke in no,

2:11:42

no, no I'm saying on the ground while they were

2:11:44

trading Oh boy, so they go up. He does eight

2:11:46

days. He comes back down He

2:11:49

took these incredible high-res pictures of like all the parts

2:11:51

of the earth He said it was the most incredible

2:11:53

thing, but you know when you get back He's like

2:11:55

I was ready to get back. Did

2:11:57

you see this latest report? There's

2:12:00

a like real controversy about some

2:12:02

finding that the James Webb telescope

2:12:04

has discovered. And there's

2:12:06

some talk of some large object

2:12:09

moving towards us that's course

2:12:11

correcting. This is

2:12:13

the weird part about it. And

2:12:16

there's all these meetings, and so all

2:12:18

the kooky UAP people are

2:12:20

all over it saying disclosure is imminent. There's

2:12:23

a mothership headed towards us. So

2:12:27

it gets fun. I don't know what they mean

2:12:29

by course correcting. What does that mean? And how

2:12:31

do they know it wasn't impacted with something else

2:12:33

that diverted it? It could have

2:12:35

been that. It could have just been the gravitational fields.

2:12:37

It could have been orbital path. But

2:12:40

they're not telling anybody. There's something

2:12:42

going on. Do you

2:12:44

think they would tell people, imagine

2:12:46

if there was a giant chunk of

2:12:49

steel, of iron rather, that's headed

2:12:51

towards us. That's a great question. I think

2:12:53

the question is, what would we do if

2:12:56

we knew? Do we have the capability of

2:12:58

moving that thing? Would

2:13:01

the FCC wait five months to give

2:13:03

Elon to- I think you'd probably send

2:13:05

as many, but see, it's

2:13:08

all a physics problem at that point. It's

2:13:10

also a problem of breaking it up. If

2:13:13

it breaks up, then you have smaller pieces

2:13:15

that are hitting everywhere instead of one large

2:13:17

chunk. Isn't this like the

2:13:20

perfect reason why being multi-planetary just makes

2:13:22

a lot of sense? Sure.

2:13:25

For example, would you get on an airplane if

2:13:27

they said, hey Joe, this is the best airplane

2:13:30

in the world. It's the most incredible, it's the

2:13:32

most luxurious, it has the best weather,

2:13:34

you can surf, but there's only one

2:13:36

navigation system. And if it goes out, you'd

2:13:40

never do that. Would you ever

2:13:43

get on that airplane? No. No. I

2:13:46

think we owe it to ourselves to

2:13:48

have some redundancy. Yeah, but ultimately

2:13:50

I always wonder, the

2:13:53

universe sort of has these patterns

2:13:55

that force innovation and

2:13:58

constantly move towards further. further

2:14:00

complexity. If you

2:14:02

were going to have intelligent life that

2:14:04

existed on a planet, what better incentive

2:14:07

to get this intelligent life to spread

2:14:09

to other parts of the planet than

2:14:11

to make that planet volatile? Make

2:14:14

super volcanoes, earthquakes, solar

2:14:17

flares, all sorts of

2:14:19

different possibilities, asteroid impacts, all sorts of

2:14:21

different possibilities that motivate this thing to

2:14:23

spread. Don't have to say like this

2:14:26

is fragile and it's not

2:14:28

forever, so create some

2:14:30

redundancy. I mean, I was

2:14:33

raised Buddhist. I'm not that religious in

2:14:36

that way, but I'm kind of weirdly

2:14:38

spiritual in this other way, which is I

2:14:41

do think the universe is basically, it's

2:14:44

littered with answers. You

2:14:47

just got to go and find out what the right questions

2:14:49

are. So to your point,

2:14:52

are all these natural phenomena on Earth?

2:14:56

The question is, okay, if

2:14:58

that's the answer, well, the question is like, do

2:15:01

we want to be a single planet species

2:15:03

or do we want to have some built-in

2:15:05

redundancy? And maybe

2:15:08

100 years from now that builds

2:15:10

on top of what happens in the next

2:15:12

five, we'll have discovered all kinds

2:15:14

of different planets. That's

2:15:16

an amazing thing. Unquestionably.

2:15:19

Unquestionably. We also

2:15:21

know that there's planets in our immediate

2:15:23

vicinity that used to be able to

2:15:25

harbor life like Mars. We know

2:15:27

that Mars was covered in water and Mars

2:15:29

had a sustainable atmosphere. So

2:15:32

we know that this is not just a

2:15:34

dream, that this is possible that what we're

2:15:36

experiencing here on Earth is temporary. And

2:15:39

if we get hit by something, but what we

2:15:41

know Earth was hit by a planet in

2:15:43

its formation. There was Earth one and Earth

2:15:45

two, the formation of the moon, the primary

2:15:47

theories that we were hit by another planet

2:15:49

and that's why we have such a large

2:15:51

moon. That's a quarter the

2:15:53

size of Earth. It's like keeping

2:15:56

our atmosphere stable and keeping our...

2:16:00

wild shooting gallery out there. I mean

2:16:02

it really is and especially our Our

2:16:06

particular solar system has a massive asteroid

2:16:08

belt. There's like 900,000 near-Earth objects but

2:16:12

isn't that so like inspiring like this idea

2:16:14

of like Discovering

2:16:16

all these other questions that we

2:16:19

don't know yet to even ask

2:16:21

right that is a life well-lived

2:16:24

Yes, you know that's the

2:16:26

the most promising aspect to

2:16:28

a hyper intelligent AI in

2:16:30

my in my opinion That

2:16:33

it'll be able to solve problems that

2:16:35

are inescapable to us and also Offer

2:16:38

us like real hard data

2:16:41

about how big of a problem

2:16:43

this is and when this needs to be solved by

2:16:45

and then Come up with

2:16:47

actionable solutions. Yeah, and that that seems

2:16:49

to be something that Might

2:16:52

escape us as biological entities with

2:16:55

limited minds Especially we're not working

2:16:57

together and you could get AI

2:16:59

to have the accumulated power mind

2:17:01

power of everyone You

2:17:04

know 10x the the

2:17:06

mental model is if an alien

2:17:08

showed up today Would

2:17:10

humans by and large drop all of their?

2:17:14

Internal issues and cooperate together

2:17:20

Perhaps perhaps I

2:17:22

would I would hope that the answer would be yes It

2:17:24

would have to be something that showed such overwhelming

2:17:27

Superiority that it shut down all of

2:17:29

our military systems and did so openly

2:17:31

to the point where we're like We're

2:17:34

really helpless against this thing. Well, so

2:17:36

I think that one way to think

2:17:38

about AI is that it is a

2:17:41

supernatural system in

2:17:44

some ways so If

2:17:47

we can just find a way to cooperate and

2:17:49

harness this and see the bigger picture. I Think

2:17:53

we'll all be better off like again

2:17:55

like killing each

2:17:57

other It's

2:18:00

just so barbarically unnecessary.

2:18:04

It doesn't solve anything. All it does

2:18:06

is just makes more anger. It

2:18:09

creates more hatred because what's

2:18:12

left over is not positive. And

2:18:16

I think that we need to

2:18:18

be reminded of that somehow without

2:18:21

actually living the experience. Yes.

2:18:24

My hope is that one

2:18:26

of the things that comes out of AI

2:18:28

and the advancement of society through this is

2:18:31

the allocation of resources much more

2:18:33

evenly. And that we use

2:18:35

AI, as I was saying

2:18:37

before, the best way to keep people from entering

2:18:39

into this country is to make all the other

2:18:41

places as good as this country. As good as

2:18:44

this country. Then you

2:18:46

solve all the problems for everybody. And you don't

2:18:48

have this one place where you can go to

2:18:50

get a job or you go over there and

2:18:52

you get murdered. Well, so I

2:18:54

think that why are a lot of people coming

2:18:57

to America? A

2:19:00

lot of the reasons, some are clearly

2:19:02

political persecution, but a lot of the

2:19:04

other reasons are economic to your point. And

2:19:06

so if you can create economic abundance

2:19:11

generally in the world, that's

2:19:13

I think what people want. Most people want,

2:19:15

as you said before, a good job.

2:19:17

They want to come in and feel like they can point

2:19:20

to something and say, I made that. I don't feel proud

2:19:22

of that. They want

2:19:24

to hopefully get married, have some kids, have

2:19:26

fun with them, teach them what they were

2:19:28

all about. And

2:19:30

then our swan song and we all

2:19:32

kind of, I don't know, get reborn

2:19:34

or not. Isn't it interesting that the

2:19:36

idea of people not getting together in

2:19:38

groups and killing people they don't know,

2:19:40

that's utopia. That

2:19:43

is some sort of ridiculous

2:19:47

pie in the sky vision of the

2:19:49

possibility of the future of humanity. That's

2:19:53

common in small groups,

2:19:56

like even in cities. There's

2:19:58

individual murders. And there's crimes

2:20:00

in cities, but cities aren't

2:20:02

attacking other cities and killing

2:20:04

everybody. So there's

2:20:06

something bizarre about nations, and

2:20:09

there's something bizarre about the

2:20:11

uneven application of

2:20:13

resources and possibilities and,

2:20:17

you know, your economic hopes,

2:20:19

your dreams, your

2:20:23

aspirations being achievable pretty

2:20:26

much everywhere. If

2:20:28

we did that, I think that might

2:20:30

be the way that we solve most

2:20:33

violence, or the most horrific nonsensical violence.

2:20:35

So, and you have this data point.

2:20:37

I said this before, but the most

2:20:40

important thing that has happened is

2:20:43

that in the last four or five

2:20:45

years is we have severely curtailed the

2:20:48

likelihood of war in

2:20:51

the nominal sense. I think Trump

2:20:53

was able to basically draw a hard red line

2:20:55

in the sand on that stuff. And

2:20:59

the underlying reason was because we had enough

2:21:01

economic abundance where the incentives to go to

2:21:03

war fell. We

2:21:06

had just a complete rebirth of

2:21:08

domestic hydrocarbons in America. Whether you agree

2:21:10

with it or not, my point is

2:21:13

it is quite clearly correlated

2:21:15

in the data. As

2:21:17

we were able to produce more stuff,

2:21:19

so economic abundance, we

2:21:22

had less need to go and fight with external parties.

2:21:25

So I do think you're right. Like

2:21:27

this reduces it down to we

2:21:29

need to find ways of

2:21:32

allocating this abundance more

2:21:34

broadly, to more countries. Meanwhile,

2:21:38

that one crazy thing that you can't

2:21:40

unwind and go back from, you can

2:21:42

just never go there. And

2:21:45

you just have to make sure nobody believes

2:21:47

that that is justified. Because

2:21:50

in a nuclear event, I think that

2:21:52

that's not what happens. Clearly.

2:21:55

I saw this brilliant discussion that

2:21:57

you had where you were explaining.

2:22:00

meaning that Trump

2:22:02

is the wrong messenger, but

2:22:04

many of the things that he did actually

2:22:07

were very positive. And

2:22:09

I think that is a

2:22:12

very difficult thing to describe.

2:22:15

It's a very difficult thing to express

2:22:18

to people because we're

2:22:21

so polarized, particularly with a

2:22:23

character like Trump, that's so

2:22:25

polarizing. It's very difficult

2:22:28

to attribute anything to him

2:22:30

that is positive, especially

2:22:32

if you're a progressive or if you're on

2:22:34

the left or if you've been a lifelong

2:22:36

Democrat or if you're involved in tech. Totally.

2:22:38

I mean, it's this

2:22:41

bizarre denial

2:22:43

of basic reality, the reality

2:22:45

of what can

2:22:47

you see based on what was

2:22:50

put in place, what actions were taken, what

2:22:52

were the net benefits? I've

2:22:57

always been a liberal, and

2:22:59

I think I should define what liberalism used to

2:23:01

mean. It used

2:23:04

to mean absolutely no war, and

2:23:06

it used to mean free speech, and

2:23:09

it used to mean a

2:23:13

government that was supportive of private industry. Try

2:23:16

your best, go out there, we'll look out for you. Come

2:23:19

back to us if things go haywire. That's

2:23:22

an incredible view of the world.

2:23:27

I think what happened was when I

2:23:30

was given a choice, I

2:23:32

would vote Democrat or I would support Democrats because

2:23:35

I thought that that's what they stood for. I

2:23:39

didn't really understand Trump. What

2:23:42

happened was I

2:23:45

got too caught up in

2:23:47

the messenger, and I didn't focus enough

2:23:49

on the message. I

2:23:51

didn't even realize that. I didn't realize it

2:23:54

in 2016, but I don't think many people

2:23:56

did. Then

2:23:58

in 2020, I got lost in it. But

2:24:02

probably by 21 or 22, I

2:24:06

started to see all this data and

2:24:08

I said, hold on, I am not being

2:24:11

a responsible adult the way that I

2:24:13

define responsibility. I am not looking

2:24:16

at this data from first principles and I

2:24:18

need to do it. And

2:24:20

when I did, what

2:24:23

I saw was a

2:24:25

bunch of decisions that

2:24:27

turned out to be pretty smart. The

2:24:29

problem is that because he's the

2:24:32

vessel, he turns off so many

2:24:34

people with his delivery. And

2:24:37

I think this is a moment where the

2:24:39

stakes are so high, you have to try

2:24:42

to figure out what the message is versus

2:24:44

what the messenger is saying. Or

2:24:47

look to somebody else that can tell you the

2:24:49

message in a way that maybe

2:24:52

will allow you to actually listen to it. That

2:24:55

could be JD Vance, it could be Elon Musk, it could

2:24:57

be RFK. Tulsi

2:24:59

Gabbard, there's all kinds of surrogates now

2:25:01

because I think that they

2:25:04

have realized that there's a lot of value

2:25:06

in these messages. We

2:25:08

need to have multiple messengers so

2:25:11

that folks don't get tilted and go upside down. So

2:25:14

that the minute one person walks in the room. And

2:25:17

I had to challenge myself to go through that process

2:25:19

and at the end of it, I'm like, wow,

2:25:22

he's the only mainline candidate here that will

2:25:25

not go to war. And

2:25:27

just on that point, it's

2:25:30

like very unique times

2:25:32

creates strange bedfellows. It's sort of like one

2:25:34

thing that kind of like always pops out

2:25:36

at me like, why are they working together?

2:25:39

Why are they cooperating? I always think like, what's

2:25:41

going on here? And when I saw

2:25:43

him and Bobby align, Bobby

2:25:46

is a very balanced

2:25:49

view of Donald Trump. Here's the

2:25:51

good, here's the bad, even now. Even

2:25:53

with everything that's on the line for Bobby

2:25:55

and Bobby's agenda, he's

2:25:58

quite honest about Donald Trump's. positives

2:26:02

and negatives, but

2:26:05

they both get along. One

2:26:08

of the things, and probably the most

2:26:10

important thing, where they were

2:26:13

sounding the drum from day one is, under

2:26:16

no circumstance will the United States go to war.

2:26:19

I just think we should observe that. People

2:26:22

should have an opinion on that. He's

2:26:24

so polarizing that there's been two

2:26:27

attempted assassinations on him and no

2:26:29

one cares. He's like Neil in

2:26:31

The Matrix. He's like dodging his

2:26:33

blood. For now. You know?

2:26:36

Yeah. But listen, no one can dodge forever.

2:26:38

But the thing is, it's like no one seems to care that

2:26:40

the rhetoric has ramped up so

2:26:42

hard and has been so

2:26:45

distorted. The other thing that people need to,

2:26:47

I think, think about is the

2:26:50

domestic policy agenda of both the Democrats

2:26:52

and the Republicans are

2:26:55

within error bars. What I mean

2:26:57

by that is, when push comes to

2:26:59

shove, they both, whether

2:27:01

it's Kamala Harris or Donald Trump, they

2:27:04

have to work through a very

2:27:07

sclerotic Congress, which

2:27:09

means that very little will ultimately get done

2:27:11

if you just look at the track record

2:27:13

of all these past presidents. You

2:27:16

typically get one piece of

2:27:18

landmark legislation passed in your first two

2:27:21

years and

2:27:23

it all just gets unwound. It's

2:27:25

happened from Clinton onwards. Bush

2:27:28

had one bite at the apple. Obama had one bite

2:27:30

at the apple. Trump had one bite at the apple.

2:27:32

Biden had one bite at the apple. The

2:27:37

American political system has

2:27:40

a really incredible way of insulating itself.

2:27:45

If people would just take a step back and look at

2:27:47

that, a lot of

2:27:49

the policy agendas that both of

2:27:51

them espouse are going to

2:27:54

be very hard to get done. There'll be one thing,

2:27:57

maybe they both do something on domestic.

2:28:00

taxation, maybe they

2:28:02

both do something on the border, but

2:28:05

the likelihood based on the past is

2:28:07

that they'll get one of these things done and then

2:28:09

not much will be done. This

2:28:12

is why I think folks then need to think

2:28:14

about, okay, what are the super-presidential

2:28:18

powers then where they can act

2:28:20

alone? One

2:28:23

area where they can act alone is they

2:28:25

can issue executive orders. That

2:28:29

can direct the behavior of

2:28:32

governmental agencies. Okay,

2:28:34

so people should decide what they think about that.

2:28:37

Do you want a muscular American

2:28:40

bureaucracy? Do you want a

2:28:42

more slimmed down one? Do you want

2:28:44

one that has bigger

2:28:46

ambitions, more teeth? Do

2:28:49

you want one that is zero-based

2:28:51

budgeted? They're pretty stark on those

2:28:53

things. Then

2:28:56

foreign policy, I think

2:29:00

one camp is much more in

2:29:02

the view that we are the world's policeman

2:29:06

and there's a responsibility that comes with that.

2:29:09

One says, we got a lot of problems

2:29:11

at home, we're not getting pulled into something abroad. I

2:29:15

think people need to decide about that. Other than

2:29:17

those two threshold issues, my honest opinion

2:29:19

is that we're in error

2:29:22

bars between

2:29:25

the two of them. One will cut taxes by

2:29:27

this much, one will increase taxes by that much.

2:29:32

There is real decisions that

2:29:34

have been made during the Biden administration

2:29:36

about the border that are affecting people.

2:29:39

Or lack thereof. I

2:29:41

think it's a decision. I don't think

2:29:43

it's a lack thereof, especially the flying

2:29:45

people in and the utilization of an

2:29:47

app to fly people in. That

2:29:50

seems insane. The whole

2:29:52

thing seems insane and I don't know what the

2:29:55

motivation is. I've talked to people that know a

2:29:57

lot about the construction business and they believe the

2:29:59

motivation is to cheap labor. I think that's part

2:30:01

of it and that a lot of the problem

2:30:03

is in many industries

2:30:06

the lack of cheap labor and people that are willing

2:30:08

to do jobs. It's one of the things that I've

2:30:10

heard you know there's a lot of criticism about all

2:30:12

the Haitians that have moved to Springfield, Ohio. But one

2:30:14

of the positive things that I've heard from people that

2:30:17

live there is that these people are hard workers and

2:30:19

they're willing to do jobs that the other people weren't

2:30:21

willing to take on. So you

2:30:24

have pros and cons but you have this

2:30:26

incentivized effort to move people into this country

2:30:29

illegally which will undoubtedly bring in people that

2:30:31

you don't want here. Gang members, cartel members,

2:30:33

terrorists. That's real and we've documented that and

2:30:35

there's people that have been arrested that were

2:30:38

trying to come in that were terrorists and

2:30:40

there's people that have gotten through for sure.

2:30:42

I think that if I give both of

2:30:44

them the benefit of the doubt I think

2:30:47

both of them will have to act on

2:30:49

the border. I think

2:30:52

that Donald Trump has had a

2:30:54

clearer view of this issue for much far longer.

2:30:56

I think that Kamala

2:30:58

has had to shift her position to make

2:31:00

herself more palatable to centrists. But

2:31:05

I do think that both of them

2:31:07

will probably have to act because I

2:31:09

don't think what's happening today is sustainable.

2:31:12

I don't think it is either but the

2:31:14

fear and Elon's talked about this the real

2:31:16

fear is that they're bringing these people in

2:31:18

to give them a clear path to citizenship

2:31:21

which will allow them to vote and then

2:31:23

you've essentially bought their vote. So

2:31:25

if the Democrats bring them in and

2:31:27

incentivize them to become Democrats and vote

2:31:29

and give them money which

2:31:32

they clearly are doing they're giving them EBD

2:31:34

cards and they're giving them housing and they're

2:31:36

giving things that they're not getting giving to

2:31:38

veterans and poor people in this country. That

2:31:40

seems to be an incentive to

2:31:42

get these people to want to be here and

2:31:44

also to appreciate the people that gave them that

2:31:46

opportunity which is you would

2:31:48

essentially in swing states which is

2:31:50

Ohio what's one of them you

2:31:53

if you can get a lot of people in

2:31:55

there and you've given them a better life because

2:31:57

of your policies those people if you give them

2:31:59

the opportunity to vote. vote, especially if they're like

2:32:02

limited, low information

2:32:04

voters, they're going to vote for the

2:32:06

party that got them to America. I

2:32:10

mean, I don't

2:32:13

know whether it's a conspiracy

2:32:16

per se, but I do

2:32:18

agree with the outcome. I

2:32:22

remember very vividly, my

2:32:28

parents took out the whole

2:32:30

family, three of us, myself and my two sisters,

2:32:33

two Niagara Falls, and then we crossed the border

2:32:35

to Buffalo, and we

2:32:37

applied for refugee status in America as well.

2:32:42

We didn't get it, we were rejected. And

2:32:45

when we went back, we

2:32:47

got a tribunal hearing in

2:32:50

Ottawa, where I grew up. And

2:32:53

I remember that it was in front of this magistrate judge,

2:32:55

so the person comes in with the robes and the hair

2:32:57

and everything, and you sit there and they

2:32:59

hear- They have the wigs up there? All of it, yeah. The

2:33:04

wigs. And then

2:33:06

they sit there and they hear your

2:33:09

case out, and my father

2:33:11

had to defend our whole family. Our

2:33:16

life was like, here's what we did. And

2:33:18

I remember just crying from the

2:33:20

minute it started. That's all I

2:33:22

did the whole time. It

2:33:24

seared in my mind, because your

2:33:26

life is right there. It's like a crucible

2:33:29

moment for your whole family. If they're like,

2:33:31

I don't buy it, off you go. We

2:33:34

go back and I don't know what would have happened. Fortunately,

2:33:38

obviously, it worked out. And

2:33:42

then you go through the process. I became a Canadian citizen.

2:33:44

Then I moved to the United States to get on a

2:33:46

visa. Then I become an American citizen. I

2:33:48

have an enormous loyalty

2:33:51

to this country. And

2:33:53

so when I think about Americans

2:33:56

not getting what they deserve before

2:33:58

other folks, It really

2:34:00

does touch me in a place ... I get

2:34:02

very agitated about that idea. It's

2:34:04

not that those folks shouldn't be taken care of

2:34:07

in some way, shape, or form, because I was

2:34:09

one of those people that

2:34:11

needed a little bit of a safety net. We

2:34:13

needed welfare. We needed the places to go

2:34:15

and to get the free clothes and all

2:34:17

that stuff. But

2:34:22

you have to sort of take care of all

2:34:25

of the people that are putting in the

2:34:27

effort and the time to

2:34:29

be here and follow the rules and stood

2:34:31

in line. When I came to the United

2:34:33

States, man, I came

2:34:35

on a TN visa. Every

2:34:38

year you had to get it renewed. You

2:34:40

had to show up. If the person that was looking

2:34:43

at you said, J'mon, you're gone,

2:34:45

Joe. Then

2:34:47

I had to transfer to an H-1B visa. My

2:34:50

company had to show that there wasn't

2:34:52

an American that could do this

2:34:54

job. Then

2:34:57

we were able to show that. I've

2:35:00

lived this experience of an immigrant

2:35:02

following the rules and

2:35:05

just methodically and patiently waiting and hoping, and

2:35:07

the anxiety that comes with that, because

2:35:11

it comes with tremendous anxiety. If you ask

2:35:13

people that were on H-1Bs in America, there

2:35:15

was a website. I don't even know if it exists

2:35:18

anymore, but we would check what

2:35:20

... because when you apply for a

2:35:23

green card, you get an

2:35:25

application date. Man,

2:35:27

I would sweat that website every other

2:35:29

week. Hey, did they update that? It

2:35:32

would be like four years in

2:35:34

the past and I'm like, I'm never going to get my green

2:35:36

card. My visa's going to expire. I'm going to have to move

2:35:38

back to Canada. But

2:35:40

I still play by the rules. I

2:35:44

just think it's important to recognize that there are a lot

2:35:46

of folks that play by the rules that are immigrants to

2:35:48

this country. There are a lot of people that were born

2:35:50

here that have been playing by the rules. I

2:35:53

think we owe it to them to do the right thing

2:35:55

for them as well. Then

2:35:58

try to do the right thing for someone else. folks that are

2:36:00

coming across the border because they probably are,

2:36:03

some of them legitimately are escaping

2:36:05

some really bad stuff. Quite a lot

2:36:07

of them. Quite a lot of them. I'm sure most

2:36:09

of those people are people that just want a better opportunity. And

2:36:12

that's a great thing. And that's a great thing. But

2:36:14

you have to take care of all the people

2:36:16

here, especially the veterans, and especially

2:36:19

these people that have been struggling in these inner

2:36:21

cities that have dealt with

2:36:23

the red lining and all the Jim

2:36:25

Crow laws that have set them back

2:36:27

for decades and decades. And it's never

2:36:29

been corrected. There's never been any effort

2:36:31

to take these places that have been

2:36:34

economically fucked since the beginning

2:36:36

of the 20th century and

2:36:38

correct it. And instead,

2:36:40

you're dumping all this money into people that

2:36:42

have illegally come here. That

2:36:44

to me is where it starts looking

2:36:46

like a conspiracy. I think that as

2:36:49

long as people can explain what they're

2:36:51

doing for these other folks that you

2:36:53

just mentioned, I think

2:36:55

for a lot of people, for 50% of the population that leans

2:36:59

red on this topic,

2:37:01

you could at least explain to them. The

2:37:04

problem is that there is no explanation. There is a

2:37:06

$150,000 home credit that Gavin Newsom

2:37:08

was about to give. I

2:37:13

think he vetoed it. I could be wrong. He

2:37:15

did. It was wildly unpopular. But that bill somehow

2:37:18

gets to his desk. And is

2:37:22

there a bill that says we

2:37:24

should have better food for the food deserts? Did

2:37:26

that bill get passed? So

2:37:29

there's clearly a way for state

2:37:32

legislatures to do what's right on behalf of the

2:37:34

folks in their state. So

2:37:37

if we just had a little bit more balance, and

2:37:40

then if we were able to shine a light on those

2:37:42

things, a lot

2:37:45

of the people that live here that contribute would

2:37:48

feel better about

2:37:50

where things were going and wouldn't feel

2:37:52

like the whole thing is just rigged.

2:37:54

Right. That's one of the things

2:37:56

that people are so excited about with this Trump union

2:37:58

with Tulsi Gabbard and Robert K. Kennedy, is

2:38:01

that you're having these movements

2:38:04

that seem to be almost

2:38:07

impossible to achieve outside of

2:38:10

an outsider, like the Make America

2:38:12

Healthy Again concept. Like

2:38:14

what are you talking about? You're going

2:38:16

to go up against these companies that

2:38:18

have been donating to these political parties

2:38:20

forever and have allowed them to have

2:38:22

these regulations that are allowing them to

2:38:24

have these dyes in food that's illegal

2:38:26

in our neighboring Canada? Like what?

2:38:30

No one's done that before, right? That's very

2:38:32

exciting. Okay, so- But again,

2:38:34

Messenger Message. Messenger Message,

2:38:36

just take a step back though,

2:38:40

and if you were just the

2:38:42

average Joe citizen, I

2:38:45

think an important thing to just notice is

2:38:49

why are all these totally

2:38:52

different people acting

2:38:55

as a team? I

2:38:58

just think it's an interesting thought question for ...

2:39:01

I don't have an answer, and I'm not

2:39:03

going to plant an answer, but

2:39:05

just ask yourself, like why are all of these

2:39:07

people cooperating? And

2:39:09

I think the

2:39:12

2024 election is

2:39:15

a lot about the traditional

2:39:18

approach to governance and

2:39:22

a very radical reimagining of

2:39:24

government. And I

2:39:26

think that's what effectively will get decided.

2:39:28

The traditional approach says we're going to

2:39:31

create robust policies, we're going to work

2:39:33

sort of top down this

2:39:35

muscular foreign policy, muscular domestic policy, the

2:39:38

government's going to play a large part

2:39:40

of the economy, and

2:39:42

we're going to try to right some wrongs. The

2:39:46

radical reimagining says we're going to

2:39:48

go back to a more founding

2:39:52

notion of this country. We're

2:39:56

going to have a very light governmental

2:39:58

infrastructure. are

2:42:00

nothing in comparison to the media's depictions of

2:42:02

him. Everybody's

2:42:04

guffaws. His, I

2:42:06

think, are... his

2:42:08

exist and are well described, but

2:42:11

I do think that they... I think

2:42:15

there's like a couple of good examples. You

2:42:17

know, one example that bothered

2:42:19

me was

2:42:21

the Charlottesville press conference.

2:42:26

When I first heard the media depiction of

2:42:28

it, I was really upset

2:42:30

because of what I thought he said. It

2:42:32

turned out he didn't say it. Exactly. In fact,

2:42:34

not only did he not say it, he

2:42:37

said the exact opposite. And then

2:42:39

I was really frustrated and a little bit

2:42:41

angry because I thought he

2:42:44

was never lying to me, the

2:42:46

filter was lying to me. And

2:42:49

I'm not paying for

2:42:51

those people to lie to me. I'm paying for them

2:42:53

to actually give me

2:42:55

the transcript of it so that

2:42:58

I can decide for myself. I think that's part

2:43:00

of a responsibility of being a cogent adult. And

2:43:02

the only repercussions of them lying is a lack

2:43:04

of trust that people have for them now. And

2:43:08

so then they make their own bed, you know, they

2:43:10

dig their own grave a little bit because it's the...

2:43:12

I think the trust in the mainstream media is the

2:43:14

lowest it's ever been. I think way

2:43:16

more people trust you, you know,

2:43:19

way more people trust us to tell

2:43:21

a version of what we think is happening because

2:43:25

you're not gonna lie and

2:43:28

you're like interested in just showing the clips

2:43:30

and then just debating. What

2:43:32

did he mean? What did he say? Why did he say this? Why

2:43:34

did he say that? By

2:43:36

the way, the same goes for Kamala because now, you

2:43:38

know, the domestic political machinery

2:43:41

is going to try to characterize

2:43:43

her as well. Cherry pick

2:43:45

comments, she says. So my point is, I think

2:43:47

we have to suspend this. But it's not

2:43:49

balanced. Like particularly look at a

2:43:52

debate where they fact check

2:43:54

Trump multiple times, but they didn't

2:43:56

fact check her. I read

2:43:58

this.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features