Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Released Thursday, 10th October 2024
Good episode? Give it some love!
Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Mauritius Compliance - Satoshi Unmasked, End of .io, Megalopolis

Thursday, 10th October 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for Twig this week in Google.

0:02

Paris Martineau's here. Yay! She's back. Jeff Jarvis

0:04

is here also. We'll talk about two

0:06

big court decisions going against Google, one for

0:09

the App Store, and well, one

0:11

it's the DOJ saying, I

0:13

think we're gonna break Google up. We'll

0:15

see what happens with that. We'll also

0:17

talk about the states, more

0:19

than a dozen of them, suing TikTok.

0:22

And is Peter

0:25

Todd really Satoshi Nakamoto? I

0:28

have some thoughts, all

0:30

that coming up next

0:32

on Twig. Podcasts you love

0:35

from people you trust. This

0:38

is Twig. This

0:44

is Twig this week in Google episode 789

0:46

recorded Wednesday, October 9th, 2024.

0:52

Mauritius compliance. It's

0:55

time for Twig this week in Google, a show

0:57

we talk about the latest news from

1:00

the Google verse, which covers pretty much everything

1:03

out there in the

1:05

internet cloud. Paris Martineau's

1:07

back, hallelujah. Yay! From

1:09

theinformation.com. She writes for

1:11

the weekend section talking

1:13

about apparently children's

1:16

flag football. So that's good. That'll be

1:18

exciting. You know,

1:20

only the important topics. She's

1:22

covering issues. Online child safety.

1:24

Yeah, issues with the youths and

1:26

online and all that. That's a good subject,

1:29

isn't it? Boy, these days. That it is.

1:32

Great. We missed you. Your friend Ed is

1:34

a character and a half. I

1:36

love that I've introduced Ed into the

1:38

twit universe. Oh my God. He

1:40

just can come on through

1:42

Twitter. I feel like, yeah,

1:45

he tweeted his way into my heart and

1:47

yours. Yeah. Your

1:49

nihilistic heart. That's Jeff

1:51

Jarvis, professor emeritus of journalism

1:54

at the City University of New York.

1:56

Well, we should say at the Craig

1:58

Newmark Graduate School. Have a

2:00

fantastic New York. Have a

2:03

fantastic New York. At the City University of New

2:05

York, emeritus. And Jeff has now created a kiosk

2:07

in his office. Well, the

2:10

web we weave is officially out. Get

2:13

your copy now. By the whole set.

2:16

He did a TikTok where he opened

2:18

the boxes. It's hysterical. Is that your

2:20

first TikTok or is that... No, I

2:22

did it before. It was

2:24

cute. It was really cute. Do a lot of dances on there.

2:27

So I have all the books. I do.

2:30

But the latest is this, the web we weave. Soon

2:33

to be a New York Times bestseller. Why

2:35

we must reclaim the internet from moguls and

2:37

misanthropes and moral panic. I got

2:39

my first review and it's a pan. Moral panic you say? Who

2:42

panned you? Oh, some old

2:45

bitty. Some old

2:47

bitty. Bad feeling about this. Hey,

2:50

I didn't realize we still had that. I thought we

2:52

left it at the studio. We apparently have 20 of them. I

2:55

think we realized. Oh yeah, you didn't know that. We

2:57

have versions. We have a bunch of

2:59

versions of that. It's just, I don't have a hot

3:01

key for it. So it's hard to hit on

3:03

time. You got to bring it up. Yeah.

3:07

All I have is John, John, Selena

3:09

going, Hey, hey, that's

3:11

the only thing I brought. Anyway,

3:14

well, you don't deserve bad reviews. Although

3:17

I would imagine the people who don't like it are

3:19

the people who are lobbying

3:21

for the shutdown of social

3:23

media. That's what it is. Yes.

3:25

Yes. Yes. This week, the what is it? 13

3:29

states attorney general decided that they're

3:32

going to go after social media and shut

3:34

her down. Well, tick tock before

3:36

and tick tocks on the way out anyway. So

3:38

you're wasting your breath. They said

3:40

it's like it's like cigarettes. It's like

3:43

nicotine and cigarettes. This social media. It's

3:46

not causing cancer. You nit wits. No,

3:49

you're nit wits. Moral

3:53

entrepreneurs. Yeah, it was 13 states

3:56

and the District of Columbia sued

3:58

tick tock on Tuesday. arguing

4:00

that the company deliberately designed

4:02

the app to be addictive to children

4:05

and that it has misrepresented

4:08

the effectiveness of its content moderation

4:10

efforts to consumers. And cases do

4:12

this. And this is kind

4:14

of advancing a novel legal

4:16

strategy. It's been finding purchase

4:18

in courts around the U.S.

4:21

lately, which is they're trying

4:23

to basically sidestep section

4:25

230 protections by using

4:27

the principles of product liability

4:29

to get these companies on

4:32

negligent design or essentially

4:35

just knowing that

4:37

something in their product could harm

4:40

consumers in some way and

4:42

continue to do it anyway. Section 230

4:44

protects them against being

4:46

sued for either moderating or not moderating

4:49

the content. They can't be held liable

4:51

for something other people post or

4:53

that they take down because it's bad. But so they

4:55

can't go through that. So

4:57

specifically what these suits are doing is

5:00

they're targeting the TikTok algorithm in different

5:02

parts of the TikTok platform, saying that,

5:04

you know, the way that age gating

5:07

was implemented, the process by which they

5:09

determined like, are you under 13 or

5:11

not, that it was designed

5:14

in a defective way that

5:17

could have harmful impacts or that, you

5:19

know, the algorithm could have harmful

5:21

impacts on mental health of young users. And

5:23

they knew that and continued to make those

5:26

choices anyway. Let me channel Ed

5:29

Zittrain from last week's show. Social

5:32

media is killing children.

5:34

The algorithm is like

5:36

nicotine. That's

5:38

not a very good Ed Zittrain, but you get

5:41

the idea that people very, very much feel that.

5:43

And it feels, and by the

5:45

way, that's what you're arguing against very much in

5:47

the web we weave, Jeff. Yeah, but trying to

5:49

actually find the Supreme Court up

5:51

until now has again and again thrown down

5:53

this argument. I can't find it in my

5:56

own book. It should have an index.

6:00

thrown down the idea that it does have an index.

6:02

I know it does, but I can't. Uh,

6:06

it's thrown on the idea that, um, censoring

6:12

for children is okay. Cause it's for

6:14

children. Right. It's always fundamentally insulting

6:16

for children. Interestingly this week, it just,

6:19

if I may do it, just a

6:21

little tiny detour, we'll come back

6:23

to it. No, no. Detour array, because we're,

6:25

this is going to be a fast moving

6:27

fast paced episode with many, many stories. Oh,

6:29

okay. Unlike last week when we did two.

6:32

Yes. Um, so, sorry, I'll get

6:34

rid of this. I got to laugh at your

6:36

kiosk. That is the funniest thing ever. By the

6:38

way, I hope that

6:40

a Glenn Fleischman never sees that you're

6:43

using his books

6:45

to prop up the web. We, we, that's

6:49

how you keep it. That's how you keep it

6:51

stable is that's the, uh, that

6:53

the type shift happens book that Glenn did. Notice

6:55

that it's right here because I'm using it for

6:57

research for the day. Okay. So

6:59

that's not really a, uh, a

7:01

show kiosk. That's my desk. I

7:04

love stacks. Book

7:07

cart that we have more books over

7:09

here. I am working. That's

7:12

a proper desk. It is. That

7:14

is, that is really impressive. Camera

7:17

is never going to be back in the right spot.

7:20

Fine. It's not. Wait

7:26

a minute. Just in sympathy. I'm

7:28

just going to lower my, uh, my shot.

7:31

Let's all sit a little lower. Now

7:35

he's high. Make up your mind. Mr. Jarvis. So,

7:44

uh, last week, I think

7:46

we, I've fallen and I can't

7:49

get up. It's really

7:51

hard to adjust from here. Oh

7:53

God. How do I do it? Okay.

7:59

The show's fallen apart already in the

8:01

first seven minutes. That's impressive. Where

8:04

was that when I was so rudely digressive?

8:06

We were talking about the fact that there

8:08

are bad people out there who think that

8:10

social media is addictive. And you talk about

8:12

that in the web we weave. And there

8:14

are constitutional issues. So, uh, one of the,

8:17

one of the AI bills that Gavin Newsom,

8:19

the governor of California signed in the

8:21

last month or so was a last

8:23

week enjoined, uh, because,

8:25

um, it was

8:27

against deep fakes being promoted in social

8:30

media. And a guy did a

8:32

deep fake of Kamala Harris doing something. And he

8:34

sued because he said it was a violation of

8:36

his first amendment rights to take it down. And

8:38

the court agreed with him so far. So

8:40

the first amendment comes, it's more than section two

8:43

30. The first amendment comes into play again and

8:45

again and again. Well,

8:47

I'll play devil's advocate here for

8:49

a minute. Please talk to a

8:51

lot. I know I can't do a word

8:53

of sex and otherwise I would, but for

8:56

instance, let's take the safe for kids act

8:58

safe stands for something. I'm not remembering, which

9:00

is something that passed in New York recently.

9:02

One of the provisions in that is that

9:04

accounts for children should by default

9:08

have, uh, an

9:10

algorithmic recommendation engine turned off

9:12

and should instead display content

9:14

from people like on Instagram,

9:17

the kid follows chronologically. That's

9:19

ostensibly because they think, Oh, a

9:22

problem a lot of kids have is,

9:24

you know, regulating their time. One of

9:26

the things that makes it worse is

9:29

if you have a recommendation algorithm that

9:31

keeps serving them really interesting content, I

9:33

don't think that that's that bad. That's

9:36

just saying social media

9:38

is too much fun. It's too

9:40

interesting. So please make it boring.

9:43

And then we don't have to worry about it being addictive. The

9:46

algorithm often, I mean, we set up with

9:48

like alcohol and cigarette. I set up my

9:51

Facebook because somebody told me, Oh, you know,

9:53

you can't, cause I kept saying, Oh, I

9:55

want the, I want just the friend feed,

9:58

the chronological feed. I said. up. It was

10:00

the most horribly boring

10:04

list of stuff. One guy posted half of

10:06

it. It's not good. So what's

10:09

wrong with a company saying, look, we're going to, we

10:11

want to give people what they want to see. It's

10:14

not for everybody. It's just for, I'm

10:16

forgetting the exact age range, but let's

10:18

say like under 16 year

10:20

olds or under 13 year olds, if they're

10:22

using a product, I don't think

10:25

that that's that bad of a policy

10:27

to say by default, you have one

10:29

feed. If you have parental permission, you

10:31

can change to the normal one. That's

10:33

fine. But on very specific kid accounts

10:35

to have by default kind of increased

10:37

operation standards, I don't think there's

10:40

nothing real. Children's television, Teletubbies is

10:42

far too entertaining. We need to

10:44

have it be droning

10:46

teachers telling them about

10:48

stuff they need to

10:50

know. I mean,

10:52

I don't mind them banning sugar

10:55

cereal ads in children's TV programming and it

10:57

would be okay to say no advertising. The

10:59

algorithm is not bad. It is a choice

11:01

of ranking and ranking is made for many

11:04

reasons. And I quote from a book just

11:06

out, um, the governor of

11:08

New York, Kathy Hochul, uh, who

11:10

not only killed former congestion, former governor, she's

11:12

still the governor. Oh, I was, I was

11:14

wishful thinking. Oh yeah. I'm thinking of Eric

11:16

Adams. Oh no, he's still maybe, he's still

11:18

here. He's former soon though. Come

11:20

on. I guess who may become

11:22

our next mayor, but anyway, she

11:25

said, do you understand how an algorithm

11:27

works? It follows you. It

11:29

preys on you. No, it says classic.

11:31

It doesn't have a crap. Yeah. No.

11:35

And I know you're doing a little bit

11:37

of devil's advocate, but that's my job. That's

11:39

Leo's job. But, uh, no, but, uh, and

11:41

I, and I have said that myself that

11:43

the algorithm is the problem you've, you've been

11:45

blasted at the past, but you're coming to

11:47

learn. Well, I just realized that

11:49

with that, all an algorithm is, is trying

11:51

to make the content more interesting. Well, that's

11:54

a bit of an extrapolation. Really what it's

11:56

doing is optimizing for the maximized

11:58

engagement. It's optimizing for how long will

12:01

it keep this user on this website.

12:03

And I do think that there's some

12:05

way of an argument is a proxy

12:07

for satisfaction. Isn't that's yeah. But

12:10

when you're talking about like a,

12:13

you know, minor child, I think that

12:15

there could be something to maybe having

12:17

a different type of account for

12:19

someone like under 16 or under 13

12:22

years old that perhaps doesn't

12:24

optimize for that. Okay. I won't disagree

12:26

with having a different account for a

12:28

young person, but, but the algorithm is

12:31

irrelevant to that. I

12:33

think you got taken in by a cupcake, Ms.

12:35

Martin. No. What?

12:38

Right. I must have missed that discussion before the show.

12:41

Um, and,

12:43

uh, uh, so, so then, then what are the

12:45

criteria? What is it that you want to do

12:47

with the account? Okay. But the algorithm

12:50

is just ranking, making ranking decisions. That's all

12:52

it's doing is say, and, and, and as

12:54

Leo said, it's often getting rid of the

12:57

bad stuff, the boring stuff, the

12:59

competitive stuff. It depends on

13:01

how it's written. Algorithms like saying, we

13:03

should make candy taste bad if children

13:05

are going to eat it. Well, you

13:07

were just talking about how you'd be

13:09

in support of no like high sugar

13:11

ads being shown to kids. I feel

13:13

like it's kind of a similar vein

13:15

of regulation. I mean, let's take, for

13:17

example, this New York law I just

13:19

mentioned, I believe the provisions

13:21

in it are, you know, some

13:23

level of like minor kid accounts should

13:26

by default have chronological feeds instead of

13:28

having Algorithmic as the default. You can

13:30

switch it if you want. And then

13:32

the other one is, uh, notifications

13:35

for, I believe like under 13

13:37

year olds should be by default

13:39

muted from 10 PM to

13:41

six. Yeah. I don't have a problem

13:43

with that. Those are those, I don't

13:45

think that's that big of a deal.

13:47

Shouldn't that be the parent's job? Why

13:49

does the government have to decide when

13:51

your kids are getting notifications? I mean,

13:53

why does the government decide anything? Why

13:56

does the government regulate the airline industry?

13:58

Why does the government decide whether we

14:00

can add? advertise stuff to kids like,

14:02

I don't know, the government

14:04

does a lot of stuff. I don't know if, yeah, I'm gonna,

14:06

I don't know if I can codify it, but

14:08

there is definitely stuff that it's an appropriate

14:11

thing for the government to do and stuff that's appropriate thing

14:13

for the parent to do. I guess, Something

14:16

that I recently heard. What the government could tell companies is

14:18

you have to give better parental

14:20

controls to the parents. Like, there

14:22

should be a switch that says no notifications

14:24

at night and the parents should be made

14:26

aware of that. But the presumption

14:28

really is, oh, parents aren't gonna care. They're

14:31

not gonna do anything. So we have to,

14:33

and it bothers me a little bit. And

14:35

it's a red herring. Show me the research

14:37

that says that the algorithm does

14:40

this to children. That's what I couldn't

14:42

find. And the, you know,

14:44

it's, it's the algorithm has now been demonized,

14:47

which is like demonizing math. I

14:50

mean, I agree. I think like even that hope

14:52

chill quote you just did, it's ridiculous the way

14:55

that something as simple as an algorithm has been

14:57

demonized. But I do think that there is something

14:59

like to your question of like, oh, it should

15:01

be the parents responsibility. I agree. I was talking

15:04

to kind of a big

15:06

supporter of this recently, of these

15:08

sort of regulations recently. And

15:10

I posed that question to her asking like, well,

15:12

shouldn't all of this be parental responsibility? And

15:14

she did bring up an interesting point I hadn't

15:16

thought about, which is like, she

15:19

will one, she gave an example of,

15:21

she had three, has three kids. At

15:23

one point she downloaded

15:26

one of those, software services. I think bark is

15:28

the name of one of them, where essentially it's

15:30

like spyware for your phone, where it sits in

15:32

the back and monitors every social media every texting

15:34

thing, anything you have on there, and you can

15:36

set certain triggers and it'll alert you for all

15:38

of it. And she did like whatever the basic

15:40

bare minimum was. And she was like, I was

15:42

receiving 500 to a thousand alerts

15:47

a day. She was like, it

15:49

was incredible. She's like, I'm really privileged. Why

15:51

was able to deal with that? But most

15:53

people who do not, who, you know, maybe

15:55

work two jobs probably can't. And right now

15:57

these social media companies don't seem to have

15:59

parental settings. in place to make

16:01

it easy for parents. Facebook only recently, or

16:03

Meta only recently came out, this was Instagram.

16:07

The other thing I think is, I

16:10

agree, the other thing I think is notable that

16:13

she brought up that I hadn't thought about is

16:15

she's like, for a lot of parents she knows,

16:17

they don't allow their kids to have any access

16:19

to personal devices like phones or computers, but that

16:22

doesn't work because nowadays if you have a kid

16:25

in elementary school, they're gonna be sent home

16:27

with an iPad, school issued, or a computer

16:29

starting a middle school. A Chromebook, a Chromebook.

16:31

I mean, and you might say, and I

16:33

mean, I think the

16:36

gasp is somewhat warranted, but also it's

16:38

incredibly difficult to then police your child's

16:40

social media use if they have a

16:42

computer 24 seven. If

16:45

the kids are being given computers

16:47

from the schools, it's incumbent on the

16:49

schools to do the right thing and

16:51

make sure those are not introducing stuff

16:53

into the house. I

16:55

think whoever provides the object should be responsible

16:57

for making sure the object does no harm.

17:00

What about this example? Just

17:03

explain to me how this is different

17:05

from, let's take the sugar out of

17:07

candy if it's sold to people under 13, because

17:10

it was clearly bad for them. It causes tooth

17:12

decay. We're learning

17:15

more and more, the sugar is actually a poison.

17:18

We should take sugar out of candy,

17:21

but only for people who are underage.

17:24

I don't disagree that that could be an

17:26

argument. I don't think

17:28

we should for any, I don't have any strong

17:30

personal beliefs on either of these subjects. I think

17:33

the difference is that it has

17:35

become, the social media issue has become

17:37

such a big issue for parents in

17:39

recent years that they're mobilizing on it.

17:41

Meanwhile, sugar isn't something that's incited

17:43

like mobilization. And more and more

17:45

viewers are exploiting this and making

17:47

more fear and not using research.

17:50

The problem, and so much of this, for a lot

17:53

of kids who feel

17:55

very lonely and very depressed, there's a lot

17:57

of research that's actually I write about in

17:59

the book, which. says that it

18:01

makes them feel better. They know

18:03

that they're not alone in the world. It's a

18:05

way that they have friends. Ed last week was

18:07

talking about how he, it was very sad really,

18:09

about how he had no friends, none at all.

18:13

And the internet, time and time again, is a

18:15

place where kids can see that they're not alone,

18:18

that they find other people who sympathize with

18:20

them. That is an important human

18:22

connection that's being cut off because

18:24

of a fear of a screen or an algorithm.

18:28

It's pure moral panic. There's

18:30

also the argument that. Is

18:33

this gonna be a different one? Houston, we have a problem.

18:36

What are we doing? It's

18:39

also the case that focusing on

18:41

this distracts from real

18:44

solutions like make

18:46

sure you fund mental

18:48

health experts in every school, that you

18:51

fund mental health centers, that you

18:53

pay attention to what's going on for kids. There's

18:55

a lot of things we could do that

18:58

we don't do. It's very easy to blame. Just say,

19:00

well, it's big tech, we can fix it. Let's

19:02

just ban them and then we'll be

19:04

all fine. And I think that that's

19:07

really a shortcut that unfortunately leaves behind

19:09

real solutions. So there's also that.

19:13

Anyway, it's a good conversation. Yes.

19:17

I guess we'll let the courts decide

19:19

because the courts are so smart that

19:22

they should be able to figure this out. Obviously they're so

19:24

smart. And I think it's a

19:26

good conversation to have because I

19:29

fall on both sides of the spectrum depending on what

19:32

the day is. I think I can see

19:34

benefits in both sides. And I think it's gonna be kind

19:36

of interesting to see how this shakes out. I

19:38

was a laissez-faire parent. Even

19:41

when my kids were, I mean now they're 30 and 32. So

19:44

they're older than you, Paris. So they obviously

19:46

didn't have access to the same social media

19:49

you do or kids today have. But there

19:52

was even an end. Should they

19:54

play video games? Should you limit that to too much?

19:56

Should you limit screen time? Et

19:59

cetera, et cetera. laissez-faire, I figured, well,

20:02

we'll let them figure it out. And I, we

20:04

would have discussions like if you see something online

20:07

that upsets you, please

20:09

come to us and talk about it. We're never going to yell

20:11

at you for seeing it. We want to talk

20:14

about it and that kind of thing. They

20:16

survived. Um, so

20:19

my, my inclination is laissez-faire, but I'll

20:21

tell you the other thing is I

20:23

think parents should have whatever tools they

20:25

need to keep their kids safe. It's

20:27

up to the parent. And that's why

20:29

I'm not crazy about government getting involved.

20:31

I think government could mandate the tools.

20:33

That's fine. Yeah. I mean, I

20:35

do think that there is something to the argument though, that,

20:37

I mean,

20:40

let's, for, let's take, for example, these new

20:42

teen accounts that, uh, metagist rolled out for

20:45

Instagram that happened a week or two ago,

20:47

but they, I don't know whether that's a

20:49

good solution, but they had features in there

20:51

that I, when I saw it was like,

20:54

what, you didn't have

20:56

this beforehand to where, you know,

20:58

if you're a child who's told

21:00

meta, I'm 13. Now

21:03

your account by default doesn't let any

21:06

stranger that wants to talk, like connect with

21:08

you and message you, you know, I

21:11

think, and now there are stronger supports

21:13

for a parent wants to have parental

21:15

controls on that kid's account. They can't.

21:18

And I think those are both

21:20

good things and they only have

21:22

come because meta has been bullied

21:24

for years, like months and months

21:26

and years by angry

21:28

parents and threatened with serious

21:31

legislation and regulation by state

21:33

and federal lawmakers. Yeah. Remember

21:36

when the Senate committee made

21:38

Mark Zuckerberg stand up and

21:40

apologize to the parents in

21:42

the, in the gallery? And

21:44

now he says he's done apologizing. That

21:47

was, that also probably backfires because rather

21:49

than, than, than having a cooperative discussion

21:51

about this, it became so combative. He

21:54

just says Oscar. Yeah.

21:56

Maybe that's really what we need is this sense that

21:58

we are all coming from a point goodwill

22:01

and that we're trying to find a solution that's best

22:03

for our kids. I

22:06

agree, but I would also say, I mean, something,

22:10

something that struck me out when I was reading the,

22:12

it was a couple weeks ago,

22:14

the Department of Justice sued TikTok

22:16

for various violations of

22:19

COPPA, the Children Online Privacy and

22:21

Protection Act, and a different agreement.

22:24

And it detailed in, it went

22:26

into great detail in the complaint

22:28

of the lawsuit, all the various

22:31

ways in which TikTok had been

22:33

deficient in stopping children

22:35

under the age of 13 from making

22:38

accounts as adults. And

22:40

one of the things that they mentioned was

22:42

that TikTok has, I guess, some

22:44

form on their website where if you're a

22:46

parent and you're like, my eight-year-old

22:49

has a TikTok account and I want

22:51

it to be deleted. TikTok's

22:54

system for responding to those requests

22:56

was so deficient that the vast, vast,

22:58

vast majority of requests from parents went

23:01

nowhere. Even if parents did everything correctly,

23:03

filled out all the forms, TikTok would

23:05

ignore it and in some ways built

23:08

a system specifically to keep those accounts

23:10

around and active for the kids when

23:12

parents wanted to get them taken down. That should

23:15

definitely be fixed. Although all that's

23:17

doing means is that TikTok is operating at

23:19

the same level as our government does. I

23:21

mean, yeah. So I think that that's probably

23:23

why things have gotten deleted. I have to

23:25

say, it

23:27

hurts me a little bit to think of

23:29

banning TikTok because of my son who,

23:32

so one

23:35

of the things we've been talking about, Benino said

23:37

this a couple of weeks ago and it really

23:39

sunk in with me. If you're going to start

23:41

a new project, a podcast or YouTube video or

23:43

whatever, you need to put it somewhere where there's

23:45

an algorithm, where there's a discovery engine because

23:49

discovery is impossible in a world

23:51

where there are millions of creators.

23:54

You need to put it somewhere where an algorithm can

23:56

promote you. My son started

23:58

making sandwiches. on TikTok and

24:01

paid attention to what the algorithm promoted

24:03

and what it didn't promote and

24:06

worked hard to make sure that his videos

24:08

appealed to the algorithm, presuming

24:10

that the algorithm was doing what

24:12

it does because it appealed and

24:14

turned to people, right? It's

24:17

watching likes and so forth. And

24:19

he's been able to build a pretty darn

24:22

good career out of it. I'm holding up

24:24

his cookbook as I'm talking about that, which

24:26

is available in bookstores everywhere. I

24:28

saw that kind of salt napkin,

24:31

salt, Hank, a five napkin situation

24:33

by Henry Laporte, buy

24:36

it at bookstores everywhere. But

24:39

that is, I think, and

24:41

a hundred percent owed to

24:43

the marriage of TikTok's algorithm

24:45

and his ability to create

24:48

stuff that fit the algorithm.

24:51

Not that he's just one of a thousand

24:53

or a hundred thousand TikTok

24:55

chefs or YouTube chefs. That's

25:00

why I'm kind of, I feel a

25:03

certain loyalty to TikTok. And if they're doing

25:05

stuff wrong, they need to work on it.

25:08

Obviously if they, you know, parents should be able to

25:10

say to TikTok, Hey, this is a kid, you

25:12

know, block his account, whatever that, that needs to

25:14

be fixed. Obviously I'm not against that. Let's

25:16

take a little break there. There's lots

25:18

of news. Let's take a little break. Because

25:21

we now know who Satoshi Nakamoto is

25:24

and they amaze you.

25:29

I'm going to argue for this one, but

25:31

I'll tell you, we'll talk about it in

25:33

just a second. Also, if you've got a.

25:35

One of the arguments for being Satoshi is

25:37

that they share the same pizza, favorite pizza

25:39

topping. I'll just put that

25:42

out there. Be patient. We'll get to that.

25:44

Also, the .io domain is going to disappear

25:46

in all likelihood. What are you going to

25:48

do? Then

25:50

google.io all

25:52

of that and more coming up in just a little

25:54

bit on this week in Google Paris, Martino's back. We're

25:57

so glad to see her. And

25:59

by the way. coordinating the

26:02

Monstera sweater and glasses

26:08

and the green, the green lighting behind you. Very nicely

26:10

done. Very nicely done. Are

26:13

your parents in the, in the, uh, wake

26:17

of the, they are thankfully not in

26:19

the part of Florida. It's going

26:21

to be hit by the hurricane. Yeah. Our

26:23

deepest thoughts and,

26:25

uh, and, uh, good

26:28

wishes to the people in North Carolina and

26:30

Florida and the areas affected by

26:32

first Helene and now what's

26:34

his name? Milton. Milton. It

26:37

truly looks like it's going to be devastating. I mean,

26:39

the Tampa Bay area hasn't been directly

26:42

hit by a hurricane in a hundred

26:44

years. Yeah. It's been rough,

26:46

spared time and time and time again. And

26:48

it's time. It looks like

26:50

it's gone. Did you see the amazing animation

26:52

the weather channel did, uh, about

26:55

storm surges? Yeah, that was really good.

26:57

That was, that was a really good

26:59

example of, you know, normally I don't

27:02

like this kind of, um,

27:05

uh, show off show off the effects,

27:08

but let me just, I, I probably get taken down

27:10

for this. I don't care. It's so good. Uh, just

27:12

show a little bit of it. The,

27:14

uh, anchor, uh, is, I'm not, I'm

27:17

not playing the audio. Maybe that'll help is, is,

27:19

you know, standing in a street with a car

27:22

and the storm is coming and he's describing what a

27:24

storm surge would look like. And he says, well, here's

27:28

a three foot storm

27:31

surge. This is

27:33

very at six feet. This

27:35

is very effective in, in

27:37

really showing what

27:39

people who are deciding to so-called

27:41

ride it out are going

27:43

to face with a nine. That's

27:46

nine feet. Uh,

27:48

you have very little chance of survival and,

27:50

and we're talking some areas. Not just that

27:52

you're on the second floor, the house goes

27:54

down. Yeah. The whole thing's gone. And

27:56

they're even showing all the stuff that's floating in

27:58

that water. Oh, that's the other

28:01

problem, of course. So I really,

28:03

uh, I thought that was a really

28:05

good example of using CGI, uh, very

28:08

effectively for information. Good job,

28:10

Weather Channel. Please don't take us down. Uh,

28:13

all right, we'll come back with more in just a

28:15

bit. You're watching this week in Google brought to you

28:18

today by us cloud. I

28:20

had a great conversation with these guys. I had,

28:23

I had, I have to admit, I hadn't heard of them.

28:26

They are the number, but I should have, they're

28:28

the number one, number

28:30

one, Microsoft unified support replacement.

28:33

Now, a lot of people

28:36

use Microsoft, a lot of enterprises use Microsoft

28:38

for their support. It might be

28:40

coming with your license and so forth, but

28:42

it isn't really a pay as you go deal.

28:46

You buy an, uh, you know, you just kind

28:48

of buy it, whether you're going to use it

28:50

or not. And it is maybe not the most

28:52

effective for your business. That's why

28:54

so many people now turn to us cloud, the

28:57

global leader and third party Microsoft

28:59

enterprise support. They don't say

29:01

it, but I'm going to say it better support than

29:03

Microsoft for less supporting

29:06

50 of the fortune 500 switching

29:09

to us cloud can save your business 30

29:12

to 50% on a

29:14

true comparable replacement for Microsoft unified support.

29:16

But it's more than that. US

29:19

cloud supports the entire Microsoft stack 24 seven,

29:23

365. They respond faster. They

29:26

resolve tickets quicker for clients all around

29:28

the world. You're always going to talk

29:30

to real humans based in the US

29:33

and even more importantly, you're going to

29:35

talk to engineers who have the knowledge

29:37

and the skillset. They go the extra

29:39

mile to bring in the best brains

29:42

so that they're there for you when you

29:44

need help. Expert level engineers with an average

29:46

of 14.9 years in the business. And

29:51

that's for break fix or DSE, 100%

29:55

domestic teams, your data and your call and

29:57

all of your business never leaves the US.

30:00

And here's something Microsoft will not

30:02

do, financially backed SLAs on response

30:05

time. Initial

30:07

ticket response average under four

30:09

minutes. Fast help, good

30:11

help, at a cost that's lower than Microsoft.

30:13

In 2023, 94% of US Cloud's clients reported

30:16

saving one

30:19

third or more when switching from

30:22

Microsoft Unified support to US Cloud.

30:25

So you save, but it's also

30:28

better support from Fortune

30:30

500 companies and large health

30:32

systems to major financial institutions,

30:35

even federal agencies. US

30:37

Cloud ensures that vital Microsoft

30:39

systems are working for over

30:41

six million users globally every

30:44

day. I'm talking big brands,

30:47

the Trust US Cloud, Caterpillar,

30:49

HP, Aflac, Dun &

30:52

Bradstreet, Under Armour, Keybank, even the

30:54

IT folks at Gartner have chosen

30:57

US Cloud for their Microsoft support

30:59

needs. It's kind of a

31:01

no brainer, cost less check,

31:04

better support check, faster

31:06

support check, financial SLAs

31:08

check. One director

31:10

of information technology says, and within an

31:12

hour, US Cloud

31:14

responded with, I wanna say four engineers.

31:16

So not only did they bring the

31:19

right guys to the call, they brought

31:21

the cavalry. I just

31:23

felt like, wow, that was amazing.

31:25

That was unlike anything I'd experienced

31:27

with Microsoft in my eight years

31:29

of being with Premier, we made

31:31

the right choice. This is the

31:33

company. And by the way, when

31:35

it comes to compliance, no one

31:37

gets it more than US Cloud,

31:39

ISO GDPR, ESG compliance, not just

31:41

regulatory requirements, but strategic imperatives for

31:43

US Cloud that drive operational efficiency,

31:45

legal compliance, risk management, and corporate

31:47

reputation. These standards foster trust and

31:49

loyalty with your customers,

31:51

with your stakeholders. They attract

31:54

investment. They ensure long-term sustainability

31:56

and success in a very

31:58

competitive global market. You

32:00

want US cloud, visit uscloud.com,

32:02

book a call today. These

32:05

guys are fantastic. I was so glad I called

32:07

them. Find out how much your team could save.

32:10

US cloud, call them

32:12

today. uscloud.com, book a

32:14

call, get faster Microsoft support

32:16

for less from the best.

32:18

US cloud, we're so glad to

32:21

welcome them to this week in Google and

32:23

to the whole Twit network. I think this is the first

32:25

time we've talked about them. Really,

32:27

really great company offering an

32:29

amazing service. All right,

32:31

back we come to Twig. And this was,

32:33

I forced myself to stay up

32:36

late last night watching

32:38

Electric Money, which is

32:40

a new HBO documentary from the same

32:42

guy who did the QAnon documentary. And

32:45

I think quite credibly unmasked Q. Did

32:48

you watch the? Yeah. Yeah.

32:52

I mean, I think he quite credibly

32:54

unmasked Q because the Q, the. The

32:57

Q kid. The important

33:00

part of unmasking Q was

33:03

going back through old 4chan

33:05

forum posts and 8chan forum posts. And

33:07

I think that he and his team

33:09

are an expert at that kind of

33:11

digital research. I'm not sure if the

33:13

same is true with Satoshi Nakamoto. So

33:16

that's the Electric Money is about, or

33:18

Money Electric, I guess is the name,

33:20

is about Bitcoin. It's,

33:24

you know, I'm watching it. Before

33:26

they even start going into, well, who

33:28

is Satoshi Nakamoto? And thinking this is

33:31

very, very positive about Bitcoin.

33:33

Doesn't really mention a lot

33:36

of the consequences of Bitcoin

33:39

instead kind of sells Bitcoin. Colin Hoback,

33:42

the filmmaker though, I think

33:44

really intended it all along to be unmasked

33:47

the creator of Bitcoin who

33:49

has been mysterious since

33:51

2012. He's dropped off

33:53

the internet, disappeared. He

33:58

wrote the paper that was. the

34:01

Bitcoin is based on. He created

34:04

what they call the Genesis block, the first block

34:06

in the blockchain, and is thought to

34:08

have a million, one

34:10

million, Bitcoins in his

34:13

wallet. Hang

34:15

on right there for a

34:17

second. Yeah. Like

34:22

Alexander Hamilton creates a US

34:24

Treasury and doesn't keep a

34:26

million dollars for himself for

34:29

doing so. There seems to

34:31

be, granted, whoever

34:33

this is hasn't cashed

34:35

any of it in, but

34:38

it really gives

34:41

me a bad taste to

34:43

think that someone creates a currency, but

34:46

does so to that tremendous personal

34:48

advantage. Well, yeah. Good

34:51

point. Like it would might be a

34:53

pyramid scheme. Like maybe, yeah,

34:55

I mean, they don't debate that. I

34:58

mean, they don't go into it either,

35:00

but I agree with you. I mean,

35:02

there are definitely things about Bitcoin that

35:04

are kind of... I

35:06

thought when it started, I thought, oh, this is cool.

35:08

And blockchain is really cool technology. And I kind of

35:11

went along with it. And of course, the culture

35:15

of the crypto bros is enough to

35:17

turn you off on anything. But

35:19

it really is an interference with

35:23

economic systems in a way that

35:25

isn't regulated, really. And

35:27

it's disturbing. So the one hand, why do I

35:29

care who Satoshi is? But the other hand, I

35:31

guess maybe we should. Well,

35:34

if he has what would make him a

35:36

60 billionaire

35:38

now and potentially a trillionaire

35:40

down the road worth

35:42

of Bitcoin. Now, of course, if he were

35:44

to cash even a significant, any

35:47

significant part of that in, it would

35:49

probably undermine Bitcoin entirely. Right.

35:51

And the confidence in it, which might explain

35:53

why he hasn't. There's some people who've also

35:55

thought that he might be dead. Right.

35:58

Or maybe it's not one. person maybe it's

36:00

a couple of people. Leo I've got it.

36:02

I've got it. You've talked about your drive

36:04

with your Bitcoin on it. Yeah. And how

36:06

you can't open it up. Yeah. I think

36:08

that's just been a confession that you. There's

36:11

a million Bitcoin on

36:13

my drive. You keep doing this podcast

36:15

thing and you complain about the business

36:17

beat down as a cover. It

36:20

is interesting they point out in the documentary

36:22

there is a way to burn Bitcoin publicly

36:24

in such a way that everybody would know

36:26

you no longer have access to it. And

36:29

that has not happened. The

36:32

person he actually I think and

36:34

this is maybe just me

36:36

I think he in effect said two

36:38

people are Satoshi Nakamoto

36:41

the two people that but the person that

36:43

he by the way both of them

36:45

deny it as by

36:47

the way as one would if one

36:50

were Satoshi Nakamoto who knows what

36:52

the United States government or any other government

36:55

might do to you and kidnapping you. Yeah.

36:57

Or not to mention terrorists who would want

36:59

your Bitcoin and on and on

37:01

and on. So I don't

37:03

blame whoever it is for being

37:05

anonymous and for denying it. He

37:08

I think now you see you watched all the way to the

37:11

end because at the end of the

37:13

documentary there is this I think has become

37:15

the Colin Hoback

37:19

trademark. He gets the

37:21

two guys together in like

37:23

a deserted factoring in Slovenia.

37:30

Because Peter Todd who he thinks

37:32

is Satoshi Nakamoto is a caver.

37:34

He likes to a Torontonian. He

37:37

was very if he is Satoshi Nakamoto

37:39

he would have written Bitcoin's seminal

37:41

paper at the age of

37:43

22. The biggest piece

37:47

of evidence is very flimsy which

37:50

is that Satoshi on the on the

37:52

forums before Nakamoto disappeared Satoshi

37:55

posted on the Bitcoin forum before

37:59

it was even before Bitcoin was even the

38:01

paper came out kind of some who's thinking

38:03

out loud. And all

38:06

of a sudden this guy, Peter Todd, who's

38:08

never been around, he's only made one other

38:10

post basically finishes

38:12

the sentence in a way that shows

38:14

a deep technical understanding of what the

38:17

issues are and

38:19

then both Satoshi

38:21

Nakamoto and Peter Todd disappear.

38:24

Colin Hoback's theory is this

38:27

that Peter Todd is Satoshi was

38:29

posting a Satoshi accidentally logged back

38:31

in using another account,

38:34

posted and then went, oh shoot,

38:36

both disappeared. And then never

38:39

deleted it? Well, never

38:41

deleted it. And Todd says, well, why didn't I

38:43

delete it? And I

38:45

think you could make a credible case that deleting

38:47

it would just, nothing ever is

38:50

deleted from the internet, right? So

38:52

that deleting it would just call attention to it. I

38:54

don't know that the forum posts are that

38:59

indicative of it being

39:01

the same person, the

39:03

message, I'll read the Satoshi message here

39:05

and then I'll read the Peter Todd

39:07

reply. There would need to

39:10

be some changes in the Bitcoin coin

39:12

miner side also to make the possibility

39:14

to accept a double spend into the

39:16

transaction pool. But only strictly if the

39:18

inputs and outputs match and the transaction

39:20

fee is higher. And then he, you

39:22

know, as one other sentence about the

39:24

transaction fee and then Peter Todd responds,

39:26

of course, to be specific, the inputs

39:28

and outputs can't match exactly if the

39:30

second transaction has a transaction fee. I

39:33

don't think that that is some secret forbidden

39:35

knowledge. No, it's not. But a transaction fee

39:37

would make something not match. I will add

39:40

one more piece to this. What

39:42

they're talking about is

39:44

a pay for fee piece

39:47

that was not included in the original paper,

39:50

but Peter Todd later wrote and added

39:52

to the Bitcoin, the ability to pay

39:55

extra to get a faster transaction. So

39:59

Peter Todd did it. in fact implement that

40:01

piece of Bitcoin later.

40:06

To me, the thing you're right, it's

40:09

flimsy, to say the least.

40:12

I'm watching it, though, and he's getting

40:14

real squirrely. And

40:16

I'm kind of thinking he

40:19

looks damn guilty. I

40:21

mean, he reacted exactly as you

40:23

would expect somebody who was

40:25

pretending not to be Satoshi

40:27

Nakamoto would act. Now, the other person

40:29

standing next to him was Adam Back, who

40:32

is the chief executive of a Bitcoin

40:34

development company called Blockstream. The

40:37

only other person mentioned by name in

40:39

the Bitcoin paper, the

40:41

only person who is thought to know who

40:43

Nakamoto is, in fact, he's been holding on

40:46

to email and email exchange between

40:48

him and Satoshi Nakamoto for years, saying, well,

40:50

it's not my mind

40:52

to reveal. I

40:55

think personally, it's

40:57

not just Peter Todd. I think it's Peter

41:00

Todd plus Adam Back, who had written something

41:02

called Hashcoin a few years earlier, which was

41:04

kind of an early Bitcoin, an early cryptocurrency.

41:07

I honestly think that

41:09

the most credible answer is that this kid, Peter

41:11

Todd, who was a genius, who

41:14

is also, by the way, nobody in

41:16

the Bitcoin community wants him to be

41:18

Satoshi Nakamoto. He's pretty widely disliked. I

41:22

think it's him and Adam Back. How crypto boyish is

41:24

he? Yeah. Well, but he's not, though,

41:26

is a Bitcoin bro, which is interesting. He's

41:29

not the, you know, and he doesn't. I don't

41:31

think he ever cashed in a lot of Bitcoin

41:33

or anything. I think

41:35

he and Adam Back did it

41:37

together. They had the they

41:40

certainly had the means, the

41:42

technical ability to do it. Todd

41:45

Hoback did catch Todd lying about

41:47

his skills in C++. Todd

41:50

says on camera, I don't know C++,

41:52

but in fact, wrote an entire system

41:54

in C++ some years earlier.

41:58

I think they got him. Now, what is it? it

42:00

matter? Probably not at all. But

42:03

it's fascinating. And it's exactly the kind

42:05

of person the Bitcoin community

42:07

wouldn't, would not lie. It's

42:09

not his no magical, brilliant,

42:12

dark night created

42:15

a world changing cryptocurrency. He

42:17

was just some annoying kid who

42:20

was really skilled. I

42:22

don't know. I like it. I I'm going

42:24

with it. Peter Todd and I'm going with

42:26

Adam Beck was right there and did

42:28

part of it. And Paris, you're your best argument against

42:31

his pizza. Well,

42:33

I think that one of their one

42:37

of the arguments in the documentary

42:39

was based on the account,

42:42

the Satoshi account saying that its

42:44

favorite pizza topping is pineapple and

42:47

jalapeno and that that also being

42:49

Peter Todd's favorite pizza topping. That's

42:51

not a work like pineapple and

42:54

his pizza. As the other stuff

42:56

said, pineapple jalapeno is not completely

42:59

normal. That's not a very common.

43:02

I mean, I I don't think that I think

43:04

that it's probably pretty likely that someone

43:07

who is maybe involved in the early

43:09

days of Bitcoin, then when confronted in

43:11

a warehouse with an entire camera crew

43:13

and a director accusing you in hours

43:15

long interviews of being Satoshi Nakamoto, that

43:17

you're going to get defensive and act

43:19

weird. I also think I mean, I

43:21

don't know any of the stuff about

43:23

the C plus plus, but I could

43:25

think of conceivable explanations to why someone

43:28

would say they don't know a programming

43:31

language. Perhaps they were referring to, you know, not

43:33

knowing it that well or they don't really think

43:35

he did it as part of a denial saying

43:37

I couldn't have written. I don't know anything about

43:39

C plus plus. I

43:41

mean, honestly, this is exactly what you

43:44

would say if you were Satoshi Nakamoto.

43:47

He started contributing to C Well,

43:58

look, we've seen a lot of specious announcements,

44:01

including Newsweek's appalling

44:03

announcement that they discovered Satoshi Nakamoto as

44:05

a cover story, which to my knowledge

44:07

they never retracted. Did they ever? It

44:11

was completely wrong. So

44:13

it could be just another one of these. Who

44:16

is Cullen Hoback? I mean is it enough to say

44:18

well I found out who Q was? I don't know

44:20

if that's enough. I,

44:24

it just rang quite true to

44:26

me. I mean I think you just

44:29

need more for it to be like a smoking

44:31

gun. Right. I

44:35

don't much care. And does

44:37

it matter is the bottom line? I mean

44:40

honestly he has Peter Todd, for

44:42

whatever you think of him, has perfectly

44:45

good reasons not to be outed.

44:48

And I respect that because. What does he do for a

44:50

living? He

44:53

is on X, he says he

44:55

is a crypto-chrome answer

44:57

and web-py developer. Hey

44:59

this is Benito. So

45:01

like why this does

45:03

matter is because that

45:06

if this person is Satoshi Nakamoto then they

45:08

essentially have control of the Bitcoin market. They

45:10

more than that they have control of a

45:12

world economy. Exactly. So like that's what makes

45:14

it important. He

45:17

says, Peter Todd said in a

45:19

post, and they talk about this

45:21

in the documentary, that he did a very hard

45:23

thing. He burned a bunch of Bitcoin. But

45:27

there's no way to prove that because he didn't do it

45:29

in a way that was provable. And so I

45:32

mean I've burned my Bitcoin by forgetting

45:34

my password. You can lose

45:36

your Bitcoin very easily. Steve Gibson did it

45:38

by throwing his heart, raising his hard drive.

45:40

I mean that's easy to do. But

45:42

there's no provable, that's not provable. There is a

45:44

provable way to do it and he did not

45:47

choose to do that. That raises

45:49

some big issues. If he controls

45:51

a million Bitcoin, he

45:53

could collapse Bitcoin. And

45:56

there are a number of nations, there's very active... not

46:00

just El Salvador, but a number of other nations that

46:02

want to use Bitcoin as their currency. Uh,

46:05

yeah, I think you're right, Benito. There is, there is

46:07

some, but what are you going to do?

46:09

You can't, if you can't prove it and if you could, let's

46:11

say you could prove it and it really is him. Then

46:15

what? Then what? I

46:18

mean, I think also the question is like, if

46:22

he or someone else is Satoshi,

46:25

what are they going to do with that? Like

46:27

you can't start moving

46:30

Bitcoin out of your account. Otherwise the

46:32

whole market's going to catch on fire

46:34

and people are going to freak out

46:36

because Satoshi has been active in years.

46:38

Right. Um, I don't

46:40

know. I guess

46:42

maybe it was a national security asset.

46:46

Okay. Now that you say that actually, I will bring

46:48

up, there was a time about

46:50

a year ago where for

46:52

weeks, every, every

46:54

reporter at the information was

46:56

getting these emails,

46:59

signal messages, calls to our personal phone,

47:01

LinkedIn DMS, Twitter DMS from some guy

47:03

being like, I figured out who sought

47:05

Toshi Nakamoto is. You've got to listen

47:08

to us. And none of us really

47:10

replied. And then afterwards he would call

47:12

and email again and be like, I

47:15

figured it out. It's Elon Musk

47:17

and his first was very similar to

47:19

yours in the sense of like both

47:22

Elon and Satoshi used two

47:24

spaces after a period Satoshi

47:27

posted in a time zone that

47:29

was similar to where Elon was

47:31

during some of those months. Uh,

47:34

both Satoshi and Elon have used

47:36

the words bloody in messages

47:38

and things like that. And that's, that's what

47:40

this documentary sounds like to me is it

47:42

is more wishful thinking of a loon. Yeah.

47:47

I'm going with it's Peter Todd and Adam

47:50

back. But again, I don't

47:52

know what you do with it. You can't prove it. So

47:54

I don't know what you do with it. It's a very

47:56

interesting topic. Peter, if

47:58

you're listening, come on. the show and

48:00

and tell us why it's

48:03

complete nonsense. His

48:05

denial was not that credible. Alright

48:09

let's see, yeah let's take another break

48:11

and then I do want to talk

48:13

about .io. This

48:15

is a little

48:18

bit of an issue and it

48:20

kind of it shows a little bit of a weakness

48:22

in the whole domain name system

48:25

thing. Every system has

48:27

weaknesses. Okay thank you. You're all

48:29

human. Professor Jeff Jarvis ladies and

48:32

gentlemen, we need him. He's

48:34

the resident intellectual here. Of

48:37

course Paris Martineau is the resident

48:39

young person genius here.

48:41

And nihilist. And nihilist. Self-described

48:45

nihilist. I never

48:47

knew that till Jeff told me you were a

48:49

nihilist. We've talked about that on this

48:52

show. Perhaps like three or four

48:54

different episodes. Maybe I didn't believe it. I

48:56

knew you liked Nietzsche. I didn't know that you

48:58

were a nihilist. I mean I

49:00

would say I'm a soft nihilist in the sense

49:02

that like everything is meaningless but that means we

49:04

have to derive our own meaning from it. And

49:07

optimistic nihilism is an actual philosophy. Yeah.

49:09

Okay good. I would say optimistic nihilist.

49:12

Yeah you kind of actually that's kind of my

49:14

philosophy which is everything is meaningless and

49:16

everything means something. So

49:20

a person yeah if you want it to mean

49:23

something. Yeah it's up

49:25

to you. Yeah. And

49:27

I'm just the dunderhead in the middle. I'm

49:29

the I'm the baloney in the genius

49:32

sandwich. Our show

49:34

today. Praty-Fi. And

49:37

Benito's the mayonnaise. Benito's the mustard. I

49:39

was gonna say mustard. Yeah the show

49:41

today. Well mustard on one side, bananas

49:43

on the other side. The oh I

49:46

just invented a new topping. Bananas. I

49:49

might have to make that. Our

49:52

show today brought to you by One Password. I'm gonna make

49:54

that tonight and I'll let you know how it comes out.

49:57

One Password is a great by the way company.

50:00

who has a new product called

50:02

extended access management that solves a

50:04

really, I think, fundamental problem in

50:07

business. Let me ask this rhetorical question, because I

50:09

know the answer. Do your

50:11

end users always work on company-owned

50:13

devices? Of course, right. And IT

50:16

approved apps never bring their

50:18

own device into work, never use

50:20

their own apps, right? Wrong. So

50:23

how do you keep your company's data safe when it's

50:25

sitting on all those unmanaged apps

50:27

and devices? That's

50:30

why one password came up

50:32

with extended access management.

50:35

One password extended access management helps

50:38

you secure every sign-in for

50:40

every app on every device.

50:42

It solves the problems traditional

50:45

IAM and MDM cannot touch.

50:48

Imagine your company's security like the quad

50:50

of a college campus, right? You

50:52

got these nice brick paths,

50:54

winding little brick paths, leading

50:56

from IV covered building to

50:59

IV covered building. So

51:01

pretty. Those are the company-owned devices,

51:03

the buildings, and the IT approved

51:05

apps, the paths, and the managed

51:08

employee identities walking up and down

51:10

those paths. Then in the

51:12

real world, there are the paths people

51:15

actually use, the shortcuts worn through

51:17

the grass, you know, the actual straightest

51:19

line from building A to building

51:21

B. Those are unmanaged

51:23

devices, shadow IT apps.

51:26

Non-employee identities on your network like

51:28

contractors, right? So most

51:30

security tools just assume they only work on

51:33

those happy brick paths. That's

51:35

where they work. But a lot of security problems, I

51:37

say almost all of them, take place on

51:39

the shortcuts. One password

51:42

extended access management is the first

51:44

security solution that takes all

51:46

those unmanaged devices and apps and identities and

51:48

puts them under your control. It

51:51

ensures that every user credential is strong

51:53

and protected. Every device is known and

51:55

healthy and every app is

51:58

visible. It's security for the way

52:00

we... work today and it's now

52:02

generally available to companies with Okta

52:04

and Microsoft Entra. And

52:06

by the way, good on you that you're

52:08

using Okta and Entra. That's fantastic and

52:11

now this takes it to the next step. So not only are

52:13

the people authenticated, but the devices, the software

52:15

as well. It's in beta for

52:17

Google Workspace customers too. You'll be happy to

52:19

know that. Also, here's

52:21

the deal. Check it out. 1password.com/twig.

52:24

That's the number one. password.com/twig.

52:26

There's a lot of information

52:28

there about how it works.

52:31

I think it'll be very

52:33

clear instantly why you need

52:35

this. 1password.com slash twig.

52:37

We thank them so much for their support of

52:40

this week in Google. I

52:42

have to go because I've just

52:44

invented bananas

52:47

and I have to go make some bananas right now.

52:49

I don't think Hank is going to approve. I don't

52:51

think he's going to allow you. I think Hank would.

52:53

It'll be bad for his reputation now. If

52:56

his father... It could be

52:58

a vegan mayonnaise alternative. Well,

53:00

I guess you could. So

53:03

I don't ever buy mayonnaise anymore. I

53:05

make it. It's easy. With an immersion

53:07

blender, all you have to do is

53:10

you put two... I use the Serious Eats recipe. Two

53:12

eggs. You put the whole egg in. You don't need

53:14

water, just the egg. Two eggs. You

53:17

put a tablespoon of Dijon mustard in

53:20

there, some salt, some lemon, and then

53:22

oil. I use avocado oil. You want to use an oil that

53:25

doesn't have a lot of flavor. Now you whip

53:27

it together and you got mayonnaise. That's mayonnaise. That's basically

53:29

mayonnaise. Does it keep in the refrigerator? Yeah. And

53:32

it's fresh and delicious. Don't

53:34

make more than you're going to need in a week

53:36

or two. It's

53:39

not a giant jar of Hellmann's. It doesn't have

53:41

anything else in it. It could stay for years.

53:43

Yeah. Yeah, because there's other stuff in it. But

53:48

this is really good. It's delicious. And you can

53:50

put garlic in it. But you could also put

53:52

a mushed up banana in there and

53:55

you'd have bananas. That's

53:57

true. Not a lot of people are going

53:59

to do that. But a

54:01

peanut butter and bananas sandwich

54:05

I will say sometimes people substitute

54:07

bananas for eggs in recipes So

54:09

you could leave the egg stuff

54:12

so you could replace the egg with

54:14

a banana and that could be bananas

54:16

and somehow Inspire

54:19

those banana. What he says. Have

54:21

you had banana ketchup? No,

54:24

no, never heard of that. A Filipino thing.

54:26

Yeah, but I know it's delicious See?

54:29

Mmm. And I'm thinking

54:31

for the holidays you could put a little pumpkin

54:34

spice in and then you have pumpkin spice bananas

54:37

Wow Now,

54:41

you know where salt Hank

54:43

gets his genius No,

54:46

he would never ever make anything like

54:49

that. I don't think You

54:52

should just text him bananas question

54:54

mark? One word. Right

54:57

now I would actually love his opinion on

54:59

banana ketchup Okay,

55:03

I like just suggesting you strange things

55:05

to text Philip

55:07

Manila banana ketchup. He says

55:11

Wednesday I get weird texts from you

55:19

Then like a weird thing in the Philippines, it's

55:21

like normal banana ketchup is on the table and

55:23

rest. It makes perfect sense It's a tropical ketchup

55:27

So I just texted him one word bananas. We'll

55:29

see what I should mark. Do you put a

55:31

question mark? No. Oh Exhalation

55:34

point. Yes There's

55:38

no question my friend, how do

55:40

you spell bananas I guessed

55:45

B-A-N-A-N-A-I-S-E All

55:48

right, right sure

55:52

so What's the dot

55:54

IO stand for? Do you know? trivia

55:57

question Indian Ocean Ocean very good

56:00

I know it's been in the news recently. Oh,

56:02

yeah. It's the

56:04

country code domain. Except for the show.

56:08

Yeah. I knew once, but I

56:10

didn't remember. So it's the country

56:13

code domain for the Chagos Islands, which

56:15

is the British Indian Ocean territory,

56:18

disputed territory for a long

56:20

time. Mauritius said,

56:22

that's ours. The British government said,

56:24

it's ours. And took it. In

56:28

1814, the French ceded control of the Chagos

56:30

Islands and the island country of Mauritius to

56:32

the British. When the

56:34

British took over, the islands remained a

56:36

dependency of Mauritius. In

56:39

1965, the UK gave sovereignty

56:41

to Mauritius, but said,

56:44

we're going to keep the Chagos Islands and make

56:46

it the British Indian Ocean territory. That's in 1965.

56:50

In fact, and this is horrible,

56:54

the UK forcibly removed the

56:56

indigenous people, the Chagosian people,

56:59

so the US could build a military,

57:01

the US, us, could build

57:03

a military base on the island. They displaced 1,500

57:05

people. That

57:08

might sound a little familiar to you, Benito. Colonialism.

57:12

Colonialism. Eventually,

57:14

the Chagos Islands were given the .io

57:17

country code. That

57:19

was in 1997 by who else? The,

57:22

you know, IANA. Even though they

57:24

were still, they were a British protectorate, but

57:26

they were independent enough to have their own

57:28

country code. Right. Well, that's

57:30

not unusual, right? Canada is

57:32

part of the Commonwealth, but it has its

57:35

own. Anyway, the British government granted rights to

57:37

sell .io domains to the

57:39

Internet Computer Bureau, the ICB.

57:43

The country's government receives revenue

57:45

for any sites that register with their country

57:47

code domain. For instance, Anguilla,

57:50

which has the country code AI. Nice one, right?

57:55

Expect to make 25 to 30 million

57:57

dollars from websites registered with the .io

57:59

domain. Yeah Tuvalu is

58:01

.tv. I have a .tv domain.

58:03

Twit.tv is our domain, right? Unfortunately

58:11

The British government collected some of

58:13

the revenue and didn't give it

58:15

to the Chagosian people. In

58:17

2020 they submitted a claim to gain ownership, which they

58:20

said of what they said was a 50 million dollar

58:22

property. This is from the Verge. But

58:26

the UK has now finalized an agreement giving

58:28

the Chagos Islands back

58:31

to the Mauritians, to Mauritius,

58:34

a move by the way that Chagosian said the

58:36

government didn't even consult them on, ending

58:39

the British Indian Ocean territory. It's all

58:41

Mauritius now and here's

58:44

the problem. It

58:46

also potentially ends the IIO domain,

58:49

the .io domain. What

58:51

does it end it? Why can't they just

58:53

agree to keep going? The Internet

58:55

Assigned Numbers Authority, the

58:57

successor to ICANN, IANA,

59:00

has a process for retiring old

59:03

country code domains within five years. This

59:06

was after .su, which was

59:08

the Soviet Union's domain, kind

59:11

of lingered even though the Soviet Union had

59:13

gone on. In

59:15

fact, it was used apparently according to the Verge

59:17

by cyber criminals. So... No,

59:20

I'm shocked. No, every domain

59:22

is used by cyber criminals. Especially what

59:24

associated with Soviet Union. Soviet Union goods!

59:27

Since then IANA has also had to

59:30

recover the Yugoslavian domain .yu, although

59:33

it went on for a few years after

59:35

the country was broken up. But

59:39

if Chagos was part of the British Empire,

59:42

couldn't it just as easily be part of

59:44

Mauritius but independent enough to have a .io?

59:47

Yeah. Human

59:49

beings could decide these things. Yeah, can't we

59:51

just decide to make it not a country

59:53

thing and just make it a regular domain?

59:55

The fact that Google uses it for google.io.

59:57

Anyway, Emma Roth who did a very, very

59:59

good. piece on this explaining it all

1:00:01

and the Verge writes, for now it's still

1:00:04

too early to tell what would become of

1:00:06

the .io domain, whether it will

1:00:08

go through a similar transitional period like

1:00:10

.yu or if IANA will let

1:00:12

the Chagossians keep it because they

1:00:14

say we want it, we

1:00:17

don't want the Brits to have it, we want the money.

1:00:19

The Verge reached out to Identity

1:00:21

Digital, the domain registrar that previously

1:00:23

obtained rights to sell IO domains

1:00:26

and IANA for information about

1:00:28

IO's future but we haven't heard back. Okay,

1:00:30

so there it is, it's the money. It's

1:00:33

the money because the Chagossians say wait

1:00:36

a minute, we make just

1:00:38

like, by the way Tuvalu is sinking. Tuvalu

1:00:41

will in the next few decades disappear

1:00:43

under the ocean. There goes your domain

1:00:45

too. Well, but

1:00:48

the Tuvalans would very much like to move

1:00:50

to another island and keep .tv

1:00:52

because it's a lot

1:00:55

of money. Hmm, I

1:00:57

don't know. There

1:00:59

are a lot of .io

1:01:02

people who just .io out

1:01:04

there, including my bookmarking service,

1:01:06

raindrop.io. It's a big

1:01:08

deal to change your domain. Yeah, especially

1:01:11

a really popular one. Yeah. Okay,

1:01:15

well that's that story. What

1:01:17

are you guys going to do? Do you

1:01:19

have any .io domains, Leo? I do, yes.

1:01:22

I think I do. Which one? Yes, how

1:01:24

many domains do you have? I was about

1:01:26

to say, how much do you spend a

1:01:28

year on domains? Hundreds of dollars. Didn't you

1:01:30

clean them out somewhat? No.

1:01:32

Why would I do that? None of them

1:01:34

are worth anything? None of them are worth

1:01:36

a thing. I

1:01:38

still have tunictime.com and fancypants.com. We've

1:01:40

talked about this before. Yes, yes,

1:01:43

yes. I had S

1:01:45

widget spelled out. Swidget?

1:01:48

No. No, you don't have fancypants.com. We've talked

1:01:50

about this. Oh, that's right. I gave you.

1:01:52

So it's true. Some of them have lapsed.

1:01:56

Normally I have auto. No, that went to your wife, your

1:01:58

ex-wife. She got it

1:02:00

in the divorce settlement. That would be funny.

1:02:03

Fancy pants? I should

1:02:05

write to her, hey hun, do you

1:02:07

think you still want fancy pants and

1:02:09

tunic time? Let

1:02:15

me see, I'm just going to log in to

1:02:17

Hover here and see what's happening. I

1:02:20

feel like I have a .io. I

1:02:22

simply wish I could own hair.is. V-A-R.is.

1:02:28

It's never going to happen. Why not? .is

1:02:30

is Iceland. You should be able to get that.

1:02:32

I know. It

1:02:35

was registered by someone else in 2007. Some dope.

1:02:37

I can't find a way to purchase it and I'm

1:02:39

sure if I did it would be like a million

1:02:41

dollars. Hey, if you're out

1:02:44

there owner of par.is, please give it

1:02:46

to me for a reasonable price. I

1:02:48

was miffed because I

1:02:51

wanted leo.com, which would be

1:02:53

a fantastic domain, right? Or

1:02:56

a leo.anything. The Royal Bank of

1:02:58

Canada's mascot is a lion named

1:03:00

Leo, so they owned leo.com for

1:03:02

the longest darn time and

1:03:04

then they let it lapse and I didn't notice

1:03:06

in some domain squatter. Yes, I

1:03:09

have leport.io. Oh. Yeah.

1:03:12

leport.io. My son got

1:03:14

jarv.is. It's

1:03:17

registered to. That's really good.

1:03:19

Jarv.is. I got to

1:03:21

update the address. It's registered to the Twitch cottage.

1:03:27

I've had this since 2016 and it will

1:03:29

not expire till 2025. So

1:03:33

I should really think about it now. Should I keep

1:03:36

it? Do the Bagosians need

1:03:38

it? What was their names for the

1:03:40

Chagosians? Do they need it?

1:03:42

Do they need the money? The Brits got your money instead

1:03:45

of them. Yeah.

1:03:47

I have so many silly domain names. I probably should

1:03:49

get rid of some of these. You could get leo.lol

1:03:51

for $3,000. Yeah, no. See,

1:03:54

not worth it. I do have, I think

1:03:57

I have an lol domain. a

1:04:00

dot fund domain. So I don't

1:04:03

know if I have a dot, no, no, no

1:04:05

L O Ls. How

1:04:08

many domains do you have Paris? You have a lot

1:04:10

probably. Yeah. Not really. I just have

1:04:13

Paris, smart, no.com and Paris dot NYC. Oh,

1:04:16

I like Paris. That's

1:04:18

kind of an interesting juxtaposition of

1:04:21

two big cities, one after the

1:04:23

other. I love Paris.

1:04:25

Absolutely. And your spirit. Yes. What

1:04:27

do you use it for anything? Um,

1:04:30

I just use it for my blue sky

1:04:32

handle and it also redirects my website. Oh,

1:04:35

so you do have a website. Yeah. Paris

1:04:38

dot NYC, man. Right. Pretty good.

1:04:40

Oh, that's the best. I finally

1:04:42

just this week started using Jeff

1:04:44

jarvis.com. What were

1:04:47

you using before? Well, the

1:04:50

goods machine dot com, good breath. This

1:04:52

is dot com. I forgot that I

1:04:54

owned it. My old employers advance. Somebody,

1:04:56

somebody sat on it and tried to

1:04:58

screw me in. So the lawyers for

1:05:00

advance went and got it and

1:05:02

I forgot it. They paid for it for years.

1:05:04

They finally said, uh, Jeff, can you take this

1:05:06

off our hands? We got it. We don't want,

1:05:09

you know, what you were sitting on things. I

1:05:11

know we, last week we talked about our Oala

1:05:13

bottles, Paris. Just a little tip.

1:05:15

I sat on the lid. Oh,

1:05:19

the springs went. And,

1:05:22

but it turns out Amazon sells a

1:05:25

fake third party. Yeah,

1:05:28

that does look fake. Look at the colors

1:05:30

for only $8. Well,

1:05:33

I was going to buy a whole new one, but you

1:05:35

know, the can still good. So I

1:05:37

just bought the new lid anyway, just a little

1:05:39

tip. It's not, it's not a, who says this

1:05:41

show, is it useful? Prime

1:05:45

deal or anything? Wire cutter. You can

1:05:47

do so. I feel so bad for people who

1:05:50

work at sites like the wire cutter, uh, the

1:05:53

verge cause they're going crazy right now

1:05:55

cause prime days, right? The

1:05:58

second one of the year. Yeah,

1:06:00

I never find anything I want in prime days.

1:06:02

Never, never, never, never. Do

1:06:04

you Paris? Are you a prime shopper? No, I'm not.

1:06:07

It's barely a Holland. I had to

1:06:10

do the prime day gift

1:06:13

guides for New York magazine

1:06:15

one year. And it was a nightmare.

1:06:17

Right now what all those writers are

1:06:19

doing is combing through an endlessly long

1:06:22

Excel spreadsheet and

1:06:25

looking at meager deals and trying to find a

1:06:27

way to get readers excited about it because it

1:06:29

pays for their salary. Exactly. Yep.

1:06:32

I remember one day we thought we weren't

1:06:34

going to make our prime day targets and

1:06:36

then four people bought like a five grand

1:06:38

TV using our affiliate link and there

1:06:40

we were. Yep. Yes,

1:06:44

you got to keep doing it. Who

1:06:46

said I was trying to remember some site said it

1:06:49

might have maybe was maybe it was wire cutter.

1:06:51

Here are the 4,393 good deals in the 28,972 Amazon

1:06:57

prime deals. It

1:06:59

made me go, Oh my

1:07:01

God, that hurts. That hurts. Whoever

1:07:03

had to do that. I was just searching for

1:07:06

that story and I don't see it here anywhere.

1:07:09

Anyway. So I just put

1:07:11

up a little news about open AI from the

1:07:13

FT. Well, there's a

1:07:15

big story about open AI actually. From the

1:07:18

information from a few minutes ago.

1:07:21

Oh, okay. Let's

1:07:23

do that. Then I'll do mine. What's so what's the

1:07:25

story? Opening

1:07:27

I burned 340 million in the first half of 2024. They

1:07:33

got 167 billion. Their

1:07:37

total losses between 2023 and 2028 are going to be

1:07:39

$44 billion. So

1:07:44

that means they have a four year runway. Yeah.

1:07:48

They don't know. I think how,

1:07:50

uh, like when they're going to be

1:07:52

profitable or they're not going to be profitable for a very

1:07:54

long time, I believe. Um, but

1:07:58

according to my story, it was. won't matter.

1:08:01

Why not? Because now the latest

1:08:03

gambit from Altman is

1:08:06

that he wants to structure OpenAI as we know

1:08:08

he wants to structure as a for-profit company and

1:08:10

end it not for profit, but he

1:08:12

wants to structure it as a public benefit company.

1:08:15

Oh boy. Well that's better than a...

1:08:18

It's just a trick to fend off. Yeah,

1:08:20

it's not really real. I

1:08:23

know a lot of public, but I thought that you

1:08:25

had to, I don't know,

1:08:27

be a public benefit or something. But yeah,

1:08:30

what does that mean? Yeah, you make some

1:08:32

fakie. We're really good. It's not quite non-profit,

1:08:35

but you're not allowed to get rich, right? No, you can

1:08:37

get as rich as you want, but the thing is

1:08:40

that what it does is it protects you from

1:08:43

raiders coming and saying

1:08:45

you're not making enough profit. Oh.

1:08:49

So what he's protecting here is his lack

1:08:52

of profitability going forward by doing this

1:08:54

to hide behind a public

1:08:56

benefit because Sam is... I wouldn't

1:09:02

trust him. So

1:09:04

my stories were actually about

1:09:06

AI also. Jeffrey

1:09:09

Hinton wins a Nobel Prize

1:09:11

in physics. Hinton

1:09:14

of course is the

1:09:16

godfather of AI. One

1:09:19

of many. But he was the guy who

1:09:21

came up with neural networks many, many years

1:09:23

ago. He's also one of the signatories

1:09:26

of the letter.

1:09:29

Which I think is why they gave it to

1:09:31

him. I think that the bells have become more

1:09:33

and more political and

1:09:35

I think it's an opportunity for them to give

1:09:40

him attention. Interesting. The

1:09:42

Wall Street Journal did immediately. The Wall Street Journal

1:09:44

said, and he's screaming about danger. The Wall

1:09:47

Street Journal loves to do that. Well,

1:09:49

you know who else won a Nobel Prize? D'Amisisibis.

1:09:54

Which really says that Google

1:09:56

has really become, or at least

1:09:58

was, a a

1:10:00

key basic research, the

1:10:03

Nobel labs. He of

1:10:05

course was the founder of, of

1:10:07

deep mind. I

1:10:10

wasn't, you lost your mind. I

1:10:14

knew that deep throat was wrong. It

1:10:17

was the founder, very long. Yeah. Very

1:10:19

long founder of deep mind,

1:10:21

his Nobel prize is in chemistry. So hitting

1:10:23

one in for physics, cause there isn't a

1:10:26

Nobel prize for AI. Not

1:10:28

yet. Uh, but his

1:10:30

CBS one, because they came up with

1:10:32

a pro deep mind, came up with

1:10:34

a protein folding that has

1:10:36

been used to ice. I

1:10:39

thought, I don't know, no effect,

1:10:41

but apparently some effect in

1:10:43

medicine. So,

1:10:46

uh, David Baker university, Washington,

1:10:48

John jumper, Google deep mind and Damaese,

1:10:50

I'm a CBS of deep mind. Uh,

1:10:54

uh, has CBS and jumper developed a

1:10:56

powerful computational tool. This is from the

1:10:58

Washington post that gave researchers the long

1:11:00

sought ability to predict how proteins twist

1:11:02

and fold to create complex

1:11:05

3d structures that can block viruses,

1:11:07

build muscle or degrade

1:11:09

plastic. Um,

1:11:11

and protein folding is, you know,

1:11:13

this remember the, the,

1:11:15

uh, the whole thing

1:11:17

where you were, uh, getting your computers

1:11:20

idle time and you were donating it

1:11:22

to a project and one of them was folding

1:11:25

it home. Remember that where you

1:11:27

would get all the computers in the

1:11:30

world with their idle cycles, trying to

1:11:32

fold proteins. Well, that's gone away because

1:11:34

AI does it easily

1:11:36

and quickly. You don't

1:11:38

need all those computers. Um,

1:11:43

so I'm curious,

1:11:46

they got the Nobel prize, but have,

1:11:48

has alpha fold. Generated.

1:11:51

I mean, is there a medicine or something I

1:11:53

could point to that says it's,

1:11:56

it's working. I mean, I

1:11:58

know it's making these defined working. It

1:12:00

was a product that people undertook this project for

1:12:02

a reason so that they

1:12:04

could. And, and, and we got there thanks

1:12:06

to this. Okay. So

1:12:09

see, it's no, it's a good

1:12:11

thing. Yeah. It's

1:12:15

not all generative AI. That's

1:12:17

a good, okay. There you go. That's a good point.

1:12:20

It isn't, this stuff is not generative AI.

1:12:22

I mean, I don't

1:12:24

know what to me it is. It's

1:12:26

all anything that generates something with AI.

1:12:28

It's generative AI, but. Well, so,

1:12:31

so I'm, I, I, I had a call for

1:12:33

him with the World Economic Forum at Davos. Oh.

1:12:35

And they, uh, cause I'm a member of

1:12:38

the, of some kind of AI thing, the

1:12:40

AI governance thing. So anyway,

1:12:42

they said that it's also diagnostic,

1:12:44

predictive, prescriptive, and adaptive AI. Oh,

1:12:47

those are the categories. Okay. But.

1:12:50

I think a lot of times we think

1:12:53

of a generative AI as chat bots basically.

1:12:55

Yeah. Right. Chat GPT. Making, um, swimming hippos.

1:12:58

When the Nobel committee called Jeffrey Hinton,

1:13:00

he said he is quote. Worried.

1:13:04

Oh Jesus. That the overall consequence

1:13:06

of this might be systems more

1:13:08

intelligent than us that might eventually

1:13:11

take control. But then they

1:13:13

said, but, but knowing that would you, would you do

1:13:15

it all over again? He said, Oh yeah. It's

1:13:21

a whiny Oppenheimer. Yeah.

1:13:24

I'd do it again. I

1:13:26

am, I am created chat

1:13:28

bot. Uh, uh,

1:13:31

Annoyer of worlds. Exactly.

1:13:35

Good. Very good. Hopfield

1:13:40

echoed his co-laurates concerns in a

1:13:42

video call yesterday afternoon at Princeton

1:13:44

university. The worry I

1:13:46

have is not quite, not AI

1:13:48

quite directly, but AI

1:13:50

combined with information flow around

1:13:53

the globe. Oh,

1:13:56

no one's worried about like bad people with

1:13:58

AI. Like that's

1:14:00

more worried about is like that, that,

1:14:03

that maybe deep fakes. I don't know.

1:14:07

Uh, Hopfield is 91 hitting is 76. Uh,

1:14:12

that's usually the case. You don't win a Nobel

1:14:14

prize to your, Dennis is very young, isn't he?

1:14:16

He's pretty young. Yeah. Yeah.

1:14:20

Well, I'm just saying, is there still hope,

1:14:22

Jeff? That's all. I'm just trying to get

1:14:24

your 4am phone call. I

1:14:27

just want to MacArthur by the way, that's

1:14:30

what that play that I really liked that the

1:14:32

Washington post hated. I saw that. I saw you

1:14:34

sneaking in at a interview last week. So

1:14:39

we were talking about McLean, which I thought

1:14:41

was a great movie. It's, uh, it's the

1:14:43

Robert Downey Broadway debut play nut movie play

1:14:46

that I saw. I mean, it's a movie in some

1:14:48

ways he's moving. He's moving. And

1:14:50

there's a lot of special effects on the stage.

1:14:53

It's actually the staging and the post did like

1:14:55

the staging, but they hated the,

1:14:57

the, the, the play itself. I

1:14:59

don't know. I haven't did the times hate it. Did the New

1:15:01

York City? I haven't seen a review yet.

1:15:04

We should talk about megalopolis, which I saw, and I

1:15:06

don't know if I hate it or not. It

1:15:09

was a movie,

1:15:11

an experience of

1:15:13

what a ride of a lifetime. I spent

1:15:15

the whole time thinking of it. Adam Driver

1:15:17

as Leo, because he famously voiced

1:15:20

an early version of either driver's character

1:15:22

or you said maybe

1:15:24

the narrator I thought might be the narrator. It was

1:15:26

one of those, I walked

1:15:28

down the rainy streets and

1:15:30

you know, it was, I thought he was a detective,

1:15:32

but maybe it was Adam. Maybe he was an architect.

1:15:34

I mean, that does seem like something that Adam drivers

1:15:37

would say in this, it was

1:15:39

a bizarre film. I

1:15:42

really enjoyed it. There were parts of it that

1:15:44

I was like, I

1:15:46

love the weird movie. This is fun. And

1:15:48

there are parts that I was like, this

1:15:51

is terrible. Like one of the first things

1:15:53

that Adam driver's character, the plot of this

1:15:55

is it's set in new Rome, which is

1:15:57

like New York city, but it's Rome.

1:16:00

in the future, but it's

1:16:02

not very confusing. And Adam

1:16:04

Driver plays a mad architect

1:16:06

named Caesar, who's got this

1:16:09

vision for a future city that kind

1:16:11

of looks like Hudson Yards mixed with

1:16:14

that meme of the utopia

1:16:16

city that you see, which is an insult. Is

1:16:18

it like, like, yeah, it was shrugged a little

1:16:20

bit? I mean, when I say it was an

1:16:22

architect. It's like,

1:16:25

and Rand mixed with Hudson

1:16:28

Yards mixed with Caesar.

1:16:31

I don't know. It's not

1:16:33

great. I want to

1:16:35

see it just because of the spectacle. By

1:16:37

the way, you're lucky you saw it because

1:16:39

there's some, some thought that they will not

1:16:41

be sent to streaming, that this is it.

1:16:43

You see it in the theater. You don't

1:16:45

see it at all. I also saw an

1:16:48

immersive screening where midway through, oh yeah, you

1:16:50

don't consider an a spoiler because this is

1:16:52

30 seconds of

1:16:54

the movie and it does not have

1:16:56

a console. A satellite

1:16:59

crashes into New York city. Then

1:17:01

archival footage of 9 11 is

1:17:03

played, which I've since

1:17:05

learned was shot when they were filming

1:17:07

because they've been filming this movie for

1:17:09

so goddamn long that they have never

1:17:11

before seen footage of 9 11 in

1:17:14

this. Then there is a scene

1:17:18

of Adam driver as the architect in

1:17:20

a very tiny box on screen at

1:17:22

a press conference. Then the house lights

1:17:25

go up in the theater and a

1:17:27

man walks out in front of the

1:17:29

movie theater with a mic, takes notes

1:17:32

and then asks Adam driver on screen

1:17:34

a question. Adam driver responds and then

1:17:36

he goes away and the lights. Oh,

1:17:39

that's so it was, oh my God.

1:17:41

What reality is bending. I know. I

1:17:44

thought it was kind of like going to the

1:17:46

Rocky Horror Picture Show and seeing some guy dressed

1:17:49

as Frankenfurter, but that, that, that person is now

1:17:51

going to say that he was in a couple

1:17:53

of production. Right. Well, like me, I was in

1:17:55

a car like you. So

1:17:57

you, this is good. Did you, we did this.

1:18:00

before the show. So

1:18:02

Megalopolis, Coppola has been

1:18:04

working on this for 40 years. Yes.

1:18:07

And I mentioned that 30 years ago,

1:18:10

I was called into Zoetrope.

1:18:13

I auditioned for a role in some

1:18:16

unknown radio thing that Coppola

1:18:18

was doing at Zoetrope and read a part

1:18:20

for a couple of nights with

1:18:22

other actors in Zoetrope.

1:18:26

I later figured out that we were recording

1:18:28

some sort of pre-visualization for a movie that

1:18:30

Francis was working on, but the movie never

1:18:32

came out. And the other

1:18:34

hit that says you're Adam Driver is

1:18:37

that you were supposed to sound like Bob Woodward. I

1:18:40

got fired. I'm not knowing how he

1:18:42

sounds, and Bob Woodward sounds like Adam

1:18:44

Driver. Francis grew increasingly agitated. He kept

1:18:46

saying, do it more like Bob Woodward.

1:18:48

And I had no idea

1:18:50

what Bob Woodward sounded like, so

1:18:52

I didn't know what this direction meant. So

1:18:55

I just tried different accents. Some of the

1:18:57

strangest voice acting

1:18:59

and just general acting choices I've

1:19:01

ever seen in a movie in this film. That's what

1:19:04

he was trying to get me to do. There is

1:19:06

one where it's a normal scene between Adam Driver and

1:19:08

what will become his love interest. And he's talking in

1:19:10

a normal way. And then he goes, so

1:19:13

go back to the club. That

1:19:15

is fully how he delivers

1:19:17

the love. I can totally

1:19:20

see Adam Driver doing that. He does that little

1:19:22

vocal tick. Yeah. I have to

1:19:24

see it because you have to

1:19:26

see it. And I also heard from Coppola

1:19:29

when he was talking when someone was asking

1:19:31

him in an interview recently, what led you

1:19:33

to come back to megalopolis all these years

1:19:35

later? He said some years ago he had

1:19:38

filmed a food

1:19:40

tour TV show episode with Anthony Bourdain. I

1:19:42

assumed it. Where else would he go? But

1:19:44

he was watching himself on it and he

1:19:46

was like, oh, I look so fat. I

1:19:48

hate the way I look. So he signed

1:19:50

up for like an exercise camp. And during

1:19:52

the whole exercise thing, he was like, what

1:19:54

should I listen to? Might as well listen

1:19:56

to the old audio recordings I made for

1:19:58

the visualizations of Megalov. So I was

1:20:01

thinking, I was like, he's listening to Leo.

1:20:03

I wonder if he left my part in

1:20:05

and then had father Guido Sarducci do the

1:20:08

rest. Frankly, given the way that

1:20:10

this movie turned out, that would

1:20:12

make sense. It is the

1:20:15

most disjointed, confusing experience. There

1:20:17

was a whole subplot where Aubrey

1:20:19

Plaza is a character named Wow

1:20:21

Platinum that is kind of like

1:20:24

sexy Jim Cramer. She hosts a

1:20:26

financial morning talk news

1:20:29

show called The Money Bunny. Oh,

1:20:33

God. Which is the money honey reference

1:20:35

to that. Oh, I get it. So

1:20:37

he borrowed or raised

1:20:39

money like a hundred, what, 120 million to make this? He

1:20:41

sold one of his vineyards for it. He sold a vineyard

1:20:43

for 120 million dollars. Or

1:20:46

he leveraged, I think, his vineyard as collateral.

1:20:48

Right. This vineyard, by the way, is just north of

1:20:51

me. The couple of vineyards. It's going to

1:20:53

be sold by somebody else pretty soon. This

1:20:55

is a triggering question for me. So it

1:20:57

is opening weekend. It earned

1:20:59

four million dollars. Yeah,

1:21:02

no, it's not going to make money. It was not

1:21:04

a I will probably see it again, but I'm

1:21:07

also crazy. I wouldn't say I found

1:21:09

that a decent movie watching

1:21:11

experience, but it wasn't pleasurable. I was listening

1:21:13

to pop culture, happy hour talk about this,

1:21:16

and they were trying to decide whether it

1:21:18

was a failure or a fiasco because a

1:21:20

failure is just a failure that is like

1:21:22

not fun at all. But a fiasco has

1:21:25

got some pizzazz. Like there's something about it

1:21:27

fails. And this is definitely a fiasco and

1:21:29

kind of a fun way. One of

1:21:31

the funniest this American life episodes ever is

1:21:33

called fiasco. And if you haven't listened to

1:21:36

it, it is a series of

1:21:38

fiasco and it's a one. It's

1:21:40

it's it's what made this American life. It's how long

1:21:42

is it Paris? Two hours

1:21:44

and 18 minutes. Oh, I thought it might have been a

1:21:46

four hour. So I'm maybe four hours.

1:21:48

It might have made more sense. I

1:21:51

see that it is now up for

1:21:53

presale on Amazon and iTunes and the

1:21:55

various streaming sites. So I would

1:21:57

say it's definitely a movie I'd recommend. in

1:22:00

theaters, if you could, I guess I don't

1:22:02

know whether the theaters would be packed. I

1:22:04

saw it in a packed giant IMAX theaters

1:22:06

and people were just uproary. It was sold

1:22:08

out. It was opening again. It would be fun to

1:22:10

see that with a lot of other people. It was the immersive one

1:22:12

with the guy, so everybody was there to see the

1:22:15

guy. But people were laughing their

1:22:17

butts off during it because there are

1:22:19

some really strange choices in there. At

1:22:22

one point, a character is talking about a

1:22:24

baby she's about to have to someone and

1:22:26

she's like, if it's

1:22:28

a girl, we'll name her Sunny Hope.

1:22:31

If it's a boy, Francis. It's

1:22:36

like stuff like that one after another. Jeff,

1:22:39

you were asking a question before? It's

1:22:41

triggering for me to ask this, but now

1:22:44

I'm curious. What

1:22:47

was the 9-11 footage like? I

1:22:50

feel like it was just kind of a

1:22:52

short, like they had like three panel

1:22:54

shots that were kind of interspersed throughout the

1:22:56

film, one of which I believe was

1:22:58

like perhaps like the aftermath of 9-11.

1:23:00

Like it wasn't, you know, planes crashing into

1:23:03

the towers or anything like that. But

1:23:05

it was footage that I recognized as, huh,

1:23:07

that's a 9-11. And then afterwards, I

1:23:09

heard that it was never before seen

1:23:11

footage from the day because they were

1:23:14

filming in downtown Manhattan when that happened.

1:23:18

I am I am looking at the

1:23:20

six thirty show tonight at our local theater

1:23:22

and looks like I can have the theater

1:23:25

all to myself. Leo's

1:23:30

going to be rushing off to make sure he gets in. Yeah,

1:23:35

I think I'll go. Should I get tickets? I

1:23:38

see sad to see it. Like I'm the

1:23:40

only person in the freaking theater. I

1:23:43

would go if you could go. I mean, honestly,

1:23:45

if you're there by yourself, then just like

1:23:48

know that it's good to laugh. I think that it's

1:23:50

kind of a very funny movie to me. But

1:23:55

God, it's weird. It's such a weird

1:23:57

movie. There's a whole deep fake subplot.

1:24:02

I feel like I would like this movie to be honest

1:24:04

with you. That's what's just. Yeah, no, it's like a

1:24:06

mix between like, there are parts of the movie that I'm

1:24:08

like, oh, this could have been good, but it doesn't make

1:24:11

sense. There's it switches between at

1:24:13

one point, Adam Driver does the full

1:24:15

Shakespeare to be and not to be

1:24:17

monologue at another point. A guy does

1:24:19

a phone or joke. It's, it's bizarre.

1:24:22

It's bizarre. Doctor trying to

1:24:24

get out of writing a few pages of script.

1:24:26

I think I have this old Shakespeare stuff I

1:24:28

could use. Because

1:24:30

like it's kind of said in modern times.

1:24:32

So then all the people watching were like,

1:24:34

what's this guy doing in like normal English?

1:24:37

And they're like, ah, just let him come.

1:24:39

Not to be that's the question. Basically.

1:24:43

It just sounds like he was working on it for too

1:24:45

long. That like he left too much and he did not.

1:24:47

That's what he does. Yeah. You

1:24:50

know, I sat next to him once in San Francisco

1:24:52

and it was when apocalypse now

1:24:54

was coming out. I was one of the few

1:24:56

people who liked it. I loved it. It's the

1:24:58

greatest apocalypse. You know, in hindsight, you're right at

1:25:00

the time it was panned in hindsight.

1:25:03

Well, it's gotta be the greatest war movie. It's not

1:25:05

as good as apocalypse now, but it is

1:25:07

interesting. I think it's the most interesting movie

1:25:09

I've seen. He's made some

1:25:12

terrible movies. There's no doubt about that.

1:25:15

But I have huge respect for Francis. The

1:25:17

opening of it is a direct nod to Hudsucker

1:25:19

proxy, which is one of my favorite movies. So

1:25:22

I liked that. So you're a movie fan,

1:25:24

a nihilist. I am. An optimistic

1:25:26

nihilist and a movie fan. It's

1:25:28

true. An optimistic nihilist and a movie fan walking

1:25:30

to a bar. So the truth is

1:25:32

I really liked going to the movies for the

1:25:34

popcorn. True. And when the pandemic

1:25:38

seems to die down, it's not died down. It's still

1:25:40

there folks. I know that, but I

1:25:42

thought, you know, I haven't had that popcorn in

1:25:45

four years. And I went to

1:25:47

the local theater where they have the probably here's

1:25:49

a popcorn. It's terrible. A, B, a little tiny

1:25:51

one, $8. Well,

1:25:53

that's their profit. I know,

1:25:55

but geez. I'll make

1:25:57

you, when next time you're out here, I'll make you some

1:25:59

good. I make good popcorn.

1:26:02

Nice. I do. I have a whole, I

1:26:05

have actually official popcorn equipment.

1:26:07

Of course you do.

1:26:09

I have a pot that's exclusively for making

1:26:11

popcorn. Do you make it in your

1:26:13

pizza oven? You don't have a pizza oven anymore. No,

1:26:15

I don't. I don't make it in the pizza oven.

1:26:17

I make it on the stovetop, but it's a, it's

1:26:20

a whirly pop. Now I use the, uh, the official

1:26:22

whirly pops for a while, but they fall apart. I

1:26:24

got myself a really nice stainless

1:26:26

steel glass metal

1:26:29

gear whirly pop that does muah. And

1:26:32

then I use a Amish country popcorn.

1:26:34

It's very, very good. And

1:26:36

then I put the salt in the whirly pop

1:26:39

so that, and I use ghee, not

1:26:41

butter, not oil, but ghee to

1:26:43

pop the popcorn. And then of course melted butter on top.

1:26:45

If you really want to go crazy, but another thing you

1:26:48

could do is put the salt in with some sugar and

1:26:50

do the whirly pop. And then you got your cell.

1:26:53

Or put bananas. Bananas

1:26:56

top with the mayonnaise. You've got

1:26:58

your cell. Bananas. I think it's

1:27:00

a, it's a thing. You

1:27:04

know, there was a lot of Google news this week.

1:27:06

Yeah. Let's take a break and then come back and

1:27:08

actually do some stuff. I just want to, I just

1:27:10

want to say for the records, we did no news

1:27:12

last week. We're doing no, you know, we've done a

1:27:14

lot of news. It

1:27:17

was an oops all arguments. Oh yeah. So

1:27:19

it was, there was no news. We

1:27:21

hit like two headlines. We started

1:27:23

to play scooter X's notebook,

1:27:26

LM change log and it was so

1:27:28

awful. We had to stop. We

1:27:30

got into a fight about Taylor, the Renz. By

1:27:34

the way, I still can't go to our

1:27:36

site. It says it's malware. I don't know

1:27:38

why, but I have my, my, my, you

1:27:41

know, our, our security software is blocking

1:27:43

it weird. Yeah. That's

1:27:46

kind of all we did. Anyway,

1:27:48

let's not think about the past.

1:27:51

Let's be optimistic. Shall we

1:27:53

more to come with this week in Google Jeff

1:27:55

Jarvis, Paris Martineau and your, your,

1:27:58

your host, Adam driver. or

1:28:00

should they brought to you by Veeam. V-A-M. Now

1:28:03

Veeam is a

1:28:05

really important

1:28:08

product as far as I'm concerned because

1:28:11

if everybody in the world would

1:28:14

just use Veeam, there would

1:28:17

be no ransomware. There

1:28:19

would be no data breaches. There would

1:28:21

be no problems. Your data

1:28:23

as a company is the most important thing

1:28:25

you got without your data. Your

1:28:28

customers trust turns to digital dust.

1:28:31

That's why Veeam's data protection

1:28:33

and underline this ransomware

1:28:35

recovery ensures

1:28:38

that you can secure

1:28:40

and restore your enterprise data

1:28:42

wherever and whenever you need

1:28:45

it no matter what happens. Doesn't that sound

1:28:47

good? Veeam is the

1:28:49

number one global market leader in

1:28:51

data resilience. Trusted by, this

1:28:53

number blows me away, 77% of the Fortune 500.

1:28:55

You gotta wonder about

1:28:59

that other 23%. 77% of the Fortune 500 uses Veeam to

1:29:01

keep their businesses running when

1:29:06

digital disruptions like ransomware strike. That's

1:29:08

because Veeam lets you back up

1:29:10

and recover your data instantly across

1:29:13

the entire cloud ecosystem no matter

1:29:15

where your data lives. With

1:29:17

Veeam you can proactively detect malicious activity, stop

1:29:20

it cold before you need it. You

1:29:23

can also remove the guesswork and this

1:29:25

is so important by automating your recovery

1:29:27

plans and policies. You have recovery

1:29:29

plans and policies right? Well you gotta keep them up

1:29:31

to date. With Veeam

1:29:33

you'll get real-time support

1:29:35

from ransomware recovery experts

1:29:37

so you're never on

1:29:39

your own data. It's

1:29:41

the lifeblood of your

1:29:43

business. Get data resilient

1:29:45

with Veeam. v-e-e-a-m.com to

1:29:48

learn more. veeam.com. We

1:29:50

thank them so much for their support of This

1:29:52

Week in Google and of course you support

1:29:54

us by using that site

1:29:58

veeam.com. Thank

1:30:00

you, Veeam. I

1:30:04

do have some other stories, but they're not yet Google

1:30:06

stories, but I'll get to them. One

1:30:08

day. One

1:30:12

of the things we talked about last week, Ed Zittrain would

1:30:14

not talk about it, is the

1:30:16

kerfuffle between automatic

1:30:19

and WordPress and

1:30:21

WordPress engine. And

1:30:23

of course, that was last week was

1:30:26

the day. And the reason he couldn't talk about

1:30:28

it is he represents automatic. Oh,

1:30:30

I thought he was just like too boring. I don't

1:30:32

want to talk about that. No, maybe he also was

1:30:34

crazy glued like, same word. We

1:30:41

had somebody on the

1:30:43

show Sunday

1:30:47

who used

1:30:49

to work for automatic has a lot of respect,

1:30:51

I think, for Matt Mullenwegan as

1:30:53

I do. But it's a

1:30:55

complicated story. Great piece from Jeffrey

1:30:57

Zeldman. I have a lot, I'll tell you what, I have

1:31:00

a lot of respect for Zeldman. He is one

1:31:02

of the guys behind so

1:31:05

many web standards that we use. He's one of the old

1:31:07

school guys. He works at

1:31:09

automatic. He decided

1:31:11

to stay there or

1:31:13

decided to take a job there after. Should

1:31:15

you give, if I may, the background here of

1:31:17

what he was, what everybody was offered? It

1:31:21

was like six months. Or

1:31:24

30,000, whichever is larger. To

1:31:26

do what? To leave? To leave because

1:31:28

people were disagreeing with what Matt had, this is

1:31:30

background, I think it

1:31:32

matters. Matt had come out with his jihad

1:31:34

against the other company. And

1:31:37

some disagreed with him and his tone and

1:31:39

Matt said, okay. And

1:31:42

it wasn't, it was, I think, I think it was a

1:31:44

uniquely Matt thing to do to say, I don't

1:31:47

want to have you feel like you're working in a

1:31:49

company where you're going to disagree. So I'm going to

1:31:51

make a very generous offer. Anybody who wants to leave

1:31:53

can leave and get six months salary. And

1:31:56

that six months health insurance. Yes.

1:31:59

In fact, that's. That's why Zelman said,

1:32:01

I really thought about this because he

1:32:03

has creditors. He's already hired somewhere else.

1:32:06

Yeah. He said six months

1:32:08

salary in advance would have wiped the slate

1:32:10

clean on medical debts and financial obligations incurred

1:32:13

by the closing of my publishing businesses

1:32:16

and my conference. So 159 people took

1:32:18

the offer. A lot of

1:32:20

people. 8.4% of the company. The

1:32:24

other 91.6% gave up 126 million in the potential severance to stay. 63.5%

1:32:29

were male. 53%

1:32:31

were in the US. By

1:32:34

division, it affected our ecosystem WordPress

1:32:36

areas the most. 79.2%

1:32:40

of those who took it were in our ecosystem business

1:32:42

compared with 18.2% from Cosmos, our

1:32:45

apps like PocketCasts. Yeah. 18

1:32:47

people made over 200,000 K a year. One

1:32:51

person started two days before the deadline.

1:32:53

Wow. Four people then took

1:32:55

it and changed their minds. Okay. So

1:32:58

that's the background

1:33:01

here of what Zelman faces. And I get it again.

1:33:03

And I say, I really respect Zelman. He is a

1:33:05

legend and had been around forever.

1:33:07

He says, uh, even as I made myself think

1:33:09

about what six months salary and a lump son

1:33:12

could do to help my family and call my

1:33:14

creditors, I knew in my soul, there was no

1:33:16

way I'd leave this company. I

1:33:18

respect the courage and conviction of my departed colleagues.

1:33:20

I already missed them. I feel that departure is

1:33:23

a personal loss. My grief is real.

1:33:26

The sadness is like a cold fog and

1:33:28

a dark wet night, but

1:33:32

I stayed because I believe in the

1:33:34

work automatic is doing. I believe in

1:33:36

the open web and owning your content.

1:33:39

I've devoted nearly three decades of work to this

1:33:41

cause. And when I choose to move in house

1:33:43

or when I chose to move in house, I

1:33:45

knew there was only one house that would suit

1:33:47

me automatic. Now he refers

1:33:50

to a post by the

1:33:52

guy who created Drupal Dries Boudart

1:33:55

called solving the maker taker problem

1:33:57

entries, you know, do

1:33:59

pro, which is what we use as our content management

1:34:01

system is definitely a competitor with

1:34:03

WordPress. But Dries says, I'm not going to

1:34:05

take a position on the WordPress thing, but

1:34:12

it's important to understand that every open

1:34:15

source project has people who contribute to

1:34:17

it and people who take

1:34:19

from it. And

1:34:22

it's as if, and

1:34:24

Zelman says it this way, he says, it's

1:34:27

as if you would, you'd

1:34:29

go to dinner with Paris and Jeff and I

1:34:31

go to dinner every week for

1:34:33

years and I always pay

1:34:35

and nobody else offers to pay. At some point

1:34:38

you got to say something about it. So

1:34:42

he can you summarize the core of the debate again

1:34:44

for me? What is, what

1:34:46

are people upset about? I

1:34:50

actually, I don't

1:34:52

know. I mean, I do kind of know.

1:34:55

So remember that Matt Mullenweg wrote

1:34:57

WordPress in the beginning

1:35:00

and it became very, very

1:35:03

widely used. He

1:35:05

gives it away at wordpress.org. You can download it,

1:35:07

you can install it yourself. He

1:35:09

also started a company, wordpress.com, it's

1:35:11

called automatic. And

1:35:15

that is a managed version of WordPress. So

1:35:17

you pay somebody to run your website and keep

1:35:19

it up to date and all that stuff. 43%

1:35:24

of the web now uses WordPress. It's

1:35:29

a huge enterprise. And it

1:35:31

defeated moveable type. That's

1:35:36

right. Cause it was open and

1:35:38

cause it's free as an open source project.

1:35:40

But there's something interesting there is that, oh,

1:35:42

hello, gizmo. Are you interested in this open

1:35:44

source maker? Take your problem. Do you ever

1:35:46

pay for dinner? Gizmo, do you? Gizmo

1:35:49

never pays for dinner. She believes in

1:35:51

saving her capital. Yeah,

1:35:53

exactly. So

1:35:59

moveable type. this is relevant I think to where

1:36:01

you're going, movable type, uh,

1:36:03

tried to disadvantage license

1:36:05

source of its software,

1:36:08

which meant that it was put itself in

1:36:10

a conflict of interest. But WordPress didn't do

1:36:12

that and said, anybody can compete.

1:36:14

Anybody can, it's open source. We meet it. It's open

1:36:16

source. And thus it won. However,

1:36:20

WP engine was not acting

1:36:22

appropriately according to that, right?

1:36:24

Yeah. I mean, they, um,

1:36:27

they competed directly with

1:36:30

wordpress.com by offering a

1:36:32

simple managed WordPress installation,

1:36:35

but, uh, and I'm

1:36:37

not, I don't, they didn't want to contribute to

1:36:39

the open source was. Yeah. I don't want to

1:36:41

misstate Matt's objections or

1:36:44

even take a side in this. But

1:36:46

what Droy says, and I think is right,

1:36:48

is that the problem with open source software

1:36:50

is you have this, he calls it the

1:36:53

maker taker challenge, which you have people who

1:36:55

are very generously giving their

1:36:57

software away with

1:37:00

a kind of unwritten expectation that

1:37:02

people who use it, especially companies,

1:37:05

and this is a big problem who

1:37:07

use this software will contribute back, not

1:37:09

necessarily financially, could be financially, but also

1:37:11

maybe in kind with contributions

1:37:14

to the software, that kind of thing. And

1:37:16

there is this imbalance and we know

1:37:18

that there's an imbalance between people who

1:37:21

make and give away their software. People

1:37:23

like Linus Torvalds and then big

1:37:25

companies that really

1:37:28

just use it, uh, because they don't have

1:37:30

to pay for it and don't

1:37:32

contribute back. Uh, Dries

1:37:34

says addressing the maker taker challenge

1:37:37

is essential for the longterm sustainability

1:37:39

of open source projects. And I

1:37:42

agree. I think Matt maybe

1:37:45

lost his head a little bit on

1:37:47

this. You know, it really became personal for him. Um,

1:37:50

and so maybe what caused

1:37:53

the, like what flipped the switch

1:37:55

for him? Because this seems like a very

1:37:57

dramatic stance to be taking. I know. And I feel

1:37:59

like I don't. Oh, I owe you. I probably

1:38:01

shouldn't have brought this up without being more willing

1:38:03

to have an

1:38:07

opinion on it. I

1:38:10

feel a little bit like Ed. I feel

1:38:12

like a little like Ed because I really

1:38:14

know, I know and love Matt and

1:38:17

I've known him since the beginning and we've had him on the

1:38:19

show and he really is a

1:38:21

very strong advocate. Matt defends open source. He's

1:38:23

an advocate. That's why I tend to trust

1:38:25

him. Yeah. It's all

1:38:27

right to be biased on something and make

1:38:30

your bias. Yeah. This is

1:38:32

now, I decree. Perfectly neutral on everything. To

1:38:35

be known as a challenge in

1:38:37

the, in the discord. Have

1:38:39

you ever seen this? What challenge? The

1:38:41

Trenton challenge, the Trenton is bridge. The

1:38:44

stupidest slogan for a

1:38:47

town anywhere. Do

1:38:50

you see it on the bridge going to

1:38:52

the trend? World takes just,

1:38:54

just so bitter, isn't it? Just like, okay,

1:38:56

be that way world. So

1:39:00

in mid-September, this is from tech

1:39:03

crunch. Mullenweg wrote a blog post

1:39:05

calling WP engine quote, a cancer

1:39:07

to WordPress. It's a little strong.

1:39:10

Yeah. He criticized WP engine

1:39:12

for disabling the ability for

1:39:14

users to see and track

1:39:17

the revision history for every post. This

1:39:19

is a feature of WordPress. They disabled

1:39:21

it. Mullenweg says that's the

1:39:23

core of the user promise of protecting

1:39:25

your data. WP engine turns

1:39:27

it off. He says to save money. He

1:39:31

also called out WP engines,

1:39:33

investors, Silver Lake, a

1:39:35

private equity company. And

1:39:37

said they don't contribute sufficiently

1:39:39

to the open source project

1:39:42

and that WP engines use of the

1:39:45

WP brand has confused customers into thinking

1:39:47

it's part of WordPress, which I think

1:39:49

probably some people do. It's not, I

1:39:52

think to somebody who's not paying attention,

1:39:54

it's really not clear what wordpress.org, wordpress.com,

1:39:57

WP engine, automatic, what are

1:40:00

that where are they something like if

1:40:02

you gave ten dollars to WordPress you'd

1:40:04

be doing a hundred times more than

1:40:06

WP Engine is given to the open

1:40:08

source project yeah that's pretty

1:40:11

strong WP

1:40:14

Engine of course sent a cease

1:40:16

and desist letter and automatic

1:40:18

sent a cease and desist to

1:40:20

them and you know then the

1:40:22

battle goes on again

1:40:27

I my sympathies are with the Matt I

1:40:29

think he's maybe this

1:40:32

is a hot button for him and he's

1:40:34

maybe overreacting a little bit but I also

1:40:36

don't blame him because it is a problem

1:40:38

and there you have it okay

1:40:42

because we wanted to kind of cover it on last week

1:40:45

but Ed's recusal made it difficult

1:40:47

to talk too much I bet he

1:40:49

left a big his him being

1:40:51

silent leaves a big void in the room yeah boy

1:40:53

when you got him throughout the show and then suddenly

1:40:58

it really you hear

1:41:01

it all

1:41:03

right you said there's a lot of

1:41:06

Google news of Google news

1:41:09

oh my god I missed all of this I've

1:41:11

just rolled down just to give it a quick

1:41:13

digest here a judge well wait a minute the

1:41:15

most important one is we're gonna break up Google

1:41:17

says the Department of Justice right

1:41:20

that's what the DOJ says well

1:41:22

they had no the DOJ says

1:41:24

we have a menu of punishments

1:41:26

for Google you're on

1:41:28

our take your pick and

1:41:31

so it goes up to breaking up

1:41:33

but that's not the only remedy there's

1:41:35

other remedies one is they can't anybody

1:41:37

for search which means that Mozilla

1:41:40

and Apple are badly hurt

1:41:43

another is that they

1:41:45

that saves Google forty

1:41:48

billion dollars fortune

1:41:51

it doesn't hurt Google no sense another

1:41:54

is that they have to advertise that you

1:41:57

have choices in search That's

1:42:01

dumb. That's dumb. Another

1:42:03

is to break up, though it doesn't say exactly how. And

1:42:07

so I have Google's response here at mine.

1:42:10

DOJ's radical and sweeping

1:42:12

proposals risk hurting consumers,

1:42:15

businesses and developers. Of

1:42:17

course Google's going to say that. Of course they

1:42:19

will, yes. But not Robert. And you know what,

1:42:21

with some merit, I mean, it's not like, it

1:42:23

isn't, you start knocking

1:42:25

at the supports

1:42:27

of this house of cards and who knows what's

1:42:29

going to happen. And do you end up with

1:42:32

five more valuable companies that, you know, because the

1:42:34

thing is consumers will probably still pick Google

1:42:36

search. Right. Remember the

1:42:38

DOJ was going to break up Microsoft and

1:42:41

we talked about this earlier on Windows Weekly

1:42:43

and really the upshot of that would have

1:42:45

been having breaking up Microsoft into two companies,

1:42:47

operating systems and soft and work

1:42:50

office would have been two

1:42:52

more valuable companies. Consumers would have gotten

1:42:54

shares in both. Everybody wins. So

1:42:58

yeah, it's, I would love to see

1:43:00

Google be forced to give up YouTube. I think

1:43:02

that's a little much. I think

1:43:04

Google, the fact that Google controls

1:43:06

all ends of the advertising transaction

1:43:08

is clearly problematic. It

1:43:10

has hurt us with an Elon

1:43:12

Musk. What

1:43:15

if it ends up with Satoshi saying, I know what I

1:43:17

want to spend my 60 billion on. I'm going to buy

1:43:19

YouTube and do crazy things. Right. Yeah.

1:43:22

It's, you know, you be careful of the apple cart. That's a good

1:43:24

point. You upset. Good point. And

1:43:26

then the major papers, the Financial Times said,

1:43:29

this is a gift link. The

1:43:31

Google breakup reads like an antitrust fan

1:43:33

fiction. Right. And the New York

1:43:35

Times said, this is going to be really hard

1:43:38

to do. Um,

1:43:41

and at the same time, I love this, the story comes

1:43:43

in here. A wall street

1:43:45

journal story from just the other day said

1:43:47

Google's grip on search slips as

1:43:49

tick tock and AI startup mount

1:43:52

challenge. And so it's

1:43:54

just like Microsoft. The timing is. It's

1:43:57

hard because these things move glacially slowly in

1:43:59

the. technology industry. And this is what's

1:44:01

clear is Google's gonna fight full-on. They're

1:44:03

gonna go through every possible court. This

1:44:06

is gonna take years upon years upon

1:44:08

years. Well I mean they have to.

1:44:10

It's like a for their shareholders. They

1:44:12

have a fiduciary responsibility too.

1:44:15

But they're gonna spend money now. One of the stories I think

1:44:18

was the time

1:44:20

story said they're gonna spend money now

1:44:22

at a current dollar and

1:44:25

it's worth it because in the future by the

1:44:27

time anything ever happens the value

1:44:29

of what they spent will have

1:44:31

gone down with inflation. So it's

1:44:34

also in their in their physical sense

1:44:36

to just fight as long as possible. So

1:44:40

it's funny because Paul Therat said I really want to

1:44:42

know what Jeff Jarvis thinks about this. Oh

1:44:45

Paul Therat. By the way Paul who

1:44:47

I'm bitter at he was in Berlin

1:44:49

for more than a week. I told

1:44:51

him like five times on Facebook you

1:44:54

have to go to the food floor

1:44:56

in Cadevet. He never went and he

1:44:58

ate at the same bloody restaurant nearby

1:45:00

like five times having stupid curry-versed

1:45:03

every single time. I think

1:45:05

he actually loved curry-versed. He talked a lot

1:45:08

about it. Every single time he had curry-versed.

1:45:10

Jeez it's awful. I

1:45:13

love curry-versed is awful but anyway. Michael

1:45:15

Lisa's son is going to Germany with

1:45:17

his German tutor and his dad in

1:45:20

a week. Wow. And he will they

1:45:22

will be going all over Germany including

1:45:24

Berlin. Would you please text me

1:45:26

or send me or email me the name of

1:45:28

that place and I will make sure that they

1:45:30

go there. Yeah, Kaufhaus des Vesten. They should go

1:45:32

to the film museum. Yeah

1:45:35

see this is the problem. Anyway

1:45:38

Uli is a native. His German

1:45:40

tutor is native. By

1:45:42

the way Michael speaks incredibly fluent

1:45:44

German. Uli said he could teach German

1:45:46

if he wants. His kid's 21 and

1:45:49

for some reason society wanted to learn

1:45:51

German and quickly outpaced what the school

1:45:53

could offer so he has this native

1:45:55

speaker tutor who's fantastic so

1:45:59

I can't wait because he's he's going to go to Germany

1:46:01

and suddenly the language he's been learning is going to come

1:46:03

alive. Yeah. I went

1:46:05

finally, my Deutscher Schleit

1:46:07

is Erchlecht. I went finally when I was 24 and

1:46:10

I said, why didn't I pay attention? Yeah.

1:46:13

It's a fascinating culture and country. Anyway,

1:46:16

I forget where I did that to us, didn't I? No,

1:46:18

but do bring us the, send me

1:46:20

the Klaus Schaufelsstaffen. Kaufhaus

1:46:23

des Vestens. Kadebe. Okay.

1:46:26

The other story. Oh, no. Before

1:46:29

I do that. Paul Thoreau wanted to know what

1:46:32

does Jeff think? What should

1:46:34

happen? I think that

1:46:39

what this exposes is

1:46:42

the inadequacy of antitrust doctrine

1:46:44

today because, and we've talked

1:46:46

about this many times, if

1:46:49

you under US, it's about consumer harm. Consumers

1:46:52

are not harmed. They are helped in each one

1:46:54

of these cases. I do

1:46:56

think the one place where Google is vulnerable,

1:46:59

which is not this case, because this is the

1:47:01

search case, is in the ad case. Yeah. And

1:47:04

I think I disclosed on this show, I got

1:47:06

a call from lawyers who, I don't know who they could

1:47:08

have been for, wanting to see if I wanted to be

1:47:10

an expert, if I was qualified to be an expert witness.

1:47:13

And as the talk went on, gee, it was

1:47:15

about advertising and antitrust and this and that. And I

1:47:17

said, well, you might want to know what I said

1:47:20

in my new book, The Well We Weave, on sale

1:47:22

now, that is the

1:47:24

one area where Google is most

1:47:26

vulnerable and antitrust. They said, okay,

1:47:28

thanks. Nevermind. We

1:47:31

don't, we don't really want you. I agree with

1:47:34

you because they own the entire chain. They buy

1:47:36

the cell, they make the market. But

1:47:38

even there, there's an argument that says that

1:47:40

the market is more efficient because they're there

1:47:42

doing all that, but it is the area

1:47:45

where they're vulnerable. Search, it's ridiculous. It's a

1:47:47

red herring. Their shopping, which

1:47:49

the Europeans go after, is ridiculous. Just

1:47:51

how do you feel? How do

1:47:54

you feel about the fact that Google

1:47:56

lawyers had you pegged as an

1:47:58

ally of Google? Because I wrote a book. called what would

1:48:00

Google do? I mean

1:48:03

you think it's just as simple as that. I'm on a podcast. You

1:48:07

must love Google, right? But I assume they

1:48:09

probably put some other, you know, yeah, well,

1:48:11

I've said it on the show. I've said

1:48:13

on the show often, I think that the

1:48:15

case in other areas is

1:48:18

BS. But in that area, I actually agree with

1:48:20

the government going investigating. I think that's the right.

1:48:22

It's hard to know what to do though, isn't

1:48:24

it? Because you don't want to upset the apple

1:48:26

cart. Yeah. It's really

1:48:29

hard to know what to do. That's because it's

1:48:31

through the apple cart. Who need apples of cart?

1:48:33

You want to do it on the mayor's side?

1:48:35

They came in and tripped it over everyone else's

1:48:37

apple cart. So why can't we tip their apple

1:48:39

cart? I would also say that your point earlier

1:48:41

about consumer harm, I don't know.

1:48:43

I think there should be an asterisk there.

1:48:45

It's like, yeah, perhaps consumers aren't being harmed

1:48:47

under the specific way the US justice system

1:48:51

defines consumer harm currently. Some people

1:48:53

argue the consumers are being harmed

1:48:56

because monopolistic forces deprive them of

1:48:58

choices that would otherwise be available

1:49:00

if it wasn't such a concentrated

1:49:02

market. Would you want a European

1:49:04

model where the antitrust is more

1:49:07

about too big is too big?

1:49:10

Yeah. Or you could say the

1:49:12

way Google has manipulated the ad market makes it

1:49:14

very difficult for blogs to

1:49:17

succeed and for podcasts like ours

1:49:19

to succeed. And they're

1:49:21

dom, you know, they are very dominant

1:49:23

in advertising between Google and Facebook, pretty

1:49:25

much all digital advertising, something like 80

1:49:28

or 90% of it goes through Google and Facebook.

1:49:32

Their dominance means, you know,

1:49:34

they, there's no

1:49:36

competitive market for advertising. That's

1:49:40

not good for us. Yeah. So I think advertising

1:49:42

is where they're vulnerable to serve with the search

1:49:44

case here, I think is ridiculous. Yeah.

1:49:46

Because in the long run, something else is going to

1:49:48

come along like TikTok and it's not going to matter

1:49:51

anyway. So the

1:49:53

other, and they're actually, they're actually

1:49:55

ruining their own search. Frankly.

1:49:58

We've, I think that the web. AI

1:50:00

is ruining the web which in turn makes search

1:50:02

impossible to do and go

1:50:04

do a search though and the first half of the

1:50:06

page is Nonsense not

1:50:08

AI nonsense Google provided nonsense,

1:50:10

you know, well, you're also gonna

1:50:13

add ads into their Yeah,

1:50:16

AI answer and that's gonna piss off the publishers

1:50:18

who say well now you're making money on the

1:50:20

stuff that you're training on from us There

1:50:23

they're not too clever in some cases. So isn't

1:50:25

this all part of it like you're saying Like

1:50:28

the the search is not a good

1:50:31

Vector of attack but like you're saying their search

1:50:33

is getting so bad now But there's no alternative

1:50:36

because there was no one has been able to

1:50:38

compete with Google Well, there is a disagreement about

1:50:40

who's to default for that. I think yeah Yeah,

1:50:43

he thinks it's the Like the

1:50:45

bad searches reflects the bad content if you

1:50:47

want to go after Google pick your best

1:50:49

case and your best case is advertising Not

1:50:51

search. That's what I'm saying Now

1:50:56

whether you should go after Google is a different question

1:50:58

I'm just saying that if we if we buy that

1:51:00

however, I think the other important case is

1:51:02

in the app store Which I think is

1:51:04

gonna hurt consumers you saw that Yes,

1:51:08

so this is the other judge Ordered

1:51:11

Google to pry open. This is sure

1:51:13

Ovid writing in the Washington Post. It's

1:51:15

Android app store to competition on Monday

1:51:19

Jim James Donato Large this is a

1:51:21

victory for Epic Games remember Epic Games

1:51:24

the maker of fortnight sued both Google

1:51:26

and Apple They didn't do very well in

1:51:28

the Apple case But it looks like they've

1:51:31

got pretty much a complete victory in the

1:51:33

Google case They won a

1:51:35

jury verdict last year that said the Play

1:51:38

Store was an illegal monopoly Epic

1:51:41

wanted to The problem

1:51:43

for epic was this 30% Vig

1:51:45

that Google and Apple take of

1:51:48

sales Epic wanted to

1:51:50

sell in game goods and

1:51:52

products without giving money to Apple or

1:51:54

Google I'm not sure

1:51:57

why they lost in the Apple

1:51:59

case. I think because there wasn't a jury. It

1:52:01

was a judge. And

1:52:04

in the Google case, it was

1:52:06

a jury verdict. So

1:52:09

after the jury ruled last year that the

1:52:11

store was a monopoly, Donato, Judge

1:52:14

Donato was tasked with mandating changes to the

1:52:16

App Store to fix the behavior. He

1:52:18

says that Google has to allow

1:52:21

other apps, app

1:52:25

stores in

1:52:28

the Google Play Store. The

1:52:30

judge required Google to remove roadblocks that

1:52:33

largely discourage businesses from making Android apps

1:52:35

available to download from their websites. Or,

1:52:38

by the way, nobody wants to do that. Fortnite did

1:52:41

it for a while. Oh, you don't have, because Android

1:52:43

does not make you do it from the Play Store.

1:52:46

There is a security checkbox you can check and

1:52:49

then you can go to the Fortnite site

1:52:51

and download it from there, if you want

1:52:53

side loading, they call that. The

1:52:56

judge says you gotta take away

1:52:58

that checkbox, that roadblock, or allow

1:53:01

digital storefronts, not controlled by Google.

1:53:03

So somebody could download an Epic,

1:53:05

this is what Epic really wants,

1:53:08

download on their Android phone from the

1:53:10

Google Play Store, an Epic store, where

1:53:12

you could buy stuff for Fortnite, buy

1:53:14

other games from Epic. Apple

1:53:17

absolutely does not allow this except they're being

1:53:19

required to now in

1:53:22

Europe and they're dragging their feet,

1:53:25

putting up a whole bunch of roadblocks, which the

1:53:27

EU's not too happy about. I

1:53:29

imagine Google will face the same thing. Donato

1:53:32

also said app makers can offer people

1:53:34

more options to pay for digital purchases,

1:53:37

like Disney Plus streaming, or Extra Lives and

1:53:39

Candy Crush. Google right now

1:53:41

requires in-app purchases to go through its own payment

1:53:43

system, and that's when they get their 30%. This

1:53:48

is a very big deal. This is what they

1:53:50

have. Because it's also about the security of what

1:53:52

you can put on your phone. Now

1:53:55

is that worth 30% vig or not? That's

1:53:57

a debate to have, but it is a... service

1:54:00

to users that Google and

1:54:02

Apple each. Uh,

1:54:05

well, that's what they say. I don't think

1:54:07

it is. I don't think it does provide

1:54:09

a lot of security. They would. That's what

1:54:11

the argument for them. It is for Apple.

1:54:13

It is a massive windfall. It is a

1:54:15

big part of their revenue. Uh,

1:54:17

I imagine it is for Google as well. They

1:54:20

would like to, it really, the real question

1:54:22

is, does the

1:54:24

maker of your smartphone have

1:54:27

the right to control everything on that

1:54:29

smartphone? Right. Which

1:54:31

is a very, um, absolutely

1:54:34

not. You own

1:54:36

it. It's like saying, Oh, if

1:54:38

you buy a car from Audi,

1:54:40

you have to use shell gasoline and

1:54:42

no other. Or like today,

1:54:44

if they implemented a 30% vague on

1:54:46

your Mac software, everyone would do it. Oh yeah.

1:54:48

That's it would be an uproar. Well, it

1:54:51

is fascinating to me that we've

1:54:53

gotten to this position with phone

1:54:55

smartphones, yet no

1:54:58

other platform. I would

1:55:00

submit it's because these companies learned

1:55:02

from their desktops and

1:55:04

the smartphone came along, remember much later starting in 2007.

1:55:08

And they said, we're not going to make the same

1:55:10

mistake. There was, I think

1:55:12

there was very, you're right, Jeff, a legitimate security

1:55:15

argument. They said, we really got a smartphones

1:55:17

are going to be a security target.

1:55:20

And we really got to lock them down. Right.

1:55:22

And that tied to the, with the vague makes

1:55:24

their argument less valid. Oh, we make some money

1:55:27

too on it. That's okay. But

1:55:29

I do think that, uh, I mean, there's a

1:55:31

security issue on desktops as well. I mean, that

1:55:33

doesn't, that doesn't go away. I can

1:55:35

download something that's been there and that we're all that's

1:55:37

right. We just used to, we all know it. Yeah.

1:55:39

Unless you have a Chromebook, which is controlled by Google

1:55:41

and means that I have none of these problems. That's

1:55:43

true. It's a good point. So

1:55:46

does, so that's the real question that's, uh,

1:55:48

and that's what the jury has decided, but

1:55:50

is again, the question is, does

1:55:53

the maker of a smartphone have the

1:55:55

right to control what's on that smartphone?

1:55:58

For your benefit. And

1:56:00

I agree with you Benito, absolutely not.

1:56:07

Anybody disagree? So there was a lot of Google news.

1:56:09

We got to the Google news. No, so what's

1:56:12

weird is that Google now has to do this, but Apple

1:56:14

doesn't. That is interesting. That's very

1:56:16

weird. Different

1:56:18

jurisdictions, different courts. And

1:56:21

most importantly, one

1:56:24

was a jury decision and one was

1:56:26

a judge's decision. And I think the

1:56:28

judge probably is more sophisticated than the

1:56:30

jury. I don't know. Well,

1:56:33

does that leave any, we need

1:56:35

Kathy here. Does that leave any

1:56:38

cause in itself for appeal to

1:56:41

say equal treatment? I

1:56:43

think Epic has decided not, no Epic and

1:56:46

Apple both went to the Supreme Court, but

1:56:48

with just smaller issues in the overall case.

1:56:51

So I think Epic's decided that they're going to

1:56:53

live with that one. They're very happy about the

1:56:55

Google. No, what about Google saying we shouldn't be

1:56:57

subject to this if Apple isn't. Oh, oh, the

1:56:59

other way around. Yeah. Is

1:57:02

that a basis for appeal? I don't think so. It's

1:57:06

like saying, hey, that cop was driving 80

1:57:08

miles an hour in a 60 mile

1:57:10

zone. Why can't I? Like,

1:57:14

puffed luck. You're out of

1:57:16

luck. Cause you don't have qualified immunity. Oh yeah,

1:57:18

that I forgot about that. Neither does Google. Yeah.

1:57:23

Wow. Very, that's, yeah,

1:57:25

I don't know. More and more.

1:57:27

I used to on these shows, be

1:57:29

willing to express ill

1:57:33

formed opinions.

1:57:35

Hot devil's advocate. You saw that as your

1:57:37

role. And I kind of lost that. I

1:57:40

kind of now, I don't know, maybe cause

1:57:42

I'm getting older or cause

1:57:44

I'm losing my marbles. I don't know.

1:57:46

I don't know what the answer is anymore. Yeah. That's

1:57:50

where it's kind of crazy takes on AI.

1:57:53

It's yeah. I don't even that, even that I've kind

1:57:55

of given up on haven't I, I'm

1:57:58

just kind of, I should be more. more like

1:58:00

Ed, like certain about

1:58:02

everything. It's better programming.

1:58:04

I mean, it seems exhausting to do that. It

1:58:06

is. I'm exhausted anyway, so I think

1:58:09

it's me. I was thinking about this the other day,

1:58:11

is like, you know, I used to be so into

1:58:14

tweeting, posting, so always coming up with

1:58:16

a hot take on something, always out

1:58:18

there looking for content, and it really

1:58:20

was good for, you know, follower

1:58:23

counts and things like that, but that's too exhausting.

1:58:25

I don't have time for that. Yeah, do you

1:58:27

think maybe we burned out on that? I mean,

1:58:29

certainly I look at Twitter now and I go,

1:58:32

these people are just performing. They're performing. I mean,

1:58:34

yeah, it's absolutely all performing, and the performance is

1:58:36

exhausting. It's exhausting. It's a lot of work. And

1:58:40

to whose benefit? I

1:58:42

see it as the high school cafeteria of the

1:58:44

internet. We outgrew it. No, it's

1:58:46

definitely just, you know, a peacock

1:58:48

fluffing up its feathers. Which

1:58:51

can be a beautiful sight. I miss a peacock.

1:58:53

You know, you see someone make a really good

1:58:56

post, and you're like, that's a beautiful sight, but

1:58:58

also to what end? Yeah. Although

1:59:00

I have to say, I go to Blue Sky, and I see Drill trying

1:59:03

really hard, and it

1:59:05

just doesn't work on other places. No,

1:59:07

I think Drill works no matter where he goes.

1:59:10

Really? You like Drill's posts on Blue Sky? Yeah,

1:59:12

I think they're good. Okay, I just, maybe

1:59:14

I'm- I just like the fact that Blue

1:59:16

Sky, when they were coming into

1:59:18

existence, hiring their first engineers' employees,

1:59:21

every single person got a paperback

1:59:24

copy of Drill's tweets, and

1:59:26

they reserved Drill's handle on Blue Sky.

1:59:28

They cared. We're so surprised when he

1:59:30

tried to sign up, that

1:59:32

he was like, why can't I get my handle? And

1:59:35

they were like, oh no, we've been saving it for

1:59:37

you, Mr. Drill. Mr. Drill, welcome.

1:59:39

I'm happy whenever- So

1:59:42

I have started, tell me if I'm

1:59:44

wrong doing this. I got a new

1:59:46

program for iOS called Croissant, which

1:59:49

is a cross-posting app, and has a nice

1:59:51

little croissant icon. It

1:59:53

doesn't post to Twitter. Don't get your hopes

1:59:55

up. It's Blue Sky,

1:59:57

Threads, and Mastodon, but I've

1:59:59

been using it. because those are

2:00:01

the three places that, at least for

2:00:03

promotional posts, like buy my

2:00:06

son's book, Salt Hank, a five

2:00:08

napkin situation, available in bookstores everywhere,

2:00:10

and when I post that, I

2:00:13

want to post it to Mastodon. Is it only

2:00:15

a phone or is there also a web version?

2:00:18

No, no, you have to have it on an iPhone, I think. It's

2:00:21

nice, every time you refresh it,

2:00:24

it gives you a new prompt, like

2:00:27

right now it says put your words in me.

2:00:30

Let me see what it says next time. That's twee.

2:00:32

Hello world. That's just dumb. No,

2:00:35

I like it. So much is

2:00:37

happening. It's very 2010 internet. Yeah,

2:00:40

well I'm a 2010 internet guy.

2:00:42

How about this one? Drop some

2:00:45

truth. Oh jeez, okay,

2:00:47

you lost me. So it's playful.

2:00:49

You should say truth doesn't exist.

2:00:53

Impartiality is

2:00:55

a nonsense. Okay,

2:00:57

free-drick. Let's

2:01:00

see. All

2:01:04

right, I am, as I said, I am jaded,

2:01:06

bored, and we're almost out of time, so let's

2:01:09

each of you- You gotta go see

2:01:11

the movie tonight. I got tickets for the-

2:01:13

Don't buy it all alone. We need

2:01:15

to take out one pause. Yeah,

2:01:18

I'm gonna do a pause before the picks

2:01:20

of the week. Don't worry,

2:01:22

Benito. I'm gonna take care

2:01:24

of your revenue generating opportunities. I

2:01:28

don't think I'm gonna go see Megalopolis in the

2:01:30

theater. I think that sounds unbearably sad. I

2:01:34

mean, yeah, it would be sad if you're the only person

2:01:36

there. Also, yeah, I mean, I

2:01:39

only really like seeing movies in the theater. If I

2:01:41

go to one of the theaters that are near me

2:01:43

that have good popcorn. They have tacos. And

2:01:46

alcohol. And alcohol, baby.

2:01:49

Actually, our theater tried that. I

2:01:51

would recommend being a little inebriated for Megalopolis.

2:01:53

It's definitely not a sober movie. Especially

2:01:56

in the suburbs. Maybe you New Yorkers

2:01:58

do, but- I was said

2:02:01

to Howard Stern about going to the,

2:02:03

to the dome in Vegas. No. What?

2:02:05

Oh, and so she, she took, um,

2:02:07

Doug, uh, there as a

2:02:10

surprise, uh, when they were in Vegas

2:02:12

and she said, uh, definitely

2:02:14

do not go in there in an altered

2:02:16

mind altered state. Why

2:02:19

don't you like high? And she said, yeah,

2:02:21

don't, which is I love because that's our

2:02:24

hope future president saying, yeah, I've

2:02:26

been there. Wow. Not just that

2:02:29

I inhaled, uh, accidentally. I

2:02:32

feel like I want to hear less from

2:02:34

our candidates, not more. Yeah. I

2:02:37

think can we just get this damn thing

2:02:39

over with? I'm so sick of this. Uh,

2:02:43

pick a story, any story, Paris,

2:02:47

uh, Caroline Callaway,

2:02:49

uh, announced that

2:02:51

she's not evacuated. She lives in Sarasota

2:02:54

at a beach front property

2:02:56

and is not evacuating for the hurricane. She

2:03:00

doing live Caroline Callaway when

2:03:02

she is a influencer who became famous

2:03:04

over the

2:03:08

past seven or eight years. I

2:03:10

think there was originally some New York magazine. She's

2:03:12

not the hawk to a girl is she? No,

2:03:14

she's not the hawk to a girl. She looks

2:03:17

similar. She was, I think well known for writing,

2:03:19

uh, that's all you did really long Instagram captions

2:03:21

and then became known as kind of a prolific

2:03:23

scammer. She like did a bunch of low key

2:03:26

and medium scams. One of the most famous of

2:03:28

which, I don't know if this really counts, the

2:03:30

scam is she had like an apartment in New

2:03:32

York city that she really poorly

2:03:35

painted. She painted around a bunch of piles

2:03:38

of clothing on her floor and like half

2:03:40

painted them basically trash. The apartment never paid

2:03:42

rent on it for like a year or

2:03:44

two and then got evicted and now lives

2:03:47

in Florida. Um, but she's apparently has a,

2:03:49

is a fan of wax lips. Look at

2:03:51

that picture. Yes is apparently a fan of

2:03:54

wax lips. Um, so

2:03:56

she announced on Twitter and Instagram yesterday

2:03:58

that she's staying put in Sarasota where

2:04:00

Hurricane Milton is heading. And everyone was

2:04:02

like, is this a bed? What's going

2:04:04

on? She had an interview with New York mag.

2:04:07

And she's like, no, I'm just staying. And

2:04:09

I think I'm going to be fine. I'm in on

2:04:11

the third floor of a building

2:04:13

that's rated as hurricane safe. So we'll be

2:04:15

all right. So somebody

2:04:18

tweeted her dying in a

2:04:20

hurricane would be the perfect ending to her

2:04:22

narrative. Oh, geez. A

2:04:24

little dark. A little dark. So

2:04:29

folks don't stick it out. Is

2:04:31

it too late though? It might be too late. Right now it's

2:04:33

too late. But

2:04:36

right now, maybe follow the advice of

2:04:39

local authorities in Tampa and nearby

2:04:41

areas, which is if you

2:04:44

decided to stay, if

2:04:46

you decided to stay, they say get

2:04:49

a permanent marker and write your name,

2:04:51

date of birth, on your forehand, for

2:04:53

your next of kin on your arms or legs so

2:04:56

that they can identify. Yeah,

2:04:58

no, that's the actual guidance. That's

2:05:02

pretty grim. That's dark as hell.

2:05:05

Please use an indelible marker and write

2:05:07

your name on your arm so

2:05:10

we know who to call. I

2:05:12

mean, part of the issue in some of these areas

2:05:14

is when it is going

2:05:17

to be it is that devastating of

2:05:19

a forecast. Emergency service

2:05:21

providers are also evacuating.

2:05:23

Right. So there's no

2:05:25

one to come get you until things have

2:05:28

calmed down. Well, we

2:05:30

were talking about this yesterday. The other side of the

2:05:32

story is there's looting. And

2:05:35

people are afraid if they evacuate, looters

2:05:37

will come in. On the other

2:05:39

hand, wouldn't you rather lose your stuff than your life?

2:05:42

Yeah, and it's also, I mean, those looters,

2:05:45

let's say if there are looters in some place

2:05:47

like where the hurricane is going to have a

2:05:49

direct hit, those looters probably aren't going to get

2:05:52

very far with your things or they are hit

2:05:54

by a hurricane. They're not looting your things. They're

2:05:56

looting the supermarket for food because they can't leave.

2:05:58

That's what I would say. I think

2:06:00

really, yeah. Yeah, it

2:06:02

sounds like more of that racist nonsense.

2:06:04

Yeah. My family and

2:06:06

stuff are fine, but it's been really

2:06:08

a dismaying to see the constant, uh,

2:06:11

I feel like refrain you hear from Floridians during

2:06:13

hurricanes. Oh, we were fine during Ian or something

2:06:16

else. So we're going to be fine now. This

2:06:18

hurricane is a big deal. It's going to be

2:06:20

worse than Ian. It only takes one to wipe you

2:06:22

out, man. And for a lot

2:06:24

of the places like Sarasota or

2:06:26

Callaway is and things like that,

2:06:28

the notable difference is it's going

2:06:31

to be hitting kind of the north of

2:06:33

them. And Floridians know the south and southeastern

2:06:35

part of the hurricane is the most dangerous

2:06:37

because that's going to result in

2:06:40

huge storm surge, which is really what a

2:06:42

lot of the stuff that's the real threat is

2:06:44

the rising sea level. Yeah. Oh

2:06:47

yeah. Yeah. Anyway, so that's my uplifting story for the

2:06:49

day. And you know, like none of the news ever,

2:06:52

ever, ever brings up climate change

2:06:54

when they report on this. No one says

2:06:56

it. No one says it. It's just going to

2:06:58

become very obvious. And I mean,

2:07:00

for anybody out there who's poo pooing us for

2:07:02

bringing this up, it is

2:07:04

basic facts. I mean, the reason

2:07:07

why you end up getting intensified storms like this

2:07:09

and the reason why this storm in particular, I

2:07:12

believe they said it explosively

2:07:14

intensified is because hurricanes come

2:07:17

from a heated

2:07:19

water in the Gulf and the water has

2:07:21

been getting hotter. It's

2:07:23

unusually hot right now. And

2:07:26

that's what's been producing this mega storm so

2:07:28

quick on the heels of the last storm.

2:07:30

It's hot because of Jewish space lasers,

2:07:32

right? Oh, of course. Yeah. Okay.

2:07:36

Just checking. You know, this is one I don't

2:07:38

want to be right about. I just

2:07:41

wanted to do something about it. I mean, I'd

2:07:43

love to just be over cautious and have everybody

2:07:45

be fine. But seeing

2:07:47

what we saw with Helene recently

2:07:49

seems unlikely. I just, I feel for

2:07:51

everybody and stay safe folks,

2:07:53

really. We have a lot of

2:07:55

listeners who lost power, lost internet

2:07:58

in North Carolina. We

2:08:00

was hearing from them, they were, you know, some of

2:08:02

them were able to watch over other

2:08:05

systems. And something that's worth noting

2:08:07

for listeners who could be impacted by

2:08:09

this, someone who could be impacted by this, if you

2:08:11

have an iPhone or they do, the newest

2:08:14

iOS update has a feature that

2:08:16

could be really useful in this,

2:08:18

which is if you've updated to

2:08:20

the latest iOS, if you're

2:08:22

in an area without cell service,

2:08:24

as people often are after this,

2:08:27

you can send texts via satellite.

2:08:30

And I believe in the wake of Hurricane

2:08:32

Helene, people in kind of the

2:08:35

North Carolina impacted areas were able to use

2:08:37

the feature without any charge. But

2:08:39

they were able to send texts to

2:08:41

loved ones saying, I'm all right, or

2:08:44

I need help. It's kind of amazing. Really? How

2:08:47

could they do without a current satellite? The

2:08:50

newer iPhones support satellite texting. Right,

2:08:52

but how can you do that with no electricity? You have

2:08:54

no screen? Well, if your phone's dead, it

2:08:57

won't work. But

2:08:59

a thing a lot of people do in hurricanes, you keep your phone

2:09:01

off until you need it. Right. Right.

2:09:04

Yeah, the Wall Street Journal story, I have the rundown there at

2:09:06

the top of the end, 114, has

2:09:10

a kind of animation of what pixels will

2:09:12

do it. Some pixels

2:09:14

have also you have to stand there and

2:09:16

point it at the right that light and

2:09:18

it guides you to do

2:09:20

so and then says, OK, you're in. You

2:09:23

have to outside, not under trees. Clouds

2:09:25

are OK. Paris, have you ever been through

2:09:27

a big hurricane? So

2:09:31

you know a lot about them. No, I mean,

2:09:33

there were hurricanes a lot growing up, but we

2:09:35

never experienced much devastation in comparison

2:09:38

to this. I have a little

2:09:40

bit of typhoons where I'm typhoons in the

2:09:42

Philippines. Yeah. So yeah. So

2:09:44

you have been in a many times. Yeah. I

2:09:46

mean, it's terrifying. It's old hat to me. But

2:09:49

these these storms are a little stronger than what

2:09:51

we were going through. We went through seasonal monsoon

2:09:53

and things like that. Right. And there were typhoons.

2:09:56

But not like Cat 5. Yeah,

2:10:01

I feel like we had like cat

2:10:03

two, cat three, cat four, but

2:10:05

in most cases power came on after a

2:10:08

couple days. And it

2:10:10

was the only impact to me was like

2:10:12

having to spend days as a child picking

2:10:14

up downed trees and things like

2:10:16

that. But it's quite

2:10:18

sad what's about to happen. Our

2:10:23

thoughts, I want to say thoughts and prayers go

2:10:25

out to you. I

2:10:28

really stay safe if you can. All

2:10:31

right, let's take a final break and then we'll get

2:10:33

your picks of the week as we wrap things up

2:10:35

on this week in Google.

2:10:38

Hey, podcast listeners, tired of ads

2:10:41

barging into your favorite news podcast?

2:10:43

Good news with Amazon Music, you

2:10:45

have access to the largest catalog

2:10:47

of ad free top podcasts included

2:10:50

with your Prime membership. Stay up

2:10:52

to date on everything newsworthy by

2:10:54

downloading the Amazon Music app for

2:10:56

free or go to amazon.com/ad free

2:10:58

news. That's amazon.com/ad free news to

2:11:01

catch up on the latest episodes

2:11:03

without the ads. If

2:11:05

you have a question, contact

2:11:07

us at electricirelandaoil.com with the

2:11:10

information you need to get

2:11:12

the latest news on the

2:11:14

world's most important topics. From chess, cooking,

2:11:16

sports and more, we'll give you the

2:11:18

information you need to get the biggest

2:11:22

competitor and the best stage to

2:11:24

join the world in a more

2:11:26

interesting and connected Give

2:11:29

us a call, make

2:11:38

it at no cost, I've

2:11:40

kind of another silly pick.

2:11:43

I was trying to think today of what do for

2:11:45

my Halloween costume. I will try and remember all the

2:11:47

things that happen in 2024. I

2:11:50

remember that there's a woman

2:11:52

I follow on Twitter. I

2:11:55

think Paige Skinner is her name that has

2:11:57

been keeping this Google Doc where she just

2:11:59

almost every single day for the full year

2:12:01

has been just jotting down what's

2:12:03

happened. What's been happening in popular culture.

2:12:05

And it's a real delight to look

2:12:07

through. You know, always big news

2:12:10

either. Right. It could be big news.

2:12:12

Selena Gomez says she told sailor swift

2:12:14

about her friends hooking up at golden

2:12:16

globes could be by that was January

2:12:18

nights. The layoffs in the dune to

2:12:20

popcorn bucket. Uh, you

2:12:22

know, there are some just ones in these that are

2:12:25

funny, like the cut $50,000 scam article or, uh, is

2:12:29

this going to be the new one tweeting

2:12:32

like, don't tweet creates

2:12:34

Google sheets that you then

2:12:36

tweet February 25. That's a whole,

2:12:39

oh, nothing happened. Yeah. No,

2:12:41

they, as the year goes on, it's

2:12:43

clear that this person is living their

2:12:45

life and kind of forgets to do

2:12:47

it some times. Um, a lot

2:12:50

of Tik TOK in here, a lot of

2:12:52

Tik TOK, really an interesting

2:12:54

mix of things. I'm

2:12:57

going to save this for, you know,

2:12:59

we do year ends shows, uh,

2:13:01

on this show and on, uh, on

2:13:04

Twitter and Mac break. We, I got

2:13:06

to save this and we can just go, go through it.

2:13:09

May 8th or FK's brain may night. Haley

2:13:13

Bieber pregnant may 10th

2:13:16

Northern lights in the U S while

2:13:19

you go July 25th, JD

2:13:22

Vance couch. Ha ha ha.

2:13:24

You know that, that, that story kind of went

2:13:27

away. Didn't it? Kind of drifted off into the

2:13:29

August 4th RFK to bear in

2:13:31

central park. September

2:13:34

19th, Tim Robinson nude Africa,

2:13:36

Olivia Nuzzi and RFK affair.

2:13:38

Diddy on suicide watch. Wow.

2:13:40

That was a dark day.

2:13:42

Wow. Really? It'll just, you know, well

2:13:45

when Yang parodies chapel Ronan

2:13:47

mood day, yeah, she's getting

2:13:49

busy, isn't she? It's like blank. Good to see, it's

2:13:51

good to see that you have. I mean, it's good.

2:13:53

It would be a little sad if it was still going

2:13:56

on in full.

2:13:58

Let me see. Uh, February 21st,

2:14:01

Biden dog commander has bitten 24

2:14:03

says secret service. Yeah.

2:14:05

That was a big story that kind of

2:14:08

nobody talks about anymore. I may. I

2:14:11

think I have the help for you for,

2:14:13

um, Halloween. All right. Because

2:14:15

we have a story here from Tom's

2:14:17

guide about a woman who used the

2:14:19

RFK brain worm. Just, I like

2:14:22

that. Or I think you should be the bear. So

2:14:24

you still do a Halloween costume, even though you don't

2:14:26

work in an office. Why you do work in an

2:14:28

office? I work in an office. Okay. I don't

2:14:30

know if I wearing it to the office

2:14:32

would be the highlight for me. I have

2:14:34

a pretty popping neighborhood for Halloween. So I'd

2:14:36

wear it around there. You just got to walk around the bar.

2:14:40

I was the ocean gate submersible last year

2:14:43

and groups of children were oohing and eyeing

2:14:45

and stopping me for photos. So you made

2:14:47

a giant like bean

2:14:50

that you wore photo of that. Did

2:14:52

it explode periodically? Did it periodically? You

2:14:54

just go down, flatten. I've

2:14:57

got the part that I had an ocean gate.

2:15:00

Oh, so

2:15:03

you were just flattened. This was the post. Oh, you

2:15:05

were flat. That was the front

2:15:07

of it. You can kind of see the guys in

2:15:09

there. Oh geez.

2:15:11

They're not the actual guys. I

2:15:13

wasn't that dark. And then I had an ocean gate

2:15:15

hat on. I don't know. Oh, that's good. I like

2:15:17

it. So you like to do topical costumes. I like

2:15:19

to do a topical, ideally something

2:15:22

that's not a person. I was

2:15:24

the ever given container ship in

2:15:26

the Suez canal one year. That's

2:15:28

good. That's good.

2:15:30

I like that. The one that got

2:15:32

stuck. Yeah. Yeah. So Amanda

2:15:34

Caswell at Tom's guide

2:15:36

went to meta.ai. Perhaps you

2:15:39

could do this for Paris Leo and

2:15:43

asked what she should be for Halloween. And she

2:15:45

says, why didn't I think of that sooner? So

2:15:48

what did she get? She, uh, Oh, I

2:15:52

hate this. I hate

2:15:54

this so much. This sucks.

2:15:56

Meta suggests create a colorful costume

2:15:59

with the. pinata inspired. Human

2:16:01

pinata. You can use cardboard, felt,

2:16:03

or even balloons to create the

2:16:05

pinata shape. This

2:16:07

is terrible. Meta AI suggests dress up

2:16:09

a vintage suitcase with a fun twist.

2:16:11

You can use cardboard or foam to

2:16:13

create the suitcase shape and add some

2:16:15

travel stickers. To be

2:16:17

clear, the AI is suggesting I go as a

2:16:19

suitcase, not a time traveler with a suitcase.

2:16:23

Hey Meta, can you make some

2:16:25

Halloween costume suggestions for me? I

2:16:30

wish I could. Can you make that Halloween costume like

2:16:32

a witch? Vampire?

2:16:34

Witch? Vampire? Would

2:16:36

you like more ideas? Would

2:16:39

you like more ideas? That's terrible! You got

2:16:41

a life-cheater. It's awful. Cut holes in it.

2:16:44

At least give me For God's Library book.

2:16:46

Wait, wait. I asked

2:16:48

Meta.ai, my friend is

2:16:50

a nihilist. What should she do for Halloween?

2:16:53

Oh, that's good. Number one. The correct answer

2:16:55

should be nothing. The absurdity

2:16:58

of existence. Dress up as a random

2:17:00

meaningless object, e.g. a cardboard box, a

2:17:02

leaf, or a forgotten sock. Or a

2:17:04

plastic bag blowing in the wind. That's

2:17:06

entirely accurate. That's what I look for. Two. The

2:17:09

void. Wear all black with a black

2:17:11

hooder mask symbolizing the emptiness of existence. That's

2:17:14

really good. Three. It is. A

2:17:17

forgotten soul dressed as a ghost with a sign that

2:17:19

says, no one remembers me. Oh!

2:17:22

Or the meaningless reaper. Dress up

2:17:24

as death, but with a twist.

2:17:27

Carry a scythe with a rubber chicken

2:17:29

or a whoopee cushion attached. Activity

2:17:34

ideas for you and your friends. Host

2:17:36

an existential crisis party decorate with

2:17:38

absurd, contradictory, or meaningless signs

2:17:41

and symbols. Two. Attend

2:17:43

a Halloween party, but only to highlight

2:17:45

the futility of social interactions. These are

2:17:49

all actually entirely accurate. That's right,

2:17:51

friend. Attend is a journalism professor.

2:17:53

What should he dress up for

2:17:55

as Halloween? Clark

2:17:58

Kent, Superman. Old

2:18:00

school reporter. Do it on meta.ai. That's

2:18:03

chat. This is chat GPT. Citizen Kane, Edward

2:18:05

R. Murrow. I will be drawing an existential

2:18:07

crisis party, though. That's on the list. At

2:18:10

that party, Harris, you should go

2:18:12

on a search for meaning scavenger

2:18:15

hunt where clues lead to more

2:18:17

questions, not answers. This

2:18:21

is brilliant. This is actually really good.

2:18:23

Who is this meta? This is really

2:18:25

good. Organize a nothingness movie marathon featuring

2:18:27

films with existential or absurdist themes. All

2:18:31

right, let me try the optimistic nihilist on

2:18:34

chat GPT. The

2:18:36

void, but make it sparkly. Existential

2:18:43

detective, smiling

2:18:45

grim reaper. Combine

2:18:48

a classic grim reaper robe and size

2:18:50

with a big goofy smile and upbeat

2:18:52

accessories like party hats or balloons. Meh,

2:18:56

doosa. A

2:18:58

twist of Medusa where a snake covered wig

2:19:00

and Greek inspired outfit, have a nonchalant or

2:19:03

board expression, make signs that say turn to

2:19:05

stone. Yeah. That's

2:19:08

this is that you know what? OK,

2:19:10

because you got to be a

2:19:12

nihilist. That's good. That was

2:19:15

a graveyard of lost streams with

2:19:17

tombstones buried in absurd epitaphs. They

2:19:21

really got me on the go as a

2:19:23

meaningless household object. That's that's kind of the

2:19:25

vibe I'm trying to bring. I

2:19:28

am a whisk. There's also the

2:19:30

Japanese tradition of going as like really

2:19:32

mundane situations like person waiting at a

2:19:35

gas station or stop and they have

2:19:37

to stop with them. Yeah,

2:19:40

I've always wanted to do just the. The

2:19:43

bag of plastic bags that's like under the

2:19:45

sink. You

2:19:50

guys are so much more creative than me. I

2:19:52

was just going to go as a baseball player.

2:19:54

So how should we describe you, Leo? What should

2:19:56

you go as? Oh, no, no, no, no. Let's

2:19:58

do Benito. Benito. What?

2:20:00

How would we? What's a short description of Benito?

2:20:02

That's what I wanted. I wanted a description of

2:20:05

Benito's psyche. Oh, what do you mean like?

2:20:07

What's your philosophy? I'm very,

2:20:09

very much aligned with Paris. Like I'm kind of

2:20:12

the same way. You can't be an optimistic nihilist

2:20:14

that's taken. No, it's already taken. Yeah. Yeah.

2:20:17

But I have the, I have the pin and everything. I

2:20:20

have the pin and everything. Are you in the club? Oh,

2:20:22

he's more of an optimistic nihilist than I am. He does. He

2:20:25

has a card that says optimistic nihilism. And

2:20:27

then the pin. What the hell? Oh, wait a

2:20:29

minute. Maybe we have to change. Maybe

2:20:32

I have to not be the one. Yeah. He's got

2:20:34

the official license. You're the official. Right?

2:20:37

Wow. That's really good. That was really good. All

2:20:39

right. Let's see. That's the first good chat

2:20:42

experience I've had. That's really amazing. Happy

2:20:45

voiced former radio DJ now

2:20:47

podcaster. Oh my God. This

2:20:49

is going to be bad.

2:20:52

What should he go as

2:20:54

for Halloween? Question

2:20:56

mark. Uh,

2:21:01

pretty obvious. No dress up a

2:21:03

fifties. DJ podcasting Phantom. Wear

2:21:07

a ghostly outfit with a headset and microphone.

2:21:09

That's terrible. Okay.

2:21:12

Stop your turn for Jeff's

2:21:14

pick of the week. Oh, uh,

2:21:17

that was pretty good actually. But you can stop now. No, no,

2:21:19

no, no, no, no, no, no. No, no, no, no. My sushi

2:21:21

is not ready for another 40 minutes. So, oh dear. Fine. We're

2:21:26

ending too early here. Yeah, which is unusual.

2:21:29

I just put the order in because they take forever, but it's

2:21:31

good. All right. Two things. Real quick. One,

2:21:33

uh, a, a, a,

2:21:35

a, at the, uh, a Dutch

2:21:38

museum, the,

2:21:40

um, maintenance guy threw away

2:21:42

a whole exhibit. Of

2:21:46

course. I love that. Was it,

2:21:48

was it a banana? No, the exhibit was two

2:21:51

beer cans all

2:21:53

titled all the good times we spent together. They

2:21:55

thought it was rubbish. Of

2:22:00

course they did. Perfectly reasonable. I'm such a

2:22:02

freak. I would see that and be like deeply moved.

2:22:05

Yes. Yes. So

2:22:07

they, uh, the curator came along

2:22:09

and discovered it was missing, realized

2:22:12

what was happened, managed to, uh, rescue

2:22:15

the beer cans out of the garbage,

2:22:18

um, makes clear that we don't blame the custodian

2:22:20

because it would only be logical and makes sense

2:22:22

and was only trying to do his job. He

2:22:24

was new on the job. And so

2:22:26

then they took the exhibit and put it elsewhere on a

2:22:28

pedestal. So you knew it was art. I

2:22:32

think it's a real statement, frankly. I

2:22:34

commend the custodian. I think he did the right

2:22:37

thing. I think he did too. What else? In

2:22:39

middle school, I was, uh, went

2:22:42

to a, some exhibit in

2:22:44

DC at the national art museum,

2:22:47

but there was a girl

2:22:49

on our trip who had like

2:22:51

long jean pants that were dragging

2:22:53

on the floor. Fancy pants were

2:22:55

always kind of tattered. And

2:22:57

at one point we get in this big

2:23:00

room and everyone's kind of gathered around looking

2:23:02

at something on the floor and it's a

2:23:04

long piece of denim that's like tattered and,

2:23:06

you know, really artfully arranged. And then I

2:23:09

look around and realize it came

2:23:11

off of that girl's jeans and everyone just thought

2:23:13

it was art. Oh God.

2:23:19

All right. My other one is this. So Mark Zuckerberg is

2:23:21

trying to become the ideal husband on earth. He made, he

2:23:23

commissioned a whole statue for his wife, Priscilla Chan. Yeah, by

2:23:25

the way, ugly ass statue.

2:23:28

Well, yes, but hey, it's a statue. My

2:23:30

wife doesn't have a statute. Does Lisa have

2:23:32

a statue? Have

2:23:37

you made it? Have you got a statue for Lisa and Leo? Uh,

2:23:39

no, but I think that's a good idea. See, right. So at least

2:23:41

she has a statue. So now he decided she wanted a mini statue.

2:23:45

She wanted a mini van. So on

2:23:48

his Facebook, of course, he

2:23:50

has pictures of having designed

2:23:52

for her a

2:23:55

custom Porsche

2:23:57

Cayenne turbo GT. minivan.

2:24:00

Oh my God. Through

2:24:03

an Emmanuel GT3 touring to make it

2:24:05

his and hers. So his

2:24:07

is the, is the midlife crisis

2:24:09

car and hers is the minivan,

2:24:11

but they're both portions. What

2:24:15

an insane place this is photographed. And

2:24:18

I guess it's a car show. Are they? No,

2:24:20

I think it's a, it's a dealer's, um, it's,

2:24:23

it's a, it's a re rebuilding place. What

2:24:25

are those? Oh, custom, custom, custom. Yeah.

2:24:29

So he designed it. Well,

2:24:34

Priscilla you married well. That's all I

2:24:36

can say. I

2:24:38

mean, really, if you are infinitely wealthy, think

2:24:42

of the challenge of gift giving because

2:24:45

you can't just, you know, buy a box of

2:24:47

chocolates. It would

2:24:49

just really be challenging. What

2:24:51

a hard life he lives. Yeah, seriously. Ooh, boo.

2:24:54

Yeah. Yeah. Boo. He

2:24:56

has to build his wife

2:24:58

a statue. It looks like

2:25:00

a palace in Bavaria. What

2:25:04

about that? Yeah, that would be, I think

2:25:06

real estate is always welcome. Well,

2:25:08

he's got the Hawaii. Um, that's true. He

2:25:10

doesn't need it. I

2:25:13

also want to mention cause cause Paris introduced me to

2:25:15

this and I, and I love if books could kill

2:25:17

and I hope and pray none of my books ever

2:25:19

ends up on that books could kill. Uh, and they

2:25:21

have a teaser, which I guess they do Paris. The

2:25:25

teasers are for their Patreon

2:25:27

episodes, which are like a membership.

2:25:29

It's kind of like club twit where it's like, you

2:25:31

can pay, what is it like $5 a month? And

2:25:35

then you get access to their special episodes.

2:25:37

Cause this is great. So they, they,

2:25:39

they did a Glenn Kessler retire bitch.

2:25:42

Oh, this is such a good, he's

2:25:45

the fact checker at the Washington post who

2:25:47

has reached new levels of pedantry. Oh

2:25:50

gosh. And it's, I always listen to the teaser. Now I have to

2:25:52

pay money to listen to the rest of the, I will because

2:25:54

it is brilliant. It is, it is. So

2:25:57

what is the podcast? If books could kill

2:25:59

remind me. you describe it first basically

2:26:01

is the uh... they

2:26:04

started with them taking on the airport

2:26:06

books and incredible like books there popular

2:26:09

in popular culture like uh... and

2:26:11

deeply analyzing and kind of ripping

2:26:14

them apart uh... like the secret

2:26:16

rich dad poor dad uh...

2:26:18

my really really really researching the

2:26:20

start to know incredibly well-read really

2:26:22

they probably read like five book

2:26:25

five other books for every episode

2:26:27

they talk to people they read

2:26:29

studies like it is of

2:26:31

very well-researched endeavor uh...

2:26:34

and it kind of goes into both

2:26:36

the meaning of the book what things

2:26:38

have been misinterpreted what things the authors

2:26:40

misinterpreted i'd highly recommend they

2:26:43

did they did jonathan height and it

2:26:45

made me so happy uh... whole room

2:26:47

tube bits and

2:26:50

who are they are coddling of the

2:26:52

american mind which is jonathan heights other

2:26:55

earlier books yes i could uh...

2:26:57

arms is one of them and was the other one uh...

2:27:00

petersham sherry who's from five four

2:27:02

a podcast about how much supreme

2:27:04

court sucks but i'd really recommend

2:27:07

to anybody if they're all right with

2:27:11

the imagine the tenor of ed plus

2:27:13

for people with law degrees

2:27:15

uh... uh...

2:27:18

yeah and they thought it was a

2:27:20

story i'd like to take a lead he

2:27:23

i'd say is like one

2:27:26

of the most prolific and successful

2:27:28

podcasters of the modern time and

2:27:30

i mean podcasters in like podcast

2:27:33

specifically not the lives chose sort

2:27:35

of thing we we're doing right

2:27:37

now uh... he uh...

2:27:40

i think was a reporter huffington post in some

2:27:42

other places but really took off

2:27:44

with uh... all i'm forgetting the name

2:27:46

of uh... you're wrong about a podcast you'd

2:27:49

write name sarah marshall those phenomenal and went

2:27:51

on for a couple of years then

2:27:54

he started hosting a podcast with

2:27:56

uh... ari gordoned i'd really recommend

2:27:58

called maintenance phase which is about

2:28:00

kind of demystifying like wellness

2:28:04

trends and they

2:28:07

had brought up the concept of

2:28:09

the BMI like being a totally

2:28:11

bunk statistic

2:28:14

made up originally by insurance companies

2:28:16

long before it kind of that

2:28:19

became known in the popular sphere.

2:28:21

A really another phenomenally well researched

2:28:23

podcast and very funny and

2:28:26

recently he launched if books could kill

2:28:28

with Peter Shams area. Well

2:28:32

there you go and they have a Patreon page

2:28:34

which will let you listen to the entire episode

2:28:36

of Glenn Kessler retile

2:28:39

retire biatch. Yeah

2:28:42

it's all right if you pronounce it like that. Is

2:28:44

it? Oh okay. No. Oh.

2:28:48

Hey I am about to retire this

2:28:50

show has put me right out. Thank

2:28:54

you for being here we we appreciate

2:28:56

it. MsParis.nyc gosh man

2:28:58

you gotta do something

2:29:00

with that. Yeah I

2:29:03

don't know what it is but something. It's just great

2:29:05

right. It's so good. It's perfect.

2:29:07

Paris writes for the information you'll see

2:29:10

her in the weekend and covers issues

2:29:12

of young people in the internet. You

2:29:14

can send her a tip she's and

2:29:17

she covers youth soccer games too. You

2:29:19

you'd flag football. I've got a request

2:29:21

this week if you are a listener

2:29:23

that has a child that uses AI

2:29:26

chat bots in any way either for

2:29:28

search companionship or otherwise reach out to

2:29:30

me. I'd love to chat. Oh

2:29:33

I love that. Where would they reach out Paris? Reach

2:29:36

out via signal at

2:29:39

martino.01 or if

2:29:42

you go to my twitter at Paris Martino there's a bunch of other

2:29:44

ways to reach out to me there too.

2:29:46

It's also on my website paris.nyc

2:29:49

my you know work phone numbers there

2:29:51

my email some other things reach out

2:29:53

I'd love to chat. And if you do use

2:29:55

signal don't use your work phone. Yes.

2:29:58

Well this is not terrible. I mean, this

2:30:00

is not terribly. Oh, we don't know, you

2:30:03

know, by the way, that

2:30:05

is one wild website. You got there,

2:30:08

young lady, right? It's going on. We

2:30:10

go here. We go to your mouse. Expect

2:30:12

any less. Are we? Are we?

2:30:14

Yeah. Are we? Are

2:30:16

we going down to the Titanic? What's going

2:30:19

on here? There you are. Yeah, you know, I think

2:30:21

it's kind of fun. And

2:30:24

this is so cool. Lauren click photos of

2:30:26

my cat up at the time. More I know

2:30:28

about Paris, the more

2:30:30

I like her. She there. Oh,

2:30:32

hello. Gizmo. Gizmo. She's a sweetie

2:30:34

hiding behind a plant. How? Yeah,

2:30:36

no one could see you. Gizmo.

2:30:39

Oh, Paris. Thank

2:30:42

you for being here. The information.com.

2:30:44

Everybody should subscribe. Jeff

2:30:47

Jarvis is the emeritus professor

2:30:49

of journalistic innovation at the Craig.

2:30:52

Graduates School of Journalism at the

2:30:54

City University of New York. Next

2:30:56

week. I can say one of

2:30:59

the things that I'm doing next.

2:31:02

Well, what he's really is and I think

2:31:04

everybody should remember. This is a great author

2:31:07

who is I think you're going to be you're

2:31:09

going to go down as like

2:31:12

one of these, you know, historians who writes

2:31:14

about modern times. The web we

2:31:16

weave is the newest. Why we

2:31:18

must reclaim the Internet. The

2:31:21

fantastic read. It

2:31:23

is really good. The Gutenberg parenthesis

2:31:25

magazine and he's writing

2:31:27

a new one. I think

2:31:30

forget the hot teaching. Forget the teaching.

2:31:32

Just I know I wonder why I'm

2:31:35

buried right now and need to write a

2:31:37

book. But do you enjoy it? Is the

2:31:40

process fun? No, no

2:31:43

writing is pain. It's fun. Yeah, but

2:31:45

having written is the best. Yes, having

2:31:48

written rules. Writing said what

2:31:50

I want to say. It's like you hated it so

2:31:52

much. It's so painful. And

2:31:54

I mean, I'm a competent writer, but it's just hard

2:31:57

work. And

2:31:59

I never had. You're resting from yourself.

2:32:01

Yeah. Yeah. And you have so much

2:32:03

research and so much. I wanted, I have a

2:32:05

lot of research on this book. This book is

2:32:07

about the line of type and I wanted something

2:32:10

was narrative. I'd never really done that apart

2:32:12

from a news story. So I finally had

2:32:14

to figure it out for a while. So I'll use

2:32:16

somebody's narrative and I'll get the damn chapter written. And

2:32:18

then I go back through every source and

2:32:21

put in the better quotes and better this and

2:32:23

better that. And then I go back through and

2:32:25

sand and sand and sand. All right. Here's a

2:32:27

$64,000 question. Use

2:32:30

any AI in this stuff? Nope. Nope.

2:32:33

I wish I could figure out how to, but no. Yeah.

2:32:36

Paris, you don't use AI either in your writing. No,

2:32:40

just for custom. She likes the pain. I

2:32:43

did like the pain. I think it's more painful. I

2:32:46

hand wrote out after I had gone

2:32:49

through all my different interviews highlighted in,

2:32:52

you know, the text, uh,

2:32:54

transcriptions, the parts I wanted, I

2:32:56

went through and hand wrote out

2:32:58

on paper, uh, all

2:33:01

the relevant parts I wanted to include

2:33:03

from each interview, color coded, and then

2:33:05

hand wrote an outline that incorporated all

2:33:08

of those. Um, yeah. Sometimes

2:33:10

I write once through when you write it, you write

2:33:12

it, or do you write it and then rewrite, rewrite,

2:33:14

rewrite? Uh,

2:33:17

I write it once through,

2:33:19

but as I'm doing that, I write and

2:33:21

rewrite various sentences, but I do it. I

2:33:23

write and rewrite the structure a lot. I'm

2:33:25

one of those people who, especially in a

2:33:28

feature, I can't really get going on

2:33:30

it unless I get the top, write

2:33:34

the lead as they say. Um, so

2:33:36

I will often write and rewrite the lead

2:33:39

for a day. And then the next

2:33:42

part comes a lot easier. As

2:33:44

a sorry for the

2:33:46

sexist nature was that I would have to write

2:33:49

something on deadline and it would go down to

2:33:51

the composing room as I was writing, which I

2:33:53

think it was in the class. It was. So,

2:33:55

um, so what I learned to do was to write

2:33:57

really fast to get a structure and then spend every minute

2:33:59

editing. Wow. Wow. And

2:34:02

then the computer changed all that because then I

2:34:05

didn't have to wave goodbye to it and I could go

2:34:07

change things around. Thank

2:34:11

you so much guys. It's wonderful to meet with

2:34:13

you every Wednesday. We do the show about 2

2:34:15

p.m. Pacific, 5 p.m. Eastern, 2100 UTC, at

2:34:17

least for the rest of the

2:34:20

month. Thanks to the

2:34:22

Candy Makers of America, we don't change to daylight

2:34:25

standard, to

2:34:27

standard time until

2:34:30

we're sneaking up on

2:34:32

you, Jeff. We don't

2:34:34

change to standard time until the

2:34:36

next month. Share with us. But for now, anyway,

2:34:38

2100 UTC. The

2:34:43

person who abused it herself.

2:34:47

She dropped an eye immediately, followed

2:34:50

soon. Thank you. But Nio has

2:34:52

a, what do you call it, a card for

2:34:55

the show? Yeah, a little thumbnail. Yeah, a thumbnail.

2:34:57

Yeah. Should we do anything

2:34:59

else? We should go lower. What should we do for

2:35:01

YouTube? Oh yeah. I

2:35:05

don't know if I can go lower. Killroy. We

2:35:07

should all do Killroy. Oh,

2:35:10

this is so sad. All of this just to

2:35:12

get a click on YouTube. You're

2:35:15

out of focus, Leo. Because

2:35:18

it can't see my eyes. Gizmo

2:35:21

wants to know what's going on. By

2:35:23

the way, I just want to reassure everybody. Gizmo

2:35:26

has not eaten any Haitians during this

2:35:29

episode. None. That's a

2:35:31

lie being spread by irresponsible

2:35:35

politicians. We

2:35:37

do this show, as I said, every Wednesday,

2:35:39

2pm Pacific. You can watch us live on

2:35:41

seven different streams, YouTube, Twitch, Facebook,

2:35:44

LinkedIn, x.com, Kick.

2:35:47

And of course, if you're a member of the club, and I hope you

2:35:49

are, I mean, isn't this worth Right

2:35:53

here. This is the content you pay for.

2:35:55

This is what you're paying for. You can

2:35:57

watch us in your club, Twitch. Do

2:36:00

join the club. Seven bucks a month, ad-free versions

2:36:02

of all the shows, access to

2:36:05

the discord, lots of special stuff.

2:36:07

Stacy's book club is coming up at the end of the

2:36:10

month. We're gonna do a coffee episode next week. Micah's

2:36:13

Creative Corner, a lot of great things in

2:36:15

the club. So join it because it's

2:36:17

a great club that you want to be a

2:36:19

member of and it helps us out. Twit.tv slash

2:36:21

club Twit. Thank you everybody. After

2:36:23

the fact, get it on demand versions of

2:36:25

the show at the website. Twit.tv slash twig.

2:36:28

Oh wait, let's do the mime thing. This

2:36:41

is so sad. For people listening to

2:36:43

audio, there's literally nothing but me grunting.

2:36:46

Anyway, anyway, I'm sorry. I'm sorry.

2:36:51

What was I saying? Oh, yes. On

2:36:53

demand ad. You

2:36:55

can watch on YouTube. You can subscribe. I don't know

2:36:57

why you'd want to audio or video, but get the

2:36:59

video because that's when all the sight gags work. Thanks

2:37:02

for joining us. We'll see you next time on this week at Google.

2:37:15

Today's show is brought to you by

2:37:18

Progressive Insurance. Do you

2:37:20

ever think about switching insurance companies to see

2:37:22

if you could save some cash? Progressive

2:37:25

makes it easy to see if you

2:37:27

could save when you bundle your home

2:37:29

and auto policies. Try it at progressive.com

2:37:33

Progressive casual insurance company affiliates. Potential

2:37:36

savings will vary, not available in

2:37:38

all states. ...to

2:38:00

seek out and to receive permission to do

2:38:03

the placement of the electric car seat moving

2:38:05

around... ...to ride there with you through all.'

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features