Our Guest Andy Mills Discusses
How AI Will Save Humanity: Creator of The Last Invention Explains
When intelligence becomes abundant, what happens to humanity’s purpose?
Andy Mills, the co-founder of The New York Times’ The Daily and creator of The Last Invention, joins us on this episode of Digital Disruption.
Andy is a reporter, editor, podcast producer, and co-founder of Longview. His most recent series, The Last Invention, explores the AI revolution, from Alan Turing’s early ideas to today’s fierce debates between accelerationists, doomers, and those focused on building the technology safely. Before that, he co-created The Daily at The New York Times and produced acclaimed documentary series including Rabbit Hole, Caliphate, and The Witch Trials of J.K. Rowling. A former fundamentalist Christian from Louisiana and Illinois, Andy now champions curiosity, skepticism, and the transformative power of listening to people with different perspectives, values that shape his award-winning journalism across politics, terrorism, culture wars, technology, and science.
Andy sits down with Geoff to break down the real debate shaping the future of AI. From the “doomers” warning of existential risk to the accelerationists racing toward AGI, Andy maps out the three major AI camps influencing policy, economics, and the future of human intelligence. This conversation explores why some researchers fear AGI, why others believe it will save humanity, how job loss and automation could reshape society, and why 2025 is becoming an “AI 101 moment” for the public. Andy also shares what he’s learned after years investigating OpenAI, Anthropic, xAI, and the people behind the AGI race.
If you want clarity on AGI, existential risk, the future of work, and what it all means for humanity, this is an episode you won’t want to miss.
00;00;00;09 - 00;00;00;29
Hey, everyone.
00;00;00;29 - 00;00;04;02
I'm super excited to be sitting down
with Andy Mills, co-founder
00;00;04;02 - 00;00;07;25
of The New York Times, the daily podcast
and producer of tech podcast.
00;00;07;25 - 00;00;10;16
Like Rabbit Hole and The Last Invention.
00;00;10;16 - 00;00;13;24
Andy is a world class storyteller
and has been dedicating his time
00;00;13;24 - 00;00;17;27
to talking to the greatest minds in AI
so that he can help us better understand
00;00;17;27 - 00;00;21;03
what the technology is capable of
and where it's going next.
00;00;21;14 - 00;00;24;06
I want to ask him
which vision of the AI driven future
00;00;24;06 - 00;00;27;21
he finds most convincing, who he thinks
will be the winners and losers
00;00;27;28 - 00;00;29;25
and what we need to do to be ready.
00;00;29;25 - 00;00;32;25
Let's find out.
00;00;34;06 - 00;00;35;27
I am here with Andy Mills.
00;00;35;27 - 00;00;40;25
Andy, you're the producer
on The Last Invention podcast.
00;00;40;25 - 00;00;45;02
And, you know, one of the things that's,
you know, you know as well as I do
00;00;45;02 - 00;00;49;02
when you're making, an AI podcast
is you hear from all sorts of camps.
00;00;49;12 - 00;00;53;27
If I can call them that of people
who have completely different views
00;00;54;03 - 00;00;58;29
for the outlook of of how AI is going
to transform or not transform our society.
00;00;58;29 - 00;01;00;20
And so I wanted to ask you,
00;01;00;20 - 00;01;04;02
maybe you can lay out
what are the main camps that you've seen?
00;01;04;16 - 00;01;07;07
And, you know, based on hearing arguments
across all of those
00;01;07;07 - 00;01;09;25
where do you see yourself sitting?
00;01;09;25 - 00;01;10;17
Yeah.
00;01;10;17 - 00;01;12;07
Well, thanks for having me.
00;01;12;07 - 00;01;14;03
Thanks for the question.
00;01;14;03 - 00;01;19;02
My favorite subject
to cover as a journalist is a debate.
00;01;19;11 - 00;01;24;10
It is something very attractive to me
about trying to understand in good faith
00;01;24;26 - 00;01;27;21
why intelligent people
00;01;27;21 - 00;01;31;08
come to such different conclusions
when looking at the same material.
00;01;31;23 - 00;01;34;23
And I had known
that there was a contingent
00;01;35;04 - 00;01;37;17
inside
as the world of artificial intelligence
00;01;37;17 - 00;01;40;16
that was really, really worried about it
for many years.
00;01;40;16 - 00;01;44;28
Like Eliezer
you'd Kautsky podcast interviews in 2013
00;01;44;28 - 00;01;48;04
or something is when I first realized
that there was this almost like
00;01;48;12 - 00;01;51;16
biblical prophet voice out there saying
00;01;51;26 - 00;01;55;19
that the sci fi movies are kind of true
and we really need to get ready.
00;01;55;19 - 00;01;57;07
We need to get prepared for this.
00;01;57;07 - 00;02;00;09
And after ChatGPT blew up,
00;02;00;18 - 00;02;03;19
I started to increasingly run
into essentially
00;02;03;19 - 00;02;07;12
the opposite side of that debate,
which are these people who we often call
00;02;07;12 - 00;02;11;07
the acceleration us to believe
that AGI, this
00;02;11;16 - 00;02;15;15
artificial general intelligence point
that they believe is coming.
00;02;15;29 - 00;02;18;12
It could be
the best thing that ever happened to us.
00;02;18;12 - 00;02;22;00
And so I was attracted right away
to the people who have those strongly
00;02;22;00 - 00;02;25;00
opposing views inside the same world.
00;02;25;02 - 00;02;26;27
But the more that I dug into it,
00;02;26;27 - 00;02;30;20
the more I realized that everyone in the
AI world had different camps.
00;02;30;28 - 00;02;33;25
So literally
there was like 8 or 9 different ways
00;02;33;25 - 00;02;36;25
you could categorize the debate
that's happening
00;02;36;27 - 00;02;41;04
inside the technology world about what
we should do with artificial intelligence.
00;02;41;04 - 00;02;44;14
And we ended up in the podcast
narrowing them down to
00;02;44;14 - 00;02;48;00
to three basic camps, the camps
that I think are most influential
00;02;48;15 - 00;02;51;27
in this conversation in the moment
that we're having camp number one is the,
00;02;51;27 - 00;02;55;05
you know, AI Duma, essentially
the people who think
00;02;55;14 - 00;02;58;14
that the risks of the AI race
00;02;58;29 - 00;03;03;10
as we are conducting that race
today are so great
00;03;04;00 - 00;03;07;09
and include the fact that we may create
something smarter
00;03;07;09 - 00;03;11;09
than the smarter than us that ends up
leading to our own extinction.
00;03;11;29 - 00;03;16;10
They think that that is cause
for so much alarm that we need to stop.
00;03;17;01 - 00;03;20;25
Think they're trying to get us to stop
right now before we go any further.
00;03;21;17 - 00;03;23;12
Then on the far end you have the AI.
00;03;23;12 - 00;03;26;15
Acceleration is to say
that the fears have been overblown,
00;03;27;12 - 00;03;30;22
that the benefits of this,
and the way that this might help us
00;03;30;22 - 00;03;34;12
out of the stagnation that we're in,
I mean, some of them will even tell you
00;03;34;18 - 00;03;38;08
this malaise,
this essentially nihilistic streak
00;03;38;26 - 00;03;41;14
that's spreading from our politics
to our social media,
00;03;41;14 - 00;03;44;14
like almost all of that
is going to be positively affected
00;03;44;20 - 00;03;48;11
by the discovery
and the investment in a true AGI.
00;03;48;19 - 00;03;50;19
And then there's this camp
that's kind of in the middle,
00;03;50;19 - 00;03;53;08
but they're not like a medium ground
between the two.
00;03;53;08 - 00;03;55;26
They're their own place on the map.
00;03;55;26 - 00;03;57;12
And I call them the scouts.
00;03;57;12 - 00;04;00;23
And they're the people who think
it's probably too late to stop.
00;04;01;10 - 00;04;04;11
So the boomers are right to be afraid,
but we're not going to stop.
00;04;04;24 - 00;04;08;10
And maybe we shouldn't stop
because the acceleration is too right,
00;04;08;10 - 00;04;11;12
that this could be like fire,
00;04;11;13 - 00;04;14;29
like electricity, like a true
turning point in human history.
00;04;15;13 - 00;04;18;13
But they believe that the risks are real.
00;04;18;14 - 00;04;23;16
And so we need to do everything
we can to get ready for what's coming.
00;04;23;16 - 00;04;26;10
And that's on the economy.
00;04;26;10 - 00;04;29;24
What will we do
if the job market starts to fall away?
00;04;29;24 - 00;04;33;12
If the job market goes away completely,
what should we do with our politics?
00;04;33;19 - 00;04;34;21
What kind of tests?
00;04;34;21 - 00;04;36;29
What kind of regulations
should we put in place?
00;04;36;29 - 00;04;43;01
And they are trying to shout as loud
as they can that we can't wait five years.
00;04;43;04 - 00;04;45;17
Like we have to start getting ready
right now.
00;04;45;17 - 00;04;51;01
Journalists, universities, think tanks,
like we need to turn our efforts
00;04;51;08 - 00;04;56;08
into solving the problems that stand
between now and the creation of this AGI.
00;04;56;12 - 00;04;58;10
So those are the three main camps
that we talked to.
00;04;58;10 - 00;05;02;29
Obviously, there are camps like
the skeptics that are out there as well.
00;05;03;13 - 00;05;06;14
And we are going to follow up with them,
down the road.
00;05;06;14 - 00;05;08;06
But I just
I think that we're living in a time
00;05;08;06 - 00;05;11;28
where the skeptics
are not really a forceful shape
00;05;12;04 - 00;05;15;21
of the conversation
happening closest to the technology.
00;05;16;28 - 00;05;17;20
I want to come back to
00;05;17;20 - 00;05;20;23
skeptics in a minute, because I've got
I've got lots to say about the skeptics.
00;05;20;23 - 00;05;24;25
But before I do, you know, like across the
it sounds a little bit like a spectrum
00;05;24;25 - 00;05;28;17
where you've got the scouts, you know,
maybe in the middle or more, you know,
00;05;28;17 - 00;05;32;29
maybe taking aspects of both camps
and having more of a practical lens on it.
00;05;33;12 - 00;05;35;20
But listening to all these
disparate voices?
00;05;35;20 - 00;05;39;23
Is that where you see yourself, Andy,
or are you more sort of in the scout camp?
00;05;39;23 - 00;05;43;04
Or, you know,
if I had to ask you to put your flag down,
00;05;43;16 - 00;05;45;02
where do you think it would go?
00;05;45;02 - 00;05;47;28
Well, I am the kind of journalist who,
00;05;47;28 - 00;05;51;06
like, wants to remove myself to understand
it all better.
00;05;51;06 - 00;05;54;20
Like I wouldn't be investing so much time
and effort if I didn't
00;05;54;20 - 00;05;59;13
think all three camps
have earned our attention in our time.
00;05;59;13 - 00;06;03;06
I think they all have an incredible case
to make, and I want to help people
00;06;03;22 - 00;06;07;15
find their own place in this debate
and join this debate,
00;06;07;15 - 00;06;09;16
because I think the time has come
that we join this debate.
00;06;09;16 - 00;06;12;21
And to do that,
you really have to suspend your own biases
00;06;12;27 - 00;06;15;26
to help each camp
put their best foot forward.
00;06;16;07 - 00;06;18;09
But I will say I'm a person, and
00;06;19;12 - 00;06;20;28
most of my
00;06;20;28 - 00;06;24;09
personal circle were Dumas.
00;06;24;27 - 00;06;27;22
They've
some of them have moved into scouts.
00;06;27;22 - 00;06;32;08
There was definitely an anti acceleration
of bias in my personal circle.
00;06;32;08 - 00;06;34;13
When I started and that's broken down.
00;06;34;13 - 00;06;38;28
I find that the acceleration is to do
have a really compelling case.
00;06;38;28 - 00;06;42;27
And so I think I was a little bit
more Dunmurry Scout at
00;06;44;12 - 00;06;47;01
let's say maybe April, you know,
00;06;47;01 - 00;06;50;01
May when I was really diving in deeper
into this.
00;06;50;05 - 00;06;55;18
But by now, like, I truly can see a world
where all three camps get what they like.
00;06;55;23 - 00;06;59;24
I can understand the vision of the future,
that all three camps are painting,
00;06;59;24 - 00;07;01;14
and find all of them compelling enough.
00;07;01;14 - 00;07;04;19
That, and of course,
all of them know far more about Right.
00;07;04;19 - 00;07;06;07
technology than I do that
00;07;06;07 - 00;07;10;13
I just have decided that for now, I'm
keeping a complete, open mind.
00;07;10;18 - 00;07;13;17
As you know, new information is going to
be coming in over the next several years.
00;07;14;09 - 00;07;15;24
Right. That, that that's fair.
00;07;15;24 - 00;07;16;17
And it's interesting to hear
00;07;16;17 - 00;07;20;27
about your sort of personal shift
maybe coinciding with a societal shift.
00;07;20;27 - 00;07;21;16
Maybe not.
00;07;21;16 - 00;07;25;18
But I wanted to touch on this notion
of the anti acceleration test.
00;07;25;18 - 00;07;30;04
And one of the things you said that
was interesting about the Scout, is that
00;07;31;16 - 00;07;33;04
you said it, it's how can we
00;07;33;04 - 00;07;36;26
take action, how can we prepare ourselves
against things like job loss?
00;07;37;12 - 00;07;40;08
And the reason that caught my attention.
00;07;40;08 - 00;07;43;09
And tell me if the people
you're speaking to see it differently.
00;07;43;16 - 00;07;46;16
But what's so interesting
about the notion of job loss
00;07;46;17 - 00;07;49;16
is you may hear that it as, you know,
kind of a casual listener and say, oh,
00;07;49;16 - 00;07;52;16
well, that that obviously
sounds like doom or ism,
00;07;52;23 - 00;07;56;08
but sometimes, like I'm hearing
acceleration, it's talk about job
00;07;56;08 - 00;08;00;02
loss as well as it's job loss,
but it'll be good job loss.
00;08;00;02 - 00;08;02;26
It'll be creative disruption
or something like that.
00;08;02;26 - 00;08;06;13
Where does the economic component of this,
00;08;07;00 - 00;08;09;10
you know, fall into place
and how much do you find that
00;08;09;10 - 00;08;13;24
we have to get beyond like what's actually
technologically possible and start
00;08;13;24 - 00;08;19;17
to look at the broader societal factors,
things like economics, politics, society.
00;08;19;26 - 00;08;24;08
But to tell you the truth, the
the piece of this that I'm most vested
00;08;24;08 - 00;08;27;19
in, I'm most interested in at
this point is the existential risk piece.
00;08;28;02 - 00;08;28;27
The fact that,
00;08;30;05 - 00;08;33;05
like, highly educated, very experienced,
00;08;33;06 - 00;08;36;09
usually very sober minded scientists
00;08;37;09 - 00;08;40;09
are sounding like apocalyptic prophets,
00;08;41;25 - 00;08;43;09
is interesting
00;08;43;09 - 00;08;47;05
and trying to dig deep
into what has convinced them,
00;08;47;09 - 00;08;51;18
especially because many of them were busy
accelerating this technology.
00;08;51;18 - 00;08;55;25
I mean, in the case of a guy
like Geoffrey Hinton, right, since 1972
00;08;57;04 - 00;09;01;16
against a, you know, dark, cold
AI, Winter believed
00;09;01;24 - 00;09;05;15
that he should dedicated his life to this
and now he's telling us
00;09;05;15 - 00;09;09;13
that it poses an existential risk
to the entire existence of our species.
00;09;09;25 - 00;09;11;11
That's interesting.
00;09;11;11 - 00;09;12;27
What's the background?
What's the story there?
00;09;12;27 - 00;09;15;00
How many people are like him?
00;09;15;00 - 00;09;16;19
It turns out a lot more than you'd think.
00;09;16;19 - 00;09;20;06
So the existential piece is the thing
I'm most interested in.
00;09;21;09 - 00;09;24;09
Obviously, you'll have the biggest impact
if those predictions were to come true,
00;09;24;19 - 00;09;27;10
but the economic piece,
a lot of them will tell you that
00;09;27;10 - 00;09;30;18
they talk about that in part
because we can imagine it
00;09;31;22 - 00;09;32;13
like they think that the
00;09;32;13 - 00;09;37;10
existential risk piece with
AI is so hard to imagine.
00;09;37;11 - 00;09;38;03
They themselves
00;09;38;03 - 00;09;41;03
don't really know how to paint the picture
of what it would look like.
00;09;42;00 - 00;09;47;01
With the fears of atomic bombs
leading to an existential crisis.
00;09;47;01 - 00;09;51;14
At least she had a vision of the mushroom
clouds of what it might look like.
00;09;51;19 - 00;09;54;19
They can't even do that,
but they know that they can
00;09;54;29 - 00;09;57;29
tap into the reality
of an economic crisis.
00;09;57;29 - 00;10;02;08
I think many people today
lived through the 2008 financial crisis,
00;10;02;12 - 00;10;06;03
and in some ways, we are still in
the aftermath of what that did.
00;10;06;16 - 00;10;09;16
And they're saying the disruption
to the economy
00;10;10;04 - 00;10;14;07
might not just be more severe than that,
but it would last so much longer
00;10;14;07 - 00;10;18;16
and possibly would last forever, like
with the Industrial Revolution coming in,
00;10;18;16 - 00;10;22;13
there were certain jobs that were created,
never existed before certain jobs ended.
00;10;22;24 - 00;10;26;15
And then the same thing when we came into
the technological revolution, and they
00;10;27;03 - 00;10;30;03
they think that they can actually get
people's attention
00;10;30;20 - 00;10;34;03
in the short term risks to the economy
00;10;34;13 - 00;10;37;21
and then talk to them about, hey,
do you know what the really big risk is?
00;10;38;09 - 00;10;41;27
And that this thing
might become more intelligent than us
00;10;42;11 - 00;10;46;10
and a more intelligent
species is rarely controlled
00;10;46;10 - 00;10;49;09
by the desires and the wishes of a less
intelligent species.
00;10;49;14 - 00;10;52;14
And I think that they know
that that sounds a little bit batty.
00;10;52;15 - 00;10;55;10
And so they focus a lot on the economic
piece, because the economic piece,
00;10;55;10 - 00;10;59;15
not only I think is a good toehold,
but it is obviously a reality.
00;11;00;08 - 00;11;03;20
And I think that it's coming a lot
faster than we think.
00;11;03;20 - 00;11;06;11
I know that you recently talked
to, Ed Zittrain.
00;11;06;11 - 00;11;07;19
Is that his name?
Is that you say his name?
00;11;08;18 - 00;11;08;27
Yeah.
00;11;08;27 - 00;11;09;25
It's a trend. That's right.
00;11;09;25 - 00;11;10;12
Yeah.
00;11;10;12 - 00;11;15;11
Yeah, he and I, he and I are we
we are looking at this moment
00;11;15;11 - 00;11;19;25
in artificial intelligence and having such
a different response to it.
00;11;19;25 - 00;11;24;06
It's fascinating that I believe,
he said, that LMS can't do anything,
00;11;24;06 - 00;11;29;01
that these current chat bot models
really aren't able to live up to the hype.
00;11;29;01 - 00;11;32;10
Well, the hype is pretty big,
so I given that there's a lot of hype,
00;11;32;17 - 00;11;36;17
but this idea that they can't really do
anything, that there's nothing to see
00;11;36;17 - 00;11;39;17
here, I cannot find any evidence
that that's true
00;11;40;07 - 00;11;43;15
because businesses
are already interweaving
00;11;43;15 - 00;11;47;00
this into like the foundational aspects
of their business.
00;11;47;13 - 00;11;50;13
And this is just the chat bot
where we're at.
00;11;50;15 - 00;11;53;18
And I think I want to remind people
the chat bot is to the
00;11;53;18 - 00;11;57;05
AI what the website is to the internet.
00;11;58;06 - 00;11;59;11
Yes, the like the
00;11;59;11 - 00;12;02;13
website might not be
that impressive to you,
00;12;02;24 - 00;12;06;22
but they're not investing all this money
in a better website.
00;12;07;05 - 00;12;09;21
They're trying to create something
like the internet.
00;12;09;21 - 00;12;13;10
It's that it's that artificial
intelligence behind the chat bot.
00;12;13;19 - 00;12;15;21
That is the thing that's so exciting.
00;12;15;21 - 00;12;20;03
And so when you look today and you see
that, you know, Copilot and ChatGPT
00;12;20;21 - 00;12;24;11
are still not able to do things that
maybe you were led to believe by
00;12;24;11 - 00;12;27;17
some of the hype from the product managers
it was going to be able to do by now.
00;12;27;17 - 00;12;30;17
And so you think nothing to see here.
00;12;30;22 - 00;12;32;07
The interesting point of view.
00;12;32;07 - 00;12;33;12
I want to hear Ed out.
00;12;33;12 - 00;12;34;08
I'm glad he is
00;12;34;08 - 00;12;37;18
a part of this public conversation,
but I just want to put alongside it
00;12;38;01 - 00;12;42;17
like all of these people
who are very close to this technology,
00;12;42;28 - 00;12;47;10
who are worried about this technology,
and they're saying even in its infancy,
00;12;47;19 - 00;12;51;25
we are weaving it into our economy,
we are weaving it into our businesses.
00;12;52;03 - 00;12;53;10
And so if the ones who
00;12;53;10 - 00;12;56;25
are worried about it are right,
it does pose all these risks.
00;12;57;17 - 00;13;01;04
It's going to become increasingly hard
for us to just unplug it.
00;13;01;17 - 00;13;04;11
And as it becomes
more and more enmeshed in our economy.
00;13;04;11 - 00;13;06;27
Back to your question
about the jobs like it's become.
00;13;06;27 - 00;13;12;13
It's it's it's hard to even imagine
the ways we might come to rely on it
00;13;12;26 - 00;13;16;25
in the future in a way that, like a jobs
program is just not going to be
00;13;17;08 - 00;13;18;18
implemented in 2029.
00;13;18;18 - 00;13;21;16
It's just not going to be prepared
to quickly respond to.
00;13;23;14 - 00;13;24;06
If you work in
00;13;24;06 - 00;13;27;14
IT, Infotech research Group is a name
you need to know.
00;13;27;29 - 00;13;30;29
No matter what your needs are, Infotech
has you covered.
00;13;31;04 - 00;13;32;11
AI strategy?
00;13;32;11 - 00;13;34;23
Covered. Disaster recovery?
00;13;34;23 - 00;13;35;23
Covered.
00;13;35;23 - 00;13;38;08
Vendor negotiation? Covered.
00;13;38;08 - 00;13;42;01
Infotech supports you with the best
practice research and a team of analysts
00;13;42;01 - 00;13;45;17
standing by ready to help you
tackle your toughest challenges.
00;13;45;27 - 00;13;48;27
Check it out at the link below
and don't forget to like and subscribe!
00;13;51;01 - 00;13;53;11
let's follow the thread again of the,
00;13;53;11 - 00;13;56;00
you know, what you had said
a little bit earlier, which is that
00;13;56;00 - 00;13;59;03
the doomsday scenario is in some ways
the most interesting here.
00;13;59;03 - 00;14;03;23
And, I think your position and yours
is pretty similar to mine,
00;14;04;03 - 00;14;07;03
which is and Jeffrey Hinton's
a perfect example, but
00;14;07;19 - 00;14;13;19
I, I came into these conversations,
in many cases, ready to dismiss
00;14;13;19 - 00;14;17;22
these people as kooks and just say like,
oh, you're a doomsday cult.
00;14;17;23 - 00;14;20;03
You're, you're you're way out there.
00;14;20;03 - 00;14;23;00
But the the thing that that gave me
the most pause.
00;14;23;00 - 00;14;26;10
the apocalypse crowd,
they got a bad track record.
00;14;26;10 - 00;14;27;21
Right. They For sure.
00;14;28;25 - 00;14;30;12
apocalypse was Yeah.
00;14;30;12 - 00;14;31;11
and they've been wrong.
00;14;31;11 - 00;14;34;22
And I I'll just say personally like I was
raised very religious and it was like
00;14;35;15 - 00;14;39;09
a big part of my upbringing,
believing that a that
00;14;39;11 - 00;14;43;09
like God's apocalypse
would likely happen in my lifetime.
00;14;43;16 - 00;14;46;16
And when I left, that
kind of fundamentalist faith,
00;14;46;26 - 00;14;50;21
I do think I developed an allergy
to anything apocalyptic.
00;14;50;21 - 00;14;51;19
So I think we were coming
00;14;51;19 - 00;14;55;13
from the same position there, which is,
you know, that could be a bad thing,
00;14;55;13 - 00;14;57;11
because if you're looking to be very open
minded,
00;14;57;11 - 00;14;59;24
you want to make sure
you're not too allergic to anything.
00;14;59;24 - 00;15;02;14
If you want to try
and really understand the world.
00;15;02;14 - 00;15;04;04
Well and that's kind of where I was going.
00;15;04;04 - 00;15;05;13
Is it.
00;15;05;13 - 00;15;06;22
It sounds farfetched.
00;15;06;22 - 00;15;09;27
It's completely, as you said, different
from anything that we've heard before.
00;15;10;05 - 00;15;11;22
And yet when you talk to these people,
00;15;11;22 - 00;15;12;13
you know, long enough
00;15;12;13 - 00;15;15;15
as you know you do on your program and,
you know, listeners to this program
00;15;15;21 - 00;15;19;12
have heard before, one of the first things
you come away with, it's like, crap.
00;15;19;12 - 00;15;22;20
They actually do sound like they know
what they're talking about.
00;15;23;00 - 00;15;27;19
And their arguments are resistant to most,
you know,
00;15;27;19 - 00;15;31;20
kind of logical, you know, debates
you can throw at them, right?
00;15;31;20 - 00;15;35;12
Like it's not like this is just like
a, a house of cards or scarecrow argument.
00;15;35;12 - 00;15;37;09
And you say one reasonable thing
and it collapses.
00;15;37;09 - 00;15;40;07
They they've thought this through.
They've lived and breathed this
00;15;40;07 - 00;15;43;01
and they can
they have a response to everything.
00;15;43;01 - 00;15;44;04
And the
00;15;45;04 - 00;15;46;01
the piece I guess that
00;15;46;01 - 00;15;49;20
gave me the most pause is the realization
00;15;49;24 - 00;15;53;14
that when you talk to people
who try to dismiss the rumors,
00;15;54;07 - 00;15;57;16
the best argument
it sounds like we have is,
00;15;58;01 - 00;16;01;13
you know, well, things have worked out
pretty well for us before,
00;16;01;13 - 00;16;03;15
and we'll probably figure it out
right now.
00;16;03;15 - 00;16;05;03
I don't know,
maybe you heard something more credible
00;16;05;03 - 00;16;08;03
than that,
but it's tough to disprove, in my view.
00;16;09;03 - 00;16;12;03
I have two responses that I think one is
00;16;12;05 - 00;16;16;08
I think the dreamers
have become increasingly better
00;16;17;01 - 00;16;20;21
at making their case
of crafting their arguments.
00;16;21;02 - 00;16;24;15
I mean if you've been following this
for a decade like I have, you'll remember
00;16;24;15 - 00;16;29;08
the era of the paperclip maximizer where
that was their go to way of explaining it.
00;16;29;21 - 00;16;35;00
They've, I think, wisely moved on
from that somewhat brilliant
00;16;35;00 - 00;16;38;18
but also somewhat confusing
thought experiment into,
00;16;38;26 - 00;16;43;01
I think, points that are a little easier
for people to grasp,
00;16;43;25 - 00;16;48;06
helping to make the same case
just with an updated set of arguments
00;16;48;06 - 00;16;51;23
and, allegories for people to digest.
00;16;52;04 - 00;16;55;12
I don't think that the acceleration ists
have invested
00;16;55;20 - 00;16;59;09
similarly in trying to hone a great case.
00;16;59;23 - 00;17;04;14
And in some ways that makes sense
because they're winning, right?
00;17;04;14 - 00;17;07;02
They're like,
we are all accelerating right now.
00;17;07;02 - 00;17;10;23
There are no meaningful federal
regulations in place to slow them down.
00;17;11;02 - 00;17;14;23
There are so many billions of dollars
being put in this industry
00;17;14;28 - 00;17;18;15
that if they stopped tomorrow,
it would probably cause
00;17;18;15 - 00;17;22;11
a global financial collapse
like we've not seen in our lifetimes.
00;17;22;28 - 00;17;26;26
And they're racing one another
like like Altman and,
00;17;27;14 - 00;17;32;23
you know, Hassabis and Musk
and you know Dario Amadi.
00;17;32;29 - 00;17;37;06
They get these
brilliant people are competing not just
00;17;37;06 - 00;17;41;00
against the clock and not just for,
you know, better products.
00;17;41;00 - 00;17;43;05
They're competing against each other
to try
00;17;43;05 - 00;17;46;13
and be the first to get to this moment of,
you know, AGI.
00;17;47;01 - 00;17;51;07
And I don't think it's top of mind to them
to come on to a podcast
00;17;51;07 - 00;17;55;14
and try and really make the case for why
the rumors are wrong and they're right.
00;17;55;22 - 00;18;00;13
And I think that if we start to see
the boomers get more ground, like,
00;18;00;19 - 00;18;03;29
you know, laser
you'd Kautsky innate stories, their book
00;18;04;18 - 00;18;07;11
if anyone builds it, everyone dies.
00;18;07;11 - 00;18;08;14
Very catchy name.
00;18;08;14 - 00;18;09;26
It made its way onto the New York Times
00;18;09;26 - 00;18;12;26
bestsellers list like it's
starting to find an audience.
00;18;12;26 - 00;18;16;07
You see more lawmakers
who are getting vocal
00;18;16;07 - 00;18;19;21
about the risks, and that's happening
both on the left and the right.
00;18;19;21 - 00;18;24;18
You're having interesting media figures,
as, as as vast as like,
00;18;24;27 - 00;18;29;19
Steve Bannon, who's really concerned
about the existential risk.
00;18;29;26 - 00;18;31;15
And then you have Megan McCart.
00;18;31;15 - 00;18;33;18
Hang on, what's the princess's name?
00;18;34;26 - 00;18;38;14
I always forget, Markle Meghan Markle.
00;18;38;19 - 00;18;39;04
Yeah.
00;18;39;04 - 00;18;42;14
the, columnist for The Washington Post.
00;18;42;14 - 00;18;46;04
I really gave her a, promotion there
Yeah.
00;18;46;29 - 00;18;48;22
to Princess, Duchess. Yeah.
00;18;48;22 - 00;18;53;02
across across this wide range of,
like, people's
00;18;53;09 - 00;18;57;23
politics, like, this issue has not yet
become politically polarized.
00;18;57;24 - 00;19;00;06
It's not that they're acceleration is
they're right wingers.
00;19;00;06 - 00;19;05;14
And, you know, safety ists are in in
Dumas are are on the left.
00;19;05;26 - 00;19;09;17
this issue is like it's it's
still in its infancy in the public debate.
00;19;09;25 - 00;19;12;20
And I think that if we start to see
00;19;12;20 - 00;19;16;00
the Dumas get more and more purchase,
I think at that point we're going to start
00;19;16;22 - 00;19;19;00
a better case
being made for the accelerations.
00;19;19;00 - 00;19;21;04
So that's like thing one thing to
00;19;21;04 - 00;19;24;13
I think that the best
case of the acceleration ists are making.
00;19;25;01 - 00;19;28;24
Personally, it does come down to this idea
00;19;29;10 - 00;19;32;05
that we have become increasingly
00;19;32;05 - 00;19;35;14
safety oriented as a society.
00;19;35;14 - 00;19;38;07
This is something
you hear a lot from Peter Thiel.
00;19;38;07 - 00;19;40;25
And like no matter
what people think of Peter Thiel,
00;19;40;25 - 00;19;44;05
he clearly has
had a massive influence on the world.
00;19;44;11 - 00;19;49;09
He was an early investor in all these
companies, and DeepMind and OpenAI.
00;19;49;10 - 00;19;52;16
He was even if you're an AI boomer,
he was an early investor
00;19;52;16 - 00;19;57;03
in the Singularity Institute
and in, Eliezer, you'd Kautsky,
00;19;57;04 - 00;20;01;20
who is like the king of the doomsday
fear, teams all over this conversation.
00;20;01;28 - 00;20;05;04
And he's been making,
I think, a very persuasive case to people
00;20;05;12 - 00;20;10;05
that with this safety mindset
that has overcome our culture,
00;20;10;12 - 00;20;13;26
ranging from how we parent to how we,
00;20;14;07 - 00;20;17;20
invest in new technologies
as like a government,
00;20;17;29 - 00;20;22;05
that this safety mindset
has created a stagnation
00;20;22;16 - 00;20;27;08
that is threatening our politics,
that is threatening our sense of purpose,
00;20;27;17 - 00;20;31;02
and that we feel in all of these
like abstract ways, like when
00;20;31;02 - 00;20;32;15
you go to New York City
00;20;33;19 - 00;20;36;19
and then
you go like go, go, go to a Japanese city
00;20;36;25 - 00;20;39;26
and you're just thinking,
how do we have more money in New York?
00;20;40;07 - 00;20;44;08
And as much as I love New York,
lived there for many years, but it's it's
00;20;44;08 - 00;20;49;00
incredible how little progress
has been made in decades
00;20;49;07 - 00;20;52;22
when it comes to almost anything
a New Yorker would want to invest in.
00;20;52;28 - 00;20;56;13
And then you just wonder,
why are we so stagnant?
00;20;56;13 - 00;20;58;13
Where, where where's the flying cars?
00;20;58;13 - 00;21;01;07
Like a shorthand for this.
00;21;01;07 - 00;21;04;07
And I and I do think that
if we bring that safety
00;21;04;10 - 00;21;07;10
mindset too much into the AI world,
00;21;07;25 - 00;21;10;26
I think they do have a point
that with some ways,
00;21;11;07 - 00;21;14;22
allowing ourselves
to be ruled by our fears instead of
00;21;14;22 - 00;21;18;19
by our desires to to truly reach for more,
to believe in something,
00;21;18;24 - 00;21;21;25
to believe that the world can be different
and better than it is today.
00;21;22;12 - 00;21;25;09
I don't think that they make the pitch
quite as,
00;21;25;09 - 00;21;28;19
with the tone of inspiration
as I'm hitting it with here, but I feel it
00;21;28;19 - 00;21;33;13
sometimes I, I can picture
a world that they're imagining where,
00;21;34;04 - 00;21;38;15
you know, like one of the accelerations
I spoke to you happens to be a socialist.
00;21;38;22 - 00;21;42;09
Acceleration is because, like I said,
there's a large spectrum of different
00;21;42;09 - 00;21;45;08
political beliefs
inside of each one of these camps.
00;21;45;12 - 00;21;49;10
And one of the reasons that
that he was so passionate about us
00;21;49;10 - 00;21;53;11
getting to AGI and us really investing
in this technology is he was saying, like,
00;21;53;11 - 00;21;56;26
think about the millions
and millions of people
00;21;56;26 - 00;22;01;09
since the Industrial revolution
who have had to do shit jobs,
00;22;02;10 - 00;22;05;16
that someone has to
do, that our society has set up in a way
00;22;05;21 - 00;22;09;20
where someone has to do this, and in fact,
a lot of someone's
00;22;09;29 - 00;22;13;08
have to dig in these mines
and they have to clean these toilets,
00;22;13;08 - 00;22;17;24
and they have to do these repetitive
factory jobs that if we literally came up
00;22;18;00 - 00;22;21;12
with a technology
that did that work for them,
00;22;21;12 - 00;22;25;07
it would possibly be like
the most powerful, liberating force
00;22;26;01 - 00;22;29;21
for like the betterment of humanity
in human history.
00;22;29;29 - 00;22;31;17
And he's like, that is worth it.
00;22;31;17 - 00;22;34;10
Not because of some abstract thing
that you want one day,
00;22;34;10 - 00;22;37;08
but because of the people
who are living these like
00;22;37;08 - 00;22;41;15
and somewhat living in these toilsome,
miserable conditions right now.
00;22;41;24 - 00;22;44;17
And that oh, but we don't know
what they would do on the other side.
00;22;44;17 - 00;22;49;07
And we we don't know how we would organize
our society in accelerations like that.
00;22;49;07 - 00;22;50;13
This guy's name is Alex Williams.
00;22;50;13 - 00;22;55;08
What he was saying is that, like those
problems, we should solve down the road
00;22;55;08 - 00;22;58;21
after dealing with the actual problem
that we have right now?
00;22;59;03 - 00;23;01;26
And I find that to be quite,
quite persuasive as well.
00;23;01;26 - 00;23;03;23
But I know it's early in the debate,
00;23;03;23 - 00;23;07;15
and I think that over the next
couple of years, especially if I continues
00;23;07;15 - 00;23;11;08
to get this massive amount of investment,
if they continue to see the incremental
00;23;11;08 - 00;23;14;29
or maybe even the exponential growth
that they're hoping for.
00;23;15;03 - 00;23;19;11
I think this is going to be the debate,
increasingly happening
00;23;19;17 - 00;23;22;23
not just in our political spheres,
but like around
00;23;22;23 - 00;23;26;07
people's dinner
tables, friends out, for a drink.
00;23;26;17 - 00;23;28;15
And like the reason
that we put this podcast together
00;23;28;15 - 00;23;30;02
and the reason we're putting it out now
00;23;30;02 - 00;23;34;01
is that we feel like
it's kind of it's time to get to 101.
00;23;34;01 - 00;23;37;01
It's time to get an introduction to where,
00;23;37;01 - 00;23;39;08
we're at right now, how we got here
00;23;39;08 - 00;23;43;02
and, like, what the three
major camps believe we should do next.
00;23;44;08 - 00;23;45;09
So let's
00;23;45;09 - 00;23;48;20
in the spirit of of the 101
and just kind of unpacking some of this
00;23;48;20 - 00;23;51;04
and and and by the way for what it's worth
you know
00;23;51;04 - 00;23;55;03
I'm a, I'm a listener to your podcast
and I think it's a really nice kind of
00;23;55;03 - 00;23;57;29
you know, overview of that piece
where I wanted to go though, as we,
00;23;57;29 - 00;24;00;18
you know, we talked about acceleration
as it's winning.
00;24;00;18 - 00;24;01;00
Winning.
00;24;01;00 - 00;24;05;01
We talked about getting more,
you know, winning technologically
00;24;05;01 - 00;24;06;12
but also winning financially.
00;24;06;12 - 00;24;07;23
I guess you could say right now
00;24;07;23 - 00;24;10;11
because they're getting more investment
in these technologies.
00;24;10;11 - 00;24;13;19
But one of the pieces
that's interesting to me is,
00;24;14;10 - 00;24;18;09
I guess, the personalities behind
some of these different technologies
00;24;18;09 - 00;24;22;16
and the fact that you've got,
you know, a series of different
00;24;23;22 - 00;24;24;28
tech stacks here of
00;24;24;28 - 00;24;30;17
different, you know, AI products
with people who I mean, you could draw
00;24;30;17 - 00;24;33;17
a pretty interesting diagram
of how these people have intersected.
00;24;33;17 - 00;24;37;21
And, you know, the the drama in their
lives is like borderline Game of Thrones.
00;24;38;04 - 00;24;41;04
But but I mean, for the the uninformed,
00;24;41;14 - 00;24;44;14
how would you classify the the main,
00;24;45;10 - 00;24;49;26
the main I products here
and kind of the people behind them.
00;24;50;06 - 00;24;51;27
If that's not too broad a question.
00;24;51;27 - 00;24;53;25
Well let me,
let me try and give an answer.
00;24;53;25 - 00;24;55;01
You tell me
if that's what you're going for.
00;24;55;01 - 00;24;59;04
I mean the number one thing
that we start off the show and that
00;24;59;06 - 00;25;03;08
I start off most of these conversations,
trying, trying to,
00;25;03;10 - 00;25;06;25
to distinguish between is just like,
what is it
00;25;07;15 - 00;25;12;25
that open AI or anthropic or DeepMind,
00;25;12;25 - 00;25;18;08
what is it that they think they're making
when they ask investors to invest in AI?
00;25;18;21 - 00;25;21;24
And it's not a product,
it's not a chat bot.
00;25;22;03 - 00;25;27;17
When Demis Hassabis and Shane Legg,
fresh out of getting their PhDs
00;25;27;17 - 00;25;30;19
in neuroscience, come to Silicon
00;25;30;19 - 00;25;33;19
Valley in 2010 looking for investors,
00;25;33;27 - 00;25;38;11
a part of their pitch was
we don't want to make a tech product.
00;25;38;23 - 00;25;42;05
We want to make artificial general
intelligence
00;25;42;23 - 00;25;47;00
the complete automation of anything
00;25;47;19 - 00;25;50;02
that the human mind can imagine.
00;25;50;02 - 00;25;53;24
And then beyond,
like they want to make a new.
00;25;54;04 - 00;25;57;19
Species
is the way that a lot of the sources
00;25;57;19 - 00;25;58;13
that I've talked to say,
00;25;58;13 - 00;26;02;14
it's like it's a better shorthand to say
they're trying to create an intelligent
00;26;02;14 - 00;26;07;04
new species than it is to say they're
trying to make something like a chat bot.
00;26;07;08 - 00;26;09;20
So that's the important thing
to distinguish.
00;26;09;20 - 00;26;13;27
And that dream has been alive
under different names.
00;26;14;12 - 00;26;17;18
Right now
we call it AGI, but thinking machine
00;26;17;18 - 00;26;21;20
or artificial intelligence
or automaton, it's had different names.
00;26;21;29 - 00;26;26;29
It goes all the way back to the 1940s,
and it goes back to Alan Turing,
00;26;27;04 - 00;26;30;28
one of the late godfathers
of computer science, as he's often called.
00;26;31;08 - 00;26;35;24
He's sitting there in the 40s
looking at one of those massive computers
00;26;35;24 - 00;26;39;01
with the tubes sticking out of it
that was the size of a room.
00;26;39;01 - 00;26;41;02
I think it weighed like two and a half
tons.
00;26;41;02 - 00;26;44;07
He, already
seeing what this computer was able to do,
00;26;44;19 - 00;26;49;20
was envisioning a day
when it could think as well as a human
00;26;50;09 - 00;26;54;00
and thought that when that happened,
it would probably take over.
00;26;54;18 - 00;26;55;02
Right.
00;26;55;02 - 00;26;56;15
And it is.
00;26;56;15 - 00;26;58;21
It attracts
these really interesting figures
00;26;58;21 - 00;27;02;24
who are not only attract,
not only have this belief
00;27;03;11 - 00;27;06;06
that the computers, even the computers
we have today,
00;27;06;06 - 00;27;09;21
they can be so dumb sometimes, you know,
they can be so frustratingly dumb.
00;27;09;21 - 00;27;13;09
And to think that for all these years
there have been a group of true believers
00;27;14;00 - 00;27;17;00
who think that we can achieve
that level, that level of automation.
00;27;17;12 - 00;27;20;03
And, you know, in the podcast,
we go through all the ups and downs
00;27;20;03 - 00;27;22;18
where like in the 1960s,
they really thought that they were close
00;27;22;18 - 00;27;26;00
and they sounded a lot
like people sound today thinking that
00;27;26;00 - 00;27;29;23
we were ten years away from a true AGI
thinking machine.
00;27;29;23 - 00;27;31;25
Right. And obviously that didn't happen.
00;27;31;25 - 00;27;37;09
But in the modern telling of like where
we're at and who these characters are,
00;27;38;08 - 00;27;40;06
I think the important thing to know
is that
00;27;40;06 - 00;27;44;16
the most influential voices
inside of the current
00;27;44;16 - 00;27;47;16
AI conversation,
the most influential tech leaders,
00;27;47;29 - 00;27;52;14
almost all of them are the underdogs
ten years ago
00;27;53;03 - 00;27;57;00
that they were not the people
who were at the forefront
00;27;57;00 - 00;28;00;03
of Silicon Valley
when it came to new technology.
00;28;01;06 - 00;28;05;12
In fact, the people who are leading
the charge, leading the race in the US
00;28;05;12 - 00;28;09;04
right now, they were the ones
who were the most freaked out.
00;28;09;15 - 00;28;11;26
Ten years ago,
00;28;11;26 - 00;28;14;01
Dario Ahmadi,
00;28;14;01 - 00;28;17;24
Sam Altman, Elon Musk,
you know, Demis Hassabis.
00;28;17;24 - 00;28;18;21
The list could go on.
00;28;18;21 - 00;28;23;00
All these top players, you could find that
they were investing, you know,
00;28;23;00 - 00;28;26;23
millions of dollars into AI safety,
that they were lobbying Congress.
00;28;26;24 - 00;28;31;02
Elon Musk had a personal meeting
with President Obama in 2015.
00;28;31;08 - 00;28;34;13
Go back and look at Sam
Altman's blog in 2015.
00;28;34;19 - 00;28;36;11
They're saying over and over again,
00;28;36;11 - 00;28;40;07
this thing poses
the greatest existential risk to humanity.
00;28;40;17 - 00;28;42;29
Like it may lead to our extinction.
00;28;42;29 - 00;28;45;29
It may come to see us
the way that we see docs.
00;28;46;05 - 00;28;48;16
Some people say the way that we see ants,
00;28;48;16 - 00;28;51;23
and those people are now
at the forefront of the race.
00;28;51;23 - 00;28;52;03
Right?
00;28;52;03 - 00;28;55;02
They signed a petition
saying we should do everything we can
00;28;55;09 - 00;28;58;16
to make sure there's no AI race,
and now they're at the head of the race.
00;28;58;24 - 00;29;02;21
And I think that the cynics
in the critics of these people
00;29;03;03 - 00;29;06;01
are assuming that what happened is greed,
00;29;06;01 - 00;29;09;19
that what happened is that dumber
ism was bad for business.
00;29;09;19 - 00;29;12;13
And so they just pushed the Duma
ism to the side.
00;29;12;13 - 00;29;15;19
And I think one of the most fascinating
aspects of this story
00;29;16;07 - 00;29;18;23
is that it's way more interesting to that.
00;29;18;23 - 00;29;21;20
What happened
is that they came one after another
00;29;21;20 - 00;29;26;29
to believe that someone, somewhere,
was going to make this technology,
00;29;26;29 - 00;29;29;29
that AGI would be created,
00;29;29;29 - 00;29;34;04
and that the best thing that they could do
for the future of humanity
00;29;34;13 - 00;29;37;22
is make sure that they were
the ones who made it first,
00;29;38;06 - 00;29;42;29
that they because they care about safety,
because they care about democracy,
00;29;43;04 - 00;29;48;18
because they care about,
you know, things like, privacy rights,
00;29;48;25 - 00;29;53;13
that if they made the AGI,
they could use it for good
00;29;53;25 - 00;29;57;10
before a China, before,
you know, if you're Sam Altman and Elon
00;29;57;10 - 00;29;58;28
Musk, you know, before Google, right?
00;29;58;28 - 00;30;01;04
If you're dim as a sub,
as you know, before Musk. Right.
00;30;01;04 - 00;30;05;02
They came just to believe that like the
the acceleration
00;30;05;19 - 00;30;08;19
is not just a convenient
00;30;09;00 - 00;30;11;25
sales pitch,
but they sincerely seem to believe
00;30;11;25 - 00;30;15;14
that acceleration
is salvation, that acceleration
00;30;16;17 - 00;30;17;21
is a duty
00;30;17;21 - 00;30;20;20
that they
have on behalf of the human race.
00;30;20;21 - 00;30;23;02
And that's a good story.
It's a fascinating story.
00;30;23;02 - 00;30;26;27
Whether or not you believe it
is, I'm not sure I believe it,
00;30;27;06 - 00;30;31;13
but it's important to understand that
that's happening and then to understand
00;30;31;13 - 00;30;34;29
that our entire economy, not our tire,
but a huge part of our economy
00;30;34;29 - 00;30;38;19
and a large part of our stock market,
we're all riding on that conversion,
00;30;39;06 - 00;30;42;14
you know,
and we've yet to engage in a big,
00;30;42;14 - 00;30;46;04
robust public debate about whether or not
this is the right path.
00;30;46;15 - 00;30;48;21
And we're already so far down it.
00;30;48;21 - 00;30;50;06
I think that that's that's
00;30;50;06 - 00;30;54;06
I think, if you ask like the characters,
like that's the spectrum.
00;30;54;06 - 00;30;57;06
You've got Alan
Turing all the way up to these guys.
00;30;57;10 - 00;30;58;15
And then in the middle of it,
00;30;58;15 - 00;31;02;04
I guess the other characters
that I really love are the contrarians,
00;31;02;28 - 00;31;06;21
which I think naturally, this subject
would attract some contrarians,
00;31;07;04 - 00;31;10;26
but one of my favorite things
I learned putting the series together
00;31;11;04 - 00;31;16;17
was that there were two camps in the
earliest days of artificial intelligence
00;31;16;17 - 00;31;21;27
about how you would go about making a true
thinking machine in the dominant camp.
00;31;22;13 - 00;31;25;21
They were, you know,
the Symbolists went on to get into
00;31;25;21 - 00;31;27;23
all the technical details
of what they were,
00;31;27;23 - 00;31;32;12
but they're the guys that made
that chess player that beat Kasparov.
00;31;32;12 - 00;31;35;21
They they're the ones who made that
jeopardy player that won jeopardy!
00;31;35;21 - 00;31;39;17
Like they were the ones who were seen
as like the future of AI for so long.
00;31;40;12 - 00;31;44;03
And then the underdogs,
the contrarians were these connectionist.
00;31;44;11 - 00;31;49;10
These are people who for decades believed
in their vision for how to make an AI.
00;31;49;26 - 00;31;54;06
And they
I did not realize just how disrespected
00;31;54;12 - 00;31;58;09
they were inside of their own field,
how they were completely
00;31;58;23 - 00;32;02;29
just like, just
how far they were pushed to the sidelines,
00;32;03;13 - 00;32;07;10
and then how quickly their theories
00;32;07;21 - 00;32;13;21
became the engine behind this
AI revolution that we're going through.
00;32;14;08 - 00;32;17;20
And then immediately,
these guys who had dedicated their lives
00;32;17;20 - 00;32;21;14
to this contrarian view,
it's like moments after they're winning
00;32;21;14 - 00;32;24;21
their Nobel Prize, you know, moments
after they're winning their Turing Awards,
00;32;24;29 - 00;32;28;26
they then quit their jobs
and come talk to people like me to say,
00;32;29;13 - 00;32;31;03
we got to stop this thing.
00;32;31;03 - 00;32;32;26
It might kill us all.
00;32;32;26 - 00;32;38;07
So that's a spectrum of the personalities
and the characters that, are like
00;32;38;07 - 00;32;41;15
funneling the thing behind your,
you know, ChatGPT
00;32;42;03 - 00;32;45;23
and filing the,
you know, engine of progress
00;32;45;23 - 00;32;48;23
or engine of profit
or however you want to say
00;32;48;25 - 00;32;52;18
whatever you stand on the positions like,
those are the guys that are,
00;32;53;12 - 00;32;56;04
like behind this
00;32;56;04 - 00;33;00;16
AI idea
that's become so central to our society.
00;33;01;05 - 00;33;01;22
that's great idea.
00;33;01;22 - 00;33;05;18
That's that's exactly where I wanted
to kind of take the conversation and I, I,
00;33;05;18 - 00;33;08;18
you know, I had a very similar reaction
to you when I heard
00;33;09;04 - 00;33;13;15
about all these different personalities
you are all competing to,
00;33;13;16 - 00;33;18;08
you know, be the first to, you know,
make this artificial general intelligence.
00;33;18;08 - 00;33;22;12
And I mean, the way I characterized it,
and I like the first thought
00;33;22;12 - 00;33;25;12
that came to my mind, and I don't know
if you're a mad men guy, but.
00;33;25;19 - 00;33;26;16
Yeah.
00;33;26;16 - 00;33;30;06
So maybe the most famous scene
in the show, it's in the first episode,
00;33;30;13 - 00;33;34;09
but basically they're coming up
with these, this ad for Cigarets.
00;33;34;09 - 00;33;34;18
Right.
00;33;34;18 - 00;33;38;18
And, and the gist of it is that everybody
else's cigarets will kill you.
00;33;39;02 - 00;33;41;00
Ours are toasted, right?
00;33;41;00 - 00;33;42;16
Like it's toasted.
00;33;42;16 - 00;33;44;01
It's toasted. Lucky strike.
00;33;44;01 - 00;33;48;00
And to me, like that seems to be the pitch
of every single AI leader right now.
00;33;48;00 - 00;33;50;03
Everybody else is. I will kill you.
00;33;50;03 - 00;33;51;28
Our AI is toasted, right?
00;33;51;28 - 00;33;53;21
Like like, do you buy that?
00;33;53;21 - 00;33;56;16
And within that argument,
does it matter who wins?
00;33;56;16 - 00;33;57;05
Is the question.
00;33;57;05 - 00;33;59;24
I'm getting on. Well,
I think this is like, there's.
00;33;59;24 - 00;34;01;00
maybe I'm too gullible.
00;34;01;00 - 00;34;04;28
I mean, I, I covered politics largely,
00;34;05;12 - 00;34;08;22
before this have been
deep in the world of, like,
00;34;09;27 - 00;34;11;12
what is
00;34;11;12 - 00;34;13;29
causing this deep divide
in our politics today.
00;34;13;29 - 00;34;15;18
And what role does technology play?
00;34;15;18 - 00;34;19;21
What role does, you know,
the lack of having other strong markers
00;34;19;21 - 00;34;27;16
of meaning in our lives and all this
and I'm of the persuasion to take people
00;34;27;16 - 00;34;31;08
at their word when their actions
seem to back up their word.
00;34;32;17 - 00;34;36;06
And whether that's you know,
why a lifelong Democrat in Pennsylvania
00;34;36;06 - 00;34;40;29
decides to vote Republican for Donald
Trump, or whether that's a technologist
00;34;41;05 - 00;34;45;21
who seems to sincerely believe
that the thing he's making
00;34;46;05 - 00;34;47;16
could end the human race.
00;34;47;16 - 00;34;49;14
And if he doesn't make it
00;34;49;14 - 00;34;51;22
before someone else
does, it's more likely to do so.
00;34;51;22 - 00;34;52;15
I don't.
00;34;52;15 - 00;34;55;14
I personally, I don't think it's toasted.
00;34;55;14 - 00;34;58;21
they seem to sincerely
believe it to the point where
00;34;58;21 - 00;35;02;14
if you're going to criticize it, it's
I think it's better to criticize it
00;35;02;14 - 00;35;06;08
the way you would criticize
maybe, a believer in a religion
00;35;06;26 - 00;35;10;11
which, like you can criticize
someone for their religious beliefs,
00;35;10;11 - 00;35;13;15
but if you start off with the assumption
that they cynically hold them,
00;35;13;15 - 00;35;16;10
but behind closed doors,
they're not really holding them.
00;35;16;10 - 00;35;17;28
I don't think that that puts you
on the right footing.
00;35;17;28 - 00;35;20;21
Now. There are a lot of skeptics
who are out there,
00;35;20;21 - 00;35;23;21
and even some acceleration ists
will say this about the AI Dumars,
00;35;23;27 - 00;35;29;04
that it is marketing,
that if you tell people my butt, it's
00;35;29;04 - 00;35;32;23
almost the opposite of what you're saying
with the criticism they often get is
00;35;33;03 - 00;35;38;08
if you go out there and say,
like my thing, I'm making my technology
00;35;38;08 - 00;35;42;29
I'm making is so awesome,
it might destroy the earth.
00;35;43;04 - 00;35;46;22
That's a way of saying what
I do is important and you know what?
00;35;47;01 - 00;35;48;25
It might get rid of all jobs.
00;35;48;25 - 00;35;49;19
And that's a way of saying
00;35;49;19 - 00;35;53;22
invest money in me because, you know,
you put that money in like U.P.S.
00;35;53;22 - 00;35;55;26
or something that that's a
that's a failing business.
00;35;55;26 - 00;35;57;19
We're going to have robots doing all that.
00;35;57;19 - 00;36;00;11
And you know that
think that point is out there.
00;36;00;11 - 00;36;05;25
But when I talk to like former OpenAI
employees, they are they are telling me
00;36;05;25 - 00;36;10;10
that this is the conversation that they're
having in the cafeteria over lunch.
00;36;10;10 - 00;36;12;05
Like,
this is what they talk about on Friday
00;36;12;05 - 00;36;16;01
night, happy hours when they go out,
like the people that are openly saying,
00;36;16;08 - 00;36;20;08
Holy shit, I hope we don't make something
that destroys the world.
00;36;20;18 - 00;36;25;04
So I, I don't think that it's quite
the same thing as what you're saying.
00;36;25;04 - 00;36;26;21
I do think that it's going to change.
00;36;26;21 - 00;36;29;21
And, you know, I think that it's dynamic.
00;36;30;02 - 00;36;32;23
When Sam Altman in 2023
00;36;33;25 - 00;36;36;15
went to testify before Congress,
00;36;36;15 - 00;36;40;10
right in the aftermath
of the massive explosion of chat bots.
00;36;40;10 - 00;36;43;06
So ChatGPT comes out in November of 2022.
00;36;43;06 - 00;36;47;09
It's followed by all these copycats, and
everybody wants to get into the chat bot
00;36;47;09 - 00;36;51;03
game, and a lot of weird hallucinations
and crazy things are going on.
00;36;51;22 - 00;36;54;16
Congress
calls Sam Altman and others to come in
00;36;54;16 - 00;36;58;04
and testify about what's going on
with AI and it's incredible.
00;36;58;04 - 00;37;00;05
It's maybe the most incredible
congressional hearing
00;37;00;05 - 00;37;03;20
I've ever seen
because he's up there saying, regulate us.
00;37;04;01 - 00;37;07;07
Our greatest fear is
we might break the world.
00;37;07;13 - 00;37;10;13
We want you to regulate us.
00;37;10;14 - 00;37;12;06
And yet
00;37;12;06 - 00;37;13;16
they never did.
00;37;13;16 - 00;37;17;02
But it's a weird thing to ask
for your industry,
00;37;17;02 - 00;37;20;02
for your company,
for you personally to be regulated.
00;37;21;00 - 00;37;23;00
And then everyone else is like,
yeah, we should.
00;37;23;00 - 00;37;24;09
And they never did.
00;37;24;09 - 00;37;27;09
You know, fast forward to May of 2025.
00;37;27;14 - 00;37;30;14
He's brought in to testify
in front of Congress,
00;37;30;15 - 00;37;35;20
and almost the entire hearing is about,
how do we help you beat China?
00;37;35;29 - 00;37;37;29
How do we ensure that the U.S.
00;37;37;29 - 00;37;39;11
remains at the forefront of this?
00;37;40;11 - 00;37;41;05
That's an interesting
00;37;41;05 - 00;37;44;19
change that's happened, and I could see it
continuing to change again,
00;37;45;09 - 00;37;49;18
and change back and forth and continue
to change directions.
00;37;49;18 - 00;37;51;07
And I think as it change
00;37;51;07 - 00;37;54;11
changes, you're going to hear changes
coming from these companies.
00;37;54;11 - 00;37;56;19
And we could get to, it's toasted.
00;37;56;19 - 00;38;00;18
You know, Z, what Elon Musk is doing,
it's going to kill the world.
00;38;00;25 - 00;38;04;13
But me, what we're doing, it's
going to ensure civilization.
00;38;04;13 - 00;38;07;02
But I think at this moment
00;38;07;02 - 00;38;08;01
they're not really there.
00;38;08;01 - 00;38;12;00
And in fact, I'm kind of surprised
that they're not a little bit more open,
00;38;13;04 - 00;38;18;01
openly competitive,
like Pepsi and Coke or T.
00;38;18;01 - 00;38;21;13
I think maybe the our generation,
it's like T-Mobile and AT&T.
00;38;21;13 - 00;38;22;18
Like I watch football,
00;38;22;18 - 00;38;25;03
they go hard
after each other in those ads,
00;38;25;03 - 00;38;28;20
like literally making fun of each other's
ads in the ads because they're competing
00;38;29;01 - 00;38;32;23
for, for customers on the same base.
00;38;33;11 - 00;38;38;24
We're not yet seeing Claude go
that hard at OpenAI even.
00;38;38;25 - 00;38;41;22
You know, I don't know if this is, like,
really insider, but maybe you follow this.
00;38;41;22 - 00;38;46;23
But, you know, even Dario, on the day
he left OpenAI after essentially
00;38;47;09 - 00;38;48;19
being one of the most,
00;38;48;19 - 00;38;51;27
like, influential people
to put them on the path of
00;38;52;12 - 00;38;55;20
the ChatGPT success that they've had
ever since he leaves, before
00;38;55;20 - 00;38;56;13
they see that success.
00;38;56;13 - 00;39;00;19
But after he's done a ton of work
to get them down that road, he's
00;39;00;19 - 00;39;04;04
never really come out
and said why he left.
00;39;04;15 - 00;39;07;06
He's, you know, he's around the edges.
00;39;07;06 - 00;39;10;21
You know, they weren't as focused
on safety as we wanted or there was
00;39;10;21 - 00;39;11;18
leadership.
00;39;11;18 - 00;39;15;02
You know, people pick apart
his appearances on Lex Friedman.
00;39;15;02 - 00;39;16;28
They're like,
oh, is he talking about Sam Altman?
00;39;17;28 - 00;39;18;26
It's interesting
00;39;18;26 - 00;39;22;25
that the one exception is, you know, in
Sam Altman, similarly very polite.
00;39;22;25 - 00;39;27;09
The one exception is Elon Musk,
who goes pretty hard at Sam Altman.
00;39;27;09 - 00;39;30;13
And of course, has a lawsuit,
against Altman,
00;39;30;25 - 00;39;33;25
Susquehanna and, and OpenAI.
00;39;33;26 - 00;39;35;26
But even then, in his advertisements
00;39;35;26 - 00;39;39;01
and is talking about grok
and the future of AI,
00;39;39;18 - 00;39;43;18
he he doesn't pose it exactly
the same way as like we are competing.
00;39;43;18 - 00;39;45;20
There's a little
there's a little bit of a sense I get,
00;39;45;20 - 00;39;48;20
and I don't know what your impression is
that they don't want to
00;39;49;22 - 00;39;52;25
openly engage in a kind of
00;39;53;13 - 00;39;56;13
consumer driven, brand driven
00;39;56;19 - 00;39;59;25
competition the way that other companies
have in the past.
00;39;59;25 - 00;40;01;25
And I don't know why.
00;40;01;25 - 00;40;04;03
That's
I think that's a really interesting point.
00;40;04;03 - 00;40;06;04
And I hadn't,
I hadn't really considered it before.
00;40;06;04 - 00;40;09;11
to me there's I guess
like a more generous way to interpret
00;40;09;11 - 00;40;12;15
that in a less generous way to interpret
that in the more generous way is,
00;40;12;22 - 00;40;14;10
you know,
these are all technologies first,
00;40;14;10 - 00;40;18;11
and they do really care about
what's best for the human race and mankind
00;40;18;11 - 00;40;20;02
and all of that. And they don't want to.
00;40;20;02 - 00;40;25;17
They just truly don't believe themselves
to be Coke and Pepsi or T-Mobile and AT&T.
00;40;25;21 - 00;40;30;22
the less generous way is that, you know,
and I don't really know if that,
00;40;30;22 - 00;40;32;21
but the American equivalent
doesn't work quite as well.
00;40;32;21 - 00;40;36;22
But I think about, like in the UK
and like I'm not from the UK,
00;40;36;22 - 00;40;38;15
but you hear these stories
about like all the top
00;40;38;15 - 00;40;41;21
politicians actually were classmates
in the same private school.
00;40;41;26 - 00;40;46;04
And so it's actually like it's
very this kind of like old boys club.
00;40;46;11 - 00;40;50;03
And is it actually because these guys are
all, you know, friendly with each other.
00;40;50;10 - 00;40;53;10
And, you know, aside from Ellen
who like, has to lob his grenades
00;40;53;17 - 00;40;55;17
and they're just kind of
00;40;55;17 - 00;40;59;05
in on the take together
or as long as the money is flowing,
00;40;59;10 - 00;41;00;15
they don't want to say anything
00;41;00;15 - 00;41;03;15
that could potentially prevent
the money from flowing.
00;41;03;15 - 00;41;08;25
Well, my only insight into that
is it also interesting like one is
00;41;09;10 - 00;41;12;13
all of them were together in 2015
00;41;13;02 - 00;41;16;02
at a conference in Puerto Rico.
00;41;16;07 - 00;41;18;29
I think that Sam Altman
maybe wasn't at that one, but
00;41;18;29 - 00;41;22;10
was at the next conference
a few months later here in the US.
00;41;22;24 - 00;41;26;07
But it was a conference
organized by Max Tegmark, from MIT,
00;41;26;27 - 00;41;29;14
where he had all of the people
00;41;29;14 - 00;41;32;15
who, you know, at the time were true
believers
00;41;32;15 - 00;41;35;19
who had not yet made the discoveries
that would make the AI revolution.
00;41;35;27 - 00;41;40;06
And he brought them together, you know,
did Miss Hassabis the biggest name
00;41;40;06 - 00;41;43;25
at that time, the head of DeepMind,
had just been bought by Google?
00;41;44;22 - 00;41;47;22
Elon Musk was there.
00;41;48;00 - 00;41;52;04
You had,
I believe, alias Musk ever was there.
00;41;52;04 - 00;41;52;28
Dario Ahmadi
00;41;53;29 - 00;41;55;06
in that kind of 1 or
00;41;55;06 - 00;41;58;09
2 conferences starting in Puerto Rico,
Max Tegmark put together.
00;41;58;09 - 00;42;02;22
He got all these guys to try and get them
on the same page with other true
00;42;02;22 - 00;42;04;07
believers like Eliezer.
00;42;04;07 - 00;42;06;08
You'd Kautsky like Nick Bostrom,
00;42;06;08 - 00;42;08;12
these people who at the time were seen
as dreamers.
00;42;08;12 - 00;42;11;12
Nick has since become a bit
more of a scout,
00;42;11;14 - 00;42;14;10
kind of a scout bordering on acceleration
is depending
00;42;14;10 - 00;42;18;01
on the day of the week vs me,
but he got all them together to say, look,
00;42;18;11 - 00;42;21;25
we're the people who actually believe
that AI is going to change the world.
00;42;22;01 - 00;42;24;20
We shouldn't
fight so much among ourselves.
00;42;24;20 - 00;42;27;06
You know, we should
we should work more together.
00;42;27;06 - 00;42;29;13
And there was a kind of brief period
00;42;29;13 - 00;42;31;26
where that really seemed like
it was going to be possible.
00;42;31;26 - 00;42;35;10
And we start to see the fissures
first, in 2018,
00;42;35;10 - 00;42;39;24
I believe it was
when Elon Musk leaves OpenAI,
00;42;40;05 - 00;42;44;21
then when Ontario leaves OpenAI
to start anthropic.
00;42;45;10 - 00;42;47;22
Now we've got even Saskia Ver has left,
00;42;47;22 - 00;42;51;25
OpenAI and there is,
00;42;52;28 - 00;42;54;28
what I'm hearing, like this weird.
00;42;54;28 - 00;42;59;19
Like, okay, all of us, we can share
some of the safety stuff we're learning,
00;42;59;24 - 00;43;03;02
but we no longer really talk like the way
00;43;03;02 - 00;43;07;04
that one insider said it to me is all
these guys were in group chats
00;43;07;04 - 00;43;10;23
together and one after another
they've been leaving the chat.
00;43;11;06 - 00;43;15;04
You know, like the group
chats have closed by 2025.
00;43;15;04 - 00;43;16;21
I don't know if that's true,
but that's what
00;43;16;21 - 00;43;20;11
the people who are, you know,
willing to speak to me in on the record,
00;43;21;03 - 00;43;24;03
close to these people
and close to these decisions are saying.
00;43;25;18 - 00;43;27;02
Interesting. And I am. Yeah.
00;43;27;02 - 00;43;30;20
And you have to wonder it's just
so fascinating that we're talking about,
00;43;31;04 - 00;43;35;03
you know, again, to use your words
like basically creating a new species
00;43;35;03 - 00;43;36;14
here and something that could, you know,
00;43;36;14 - 00;43;39;14
forever
change the course of human history.
00;43;39;16 - 00;43;43;07
And it comes down to a handful of dudes
who may or may not
00;43;43;07 - 00;43;46;20
be in the same, you know, chat
together anymore, which is just like,
00;43;47;21 - 00;43;48;23
I don't know, like it
00;43;48;23 - 00;43;51;23
is a lot to process, right?
00;43;51;25 - 00;43;53;19
Yeah.
00;43;53;19 - 00;43;56;19
we put the podcast together
and why like it originally was,
00;43;56;24 - 00;43;57;16
you know,
00;43;57;16 - 00;43;59;02
it was really supposed to be four episodes
00;43;59;02 - 00;44;01;13
as a at one point
I thought was gonna be like 15.
00;44;01;13 - 00;44;06;07
There's so much to unpack here,
and it truly is like a sci fi movie,
00;44;06;20 - 00;44;09;19
which not to go on
too big of a tangent on this, but
00;44;10;09 - 00;44;13;23
one of the reasons it's
it turns out that it's like a sci fi movie
00;44;14;16 - 00;44;19;04
is because of the advancements
that were made in artificial intelligence
00;44;19;12 - 00;44;22;20
in the 1960s when we were,
00;44;23;00 - 00;44;27;05
when the U.S was investing
heavily in the space race.
00;44;27;11 - 00;44;31;06
They didn't just throw money
into aerospace, they threw a ton of money
00;44;31;16 - 00;44;33;04
into computer science.
00;44;33;04 - 00;44;35;18
And eventually that would help us
have like the semiconductor.
00;44;35;18 - 00;44;38;13
And there's, you know, it's
led you know, the GPS technology
00;44;38;13 - 00;44;40;14
we wouldn't have had
it had not been for that money.
00;44;40;14 - 00;44;44;20
One thing people I don't think realize is
that a ton of money went into artificial
00;44;44;20 - 00;44;47;26
intelligence research in like, help
to fund
00;44;47;26 - 00;44;51;06
the first AI labs at MIT and
00;44;52;04 - 00;44;53;05
other universities,
00;44;53;05 - 00;44;56;27
you know, under Marvin Minsky
and John McCarthy and Claude Shannon.
00;44;56;27 - 00;45;00;08
These like really interesting
early figures in AI.
00;45;00;17 - 00;45;04;15
and with the kind of optimism of the time
and seeing the rockets
00;45;04;19 - 00;45;07;19
take off into orbit and the
00;45;08;06 - 00;45;10;27
AI scientists and researchers,
00;45;10;27 - 00;45;13;09
they had this sense that like, by the time
we're walking on
00;45;13;09 - 00;45;16;24
the moon, we're probably going
to have these AIS in our lives.
00;45;17;12 - 00;45;21;09
And it turns out that they didn't they
they couldn't down the path
00;45;21;09 - 00;45;23;15
they were on, achieve the benchmarks
00;45;23;15 - 00;45;26;13
they were guilty of overhyping
and under-delivering.
00;45;26;13 - 00;45;31;06
And so this the field went into a bit of a
what they call an AI winter.
00;45;31;26 - 00;45;35;09
But because it had become this thing,
00;45;36;00 - 00;45;40;01
certain science fiction authors and film
makers became obsessed
00;45;40;08 - 00;45;43;08
with the idea of what it would be like
in the future to do this.
00;45;43;13 - 00;45;47;05
And one of the people that became
so obsessed was Stanley Kubrick.
00;45;47;22 - 00;45;53;06
And it's in 1968
that he creates 2001 A Space Odyssey.
00;45;53;19 - 00;45;58;24
In collaboration with the leading
AI scientists of the time, and with the
00;45;59;00 - 00;46;03;04
leading people who were concerned
about the future of AI at the time.
00;46;03;09 - 00;46;08;00
And he tried to infuse the
the reality of the world
00;46;08;04 - 00;46;11;11
that they thought would come about,
and the risks that we would have in
00;46;11;11 - 00;46;14;11
that world and the benefits in that world
by talking to these people.
00;46;14;18 - 00;46;17;26
And it made such a compelling film
that it became this
00;46;17;26 - 00;46;21;27
genre of film
and this genre of fear that we have.
00;46;22;06 - 00;46;26;29
And so truly like the science is
what inspired the science fiction, then?
00;46;26;29 - 00;46;30;20
The science fiction has been such a staple
part of our lives for so many years
00;46;31;02 - 00;46;34;10
that it poses this interesting problem
for people right now,
00;46;34;19 - 00;46;37;28
which is that you've got the dreamers
and the scouts and the acceleration ists
00;46;37;28 - 00;46;41;16
who are saying this is real,
join the real debate.
00;46;41;24 - 00;46;44;09
And for some people they're like,
that's just sci fi stuff, man.
00;46;44;09 - 00;46;46;21
You know, like, yeah,
we're not going to do it.
00;46;46;21 - 00;46;49;17
And then for the acceleration it's
they're like, no, no, no, don't be scared.
00;46;49;17 - 00;46;51;17
Those were just movies.
00;46;51;17 - 00;46;54;29
You know the Terminator is very unlikely.
00;46;54;29 - 00;46;56;13
The matrix doesn't make sense.
00;46;56;13 - 00;46;58;13
Why would they use us as batteries
00;46;58;13 - 00;47;03;11
and and it just reveals
how at the end of the day,
00;47;03;19 - 00;47;08;00
what we believe in, like the stories
that we attach meaning to,
00;47;09;14 - 00;47;11;18
it's like the ultimate technology
00;47;11;18 - 00;47;15;11
that's really the hardware
or the software I can't ever think of.
00;47;15;12 - 00;47;18;12
What's the best metaphor
that's running our world right now?
00;47;18;20 - 00;47;21;12
And you see that with this investment
that's happening to
00;47;21;12 - 00;47;24;29
people have come to believe
that this thing's going to work.
00;47;25;26 - 00;47;28;16
Whether that's true or not,
I think it's too soon to say.
00;47;28;16 - 00;47;31;06
But it's interesting to me as a reporter,
especially a reporter
00;47;31;06 - 00;47;34;28
who's like, so interested in debate,
like what people believe,
00;47;35;11 - 00;47;37;03
how they got to these beliefs,
00;47;37;03 - 00;47;40;13
and then what's going to happen
next as this becomes a bigger part of our,
00;47;40;19 - 00;47;43;19
you know,
national and global conversation.
00;47;44;07 - 00;47;47;14
So do you you know when you look at the
00;47;48;00 - 00;47;49;08
your role as a reporter,
00;47;49;08 - 00;47;52;10
I mean obviously so much of
where you're focused is kind of
00;47;52;10 - 00;47;56;25
on the, on the, the cutting edge and
what's next and where this is all going.
00;47;56;25 - 00;48;01;09
And, and you know, the people
and the stories and, and how those change.
00;48;01;23 - 00;48;04;23
One of the pieces that's I don't know,
00;48;05;09 - 00;48;09;00
it's probably not right to say this,
but just like less exciting,
00;48;09;04 - 00;48;12;11
if I can call it
that is just like where the rubber
00;48;12;11 - 00;48;15;11
hits the road in terms
of how people are actually using,
00;48;15;16 - 00;48;19;29
you know, the current,
you know, AI tools in their daily lives,
00;48;19;29 - 00;48;23;04
whether that's individuals,
whether that's organizations.
00;48;23;09 - 00;48;27;20
Is that something you cover it all
and and you know, if it is, you know, what
00;48;27;20 - 00;48;32;09
are you finding in terms of, you know,
what's what's most promising there?
00;48;32;10 - 00;48;33;26
And what does that look like?
00;48;35;26 - 00;48;37;15
I mean, the way that I'm
00;48;37;15 - 00;48;40;29
thinking about it, like
so the company that I run with my buddy
00;48;40;29 - 00;48;45;09
Matt, it's called Long View
and like our big thing
00;48;45;17 - 00;48;49;05
is that there's a lot of media
that's looking at our moment
00;48;49;19 - 00;48;51;05
and being like,
did you see this just happened?
00;48;51;05 - 00;48;53;15
Oh my God, this new scandal,
this Sydney Sweeney thing.
00;48;53;15 - 00;48;57;19
Oh, Pete Higgs at Signal Gate
and all the subtext get together.
00;48;57;19 - 00;49;00;22
And then like, two weeks later,
oh, there's a whole new thing.
00;49;01;15 - 00;49;04;21
And we're dedicated to saying,
okay, let's look at everything
00;49;04;21 - 00;49;08;14
happening in our moment from this longer
view, like, how do we get here?
00;49;08;14 - 00;49;11;07
What's interesting context
to bring to this that can, like,
00;49;11;07 - 00;49;14;06
really be useful in people's lives
as they try to navigate,
00;49;14;18 - 00;49;17;14
what they want to do,
what views they're going to form
00;49;17;14 - 00;49;21;03
and like looking back through history and
looking forward, like what's happening.
00;49;21;11 - 00;49;25;28
So I've been a little bit
less invested in some of the,
00;49;25;28 - 00;49;29;26
like, hallucinations that these things
are having and more interested in.
00;49;30;12 - 00;49;34;20
What about this technology is leading it
to have these hallucinations?
00;49;34;20 - 00;49;36;09
Oh, it actually turns out it's
00;49;36;09 - 00;49;40;13
this like decades old problem that
if you want to use these neural networks
00;49;40;13 - 00;49;43;26
as your base models,
they are incredibly capable.
00;49;44;07 - 00;49;48;10
But you have to make this trade off
or you'll never quite know how they work.
00;49;48;18 - 00;49;49;28
That's really interesting to me.
00;49;49;28 - 00;49;52;29
So I've kind of been a little bit more
big picture.
00;49;53;12 - 00;49;58;12
That being said,
I am a person who uses words for a living.
00;49;59;06 - 00;50;01;24
It's fascinating
that these limbs have been this
00;50;02;26 - 00;50;05;24
massive surprise in the industry
to everybody.
00;50;05;24 - 00;50;08;12
Right? It's also something I don't know
if everybody's aware of it.
00;50;08;12 - 00;50;13;13
Like the money was not being bet
on the limbs until very recently.
00;50;13;13 - 00;50;17;00
This was a totally, novel approach
00;50;17;09 - 00;50;20;09
to getting to this place that people
have wanted to get to for so long,
00;50;21;10 - 00;50;25;04
and I find them to be surprisingly useful,
00;50;25;15 - 00;50;29;29
even as they are
not yet as useful as like.
00;50;33;09 - 00;50;34;25
Maybe me think of it like this, Jeff.
00;50;34;25 - 00;50;37;25
Like.
00;50;37;28 - 00;50;40;27
In the 1890s,
00;50;40;27 - 00;50;43;15
there were all these, like, World's fairs
I'm talking to from Chicago.
00;50;43;15 - 00;50;47;04
Since 1893, Chicago hosts the World's Fair
00;50;48;08 - 00;50;51;01
and that would have been
00;50;51;01 - 00;50;54;01
the first time
that millions of millions of people
00;50;54;16 - 00;50;57;16
ever interacted with electricity.
00;50;58;19 - 00;51;02;15
You just imagine showing up
and seeing thousands of light bulbs,
00;51;03;04 - 00;51;07;03
seeing, like the first prototypes
of a radio, right?
00;51;07;12 - 00;51;12;00
Like it's just hard for us to even imagine
what that would have felt like.
00;51;12;08 - 00;51;13;20
But I bet they were pretty janky.
00;51;14;21 - 00;51;15;02
And one of
00;51;15;02 - 00;51;18;13
the reasons I bet it was tough
is that, like, no one looks at that
00;51;18;13 - 00;51;22;18
light bulb or at that radio,
no one could ever envision Wi-Fi.
00;51;23;02 - 00;51;26;10
No one could have envisioned
how this thing
00;51;26;20 - 00;51;30;00
would become so important
not just to our economies, but
00;51;30;00 - 00;51;33;08
to how we govern and how we communicate
that now in the world we live in.
00;51;33;27 - 00;51;36;27
If we lost electricity right now,
00;51;38;02 - 00;51;41;13
it would probably push us
into a post-apocalyptic situation.
00;51;41;24 - 00;51;45;13
I don't know, I don't even know
what would happen within a month
00;51;45;13 - 00;51;47;12
if the lights went off,
if electricity went out
00;51;47;12 - 00;51;50;13
and it didn't come back for a month,
who would we be in a month?
00;51;50;13 - 00;51;53;19
It's it's like a disturbing
thought experiment to try and go down.
00;51;54;05 - 00;51;56;07
And that wasn't that long ago.
00;51;56;07 - 00;51;59;14
And I think that there's a part of me
that wavers between, oh, wow, this thing
00;51;59;14 - 00;52;03;02
really helped clean up all the typos
on this email that I was going to send.
00;52;03;07 - 00;52;04;24
Love that.
00;52;04;24 - 00;52;07;16
And then the running into other I,
which is like,
00;52;07;16 - 00;52;10;13
is this a light bulb at the World's Fair?
00;52;10;13 - 00;52;13;08
I don't know,
sometimes I think it could be.
00;52;13;08 - 00;52;15;19
It really might
be. In other days, I think.
00;52;16;23 - 00;52;17;26
I'm not so sure.
00;52;17;26 - 00;52;20;26
So I kind of waver back and forth
between the two.
00;52;21;04 - 00;52;24;13
It's just interesting
to know that that is happening alongside
00;52;24;20 - 00;52;28;27
companies, businesses, millions and
millions of people are using these things.
00;52;28;27 - 00;52;31;17
And that is unlike previous
new technologies.
00;52;31;17 - 00;52;35;15
You know, when the internet was coming
out, it took so long to roll out.
00;52;35;27 - 00;52;38;04
We talk about the dotcom bubble.
00;52;38;04 - 00;52;39;09
It's been happening a lot lately.
00;52;39;09 - 00;52;44;04
There's this fear about an AI investment
bubble, which I think is probably found
00;52;44;04 - 00;52;47;14
to prove true in one way or another
because of how insane the investment is.
00;52;47;22 - 00;52;51;07
But, the thing
that's different about these
00;52;51;11 - 00;52;54;21
AI systems now is just how much
00;52;55;02 - 00;52;58;09
massive amounts of people
are interacting with them,
00;52;58;20 - 00;53;01;22
even as they're in their infancy,
even as they're so new,
00;53;01;25 - 00;53;06;23
even as they're not yet able to deliver
on the things that people want.
00;53;07;09 - 00;53;11;22
There's something about the experience
in the chat bots and that people are
00;53;11;29 - 00;53;15;09
they're showing up for
and they're they're sticking around,
00;53;15;18 - 00;53;15;27
you know,
00;53;15;27 - 00;53;19;14
like when a new technology I remember
when threads comes out from Instagram,
00;53;19;20 - 00;53;20;21
it was massive.
00;53;20;21 - 00;53;22;27
It was so huge.
People didn't stick around.
00;53;22;27 - 00;53;25;27
You know, like this thing's
now been out long enough.
00;53;26;07 - 00;53;28;04
People are showing up.
They're sticking around.
00;53;28;04 - 00;53;31;04
It's getting integrated into everything
from our search engines
00;53;31;09 - 00;53;34;24
to our, you know, military security.
00;53;36;28 - 00;53;38;20
Those are the times when I feel like,
00;53;38;20 - 00;53;41;14
this is this is a little bit
more light bulb at the World's Fair.
00;53;41;14 - 00;53;42;16
You know?
00;53;42;16 - 00;53;45;17
I haven't heard the light bulb
at the World's Fair, and I really like it.
00;53;46;03 - 00;53;50;00
And I want to use all of this
as kind of a backdrop
00;53;50;11 - 00;53;53;20
for, for my next question,
which may make it more fair
00;53;53;20 - 00;53;57;11
or less fair, but certainly not
the easiest question to answer.
00;53;57;11 - 00;54;00;16
But I mean, you, you
you started that answer by saying that
00;54;01;03 - 00;54;04;22
you're in the words business
and that you found use for this technology
00;54;04;22 - 00;54;06;05
in the words business.
00;54;06;05 - 00;54;09;10
And I think about I think about the impact
00;54;09;10 - 00;54;13;00
that's already being had
on media organizations.
00;54;13;07 - 00;54;17;05
And that will continue to be had on
media organizations and not not just the,
00;54;17;08 - 00;54;21;04
producer or the supplier side,
but just how how people
00;54;22;00 - 00;54;25;13
interact with content in this world,
whether it's how they produce content,
00;54;25;13 - 00;54;28;19
whether it's the value has that
that content has for them.
00;54;29;00 - 00;54;33;10
would be weird to be at the World's
Fair and say, you're a journalist, Andy.
00;54;33;11 - 00;54;37;05
How do you see yourself as a journalist
using electricity in 1893?
00;54;37;12 - 00;54;40;06
But at the same time, I, I am curious
00;54;40;06 - 00;54;44;07
what you see
the advent of this technology, even,
00;54;44;14 - 00;54;46;17
you know,
the changes in the last three years
00;54;46;17 - 00;54;49;20
and the potential trajectory
of where it goes.
00;54;49;29 - 00;54;53;13
What does it mean as a media producer
00;54;54;07 - 00;54;57;23
for your industry,
and how does people's relationship
00;54;57;23 - 00;55;04;18
with content impact our our institutions
around the stories that we tell?
00;55;06;11 - 00;55;09;08
I'm a bit of a contrarian on this,
to tell you the truth,
00;55;09;08 - 00;55;14;05
I can't imagine a technology that fucks up
our industry more than social media.
00;55;14;05 - 00;55;17;05
So I don't, I don't tremble in fear.
00;55;18;10 - 00;55;21;18
It's yet to really be reckoned with.
00;55;22;12 - 00;55;25;07
How damaging
00;55;25;07 - 00;55;28;22
social media ended up being on
almost every level
00;55;29;06 - 00;55;32;05
for the institution of journalism,
whether that's
00;55;32;28 - 00;55;35;23
the incentives of journalists
00;55;35;23 - 00;55;39;00
to play for likes and attention
00;55;39;15 - 00;55;43;05
and to turn into brands instead of
being people committed to values.
00;55;43;10 - 00;55;44;28
And that's been tough.
00;55;44;28 - 00;55;47;28
Or whether it's the fact that the,
00;55;48;06 - 00;55;51;20
you know, newspapers
who invested in reporting
00;55;52;20 - 00;55;56;26
were not able to compete with the money
being made by the, quote unquote,
00;55;56;26 - 00;56;00;13
journalists
who just give hyperbolic hot takes
00;56;00;22 - 00;56;03;21
that do well inside of the algorithms.
00;56;03;21 - 00;56;04;19
That didn't work out well.
00;56;05;22 - 00;56;09;15
The polarizing politics, which it's
hard to know which is the chicken
00;56;09;15 - 00;56;12;25
and which is the egg when it comes
to media, social media and politics.
00;56;12;25 - 00;56;15;09
But boy, that didn't work out very well.
00;56;15;09 - 00;56;18;11
And we find ourselves in a situation
where people
00;56;18;11 - 00;56;22;12
trust journalists less now
00;56;22;22 - 00;56;27;00
than in the history of polling around
trust in journalism.
00;56;27;11 - 00;56;28;29
That's how far we've fallen,
00;56;28;29 - 00;56;32;26
and we keep going another notch
in the wrong direction.
00;56;32;26 - 00;56;35;17
Even as for the last three years,
00;56;35;17 - 00;56;37;09
you know,
we have media conferences about this.
00;56;37;09 - 00;56;39;19
What are we going to do
to earn back their trust and all this?
00;56;39;19 - 00;56;44;15
So I think that whatever happens with
I think we're already doing
00;56;44;15 - 00;56;47;20
so poorly that we're not going to be able
to blame it for much.
00;56;47;20 - 00;56;49;14
You know, we're so far down.
00;56;50;28 - 00;56;52;04
On the other hand,
00;56;52;04 - 00;56;55;03
I also think that one of the reasons
that people
00;56;55;16 - 00;56;57;29
don't trust journalism
00;56;57;29 - 00;57;01;02
is because of the self-fulfilling
prophecy of it,
00;57;01;20 - 00;57;04;10
that the algorithms are giving us
00;57;04;10 - 00;57;08;00
what we say we will give our attention to.
00;57;08;15 - 00;57;11;11
And I don't always know to
00;57;11;11 - 00;57;14;01
how much of this is just human nature.
00;57;14;01 - 00;57;16;18
On display, and
00;57;16;18 - 00;57;19;20
there's not going to be
a technological solution to human nature.
00;57;19;20 - 00;57;20;02
Right?
00;57;20;02 - 00;57;24;02
Our human nature is something we have to
grapple with at a deeper, stranger level.
00;57;24;20 - 00;57;27;24
But I will say this
when it comes to the AI tools
00;57;27;24 - 00;57;30;24
as they exist right now,
there are things about them.
00;57;30;24 - 00;57;32;11
They're obviously freaky.
00;57;32;11 - 00;57;34;01
They hallucinate, they make stuff up.
00;57;34;01 - 00;57;36;16
If you go on there and just ask it
about yourself, that's hilarious.
00;57;36;16 - 00;57;38;01
So I'm about to interview myself.
00;57;38;01 - 00;57;39;25
I'd love some prep material.
00;57;39;25 - 00;57;43;01
It will just make up jobs
you didn't have, you know, like,
00;57;43;02 - 00;57;47;00
that's such a weird move, and even weirder
that the people back at the company
00;57;47;00 - 00;57;48;27
don't know why it's making that thing up.
00;57;48;27 - 00;57;50;11
Those, I feel like
00;57;50;11 - 00;57;52;26
are incremental, incremental changes
that could get better over time.
00;57;52;26 - 00;57;57;00
But even in its infancy,
dummy state, even just a chat bot, right.
00;57;57;00 - 00;58;01;15
Which is different than the AI
if you go and ask it a polarizing question
00;58;02;16 - 00;58;03;07
for example, I did a
00;58;03;07 - 00;58;07;00
series, with my colleagues
Matt and Megan about J.K.
00;58;07;00 - 00;58;09;05
Rowling,
where we got to interview JK Rowling
00;58;09;05 - 00;58;10;27
and spend time with her,
getting to understand
00;58;10;27 - 00;58;13;18
why she waded into this big public debate
about sex and gender.
00;58;13;18 - 00;58;16;06
And then we spent time with her critics
about why they're so upset about it.
00;58;16;06 - 00;58;17;20
And we tried to help people,
00;58;17;20 - 00;58;21;09
you know, see this complex debate
more clearly than they were
00;58;21;09 - 00;58;23;04
getting a chance
to see it on social media.
00;58;23;04 - 00;58;27;11
If you googled at the time,
why are people mad at JK Rowling?
00;58;28;17 - 00;58;32;11
Google will give you a recommendation
of articles that you could read
00;58;32;26 - 00;58;36;04
based on the amount of attention
those articles have gotten in the past,
00;58;36;14 - 00;58;39;17
and so those articles
are almost all hyperbolic.
00;58;39;25 - 00;58;40;27
They don't actually,
00;58;40;27 - 00;58;44;26
you could read those articles and not know
why people are really mad at J.K.
00;58;44;26 - 00;58;47;11
Rowling,
not really know what JK Rowling believes.
00;58;47;11 - 00;58;51;16
But then as an experiment,
I went to ChatGPT and this was ChatGPT for
00;58;52;03 - 00;58;54;29
and just said, why can you tell me
00;58;54;29 - 00;58;57;23
in good faith,
why are people upset with J.K.
00;58;57;23 - 00;59;01;05
Rowling incredibly blown away?
00;59;01;20 - 00;59;04;23
And then not only is it nuanced
and really helpful and like here,
00;59;04;23 - 00;59;06;29
if you want to read her words about this,
if you want to read it
00;59;06;29 - 00;59;09;14
and here are some powerful things
from her critics.
00;59;09;14 - 00;59;11;13
It's like helped
you see this and then it through.
00;59;11;13 - 00;59;13;10
Because I know that Sam Altman did this.
00;59;13;10 - 00;59;17;20
I think with the pivot
to 3.5, the team at OpenAI,
00;59;17;20 - 00;59;20;20
I told it to be nuanced about hot
button issues.
00;59;20;20 - 00;59;22;11
It says, remember, when you're forming
00;59;22;11 - 00;59;25;25
your own views about an issue like this
to take in multiple perspectives?
00;59;26;07 - 00;59;27;02
And part of me thought
00;59;28;10 - 00;59;31;05
maybe journalism deserves it.
00;59;31;05 - 00;59;34;20
You know, like maybe this thing
could inform the public better than we can
00;59;35;02 - 00;59;36;26
because
00;59;36;26 - 00;59;40;29
I'm not just hot button social issues
like that, but almost any issue.
00;59;41;24 - 00;59;43;21
We have been failing the American people.
00;59;43;21 - 00;59;45;23
We're just not doing a great job.
00;59;45;23 - 00;59;48;14
And if they decide that the the, the
00;59;48;14 - 00;59;51;14
AI is more nuanced,
00;59;51;17 - 00;59;54;18
I don't know,
I just it's a weird thought, but like,
00;59;55;07 - 00;59;58;08
if we can't earn people's
trust and respect,
00;59;58;27 - 01;00;02;09
then I'm not going to be upset that they
decided to look for a solution elsewhere.
01;00;02;15 - 01;00;05;15
I'm trying to be the journalist
and the kind of journalist
01;00;05;19 - 01;00;08;25
that earns their respect,
but I just don't think that our industry
01;00;09;10 - 01;00;13;01
has much, you know, tissues to, to grab.
01;00;13;01 - 01;00;16;25
We don't have the sympathy violin
that we could be pulling out right now.
01;00;16;25 - 01;00;19;02
I don't think most people around
the country are looking at us
01;00;19;02 - 01;00;23;20
would be like, but you're indispensable
and you're doing such amazing work.
01;00;24;04 - 01;00;25;05
I don't think we have to.
01;00;27;02 - 01;00;29;19
And if I take that apart a little bit
and it's
01;00;29;19 - 01;00;32;19
you know
it's really interesting to me that
01;00;33;09 - 01;00;35;15
you know there's this, these platforms,
01;00;35;15 - 01;00;39;04
this social media that rewards content.
01;00;39;04 - 01;00;42;12
And I'll use content like almost spit out
the word content when I say it
01;00;42;12 - 01;00;44;29
like it's content and,
but in the pejorative sense.
01;00;44;29 - 01;00;49;12
And it sounds like you're saying
that the journalists almost like they,
01;00;49;13 - 01;00;54;00
they scored an own goal by saying like,
oh, we don't do journalism anymore.
01;00;54;00 - 01;00;57;29
We do content and like,
like to what degree did journalists
01;00;58;05 - 01;01;01;07
actually precipitate this,
01;01;01;13 - 01;01;03;28
this rapid decline in trust
01;01;03;28 - 01;01;06;22
by changing what they were creating
01;01;06;22 - 01;01;09;28
into a much less trustworthy
version of itself, like, is there
01;01;10;17 - 01;01;16;00
is there still room for journal,
like for journalism or for, you know,
01;01;16;00 - 01;01;19;23
some sort of journalistic organization
to say, actually, we don't do content.
01;01;19;23 - 01;01;21;11
Content and journalism are not the same.
01;01;21;11 - 01;01;24;03
And we know that,
and we're doing journalism, dammit.
01;01;24;03 - 01;01;27;19
And that's a, you know,
a third pillar against content.
01;01;27;25 - 01;01;31;03
You know, the, the,
you know, traditionally created content
01;01;31;03 - 01;01;34;16
or social content and whatever ChatGPT
is spitting out for you.
01;01;34;28 - 01;01;37;04
Oh I think we can definitely do it.
01;01;37;04 - 01;01;38;17
I mean we've, we've been here before.
01;01;38;17 - 01;01;43;21
I don't know how, familiar
you are with the history
01;01;43;21 - 01;01;49;10
of American journalism, but boy I mean,
our past is not it's not squeaky clean.
01;01;49;10 - 01;01;53;06
I mean, we've we've been journalism
obsessed since our founding
01;01;53;11 - 01;01;57;29
because the fact that we're a republic
and our founding myth was like,
01;01;57;29 - 01;01;59;01
everyone gets a voice
01;01;59;01 - 01;02;02;07
and everyone should have an opinion,
so everyone should read a newspaper.
01;02;02;14 - 01;02;05;12
And like you can read Europeans accounts
of, like, trips to America.
01;02;05;12 - 01;02;08;04
And they're like,
my barber had an opinion on politics.
01;02;08;04 - 01;02;11;14
You know, like, everybody here
thinks that their voice matters.
01;02;11;14 - 01;02;14;02
It's crazy.
They're all reading newspapers.
01;02;14;02 - 01;02;17;15
But they were very partizan
and most newspapers were owned by a party.
01;02;17;15 - 01;02;22;01
They were owned by
and they were understood to be Partizan,
01;02;22;03 - 01;02;26;16
you know, and this was, I think, fine,
when there was
01;02;26;16 - 01;02;31;05
a sort of shared sense
of like national purpose.
01;02;31;15 - 01;02;34;06
And with the introduction
of big capitalism
01;02;34;06 - 01;02;35;29
during the Industrial Revolution,
01;02;35;29 - 01;02;39;14
you started to see these papers getting
bought up and these conglomeration.
01;02;39;15 - 01;02;42;26
And this is what people often
refer to as the age of yellow journalism,
01;02;43;17 - 01;02;47;22
because the newspapers, they weren't
just that, they were Partizan.
01;02;48;04 - 01;02;51;01
They were they were so full of bullshit.
01;02;51;01 - 01;02;55;26
They were they had taken exaggeration
so far in the spirit of competition.
01;02;55;26 - 01;02;56;01
Right.
01;02;56;01 - 01;02;59;17
This is the age of the new, the newsboy,
you know, the newsy on the street corner.
01;02;59;17 - 01;03;00;10
It was like,
01;03;00;10 - 01;03;04;04
you know, mother kids are children,
you know, read all about it, like it.
01;03;04;04 - 01;03;05;02
You'd find it like, well,
01;03;06;08 - 01;03;09;01
I'm not sure that actually was the story,
but, like, at least I got your attention.
01;03;09;01 - 01;03;13;16
I sold you a paper today, and we got out
of the yellow journalism racket.
01;03;13;16 - 01;03;14;21
It wasn't easy.
01;03;14;21 - 01;03;18;03
And you can look back for, like,
who helped us get out of that
01;03;18;03 - 01;03;20;17
and get to an era
where we were trusted again.
01;03;20;17 - 01;03;22;13
And, you know, it's a good story.
01;03;22;13 - 01;03;23;09
I can't go to the whole thing now.
01;03;23;09 - 01;03;26;07
But I will say this
that's where The New York Times becomes.
01;03;26;07 - 01;03;28;11
The New York Times is like a lot of us
knew it.
01;03;28;11 - 01;03;31;13
Growing up
is because you had this rich guy
01;03;31;21 - 01;03;36;11
from Jacksonville, Tennessee, named Fox,
who goes and buys
01;03;36;11 - 01;03;39;14
the struggling New York Times,
which was a Republican partizan paper
01;03;39;14 - 01;03;43;09
who was failing in the competition
with the other yellow journalist
01;03;43;09 - 01;03;44;26
newspapers of New York.
01;03;44;26 - 01;03;46;01
And he's,
01;03;46;01 - 01;03;48;18
I think, brilliant marketing
as well as brilliant
01;03;48;18 - 01;03;51;18
change said what if instead of doing what
they're doing,
01;03;51;21 - 01;03;54;21
what if our new motto is all the news
that's fit to print?
01;03;55;02 - 01;03;59;10
We only publish fact based like that
became their thing.
01;03;59;16 - 01;04;01;24
We're going to do fact based journalism.
01;04;01;24 - 01;04;03;08
And it ended up
01;04;03;08 - 01;04;06;13
not just, you know, appealing to people
who wanted to think, yes,
01;04;06;13 - 01;04;10;16
I prefer the facts of the New York Times
over that shill from the New York world.
01;04;10;16 - 01;04;11;13
Right.
01;04;11;13 - 01;04;14;13
And it had kind of
it played into people's pride.
01;04;14;13 - 01;04;16;06
But it was also great for advertisers
01;04;16;06 - 01;04;19;19
because they would rather sell
their shaving cream next to a,
01;04;19;24 - 01;04;22;23
you know, a respectable article
that had been fact checked
01;04;22;23 - 01;04;26;26
rather than the best thing about the drunk
that killed the hooker last night.
01;04;26;26 - 01;04;29;26
That didn't even turn to be true
changes like that.
01;04;30;09 - 01;04;32;19
You know, they weren't obvious,
but they did.
01;04;32;19 - 01;04;34;29
They did come about.
I think changes like that could happen.
01;04;34;29 - 01;04;39;03
Now. I know people, including a lot of
wealthy people, who want that to happen,
01;04;39;21 - 01;04;44;12
but it comes down to like it's
it's it's this complicated web
01;04;44;12 - 01;04;48;22
because one of the reasons
that journalists did the things that made
01;04;48;22 - 01;04;53;06
our industry lose so much trust
is because there was a demand for it.
01;04;53;21 - 01;04;56;17
That is because people wanted that.
01;04;56;17 - 01;04;58;22
And it's it's really complicated.
01;04;58;22 - 01;05;01;24
I mean, if you look at at,
they say the New York Times, right?
01;05;01;24 - 01;05;04;27
They pivoted so far under
01;05;05;02 - 01;05;07;27
I worked there for a number of years,
01;05;07;27 - 01;05;10;28
and I worked on
the politics desk, and boy,
01;05;12;07 - 01;05;14;08
it's understandable why they haven't.
01;05;14;08 - 01;05;17;07
And they have this reputation
for being bias.
01;05;17;07 - 01;05;18;28
I'll tell you that. It was it was coming.
01;05;18;28 - 01;05;21;00
The calls were coming
from inside the house, right.
01;05;21;00 - 01;05;24;01
But nobody wanted to publish anything
that was wrong.
01;05;24;26 - 01;05;28;18
They just didn't want to publish a piece
that would get people
01;05;29;17 - 01;05;31;24
who they liked mad at them on Twitter.
01;05;31;24 - 01;05;34;14
And so they would think,
how do I word this in a way
01;05;34;14 - 01;05;38;27
to like my fellow graduates of Yale
still think I'm cool?
01;05;38;27 - 01;05;39;29
Or like, don't think that.
01;05;39;29 - 01;05;43;02
And then they would get punished if they
publish something that was too nuanced.
01;05;43;14 - 01;05;43;20
Right?
01;05;43;20 - 01;05;46;19
And I do think that they eventually
like under their new leadership
01;05;46;19 - 01;05;48;19
especially,
they're trying to pivot away from that.
01;05;48;19 - 01;05;53;05
They want to become seen as a
as a less bias and more,
01;05;53;05 - 01;05;57;01
you know, trustworthy source,
no matter what your politics are.
01;05;57;06 - 01;05;58;26
I'm in contact with editors and people.
01;05;58;26 - 01;06;02;15
They're still like it's a noticeable
attempt they're trying to make to do that.
01;06;02;26 - 01;06;06;09
And yet you'll see them
get paid for it by,
01;06;06;09 - 01;06;09;09
you know, especially people on the left
because like, they, they've they kind of
01;06;11;08 - 01;06;12;08
got themselves a situation
01;06;12;08 - 01;06;16;16
where they have a large subscriber base
that expects them to be on the left.
01;06;16;16 - 01;06;18;03
And so if they do anything
that makes it look like
01;06;18;03 - 01;06;20;20
they're not on the left, these
these people now feel betrayed.
01;06;20;20 - 01;06;22;15
They get upset.
01;06;22;15 - 01;06;25;16
It's just hard to know
to bring it all the way back to I though
01;06;25;26 - 01;06;27;24
one of the things
that the acceleration tests,
01;06;27;24 - 01;06;30;14
some of the accelerations,
I've talked to one of the pictures
01;06;30;14 - 01;06;33;13
of the future
that they paint, that I find attractive
01;06;33;15 - 01;06;37;05
is a picture of the future where we
somehow get out of the scrolling game,
01;06;38;05 - 01;06;40;08
we get out of the screen game
01;06;40;08 - 01;06;44;17
that, you know, when I've asked them
about their concerns about,
01;06;44;28 - 01;06;47;21
the attention economy
and how much our attention
01;06;47;21 - 01;06;49;15
has got sucked up into these algorithms
01;06;49;15 - 01;06;51;16
and that we feel kind of bad
about ourselves afterwards.
01;06;51;16 - 01;06;54;09
And if you ask somebody like,
what was your favorite reel
01;06;54;09 - 01;06;56;18
you saw last month,
they can't even remember one real.
01;06;56;18 - 01;06;59;17
They saw somebody saw a million of them,
and they spent hours doing it.
01;06;59;17 - 01;07;00;22
Like, what did that happens?
01;07;00;22 - 01;07;05;08
And he said that his his hope is that AI
is this healthier replacement for that,
01;07;05;08 - 01;07;08;02
that you're having a conversation
with an intelligence
01;07;08;02 - 01;07;12;07
who's like, invested in
wanting you to achieve your goals and that
01;07;12;20 - 01;07;16;00
that AI that you're in conversation
with that AI is it's helping you.
01;07;16;08 - 01;07;18;17
It's not being funded by you.
01;07;18;17 - 01;07;23;18
Scrolling to the next sexy or terrible
or frustrating thing, like its goal
01;07;23;18 - 01;07;27;16
is to actually make you deeply feel like
it has been useful to it.
01;07;27;16 - 01;07;29;19
And I don't know.
01;07;29;19 - 01;07;31;05
I don't know
if that's going to be possible.
01;07;31;05 - 01;07;34;17
I like, though, that
that is one of the motivations behind,
01;07;34;25 - 01;07;38;10
you know, especially the product side
in this AI revolution.
01;07;38;10 - 01;07;39;14
And I'm rooting for them.
01;07;39;14 - 01;07;43;03
You know, I would rather live in a world
where we're deep in conversation about
01;07;43;12 - 01;07;47;20
things meaningful to us, even if it's
with artificial intelligences.
01;07;47;29 - 01;07;51;07
Then if it's continuing to go down
this tick tock Instagram
01;07;51;16 - 01;07;54;00
race to the bottom of the brain stem.
01;07;54;00 - 01;07;58;11
Just total nihilistic, almost like
01;07;59;10 - 01;08;01;29
attention addicts that we've become.
01;08;01;29 - 01;08;03;20
And I don't think any of us like it.
01;08;03;20 - 01;08;07;08
Like that's the thing to is a consumer
products are supposed to be,
01;08;08;04 - 01;08;10;10
you know, about what we like.
01;08;10;10 - 01;08;11;25
I don't know,
a lot of people are like, dude,
01;08;11;25 - 01;08;14;26
I spent the best three hours of my life
on TikTok last night.
01;08;14;26 - 01;08;16;14
You know,
01;08;16;14 - 01;08;19;18
I woke up this morning, hit Instagram
Reels for two hours.
01;08;19;26 - 01;08;21;20
Best hated to go to work.
01;08;21;20 - 01;08;22;28
I was having such a ball.
01;08;22;28 - 01;08;26;04
No, that's not how people feel. But.
01;08;26;26 - 01;08;30;20
I, I love that vision and it's, it's
so encouraging for me to hear that
01;08;30;26 - 01;08;33;25
there are people in high places
saying that,
01;08;35;24 - 01;08;37;04
I want to believe Andy.
01;08;37;04 - 01;08;41;09
I want to believe that, that the piece
that the, the piece that concerns me is
01;08;41;10 - 01;08;42;29
and you may know more about it,
01;08;42;29 - 01;08;46;27
but the worry that it actually goes
in the opposite direction and that,
01;08;47;05 - 01;08;50;13
you know, seeing, you know,
one of the things that ChatGPT just a call
01;08;50;13 - 01;08;54;12
out, one has gotten really good at good
is an interesting choice of words by me.
01;08;54;12 - 01;08;57;12
But one of the things that they've started
doing
01;08;57;14 - 01;09;00;20
is whenever they finish an answer, I don't
01;09;00;21 - 01;09;05;10
you know that the ChatGPT will always say,
can I give you more information on this?
01;09;05;10 - 01;09;07;04
Right. Can you?
01;09;07;04 - 01;09;08;03
Let's keep it going.
01;09;08;03 - 01;09;10;08
Stay with me a little bit longer.
Hang out.
01;09;10;08 - 01;09;13;29
You know, let's make this three hours
versus like,
01;09;14;02 - 01;09;16;08
you got me need, you know,
get out of here.
01;09;16;08 - 01;09;18;16
It's it's
that it's that notion of stickiness.
01;09;18;16 - 01;09;20;18
It's that notion of engagement.
01;09;20;18 - 01;09;25;06
And it feels like
that's kind of the Facebook ization
01;09;25;06 - 01;09;29;22
or the meta ization or whatever you want
to call it, of some of these tools.
01;09;30;04 - 01;09;31;08
Do you worry about that?
01;09;32;09 - 01;09;32;19
Yeah.
01;09;32;19 - 01;09;33;05
Not yet.
01;09;33;05 - 01;09;36;16
I think there's enough to worry
about with between the existential risk
01;09;36;16 - 01;09;38;22
and like the troubles
with current social media.
01;09;38;22 - 01;09;42;11
I mean I know what you're talking about
and it's interesting because I maybe not
01;09;42;11 - 01;09;44;02
surprising as a person who likes podcast,
01;09;44;02 - 01;09;47;06
I talk to it, I love its audio component.
01;09;47;18 - 01;09;51;08
And, I will sometimes just,
you know, be making dinner.
01;09;52;13 - 01;09;56;26
And I'll have a conversation with it
about, like, I talk a lot about religion.
01;09;56;26 - 01;09;58;11
I talk a lot about history.
01;09;58;11 - 01;10;01;17
Of course, you can never take everything
it says as, like, biblical truth.
01;10;01;17 - 01;10;03;00
But it's an interesting conversation.
01;10;03;00 - 01;10;06;05
Maybe can be more stimulating for the mind
than, maybe just listening
01;10;06;05 - 01;10;09;05
to, a one way conversation.
01;10;09;06 - 01;10;14;01
And I noticed when it start,
it used to just answer and then stop.
01;10;14;01 - 01;10;16;05
And then it started to make a prompt
01;10;16;05 - 01;10;18;11
and I news, and I was like,
oh, that's really interesting.
01;10;18;11 - 01;10;21;25
The part of it that I was clued in on
wasn't like, oh, now I'm going to spend
01;10;21;25 - 01;10;25;20
more time talking with you
because it's questions were not like,
01;10;26;00 - 01;10;29;17
if you're if you're talking about
like paintings during the enlightenment,
01;10;29;26 - 01;10;34;13
maybe what you want to talk about is,
you know, how stupid the Republicans are
01;10;34;13 - 01;10;38;04
or how awful the Democrats are,
you know, like it was recommending things
01;10;38;14 - 01;10;41;04
based on the interests
that I actually had.
01;10;41;04 - 01;10;44;25
It was wanting me to have what might be
an even more interesting conversation
01;10;45;16 - 01;10;49;23
that seemed to really be following
my interest in my likes, not following my
01;10;49;23 - 01;10;55;09
like based desires for like, this guy owns
this other guy that you don't like.
01;10;55;21 - 01;11;00;14
So I'm listening to what I was, but what
I was more interested in is the is this.
01;11;01;14 - 01;11;03;19
Is. Is the fact like, oh, at
01;11;03;19 - 01;11;06;19
one point in time, I would have been
more likely to call a friend.
01;11;07;21 - 01;11;10;09
I would have had this conversation
with a friend.
01;11;10;09 - 01;11;13;16
But like, I was born in 1984,
01;11;13;27 - 01;11;18;05
like we were some of the last people
who really loved the phone call.
01;11;18;19 - 01;11;21;18
And I just I've noticed
01;11;22;01 - 01;11;24;16
in my social circle, I wonder if in yours
01;11;24;16 - 01;11;27;16
the like our long catch up on the phone,
01;11;27;16 - 01;11;30;22
which was such a staple for so long,
it feels like it's gone.
01;11;30;27 - 01;11;36;19
Like it's it's so it's it's such a special
treat that like few times it happens.
01;11;37;08 - 01;11;40;14
And I think that the conversations
with ChatGPT
01;11;40;19 - 01;11;43;16
about like some book I've read,
I want to know more about the author
01;11;43;16 - 01;11;48;00
or someone ever like my oh, these are
conversations I would have had with people
01;11;49;02 - 01;11;49;26
before.
01;11;49;26 - 01;11;51;21
I don't know, the pandemic. Like it's
hard to know.
01;11;51;21 - 01;11;54;28
Like when did when did we all become
a little bit more siloed?
01;11;55;10 - 01;11;58;10
And is that siloing only increasing?
01;11;58;26 - 01;11;59;25
And yeah, that's the thing.
01;11;59;25 - 01;12;04;15
Like 4 or 5 years down the road,
are we only going to become more isolated
01;12;04;15 - 01;12;07;15
and lonely in talking to
01;12;08;22 - 01;12;10;24
a like a, not even a real robot,
01;12;10;24 - 01;12;13;24
you know, like talking to this digital
01;12;14;09 - 01;12;17;09
being that may have no
01;12;17;15 - 01;12;20;04
Word generator. exactly.
01;12;20;04 - 01;12;21;20
So that's the thing I think about.
01;12;21;20 - 01;12;23;09
But then, you know, when I bring this up
01;12;23;09 - 01;12;27;01
with the acceleration camps
and some of them will say, like,
01;12;27;29 - 01;12;30;03
what if it itself
01;12;30;03 - 01;12;34;04
actually becomes a being
and it has something like consciousness,
01;12;34;13 - 01;12;37;08
you know, whether in 5 to 10 years we
01;12;37;08 - 01;12;40;16
will have the same amount of,
01;12;43;27 - 01;12;46;01
I don't know, emotional distance
01;12;46;01 - 01;12;50;07
between us and then us and them as we do
now, that maybe there's a world
01;12;50;07 - 01;12;54;02
where we will begin to merge
and where 20 years from now,
01;12;55;04 - 01;12;57;27
you know, real relationships could exist.
01;12;57;27 - 01;13;00;29
And that feels a little above my pay Yeah.
01;13;02;18 - 01;13;05;12
crazy random people talking about this.
01;13;05;12 - 01;13;08;20
I mean, a lot of these are people who are
very invested in this technology who
01;13;09;07 - 01;13;12;19
and that's just interesting for us
to take note of as we are like heads down
01;13;12;19 - 01;13;17;23
looking at like, okay, what harm did this
chat bot caused this week in this way?
01;13;17;23 - 01;13;21;06
Like to know that in the room
where they're tweaking and working on it,
01;13;21;13 - 01;13;25;02
they're having real conversations
about how we will merge, either socially
01;13;25;02 - 01;13;26;22
or maybe even physically
with these things.
01;13;26;22 - 01;13;31;25
And how society will be reshaped
in ways that we can't imagine.
01;13;32;04 - 01;13;34;14
You know, the same way that you couldn't
have imagined
01;13;34;14 - 01;13;36;00
looking at that first light bulb,
01;13;36;00 - 01;13;37;21
the way that electricity would,
would shape
01;13;37;21 - 01;13;39;21
the world there, like
this is going to be even bigger.
01;13;39;21 - 01;13;40;16
Which was it?
01;13;40;16 - 01;13;41;26
I don't know if it's true,
01;13;41;26 - 01;13;44;29
but I think the time has come that people
realize that's how they're thinking.
01;13;45;11 - 01;13;47;08
They appear to be sincere.
01;13;47;08 - 01;13;50;17
They have the money, they have the means,
01;13;50;17 - 01;13;53;21
and increasingly they are trying
to make their dreams a reality.
01;13;54;01 - 01;13;57;01
And that reality may shape
the future of the world.
01;13;57;03 - 01;14;00;03
And before it does,
I think that the world should,
01;14;01;03 - 01;14;02;27
you know, decide to debate this.
01;14;02;27 - 01;14;04;13
Like, Yeah.
01;14;04;13 - 01;14;08;07
him out, here's our critics out
and take this,
01;14;08;13 - 01;14;12;27
like engage in the messy public debate
that I think this moment demands.
01;14;13;26 - 01;14;16;27
I, I agree,
I think it's super, super important.
01;14;16;27 - 01;14;19;28
And I mean, again, as you're talking,
like what comes to my mind is
01;14;20;07 - 01;14;23;09
the difference
between this and electricity is that
01;14;24;00 - 01;14;26;11
electricity
01;14;26;11 - 01;14;27;27
didn't have like, like, sure.
01;14;27;27 - 01;14;31;17
It has a, an impact on the future
of the world, but this really feels like
01;14;32;14 - 01;14;35;19
it could decide
whether or not our species continues.
01;14;35;19 - 01;14;36;00
Right?
01;14;36;00 - 01;14;39;21
Even if you forget the doomsday, like,
it kind of feels like in your description,
01;14;40;23 - 01;14;43;27
we're almost like building our own matrix,
right?
01;14;43;27 - 01;14;47;27
Like we're we're creating
an an isolating environment for ourselves
01;14;48;16 - 01;14;51;24
that's more enjoyable than spending time
with other humans
01;14;52;08 - 01;14;55;19
to the degree where it could completely
isolate us from other humans.
01;14;55;19 - 01;14;59;23
And what does that mean for our ability
to reproduce and continue and to get like,
01;15;00;07 - 01;15;03;11
I'll echo what you said, that
that's like above my pay grade,
01;15;03;17 - 01;15;08;00
but but it seems like it's like, yeah,
maybe we should be talking about that.
01;15;08;15 - 01;15;08;25
Yeah. I mean,
01;15;10;04 - 01;15;13;04
it's it's so strange that
01;15;13;18 - 01;15;16;22
at first I
when I first decided to report about this,
01;15;17;28 - 01;15;20;28
you know, about a year ago,
01;15;20;29 - 01;15;23;29
some of my colleagues worried for me.
01;15;24;07 - 01;15;27;11
They're like, oh, this is going to you're
going to come off like a crazy person.
01;15;27;12 - 01;15;30;12
Like it's like that's like what?
01;15;30;22 - 01;15;32;21
People are going to think
you're a total loon.
01;15;32;21 - 01;15;34;08
And fast forward to a year.
01;15;34;08 - 01;15;34;28
No one has.
01;15;34;28 - 01;15;39;09
I mean, it doesn't appear
people think I'm loon and, like it does
01;15;39;09 - 01;15;43;08
feel like things that even just a year ago
to imagine are now more imaginable.
01;15;43;22 - 01;15;48;13
And it's not my job as a journalist
to try and, like, predict the future.
01;15;48;17 - 01;15;50;24
Like.
01;15;50;24 - 01;15;52;14
The future. We've been so bad at that.
01;15;52;14 - 01;15;53;21
I think we got to quit it.
01;15;53;21 - 01;15;56;21
But I do want to, like,
be the person who, like, goes into a room
01;15;56;21 - 01;15;58;26
and then goes, here's
what's happening in that room.
01;15;58;26 - 01;16;03;11
And like one of the conversations that's
happening in the room of technology,
01;16;03;29 - 01;16;06;29
you know, the metaphorical room,
but also like literal rooms
01;16;07;08 - 01;16;11;03
is like them
having serious conversations about like,
01;16;11;08 - 01;16;15;04
what the caveman would have imagined
01;16;15;20 - 01;16;18;13
his ancestors doing
01;16;18;13 - 01;16;21;29
if they lived in a world
where they had access to a ton of food.
01;16;22;14 - 01;16;25;05
Like, what would we do in a world
where we didn't have to spend
01;16;25;05 - 01;16;28;23
so much of our lives hunting and gathering
and being hunted and hiding in caves?
01;16;28;27 - 01;16;32;07
Well, if we were the dominant species
on the planet and had access
01;16;32;07 - 01;16;35;09
to many resources, what would we do? And
01;16;36;10 - 01;16;37;12
there's just no way that
01;16;37;12 - 01;16;40;19
they would have imagined,
like us betting on
01;16;41;06 - 01;16;44;13
NBA basketball, you know, like
there was no way
01;16;44;13 - 01;16;47;22
they would have imagined,
like all the things that we do
01;16;48;12 - 01;16;52;08
and, and so, like with the acceleration,
as so often say, about that is like,
01;16;52;08 - 01;16;57;14
don't get too fixated on the future
because transformative technology
01;16;57;14 - 01;17;01;15
like this is going to bring about a world
where you're worried about jobs.
01;17;01;29 - 01;17;05;04
None of the things that you and I
are doing for a living, our great
01;17;05;04 - 01;17;09;29
grandparents, would have thought was a job
like my grandfather was a farmer,
01;17;09;29 - 01;17;13;04
my father was an oil man,
and I make podcasts for a living
01;17;13;04 - 01;17;17;17
like we already don't have jobs,
according to like, recent history.
01;17;17;23 - 01;17;19;10
And a lot of these people,
01;17;19;10 - 01;17;21;01
I think, make a compelling case
that, like,
01;17;21;01 - 01;17;24;09
we are still going to find ways
to do things for each other
01;17;24;11 - 01;17;27;11
for some sort of social
and financial reward,
01;17;27;13 - 01;17;31;01
whether or not that takes the form
of capitalism in the future like that.
01;17;31;02 - 01;17;33;17
I think that's really up for grabs
at capitalism's
01;17;33;17 - 01;17;36;15
time, may have to come to an end
if we do have abundant
01;17;36;15 - 01;17;38;12
energy and abundant intelligence,
01;17;38;12 - 01;17;41;10
a lot of them will admit that, too,
even though many of them are capitalists.
01;17;43;21 - 01;17;44;13
So that's like on the one
01;17;44;13 - 01;17;47;13
hand, on the on the dumber side,
they're saying like,
01;17;47;14 - 01;17;50;08
you think you're not properly
scared enough
01;17;50;08 - 01;17;54;28
because you have in your head
these cartoonish ideas of the threat
01;17;55;19 - 01;17;58;13
that we cannot imagine
01;17;58;13 - 01;18;01;24
the threat posed by a true
artificial general intelligence,
01;18;01;24 - 01;18;06;05
just like those cavemen
could not have imagined an atomic bomb,
01;18;07;06 - 01;18;07;10
right?
01;18;07;10 - 01;18;10;26
Of all the dangers that they could
conceive of splitting the atom.
01;18;12;09 - 01;18;15;05
No. No way that they
no way that they have that.
01;18;15;05 - 01;18;18;05
And of course, that's the trajectory
of many, many hundreds of years.
01;18;18;09 - 01;18;21;28
The thing that they're particularly
focused in on is that it appears
01;18;21;28 - 01;18;24;28
as if with this
01;18;24;29 - 01;18;27;25
AI revolution underway,
01;18;27;25 - 01;18;30;11
even if it's 20 years,
01;18;30;11 - 01;18;33;19
the horizon of great change has shrunk.
01;18;34;16 - 01;18;36;17
And, you know,
they'll tell a lot of stories
01;18;36;17 - 01;18;40;17
about previous moments
and different genres of technology.
01;18;41;05 - 01;18;45;17
You know, it's 1903, I believe, when
the Wright brothers get their
01;18;46;27 - 01;18;48;15
their flying contraption
01;18;48;15 - 01;18;51;17
to go 120 yards in Kitty Hawk
01;18;52;08 - 01;18;54;22
after years, by the way of pessimists
saying
01;18;54;22 - 01;18;57;22
it'll be a thousand years
before humans fly like
01;18;58;17 - 01;18;59;14
New York times.
01;18;59;14 - 01;19;01;08
Yeah.
01;19;01;08 - 01;19;03;07
Like, oh, this will never happen.
01;19;03;07 - 01;19;06;03
Like it's a total waste of money.
Human beings weren't meant to fly.
01;19;06;03 - 01;19;07;06
They could never fly.
01;19;07;06 - 01;19;10;06
19 03I believe it's 1903 or 1906.
01;19;10;08 - 01;19;13;08
You get the Wright brothers flying 120ft
01;19;14;06 - 01;19;16;10
by the
01;19;16;10 - 01;19;17;21
by the 1940s.
01;19;17;21 - 01;19;20;24
Like, we're fighting world wars
with airplanes, you know, like that.
01;19;21;09 - 01;19;26;08
But the 1960s were flying a rocket
to the moon like it happened so fast.
01;19;26;16 - 01;19;29;10
There were like
my grandmother was alive for both things,
01;19;29;10 - 01;19;32;23
you know, and they were saying,
we are about to enter an era like that.
01;19;33;03 - 01;19;38;08
And we can't imagine because this one
isn't just a a traveling technology.
01;19;38;13 - 01;19;40;14
You know, this one isn't just a technology
for physically.
01;19;40;14 - 01;19;43;03
This is we're talking about intelligence.
01;19;43;03 - 01;19;46;03
We're talking about the thing
that was at the root of the discovery
01;19;46;03 - 01;19;49;03
of all that and what let's get
whether or not we believe them,
01;19;49;12 - 01;19;52;12
whether or not that they're right,
I don't that's not my job.
01;19;52;14 - 01;19;55;14
My job is to tell you like,
this is how they're thinking.
01;19;55;27 - 01;19;59;26
While many of us are engaged in, like,
what I think they would consider to be
01;19;59;26 - 01;20;04;21
more frivolous debates on the internet,
you know, and whether or not
01;20;04;21 - 01;20;09;16
you, agree with any of them,
they're having an impact on the world.
01;20;09;16 - 01;20;13;19
And like, they may end up
shaping the future without us as a society
01;20;14;09 - 01;20;17;15
really jumping in
to try and have a voice in shaping it.
01;20;20;11 - 01;20;20;29
I love it, Andy.
01;20;20;29 - 01;20;24;02
I got I've got goosebumps from,
from that last little bit.
01;20;24;03 - 01;20;25;27
It's super compelling.
01;20;25;27 - 01;20;29;13
And I feel like we could talk
for, like, the next, like, 4 to 8 hours.
01;20;29;20 - 01;20;31;28
About everything
that we just covered here.
01;20;31;28 - 01;20;33;22
This has been such a fun conversation.
01;20;33;22 - 01;20;36;15
I wanted to say a big
thank you for, you know, telling
01;20;36;15 - 01;20;38;28
so many interesting stories
and sharing so much insight in the space.
01;20;38;28 - 01;20;40;00
It's been awesome.
01;20;40;00 - 01;20;40;29
Yeah. Well, thanks for having me.
01;20;40;29 - 01;20;43;29
And, I appreciate the invitation
of income talk again.
01;20;44;13 - 01;20;45;04
Absolutely.
01;20;45;04 - 01;20;48;04
Maybe get editor in to debate
you next time.
01;20;48;11 - 01;20;51;03
I'd be down.
01;20;51;03 - 01;20;52;15
If you work in IT,
01;20;52;15 - 01;20;55;15
Infotech Research Group is a name
you need to know.
01;20;55;18 - 01;20;58;18
No matter what your needs are, Infotech
has you covered.
01;20;58;23 - 01;21;00;02
AI strategy?
01;21;00;02 - 01;21;02;14
Covered. Disaster recovery?
01;21;02;14 - 01;21;03;14
Covered.
01;21;03;14 - 01;21;05;29
Vendor negotiation? Covered.
01;21;05;29 - 01;21;09;22
Infotech supports you with the best
practice research and a team of analysts
01;21;09;22 - 01;21;13;08
standing by ready to help you
tackle your toughest challenges.
01;21;13;18 - 01;21;16;18
Check it out at the link below
and don't forget to like and subscribe!
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Dr. Anne-Marie Imafidon Discusses
Is AI Eroding Identity? Future of Work Expert on How AI Is Taking More Than Jobs
From redefining long-held beliefs about “jobs for life,” to the cultural fractures emerging between companies, workers, and society, Dr. Anne-Marie goes deep on what’s changing, what still isn’t understood, and what leaders must do right now to avoid being left behind.
Our Guest Andy Mills Discusses
How AI Will Save Humanity: Creator of The Last Invention Explains
If you want clarity on AGI, existential risk, the future of work, and what it all means for humanity, this is an episode you won’t want to miss.
Our Guest Peter Norvig Discusses
AGI Is Here: AI Legend Peter Norvig on Why It Doesn't Matter Anymore
Are we chasing the wrong goal with artificial general intelligence and missing the breakthroughs that matter now?
Our Guest Cassie Kozyrkov Discusses
Why AI Is Failing: Ex-Google Chief Cassie Kozyrkov Debunks "AI-First"
In this episode, Cassie Kozyrkov, former Google Chief Decision Scientist and CEO of Kozyr, sits down with Geoff to unpack the hidden cost of the “AI-first” hype, the dangers of AI infrastructure debt, and why real AI readiness starts with people, not technology.