Our Guest Jeremy Roberts Discusses
What AI Bubble? Top Trends in Tech and Jobs in 2026
Are companies preparing for an AI-powered future or reacting out of fear of being left behind?
Looking ahead to 2026, Geoff Nielson and Jeremy Roberts sit down for an unfiltered conversation about artificial intelligence, the economy, and the future of work. As AI hype accelerates across markets, boardrooms, and headlines, they ask the hard questions many leaders and workers are quietly worrying about: Are we in an AI bubble? If so, what happens when expectations collide with reality?
This episode explores whether today’s massive investment in AI, GPUs, infrastructure, copilots, and generative tools is laying the foundation for long-term value or repeating the familiar patterns of past tech bubbles like the dot-com boom and the subprime mortgage crisis. Geoff and Jeremy break down why traditional metrics like price-to-earnings ratios matter, why Nvidia and big tech dominate the narrative, and why the real risk may not be collapse but widespread underperformance.
The conversation goes far beyond markets. They dig into the impact of AI on jobs, layoffs, and corporate restructuring, challenging the idea that AI is “taking jobs” versus being used as convenient cover for economic tightening. From IT, HR, and operations to customer-facing roles, they examine how AI could reshape workforce composition, accelerate automation, and create a new and potentially unsettling employment equilibrium. You’ll also hear a candid critique of how organizations are actually using AI today and what is to come next in 2026.
00;00;00;16 - 00;00;00;23
that.
00;00;00;23 - 00;00;02;11
The idea this year and in the past
00;00;02;11 - 00;00;06;06
18 months has been
I'll just add AI into the tech.
00;00;06;06 - 00;00;09;06
And it's like,
Why is there AI in my Instagram?
00;00;09;09 - 00;00;11;07
I, I don't know.
00;00;11;07 - 00;00;13;03
Where they're going to put them in
Reese's Cups too soon.
00;00;13;03 - 00;00;16;06
It's going to be peanut butter
chocolate and I I will buy those.
00;00;16;13 - 00;00;21;22
There'll be
a subscription will be $1,000 a month.
00;00;21;25 - 00;00;22;14
Hey everyone.
00;00;22;14 - 00;00;25;28
We are looking ahead today at 2026.
00;00;26;01 - 00;00;28;29
Still, nobody knows what the hell is
going on or what's going to happen.
00;00;28;29 - 00;00;33;00
And so I'm here with Jeremy Roberts
to take a lens at eye,
00;00;33;05 - 00;00;36;23
how it's impacting the economy,
our work, our life.
00;00;36;26 - 00;00;38;15
And, yeah.
00;00;38;15 - 00;00;39;29
Jeremy, what should we talk about today?
00;00;39;29 - 00;00;41;20
Well, I mean, I think you hit it
right there, Jeff.
00;00;41;20 - 00;00;45;25
It's really, I think, all about the impact
that AI is going to have on the world
00;00;45;25 - 00;00;47;21
around us, specifically the economy.
00;00;47;21 - 00;00;51;24
So the big question I've been asking this,
they've been asking this
00;00;51;27 - 00;00;55;26
is, are we in a bubble,
do you think, fueled by AI.
00;00;55;29 - 00;00;58;17
So, you know, that's
that's an interesting question to me.
00;00;58;17 - 00;01;03;25
And it's one that I feel like you have
to tease apart to get a real answer at.
00;01;04;02 - 00;01;08;04
Do I think that there's too much hype
around AI right now and the capabilities
00;01;08;04 - 00;01;12;06
that AI has and what it's going to be able
to deliver in 2026?
00;01;12;11 - 00;01;16;10
100%, I think that,
00;01;16;13 - 00;01;17;18
no, I disagree.
00;01;17;18 - 00;01;19;20
I disagree super hard about that.
00;01;19;20 - 00;01;21;08
But keep going
and I'll tell you why you're wrong.
00;01;21;08 - 00;01;23;22
And this like, All right. Sounds great.
00;01;23;22 - 00;01;28;07
Look, I think that we're
in a really unique situation right now
00;01;28;14 - 00;01;32;01
where we've got a whole ecosystem
of tech companies,
00;01;32;04 - 00;01;36;12
and the megaphones around them
that are deliberately
00;01;36;15 - 00;01;40;12
hawking this and want people to believe
that it can do these incredible things.
00;01;40;19 - 00;01;42;27
And don't get me wrong,
it can do incredible things.
00;01;42;27 - 00;01;47;17
But the degree to
which we're seeing executives
00;01;47;20 - 00;01;51;09
banging the drum of AI and degree
to which the market
00;01;51;09 - 00;01;54;16
and investors are rewarding them for it
and eating it up,
00;01;54;19 - 00;01;58;27
I just at some point this has to have
a crash course with reality.
00;01;59;00 - 00;02;02;26
And, you know, I don't exactly know
what's going to happen to the market.
00;02;02;26 - 00;02;05;26
And to be honest, I think it's a fool's
game to try and time any of this.
00;02;06;02 - 00;02;11;22
But I think the market, day by day is
struggling to stay aligned with reality.
00;02;11;24 - 00;02;14;16
Well, I mean, like, I, I wouldn't
give any market advice to anyone.
00;02;14;16 - 00;02;16;10
I'm still holding Nortel and BlackBerry.
00;02;16;10 - 00;02;20;12
But when you think about sort of this
compared to maybe other historical
00;02;20;12 - 00;02;22;14
bubbles, right?
I mean, if we go back in time,
00;02;22;14 - 00;02;25;14
we could look at, say, the subprime
mortgage crisis.
00;02;25;15 - 00;02;28;04
That was 2008, 2009, probably
00;02;28;04 - 00;02;31;13
the most impactful recession
in most people's lives thus far.
00;02;31;20 - 00;02;33;05
We can look@the.com boom.
00;02;33;05 - 00;02;36;28
And in both of those cases,
the market was way more overvalued
00;02;36;28 - 00;02;39;28
from a p e
ratio perspective than it is now.
00;02;40;00 - 00;02;40;10
Right.
00;02;40;10 - 00;02;44;07
So this idea that even in the scheme
of things, we're in some sort of a bubble,
00;02;44;07 - 00;02;45;20
we're approaching that territory.
00;02;45;20 - 00;02;48;07
I mean, it doesn't
actually stand up to deeper scrutiny.
00;02;48;07 - 00;02;49;18
And you said earlier
00;02;49;18 - 00;02;53;06
that, you know, the hype is significant
for a lot of these tools.
00;02;53;12 - 00;02;55;21
I mean, what if I were to just say that,
you know, we're early,
00;02;55;21 - 00;02;58;11
you know, copilot
might not be the best thing ever, but
00;02;58;11 - 00;03;01;13
like, was the PC the best thing ever
00;03;01;13 - 00;03;04;13
when,
you know, Apple two came out in 1978?
00;03;04;15 - 00;03;06;03
I think that's broadly right.
00;03;06;03 - 00;03;10;00
But I want to dig in a little bit
more to the, to the to the price earnings
00;03;10;00 - 00;03;12;12
ratio piece that you're talking about,
because I think you're right.
00;03;12;12 - 00;03;12;23
And I think
00;03;12;23 - 00;03;16;08
that is the single best argument right now
for us not being in a bubble
00;03;16;11 - 00;03;20;11
is that if you look at these ratios,
even for a lot of the top companies,
00;03;20;14 - 00;03;22;25
they're they're not astronomical.
00;03;22;25 - 00;03;23;18
They're pretty sane.
00;03;23;18 - 00;03;26;08
There are some,
you know, Poland here and Tesla excluded.
00;03;26;08 - 00;03;30;05
And like I'm not holding money
with them right now for for those reasons.
00;03;30;05 - 00;03;32;11
But if you hold any ETFs staff,
you probably are.
00;03;32;11 - 00;03;33;15
fair. Fair enough.
00;03;33;15 - 00;03;37;00
But to me
00;03;37;03 - 00;03;41;02
that like the secret sauce behind
that is that
00;03;41;05 - 00;03;45;14
the earnings that we're seeing right now
are still a result of this gold rush
00;03;45;14 - 00;03;48;25
that people are dumping investment
into these areas
00;03;49;00 - 00;03;52;26
with the promise of some nebulous return
at some future point.
00;03;52;26 - 00;03;53;03
Right.
00;03;53;03 - 00;03;56;07
So so you've got big tech companies
and and Microsoft is a great example.
00;03;56;07 - 00;03;58;02
By the way, you mentioned copilot.
00;03;58;02 - 00;04;02;19
And if people stop
seeing the return on this,
00;04;02;22 - 00;04;07;15
then I think there's really an opportunity
for panic in the markets.
00;04;07;15 - 00;04;10;28
And let me let me like
just put a little bit more nuance on that.
00;04;11;01 - 00;04;14;23
I think the risk of people
not seeing the returns they expect.
00;04;14;24 - 00;04;15;04
Yeah.
00;04;15;04 - 00;04;18;08
And a shock downwards is a lot more
00;04;18;08 - 00;04;22;17
realistic in 2026 than, oh, it's even
better than we thought.
00;04;22;17 - 00;04;24;02
And this stock shoots up. Right.
00;04;24;02 - 00;04;27;01
Like I feel like that's
already being priced in right now.
00;04;27;01 - 00;04;28;25
So so that's my concern.
00;04;28;25 - 00;04;31;05
So if Nvidia is at like
you know 5 trillion
00;04;31;05 - 00;04;34;11
I mean who knows what it'll be at
when people are watching this.
00;04;34;14 - 00;04;36;23
But you know
is it's incredibly valuable company.
00;04;36;23 - 00;04;38;18
But that's because people
are actually paying money for it.
00;04;38;18 - 00;04;39;18
So you're their products.
00;04;39;18 - 00;04;42;22
I mean, so
your argument is essentially that once
00;04;42;22 - 00;04;46;04
this gold rush mentality fades away,
I'm going to need fewer GPUs.
00;04;46;04 - 00;04;47;14
And the Nvidia is might retreat
00;04;47;14 - 00;04;50;15
and that it's really hard to argue
that they're under hyped.
00;04;50;20 - 00;04;54;08
Maybe it's either
appropriate level of hype or an overhyped.
00;04;54;08 - 00;04;55;00
Sort of Yeah.
00;04;55;00 - 00;05;00;09
I mean it's like you'll hear
a lot people compare this like an Nvidia
00;05;00;09 - 00;05;06;05
infrastructure boom to laying the railroad
infrastructure, you know, 150 years ago.
00;05;06;08 - 00;05;09;08
And I think that's right.
00;05;09;08 - 00;05;14;01
Except that like a railroad is a physical
asset that you can drive trains on.
00;05;14;01 - 00;05;14;10
Yeah.
00;05;14;10 - 00;05;18;05
And the nature of this is we're building
an awful lot of infrastructure
00;05;18;08 - 00;05;19;29
with the promise
that we're going to be able
00;05;19;29 - 00;05;21;22
to do something really useful with it.
00;05;21;22 - 00;05;22;02
Yeah,
00;05;22;02 - 00;05;23;28
I think it's likely
that we're going to be able
00;05;23;28 - 00;05;25;24
to do something really useful with it.
00;05;25;24 - 00;05;30;08
But a lot of the use
cases are still semi proven at this point.
00;05;30;08 - 00;05;34;05
I'll say, yeah, you know, certainly LMS,
I'm worried that they're starting
00;05;34;05 - 00;05;36;00
to plateau in terms of capabilities.
00;05;36;00 - 00;05;36;10
Yeah.
00;05;36;10 - 00;05;37;12
And, you know, a
00;05;37;12 - 00;05;41;09
genetic to me is a great hype word,
but it's still very much unproven.
00;05;41;12 - 00;05;45;23
so there's just an awful lot
riding on those panning out.
00;05;45;23 - 00;05;45;28
Right.
00;05;45;28 - 00;05;49;17
Like it's like building the railroads
before we've invented,
00;05;49;24 - 00;05;51;29
you know, the steam locomotive.
00;05;51;29 - 00;05;55;12
So. So how much more railroad
are we going to keep building?
00;05;55;14 - 00;05;55;23
Yeah.
00;05;55;23 - 00;05;58;06
No, the railroad metaphor is a good one,
because if you go back in time
00;05;58;06 - 00;06;01;14
and you look at historical railroad
maps like there were a lot
00;06;01;14 - 00;06;05;00
more railroads in North America
historically than there are now.
00;06;05;03 - 00;06;06;29
And we tore them all out
because they were ineffective.
00;06;06;29 - 00;06;07;26
They were overbuilt.
00;06;07;26 - 00;06;09;12
Having a bunch of different
private companies
00;06;09;12 - 00;06;10;24
compete to build railroads, it turned out,
00;06;10;24 - 00;06;14;00
was actually not the most efficient way
to deliver that service to a population.
00;06;14;00 - 00;06;15;25
So you might be on to something.
00;06;15;25 - 00;06;19;14
And the other thing that I'll suggest,
sort of from a bubble perspective, is that
00;06;19;19 - 00;06;22;11
it's actually quite hard
to be inside a bubble
00;06;22;11 - 00;06;24;14
and sort of call it out effectively.
00;06;24;14 - 00;06;25;22
And if everybody inside
00;06;25;22 - 00;06;29;13
the bubble is saying we're in a bubble,
but a bubble is a speculative thing,
00;06;29;20 - 00;06;32;12
it's actually kind of hard for it
to turn out to be a bubble
00;06;32;12 - 00;06;34;07
because nobody is really being misled.
00;06;34;07 - 00;06;35;17
And so that's the other thing that gets me
00;06;35;17 - 00;06;38;26
is that everybody's out there being like,
yeah, we might be overvalued.
00;06;39;02 - 00;06;41;04
I don't know that this has historically
been the same.
00;06;41;04 - 00;06;45;17
Like people weren't yelling about
the subprime housing crisis in 2006, 2007.
00;06;45;17 - 00;06;47;29
It was so rare that they made a movie
about it called The Big Short.
00;06;47;29 - 00;06;49;07
Like the one guy who actually now
00;06;49;07 - 00;06;53;02
do that, who has called 27
of the last three recessions, by the way.
00;06;53;08 - 00;06;55;05
So that's sort of where
I'm coming from on that.
00;06;55;05 - 00;06;58;12
Well, and it's really interesting
because the
00;06;58;15 - 00;07;02;04
if we were having this conversation
six months ago and I was saying
00;07;02;04 - 00;07;05;13
there's a bubble, there's a bubble,
people would say, hey, you're crazy.
00;07;05;13 - 00;07;08;01
Like you're this voice of dissent, right?
00;07;08;01 - 00;07;11;10
And I feel like somewhere in the last
handful of months, we've turned a corner
00;07;11;14 - 00;07;15;24
where it's become very in vogue to say
there is a bubble.
00;07;15;24 - 00;07;16;01
Right?
00;07;16;01 - 00;07;20;23
And like to say there's no bubble
has actually become counterculture,
00;07;20;26 - 00;07;23;26
which is interesting
from like a narrative perspective.
00;07;24;02 - 00;07;28;12
But we're still not seeing people
pull their money out in droves.
00;07;28;18 - 00;07;29;10
Right.
00;07;29;10 - 00;07;32;09
And so that's that's fascinating to me.
00;07;32;09 - 00;07;34;15
But as I said, like that, to me,
this comes down to
00;07;34;15 - 00;07;38;23
there's a lot more risk
of underperformance than over performance.
00;07;39;00 - 00;07;40;13
And that's that.
00;07;40;13 - 00;07;44;27
That's the lens I'm looking at
the market through next year.
00;07;45;00 - 00;07;45;22
If you work in
00;07;45;22 - 00;07;49;12
IT, Infotech research Group is a name
you need to know.
00;07;49;15 - 00;07;52;15
No matter what your needs are, Infotech
has you covered.
00;07;52;20 - 00;07;53;27
AI strategy?
00;07;53;27 - 00;07;56;09
Covered. Disaster recovery?
00;07;56;09 - 00;07;57;09
Covered.
00;07;57;09 - 00;07;59;24
Vendor negotiation? Covered.
00;07;59;24 - 00;08;03;19
Infotech supports you with the best
practice research and a team of analysts
00;08;03;19 - 00;08;07;12
standing by ready to help you
tackle your toughest challenges.
00;08;07;15 - 00;08;12;11
Check it out at the link below
and don't forget to like and subscribe!
00;08;12;14 - 00;08;15;25
I guess, you know, as a, as an admonition
to all of our viewers, right.
00;08;16;02 - 00;08;18;00
This idea that the market is inherently
00;08;18;00 - 00;08;21;00
tethered to reality,
I think we can dispense with that.
00;08;21;01 - 00;08;21;13
Oh, yeah.
00;08;21;13 - 00;08;24;21
Now the market, the market,
you know, is crazier than ever.
00;08;24;21 - 00;08;27;04
And as we see more sort of like,
00;08;27;04 - 00;08;30;19
I don't know if you want to call them
like retail investors or like Reddit
00;08;30;19 - 00;08;34;07
and that like the, the meme investors,
the game stoppers of the world.
00;08;34;09 - 00;08;37;11
It's fundamentally
changed the dynamics of the market in
00;08;37;11 - 00;08;41;26
a really big way that I think
00;08;41;29 - 00;08;45;10
untether it more,
if that's even an appropriate metaphor.
00;08;45;10 - 00;08;47;17
you know, From reality.
00;08;47;17 - 00;08;51;26
And what's a bit concerning to me
is that a lot of the areas
00;08;51;26 - 00;08;55;24
where we've seen big gains
this year seem to follow that pattern.
00;08;55;24 - 00;08;56;01
Right?
00;08;56;01 - 00;08;57;12
So, whether you're looking at
00;08;57;12 - 00;09;00;14
some specific like meme stocks
or some of the stocks like
00;09;00;14 - 00;09;04;06
Tesla and Palantir, whether you're looking
at, you know, is a good example.
00;09;04;08 - 00;09;07;26
The other thing that gave me pause is
I was reading an investigation
00;09;07;26 - 00;09;09;08
about gold, okay.
00;09;09;08 - 00;09;11;00
And it was saying that if you look at
00;09;11;00 - 00;09;14;18
why gold has increased so much this year
and what's different from before,
00;09;14;21 - 00;09;17;23
a lot of it is speculative
kind of meme stock investing,
00;09;17;23 - 00;09;20;23
which is terrifying because it's
meant to be, you know, kind of
00;09;20;23 - 00;09;24;02
a, a hedge against everything else
going on in the market.
00;09;24;08 - 00;09;27;04
And yet we've broken it
because it's the same, you know, meme
00;09;27;04 - 00;09;28;10
investors investing in it.
00;09;28;10 - 00;09;30;27
So I, I agree with you on that point.
00;09;30;27 - 00;09;32;23
So I think that
we all got to put our money in
00;09;32;23 - 00;09;35;24
whatever meme coin is going to come out,
we can call it hedge coin
00;09;35;24 - 00;09;37;25
or something like that.
We should have our own meme coin.
00;09;37;25 - 00;09;40;23
Maybe that maybe that's our 2026
digital disruption token.
00;09;40;23 - 00;09;42;18
And then there's a
there's definitely an option there.
00;09;42;18 - 00;09;43;10
Okay.
00;09;43;10 - 00;09;44;19
So let's talk about
00;09;44;19 - 00;09;48;01
maybe the broader implications of
AI just beyond market pricing.
00;09;48;01 - 00;09;49;04
Because to me, another big one.
00;09;49;04 - 00;09;50;01
I know you've talked about this
00;09;50;01 - 00;09;53;08
on the podcast
a ton in the past year is jobs, right?
00;09;53;08 - 00;09;55;06
So is I going to take my job?
00;09;55;06 - 00;09;56;22
Will copilot replace me?
00;09;56;22 - 00;09;59;14
Will ChatGPT
take the job of every translator
00;09;59;14 - 00;10;03;08
historian accountant
out there in the world today?
00;10;03;08 - 00;10;05;09
So at a high level,
what's your take on that?
00;10;05;09 - 00;10;07;10
Based on all the conversations
you've had? Yeah.
00;10;07;10 - 00;10;11;23
So I mean, I'm I'm very cynical by nature
00;10;11;26 - 00;10;16;00
and I think we're seeing a few different
things happen at the same time right now.
00;10;16;03 - 00;10;19;22
I think there's a lot of talk
of AI taking jobs,
00;10;19;22 - 00;10;24;10
and I think that's extremely convenient
for the companies out there
00;10;24;10 - 00;10;28;05
that may have over hired
in kind of the post-pandemic, immediate
00;10;28;05 - 00;10;30;20
post-pandemic era
where inflation rates were low.
00;10;30;20 - 00;10;34;29
There was a lot of kind of fast, almost
surprising growth, quantitative easing.
00;10;34;29 - 00;10;35;15
Yeah.
00;10;35;15 - 00;10;39;08
And they're looking at okay, well,
how do I, you know, how do I undo
00;10;39;08 - 00;10;43;12
some of that hiring that, you know,
was a little bit cowboy ish at the time.
00;10;43;15 - 00;10;48;26
And how do I do it in a way
that still is good for PR, right.
00;10;48;26 - 00;10;51;06
Or makes me
look ahead of the curve? Right?
00;10;51;06 - 00;10;53;09
I think there's a lot of self-interest.
00;10;53;09 - 00;10;54;04
And it's funny.
00;10;54;04 - 00;10;57;11
And I was talking to editor
on a handful of weeks ago about this, but,
00;10;57;14 - 00;11;01;26
you know, the idea that if you actually
read a lot of these press releases,
00;11;01;29 - 00;11;02;09
if you
00;11;02;09 - 00;11;06;04
dig past the headline, it's not we're,
00;11;06;07 - 00;11;10;14
you know, downsizing this many people
and replacing them with AI, it's
00;11;10;20 - 00;11;14;04
we're downsizing this many people,
and we're investigating
00;11;14;04 - 00;11;17;08
what capabilities a hot AI has
that can help Excited.
00;11;17;08 - 00;11;19;13
We're very excited
about the prospect of AI. Right.
00;11;19;13 - 00;11;21;23
And so a bit of a tough pill to swallow.
00;11;21;23 - 00;11;23;22
And for a lot of people,
I don't know that it matters that much.
00;11;23;22 - 00;11;26;12
But if you're asking,
I don't know that you should be asking.
00;11;26;12 - 00;11;27;19
Will I take my job?
00;11;27;19 - 00;11;32;05
I think you should be asking, Will
I still have a job at the end of 2026?
00;11;32;05 - 00;11;34;19
Because of where we are
in the economic cycle?
00;11;34;19 - 00;11;36;25
That's a real concern for a lot of people.
00;11;36;25 - 00;11;41;05
And, you know, they may be pissed off more
so that it's being
00;11;41;05 - 00;11;45;11
I washed that it really has nothing
to do with AI, and it's
00;11;45;11 - 00;11;48;28
completely untethered from whether this
AI technology pans out or not.
00;11;49;02 - 00;11;52;04
Now, the implication there, though,
of course, is
00;11;52;04 - 00;11;55;04
once we get past this part of the cycle,
00;11;55;06 - 00;11;58;21
do we go back to business as usual
and reach a point where, okay,
00;11;58;21 - 00;12;02;18
we're back on an upswing and crap, we,
you know, over fired,
00;12;02;18 - 00;12;04;29
and now we need to hire a bunch
more people?
00;12;04;29 - 00;12;07;01
Or is there a new market equilibrium?
00;12;07;01 - 00;12;07;10
Yeah.
00;12;07;10 - 00;12;08;07
Because if there's a new market
00;12;08;07 - 00;12;12;08
equilibrium because of AI, then
a lot of these jobs are not coming back.
00;12;12;08 - 00;12;15;19
And that's a really scary proposition.
00;12;15;24 - 00;12;20;01
I think, frankly, as a society,
like whether it's for individuals,
00;12;20;01 - 00;12;23;17
whether it's for, you know,
clusters of people or households,
00;12;23;17 - 00;12;29;15
whether it's for, us as a society,
it's a really real proposition.
00;12;29;15 - 00;12;32;15
And we don't know what all those people
are going to do about it.
00;12;32;15 - 00;12;33;24
Now, a couple of things
00;12;33;24 - 00;12;37;16
I want to say, number one is
you can probably tell by my outlook,
00;12;37;19 - 00;12;42;00
I don't think we're going to see
a cyclical upswing in hiring in 2026.
00;12;42;00 - 00;12;43;25
that would really surprise me.
I think we're going to see
00;12;43;25 - 00;12;46;14
the number continue to go down
and probably stay down.
00;12;46;14 - 00;12;50;07
Probably stabilize at some point
or come rough to stabilizing unless,
00;12;50;13 - 00;12;53;19
you know, the market goes completely
to hell, in which case we may see.
00;12;53;22 - 00;12;57;14
But but to be honest,
that's one of the best defenses
00;12;57;17 - 00;13;00;20
I think we have right now
against a true crash.
00;13;00;27 - 00;13;04;13
Which is to your point against
earnings is companies
00;13;04;13 - 00;13;07;22
seem to be concerned
about the bubble bursting Yes.
00;13;07;22 - 00;13;11;14
and they're preemptively,
you know, kind of leaning out
00;13;11;14 - 00;13;15;09
their operations,
to use a euphemism for firing people.
00;13;15;12 - 00;13;17;20
And so they're somewhat
protected from that.
00;13;17;20 - 00;13;21;07
So if we look beyond that, though,
00;13;21;10 - 00;13;26;10
I think we will start to see, regardless
of whether I, you know,
00;13;26;13 - 00;13;30;08
is completely transformational
or even incrementally
00;13;30;08 - 00;13;33;15
better than it is today,
I think we are going to start
00;13;33;15 - 00;13;37;10
to see a change
in the composition of our workforce.
00;13;37;10 - 00;13;40;18
And a lot of organizations,
I think, you know,
00;13;40;18 - 00;13;43;21
and maybe it's a hot take and maybe
it'll rub some people the wrong way.
00;13;43;21 - 00;13;52;02
I think we have seen a huge explosion
of corporate roles in the last
00;13;52;05 - 00;13;54;08
two years, five years, ten years, 15
00;13;54;08 - 00;13;57;26
years, like functions like it H.R.
00;13;58;02 - 00;14;01;10
Ops. These roles
that are not customer facing
00;14;01;13 - 00;14;07;04
and in many cases are not directly
increasing the capabilities
00;14;07;04 - 00;14;10;27
of the organization and like driving
their competitiveness forward.
00;14;11;00 - 00;14;15;09
We've just seen a huge proliferation
of that.
00;14;15;12 - 00;14;17;18
And it's tough,
00;14;17;18 - 00;14;21;13
I think, for any investor or any business
leader to be really excited about that.
00;14;21;13 - 00;14;25;02
Like, hooray, our company has gotten
so big and cumbersome.
00;14;25;02 - 00;14;26;28
Yeah.
00;14;26;28 - 00;14;30;27
And so if I think about the future
shape of organizations,
00;14;31;00 - 00;14;36;00
I think looking at roles
that are more customer facing,
00;14;36;05 - 00;14;40;02
more looking at capability building
and more just kind of nimble around
00;14;40;02 - 00;14;44;18
what needs to be done
versus I have a job and it's task based,
00;14;44;18 - 00;14;46;26
and I do this
task on Monday, in this task on Tuesday
00;14;46;26 - 00;14;49;20
and this task on Wednesday
and like rinse and repeat.
00;14;49;20 - 00;14;52;13
I think that's the way we're going
to start to see some of this evolve.
00;14;52;13 - 00;14;55;01
So that does sound a lot
like a talking point though, right?
00;14;55;01 - 00;14;57;04
Like, my organization is so bloated,
00;14;57;04 - 00;14;59;02
and there's all these people
who don't do anything but, like,
00;14;59;02 - 00;15;01;16
those people were hired
for a particular reason.
00;15;01;16 - 00;15;03;13
Sometimes it's to go
the thief of their boss.
00;15;03;13 - 00;15;06;03
Fair enough. We've all read the David
Graeber book Bullshit Jobs.
00;15;06;03 - 00;15;06;25
If you haven't, you should.
00;15;06;25 - 00;15;07;28
It's very interesting,
00;15;07;28 - 00;15;10;24
but in a lot of cases,
those people are performing functions
00;15;10;24 - 00;15;14;06
that when they are removed,
they do dramatically reduce the efficiency
00;15;14;09 - 00;15;15;13
of the organization.
00;15;15;13 - 00;15;18;28
So I think it's easy to say
that there's a lot of bloat,
00;15;19;01 - 00;15;22;01
but I do think that those roles are
maybe more complicated than they seem,
00;15;22;01 - 00;15;25;08
which is part of the reason
that I implementations have been so tough.
00;15;25;12 - 00;15;28;21
That said,
I agree in principle with the idea
00;15;28;28 - 00;15;32;23
that artificial intelligence
is likely to impact those back end roles.
00;15;32;26 - 00;15;37;10
But you remember that at MIT and a study
that said that 95% of AI projects
00;15;37;10 - 00;15;39;16
don't produce any real value,
00;15;39;16 - 00;15;42;19
I think that's because they're picking
the wrong things to focus on.
00;15;42;19 - 00;15;46;18
So, like, if I take, you know, a person
who's sort of acting as human middleware
00;15;46;23 - 00;15;49;14
swap them out
for an artificial intelligence solution
00;15;49;14 - 00;15;52;19
that does the same thing
pretty effectively.
00;15;52;22 - 00;15;54;26
I mean, I'm not really moving the needle.
00;15;54;26 - 00;15;58;11
So I think that is broadly in agreement
with the point that you're making.
00;15;58;11 - 00;15;59;22
But my question for you is
00;15;59;22 - 00;16;00;28
if that's not valuable,
00;16;00;28 - 00;16;03;12
why do you think companies are doing it
to the degree that they are?
00;16;03;12 - 00;16;04;26
That seems to be the primary use case.
00;16;04;26 - 00;16;06;12
And like, do you see a way forward
00;16;06;12 - 00;16;10;16
where they can really derive value
from those solutions in a jobs context?
00;16;10;19 - 00;16;11;12
That's sort of you.
00;16;11;12 - 00;16;14;08
You mean the you're referring to the
AI adoption?
00;16;14;08 - 00;16;15;14
Yes. Yeah.
00;16;15;14 - 00;16;18;14
I want to like I want to park
00;16;18;14 - 00;16;25;03
the AI productivity gains
and like the value piece for a second,
00;16;25;03 - 00;16;30;01
because I want to come back to something
you said earlier, about all these jobs
00;16;30;01 - 00;16;31;08
must be increasingly
00;16;31;08 - 00;16;33;12
increasing
the efficiency of the organization
00;16;33;12 - 00;16;36;00
because I just
so strongly disagree with that.
00;16;36;00 - 00;16;39;03
And I think that that rests on a belief
00;16;39;10 - 00;16;42;16
that an organization works
as this kind of organism
00;16;42;16 - 00;16;45;22
where everyone understands
what's best for the organization,
00;16;45;26 - 00;16;50;25
and you can transmit with near
perfect efficiency what the goals are.
00;16;50;26 - 00;16;53;26
And there's
some sort of global optimization there.
00;16;53;28 - 00;16;58;22
And I think the bigger the company gets,
the more difficult it is to do that.
00;16;58;27 - 00;17;01;08
And I think there's just like
00;17;01;11 - 00;17;02;02
with with
00;17;02;02 - 00;17;05;10
humans and anything they do, there's
this broken telephone effect
00;17;05;13 - 00;17;09;08
and you end up with a lot of roles
that are opportunistic.
00;17;09;08 - 00;17;12;10
And I mean, you, you, you know, you
kind of glossed over it as like fiefdoms.
00;17;12;10 - 00;17;15;10
But I think there's a lot
of what makes sense
00;17;15;10 - 00;17;18;10
at this price point
or with this level of growth.
00;17;18;13 - 00;17;21;16
And there's a lot of excitement of,
oh, I can finally do this.
00;17;21;19 - 00;17;24;02
But this is all, you know,
you know, to me.
00;17;24;02 - 00;17;26;25
And it's a silly analogy, but
00;17;26;25 - 00;17;29;29
employees are an operating expense,
not a capital expense, right?
00;17;29;29 - 00;17;30;09
It's not.
00;17;30;09 - 00;17;32;03
Oh, I bought myself a new jacket.
00;17;32;03 - 00;17;34;21
It's I signed myself up
for a new streaming service
00;17;34;21 - 00;17;36;09
that I'm paying for every year.
00;17;36;09 - 00;17;40;20
And if it turns out there's not a lot of
content on there, at some point
00;17;40;21 - 00;17;43;20
you know you're going to want to you're
going to want to cut the cord on that.
00;17;43;20 - 00;17;45;19
And like, I like want to address that.
00;17;45;19 - 00;17;48;11
I'm being extremely facetious
about people's livelihoods.
00;17;48;11 - 00;17;53;25
But I think that organizations
do have a responsibility for making sure
00;17;53;25 - 00;17;57;27
that the people in the organization
are working on things
00;17;57;27 - 00;18;01;27
that are actually valuable to the broader
mission of the organization.
00;18;02;00 - 00;18;04;16
And, you know, there's
00;18;04;19 - 00;18;04;28
there's a
00;18;04;28 - 00;18;08;01
philosophical question
that makes me really, really uneasy in
00;18;08;01 - 00;18;12;21
all of this, which is how many employees
should an organization have?
00;18;12;24 - 00;18;16;01
And I don't really know
how to answer that question.
00;18;16;01 - 00;18;19;15
And I'm worried
the answer is as few as humanly possible.
00;18;19;18 - 00;18;23;13
And that is a problem for society,
00;18;23;13 - 00;18;27;10
especially in an age
of increased automation.
00;18;27;13 - 00;18;31;14
But but I'll, you know,
kind of jump back to your, your question,
00;18;31;17 - 00;18;34;29
which is about what's going on with
00;18;35;02 - 00;18;39;17
AI and
I not creating value in organizations.
00;18;39;20 - 00;18;43;19
And I mean, I think the reality is here
twofold.
00;18;43;19 - 00;18;46;24
First of all,
I think organizations have just
00;18;46;27 - 00;18;50;09
broadly done a fairly,
00;18;50;09 - 00;18;56;02
a fairly poor job of implementing AI
in a thoughtful way.
00;18;56;05 - 00;18;58;24
Over the past 18 months, Sure.
00;18;58;24 - 00;19;03;28
I think there's been this belief
at an executive level that AI is easy.
00;19;04;01 - 00;19;04;29
Oh, wow.
00;19;04;29 - 00;19;06;05
It can do all these things now.
00;19;06;05 - 00;19;09;10
So let's just plug it in wherever we can.
00;19;09;13 - 00;19;12;02
You know, I hear AI for everybody.
00;19;12;02 - 00;19;13;28
Right. good. Yosemite Sam. yeah.
Thank you.
00;19;13;28 - 00;19;15;27
I like my cowboy ish finger guns.
00;19;15;27 - 00;19;18;22
And I think that's been
00;19;18;22 - 00;19;21;26
radically disproven
over the last handful of months
00;19;21;26 - 00;19;26;03
because we've seen the rate
at which these initiatives are failing.
00;19;26;06 - 00;19;27;26
And so I think the era
00;19;27;26 - 00;19;31;23
of let's just bought AI on to
everything is coming to an end.
00;19;31;23 - 00;19;34;02
I think that's a really good thing.
00;19;34;02 - 00;19;38;04
And I think it's forcing organizations
to say, oh shit, this is harder
00;19;38;04 - 00;19;39;27
and more expensive than we thought.
00;19;39;27 - 00;19;43;29
And by the way, probably involves
those pesky IT people.
00;19;44;02 - 00;19;47;01
And, you know,
I have to laugh because, like every IT
00;19;47;03 - 00;19;51;12
project in history has gotten late
and over budget and is extremely difficult
00;19;51;12 - 00;19;52;20
and more complex than people think.
00;19;52;20 - 00;19;57;00
And suddenly AI is like easy, like,
oh, who saw who saw that coming?
00;19;57;00 - 00;19;59;06
Like, yeah,
obviously it's harder than people think.
00;19;59;06 - 00;20;03;02
And so I think we're going to see sort of
AI take two
00;20;03;05 - 00;20;06;07
and if I can call it that in 2026,
where people now
00;20;06;07 - 00;20;09;18
need to decide, okay,
what do I want to be serious about here?
00;20;09;18 - 00;20;12;22
And what am I really willing to
properly invest in?
00;20;12;27 - 00;20;16;12
But that's still happening at a time where
00;20;16;15 - 00;20;19;21
AI is at a very overly experimental state,
right?
00;20;19;21 - 00;20;22;05
Like I think we all know the same count.
00;20;22;05 - 00;20;22;29
On one hand
00;20;22;29 - 00;20;27;23
as proven use cases for AI in terms of
oh yeah, it's good for customer service.
00;20;27;23 - 00;20;31;01
Developers can get more
productivity translation, you know,
00;20;31;01 - 00;20;34;09
translation,
you know, you know, things like that.
00;20;34;12 - 00;20;37;22
And for everything
else, it's still sort of question mark.
00;20;37;25 - 00;20;43;16
And, you know, the the reality of question
mark is you have to suss it out.
00;20;43;19 - 00;20;45;21
And that means that you're going to fail.
00;20;45;21 - 00;20;46;12
A lot of the time.
00;20;46;12 - 00;20;49;29
I used to I spent years
running an innovation function.
00;20;50;02 - 00;20;53;11
And reason
you know, one of the things I feel good
00;20;53;11 - 00;20;56;16
about is we failed fairly frequently.
00;20;56;22 - 00;20;59;16
Right? And,
you know, we would try to fail fast.
00;20;59;16 - 00;21;01;15
And we had lots of successes, too.
00;21;01;15 - 00;21;04;16
But that's the nature of innovation is
you can't get it right all the time.
00;21;04;16 - 00;21;05;08
It's like research.
00;21;05;08 - 00;21;08;15
If somebody is telling you
everything they research is 100% known.
00;21;08;15 - 00;21;09;25
That's not research.
00;21;09;25 - 00;21;11;03
And so I think organizations
00;21;11;03 - 00;21;14;03
need to have an appetite for that
if they're moving beyond
00;21;14;03 - 00;21;18;08
any of the bread
and butter use cases of AI.
00;21;18;11 - 00;21;19;00
So those bread and.
00;21;19;00 - 00;21;22;29
Butter use cases, though, usually
run the gamut of, well, I should say
00;21;22;29 - 00;21;27;25
run the gamut, but they sit on a spectrum
somewhere that's usually it's pitiable.
00;21;27;25 - 00;21;28;04
Right?
00;21;28;04 - 00;21;31;07
So the company that's buying sort
of understands the problem to be solved.
00;21;31;07 - 00;21;33;06
So it's not really boundary pushing.
00;21;33;06 - 00;21;37;11
It's possible for the buyer,
usually a chief information officer
00;21;37;11 - 00;21;41;19
or chief technology officer to to
justify it from like a cost perspective.
00;21;41;19 - 00;21;44;01
So, you know,
we're going to reduce the amount of time
00;21;44;01 - 00;21;47;25
it takes to do expense reporting
by introducing efficiency into the expense
00;21;47;25 - 00;21;49;19
reporting process,
because AI is going to read
00;21;49;19 - 00;21;53;10
all the receipts as a real thing,
by the way, and it's very handy.
00;21;53;13 - 00;21;54;21
And once it
00;21;54;21 - 00;21;58;11
goes through that sausage making machine,
so to speak, what comes out
00;21;58;11 - 00;22;02;10
the other end is usually not fundamentally
transformative or innovative.
00;22;02;10 - 00;22;04;25
To use the not real Henry Ford quote.
00;22;04;25 - 00;22;07;25
We're getting a lot of faster horses here,
I think.
00;22;08;02 - 00;22;12;21
So, like, how do we go from the faster
horse to inventing the automobile?
00;22;12;21 - 00;22;16;12
And like, is I part of that story,
or is I just a way to solve the problem
00;22;16;12 - 00;22;19;25
of wages in these jobs that maybe
shouldn't exist in the first place?
00;22;19;25 - 00;22;23;11
Well, I think I think there's one
more element that we need to, you know,
00;22;23;14 - 00;22;26;21
I want to get serious about, which is that
there's an awful lot of vendors.
00;22;26;27 - 00;22;29;22
And, you know, I'll dunk on Microsoft
for a minute
00;22;29;22 - 00;22;33;01
that are using AI as an excuse
to just raise
00;22;33;01 - 00;22;36;18
their prices and collect more revenue
right where they say, hey, Mr.
00;22;36;18 - 00;22;37;01
and Mrs.
00;22;37;01 - 00;22;40;01
Buyer,
we have all this new AI functionality.
00;22;40;08 - 00;22;42;04
And by the way, that's justifying
00;22;42;04 - 00;22;45;00
why we're charging you 20%
more next year, right?
00;22;45;00 - 00;22;45;29
copilot uplift.
00;22;45;29 - 00;22;46;18
Yeah. Yeah.
00;22;46;18 - 00;22;49;04
Which
I mean has been going really well so far.
00;22;49;04 - 00;22;53;19
We'll see how well that like it's
been going really well so far because
00;22;53;22 - 00;22;57;04
buyers don't know any better
or they're still, you know, a little bit
00;22;57;04 - 00;22;59;15
naive about this as been. Going
as well as they want it. To.
00;22;59;15 - 00;23;00;14
No it hasn't.
00;23;00;14 - 00;23;02;09
And I think it's about to go a lot worse
by the way,
00;23;02;09 - 00;23;05;08
because I think people are starting
to look under the hood and saying,
00;23;05;08 - 00;23;08;08
hey, this is actually isn't
doing something I want or it's
00;23;08;08 - 00;23;09;25
not doing what you promised.
00;23;09;25 - 00;23;13;14
So that I think is a huge motivation
for what we're seeing here.
00;23;13;20 - 00;23;18;14
But question you have about where is AI
a faster horse versus where is it
00;23;18;17 - 00;23;23;29
a car in an era of horses
is like the fundamental question
00;23;24;05 - 00;23;28;19
of our time in AI,
because everyone is advertising cars
00;23;28;22 - 00;23;32;05
and everyone is selling,
you know, faster horses
00;23;32;05 - 00;23;34;14
And everyone is buying faster
and everyone is buying faster.
00;23;34;14 - 00;23;36;17
Horse people show up
and you might pitch them in the car
00;23;36;17 - 00;23;39;17
and they'll be like, ooh, I don't know
if I can park that in my stable.
00;23;39;21 - 00;23;42;05
You know, maybe I should get one of them
faster horses like that.
00;23;42;05 - 00;23;44;02
And it's not just the vendors.
00;23;44;02 - 00;23;44;17
No, no.
00;23;44;17 - 00;23;49;02
And it's it's by the way, it's happening
internal to organizations where investors
00;23;49;02 - 00;23;54;14
are demanding cars, and CEOs
and boards are saying, we build cars here.
00;23;54;14 - 00;23;59;15
You should know that value our company
is that we build cars and then go back
00;23;59;15 - 00;24;02;29
to, you know, their IT team and say,
like we're building cars, right?
00;24;02;29 - 00;24;04;29
And they're saying just horses back here
yeah.
00;24;04;29 - 00;24;08;01
and like that, that reality
I think car money.
00;24;08;04 - 00;24;11;11
Yeah. Yeah. So so that's a conflict.
00;24;11;11 - 00;24;13;28
And there's some really,
really perverse incentives at play.
00;24;13;28 - 00;24;14;28
Right now.
00;24;14;28 - 00;24;20;05
And everybody is kind of looking
to everybody else to say like, but
00;24;20;05 - 00;24;25;06
but surely you have a car over there or
like do our, do our competitors have cars?
00;24;25;06 - 00;24;26;29
Do our vendors have cars.
00;24;26;29 - 00;24;31;02
And there's just not a lot of cars
right now.
00;24;31;05 - 00;24;35;01
And we're like to mixed metaphors
from earlier.
00;24;35;01 - 00;24;38;01
We're sure
building a lot of like asphalt roads
00;24;38;03 - 00;24;40;26
right now
as though there's going to be cars.
00;24;40;26 - 00;24;42;26
And so that's going
to be really interesting.
00;24;42;26 - 00;24;46;17
And by the way, I do think, you know,
to come back to something I said earlier,
00;24;46;20 - 00;24;51;01
I really think that LMS
are not going to be the road.
00;24;51;01 - 00;24;54;04
I think our LMS are the faster horse
right now.
00;24;54;05 - 00;24;54;13
Yeah.
00;24;54;13 - 00;24;56;11
And they can do some really cool stuff
00;24;56;11 - 00;25;00;11
and they can augment
individual productivity, but they are not
00;25;00;14 - 00;25;03;23
the sea change
that they're being pitched as.
00;25;03;23 - 00;25;05;09
we may see some cars.
00;25;05;09 - 00;25;07;07
This like metaphor is like totally bust
00;25;07;07 - 00;25;09;05
away from it's
like totally busted at this point.
00;25;09;05 - 00;25;13;12
the. Horse has left
the. Barn. Oh my God. Like, I
00;25;13;15 - 00;25;16;02
don't think I'm allowed to slap you,
but I sure do want to.
00;25;16;02 - 00;25;18;21
get. An AI to do it.
It'll be a digital. Yeah.
00;25;18;21 - 00;25;22;22
So I think we're going to see specific
areas outside of LMS Yeah.
00;25;22;28 - 00;25;24;17
that will make us say, wow.
00;25;24;17 - 00;25;29;11
Yep. In in 2026, like, oh, wow,
I didn't know that I could do that.
00;25;29;14 - 00;25;34;18
But I like we've got to be
honest, is a marketing buzzword, right?
00;25;34;18 - 00;25;36;23
Like like what the hell does I mean?
00;25;36;23 - 00;25;39;29
And by the way, like we're like,
we're complicit in this.
00;25;39;29 - 00;25;42;07
Like, I am part of the problem here.
00;25;42;07 - 00;25;46;00
But but, like, you know,
to get to a bit of meatier conversation.
00;25;46;00 - 00;25;50;19
Yeah, I lms, I, I think we're starting
to see a plateau there.
00;25;50;22 - 00;25;55;28
A gigantic seems to be the wagon
that everyone is hitching their horse to.
00;25;55;28 - 00;25;58;07
Yeah. God, I've broken this analogy.
00;25;58;07 - 00;26;04;02
And I just don't see that panning out
in the next 18 months, to be honest.
00;26;04;02 - 00;26;06;28
It's just too complex
to get beyond the basics.
00;26;06;28 - 00;26;10;06
Will there be more automation
happening in organizations? Yes.
00;26;10;12 - 00;26;13;17
Will we see some exciting new use
cases? Yes.
00;26;13;23 - 00;26;15;15
But the fact that,
00;26;15;15 - 00;26;18;26
you know, I've spent this year
talking to some of the brightest minds
00;26;18;29 - 00;26;23;11
in academia, in journalism, on the front
lines, actually developing this stuff.
00;26;23;16 - 00;26;27;19
And any time you're like, tell me about
the use case you're most excited about.
00;26;27;22 - 00;26;30;26
And they kind of, you know, politic
their way out of the question
00;26;30;26 - 00;26;32;26
is extremely telling to. Yeah.
00;26;32;26 - 00;26;36;20
Well you mentioned earlier
right about sort of AI 2.0.
00;26;36;23 - 00;26;38;24
And that feels a lot
like web 2.0 to me. Right.
00;26;38;24 - 00;26;41;18
So like web 1.0 was very non-interactive.
00;26;41;18 - 00;26;42;10
Like here's a page.
00;26;42;10 - 00;26;46;12
Here's a directory where we used to browse
the internet with an alphabetical list
00;26;46;15 - 00;26;49;14
of everything that was out there, like,
are you interested in aardvarks?
00;26;49;14 - 00;26;51;13
Yeah, that was the top left.
00;26;51;13 - 00;26;54;03
And then web 2.0,
of course, became far more interactive.
00;26;54;03 - 00;26;55;28
There were buttons
and things that you could press.
00;26;55;28 - 00;26;59;11
So if I 1.0 is LMS
00;26;59;11 - 00;27;02;17
that do really gimmicky stuff,
like write a poem for your birthday.
00;27;02;17 - 00;27;05;20
My grandma's cards got much more
interesting after ChatGPT came out.
00;27;05;20 - 00;27;09;19
By the way,
what do you think AI 2.0 actually is?
00;27;09;19 - 00;27;13;12
So is that is that analytics
is that uses in like health care.
00;27;13;12 - 00;27;14;17
You talked about a genetic a lot.
00;27;14;17 - 00;27;16;08
So in the genetic AI is of course an AI
00;27;16;08 - 00;27;19;01
that does something for you
based on things that it knows about you.
00;27;19;01 - 00;27;21;12
So it actually takes sort of independent
actions. It's your agent.
00;27;21;12 - 00;27;22;17
You're the principal.
00;27;22;17 - 00;27;24;12
What do you see 2.0 is looking like.
00;27;24;12 - 00;27;26;24
So let's let's talk for a minute
about 1.0.
00;27;26;24 - 00;27;29;28
And I like I love the web comparison
for AI by the way.
00;27;30;03 - 00;27;33;13
And I don't know if if you're old enough
to remember, but like when the web
00;27;33;13 - 00;27;37;29
first came out, it was very much like,
just add the web to everything.
00;27;37;29 - 00;27;41;04
Like the example that comes to mind
is that people used to make fun of
00;27;41;05 - 00;27;44;10
is like in the mid 90s, Pizza
Hut had a website
00;27;44;17 - 00;27;47;24
and it was basically just like,
what's your local Pizza Hut?
00;27;47;24 - 00;27;49;25
Okay, here's the phone number
for how to call them.
00;27;49;25 - 00;27;50;21
And you're like, call. Yeah, yeah.
00;27;50;21 - 00;27;52;29
And you're like wow, chef wow.
00;27;52;29 - 00;27;57;02
Like the, you know, the information
superhighway in action.
00;27;57;05 - 00;28;00;19
And I think that's very much
where we're at with AI were it's like,
00;28;00;19 - 00;28;04;02
oh, everybody like
just get your AI up and running.
00;28;04;05 - 00;28;08;03
And by the way, I think one of
the challenges we're seeing right now
00;28;08;06 - 00;28;12;11
is there's a lot of investor excitement
about AI.
00;28;12;18 - 00;28;16;18
And I think there's a lot of consumer
00;28;16;21 - 00;28;19;21
backlash
is maybe slightly too strong a word.
00;28;19;26 - 00;28;21;28
But I think when a lot of consumers
00;28;21;28 - 00;28;24;29
hear the word AI,
they have a neutral to negative reaction.
00;28;25;04 - 00;28;27;15
And it's just like it's
been shocking to me
00;28;27;15 - 00;28;32;10
how much there's been like,
you can have an AI, PC or like Google
00;28;32;10 - 00;28;36;10
Now with AI,
like your phone is full of AI and like,
00;28;36;13 - 00;28;41;02
I don't know if it's nobody gives a shit
or they actively don't like it, but
00;28;41;05 - 00;28;45;25
it's not having the impact on consumers
that it's having on a neutral AI best.
00;28;45;28 - 00;28;47;16
At very best.
00;28;47;16 - 00;28;49;03
I think it's actually negative.
00;28;49;03 - 00;28;52;22
And people are like,
what is the use case here?
00;28;52;26 - 00;28;53;29
How do you how do you.
00;28;53;29 - 00;28;58;03
Square that with the proliferation of LMS
in everyday life?
00;28;58;03 - 00;29;00;06
Like everybody ChatGPT is everything.
00;29;00;06 - 00;29;03;11
Like, yeah, I got an email from somebody
that's very obviously ChatGPT.
00;29;03;14 - 00;29;04;12
Oh yeah. No, no.
00;29;04;12 - 00;29;05;03
Like I get
00;29;05;03 - 00;29;09;05
I get emails that end with like, would you
like me to like change the above.
00;29;09;08 - 00;29;11;02
Like it's just it's just brutal.
00;29;11;02 - 00;29;14;15
Like, seriously, I've had people like,
literally trying to sell me something
00;29;14;15 - 00;29;17;15
with that with like, that text in it.
00;29;17;16 - 00;29;20;26
The way I square
that is the difference between
00;29;21;00 - 00;29;26;14
somebody having a productivity tool
in their back pocket that they can use
00;29;26;18 - 00;29;29;18
as a way to get themselves ahead
00;29;29;21 - 00;29;32;28
versus
having something else pushed on them.
00;29;33;01 - 00;29;36;06
And one of the things that happened
with web
00;29;36;09 - 00;29;41;18
is people
stop talking about web as though like web,
00;29;41;18 - 00;29;45;24
web, web, Came its owner and they said,
I got to be on the internet.
00;29;45;25 - 00;29;48;20
You look at them like, oh, you know,
where have No, no.
00;29;48;20 - 00;29;49;07
Exactly.
00;29;49;07 - 00;29;54;06
And so I think what's going to happen
with AI is people are going to realize,
00;29;54;09 - 00;29;56;26
you know, companies are builders,
are of AI, are going to realize
00;29;56;26 - 00;30;00;20
very quickly
that these words are getting polluted.
00;30;00;23 - 00;30;03;28
And because they have
a negative connotation with consumers,
00;30;04;01 - 00;30;08;29
you can't just be like,
hey everybody, I like, why?
00;30;09;03 - 00;30;10;25
Why should I care about that?
00;30;10;25 - 00;30;13;02
Why should I be excited about that?
00;30;13;02 - 00;30;17;18
And that is the question of
what can I do for you?
00;30;17;21 - 00;30;22;01
And I think we need to get back to that
in a big way.
00;30;22;01 - 00;30;26;02
And like if you can tell me, like,
why should I buy an IPC, why should I have
00;30;26;02 - 00;30;31;15
like your browser is I now like,
why do I want an AI browser like that?
00;30;31;15 - 00;30;35;19
Like, I don't know, like I,
I cringe at just here and like
00;30;35;19 - 00;30;36;15
these are real examples.
00;30;36;15 - 00;30;39;12
By the way, this is not me making that up.
00;30;39;12 - 00;30;42;06
So how do you market this in ways
00;30;42;06 - 00;30;45;06
where there's
actually some value to people?
00;30;45;10 - 00;30;48;27
And I think, you know,
and this is a cop out,
00;30;49;00 - 00;30;51;22
you know, in the predictions game, but,
you know,
00;30;51;22 - 00;30;55;03
to ask somebody in 1997
00;30;55;06 - 00;31;00;10
what to ask them to try to predict
Facebook or, you know, predict Instagram.
00;31;00;13 - 00;31;03;24
I think we're we're a few chess moves
away from that.
00;31;03;27 - 00;31;09;04
But what it's going to look like
is a series of very specific use cases
00;31;09;04 - 00;31;14;08
that people are excited
to use, like ChatGPT LMS,
00;31;14;08 - 00;31;17;28
and in the chat bot capacity
are an absolute killer app, right?
00;31;17;28 - 00;31;22;09
Like you just use it and you're like,
this is awesome.
00;31;22;12 - 00;31;25;01
I can get value out of this right away.
00;31;25;01 - 00;31;27;22
Nothing else in the AI spaces like that.
00;31;27;22 - 00;31;30;27
And it's this weird
extrapolation of like, well,
00;31;31;00 - 00;31;34;16
it works over there,
so why is it not working for my business?
00;31;34;19 - 00;31;36;25
And so that's got a die.
00;31;36;25 - 00;31;39;18
And I think we're going to start to see
00;31;39;18 - 00;31;42;15
people become a lot smarter with that.
00;31;42;15 - 00;31;46;09
And I think a wave of hype is going to get
washed away as part of that.
00;31;46;09 - 00;31;48;04
So now this is incidental to your point.
00;31;48;04 - 00;31;49;15
But I want to I want to jump on it
00;31;49;15 - 00;31;51;07
because you gave me
a couple of openings here.
00;31;51;07 - 00;31;53;19
So you mentioned how the consumer
sentiment around
00;31;53;19 - 00;31;54;25
AI is quite negative, right?
00;31;54;25 - 00;31;57;17
When I think ChatGPT,
if we did a word association game,
00;31;57;17 - 00;32;00;10
my response would be brain rot,
which is a real thing, right?
00;32;00;10 - 00;32;02;13
People
that don't flex their brain muscles,
00;32;02;13 - 00;32;04;12
the brain isn't actually a muscle,
but you know,
00;32;04;12 - 00;32;06;09
they don't flex their brain
when Thank you doctor.
00;32;06;09 - 00;32;08;26
Yeah,
there you go to solve their problems.
00;32;08;26 - 00;32;09;05
Right.
00;32;09;05 - 00;32;10;10
And so they end up being less
00;32;10;10 - 00;32;12;20
effective at problem
solving sort of in the aggregate.
00;32;12;20 - 00;32;14;20
But then you talk about social media.
00;32;14;20 - 00;32;17;25
And if I were to go to people in, say,
social media, positive or negative,
00;32;17;25 - 00;32;20;29
I bet social
media has a more negative connotation
00;32;20;29 - 00;32;26;01
these days than I does,
but it's still tremendously lucrative.
00;32;26;07 - 00;32;30;09
They sell the crap out of ads
on Facebook and Instagram and TikTok
00;32;30;09 - 00;32;31;14
and all of these different sites.
00;32;31;14 - 00;32;36;08
Despite being assailed by regulators
and by moralists, you know, by pundits,
00;32;36;08 - 00;32;41;20
by podcast hosts, it's still a
just a phenomenally profitable business.
00;32;41;20 - 00;32;44;26
Meta is raking in money and
and they're shoveling it
00;32;44;26 - 00;32;48;17
into the metaverse furnace
and the AI furnace and they're burning it.
00;32;48;17 - 00;32;49;28
But like,
00;32;49;28 - 00;32;53;05
do we actually need
to have positive sentiment around AI,
00;32;53;05 - 00;32;55;00
or can it be like a social media thing
where it becomes
00;32;55;00 - 00;32;58;00
so useful that it just breaks through,
even though we all hate it?
00;32;58;02 - 00;33;00;04
So I think that's a
that's a really interesting question.
00;33;00;04 - 00;33;04;11
And I want to like, just sit for a moment
with like social media and the fact
00;33;04;11 - 00;33;06;14
that, like,
everybody hates it, myself included.
00;33;06;14 - 00;33;07;20
I imagine you hate it too.
00;33;07;20 - 00;33;09;10
the way it's video, by the way.
00;33;09;10 - 00;33;12;15
yeah,
like and subscribe like and subscribe.
00;33;12;18 - 00;33;15;25
Social media is
00;33;15;28 - 00;33;19;00
like very, very oligarchical right now.
00;33;19;00 - 00;33;19;25
Right. Like consolidate.
00;33;19;25 - 00;33;22;25
It's consolidated
within a handful of big tech firms.
00;33;22;25 - 00;33;26;03
And it's extremely lucrative
for those firms.
00;33;26;06 - 00;33;29;29
And the the way that they've made it
so lucrative
00;33;30;02 - 00;33;34;09
is that they've built it completely around
engagement and attention capture.
00;33;34;09 - 00;33;34;22
Right. Yeah.
00;33;34;22 - 00;33;37;22
How do I keep you on my platform
as much as possible.
00;33;37;29 - 00;33;41;24
And, you know, there's all sorts
of negative societal outcomes around that.
00;33;41;24 - 00;33;45;00
And we could spend a very long time
talking about that statement of the show.
00;33;45;00 - 00;33;45;08
What?
00;33;45;08 - 00;33;49;10
Well, and I don't think anyone's watching
this is like what
00;33;49;13 - 00;33;53;12
social media having a negative impact
controversy Facebook.
00;33;53;12 - 00;33;54;22
No, no.
00;33;54;22 - 00;33;58;12
And so I think we're seeing
I go down that road
00;33;58;15 - 00;34;00;11
And even with some of the new releases
00;34;00;11 - 00;34;04;01
of ChatGPT and Gemini, you can see
in the way that they're structured,
00;34;04;04 - 00;34;06;28
they're more structured around
engagement, right?
00;34;06;28 - 00;34;10;14
how can I be effusive
in terms of like, Jeremy?
00;34;10;17 - 00;34;12;03
That was a brilliant question.
00;34;12;03 - 00;34;13;26
And good on you for asking that.
00;34;13;26 - 00;34;14;23
Maybe it was a good question.
00;34;14;23 - 00;34;17;23
If. well, if you're as I'm sure
they're always amazing,
00;34;17;29 - 00;34;21;17
but and now that you've asked it like,
can I help you with the next thing?
00;34;21;17 - 00;34;24;07
Right. Here's something that, like else
I can do for you.
00;34;24;07 - 00;34;25;19
How do I keep you locked in?
00;34;25;19 - 00;34;27;20
I mean, to make it a table for you,
you know.
00;34;27;20 - 00;34;29;00
Would you like it in poetry form?
00;34;29;00 - 00;34;30;15
Yeah, exactly, exactly.
00;34;30;15 - 00;34;33;13
anything to keep you locked in.
00;34;33;13 - 00;34;38;25
And one of the themes that's been,
you know, depending on exactly the day,
00;34;38;28 - 00;34;42;06
more or less, you know, in the news cycle
00;34;42;13 - 00;34;47;08
is the, the monetization
of these platforms
00;34;47;14 - 00;34;51;03
and like, come on, we all know
that we're going to get to a day
00;34;51;06 - 00;34;54;16
where ChatGPT is going to say, like,
you know, you seem tired,
00;34;54;16 - 00;34;58;14
but like, wouldn't you be revitalized
by drinking a delicious Red bull?
00;34;58;17 - 00;35;02;05
Like, it's just it's I don't drink
Red bull, by the way, but I do.
00;35;02;05 - 00;35;05;04
I drink enough for both of us. that brave,
it's coming.
00;35;05;04 - 00;35;09;17
And it's going to be the social media
ification of,
00;35;09;20 - 00;35;13;07
of AI, of the of these,
you know, these line platforms.
00;35;13;07 - 00;35;16;12
And once again, like, that's
why Big Tech is so invested in this
00;35;16;12 - 00;35;19;20
because they know this
and they want to own the platforms.
00;35;19;25 - 00;35;25;04
Let's talk about everybody else
in the context of social media,
00;35;25;07 - 00;35;26;29
because if you're a business,
00;35;26;29 - 00;35;30;20
you may advertise on Instagram or LinkedIn
or whatever, and you may get revenue
00;35;30;20 - 00;35;34;11
from that,
but you're certainly creating an economy
00;35;34;17 - 00;35;37;17
where you're paying these,
00;35;37;18 - 00;35;41;26
these, these rent seekers and big tech,
big dollars to be able to do that.
00;35;42;02 - 00;35;44;17
And so I think that's exactly
what's happening.
00;35;44;17 - 00;35;48;13
And like if you extrapolate
that these businesses are like
00;35;48;19 - 00;35;52;24
sort of using social media,
but not in a way where they're developing
00;35;52;24 - 00;35;57;08
new functionality, it's marketing, it's
an ad platform for what they're doing.
00;35;57;12 - 00;36;00;10
It's a new way to reach their customers.
00;36;00;10 - 00;36;04;08
But and, you know, new car type
types of content get created.
00;36;04;11 - 00;36;06;23
But when people are
00;36;06;26 - 00;36;08;02
adding
00;36;08;02 - 00;36;11;29
AI into their business right now,
or valuing businesses
00;36;11;29 - 00;36;16;13
based on what I could be added,
it doesn't strike me that they're saying
00;36;16;13 - 00;36;20;10
like and it'll be just like how
this business uses social media, right?
00;36;20;10 - 00;36;22;17
Like that's not the framing device.
00;36;22;17 - 00;36;24;06
And I don't know
if this is where you're going,
00;36;24;06 - 00;36;28;02
but I kind of think maybe it should be
more like that framing device.
00;36;28;08 - 00;36;30;26
Yeah. That's it's
such an interesting concept.
00;36;30;26 - 00;36;32;02
I think you're right, by the way.
00;36;32;02 - 00;36;35;03
You know, for anybody watching at home,
like a social media platform
00;36;35;03 - 00;36;38;25
is just a really,
really sophisticated ad engine.
00;36;38;28 - 00;36;42;14
We take the best,
the brightest, the smartest mathematicians
00;36;42;14 - 00;36;43;28
and computer scientists in the world.
00;36;43;28 - 00;36;45;27
And what problem do we have them solve?
00;36;45;27 - 00;36;49;04
How? You watch a video
for just ten more seconds
00;36;49;04 - 00;36;51;23
so that the advertiser can cram
in some additional content?
00;36;51;23 - 00;36;53;06
It's a huge business.
00;36;53;06 - 00;36;56;17
I it's going to be weird
when we're in that dystopian future when,
00;36;56;17 - 00;36;57;22
you know, ChatGPT is like,
00;36;57;22 - 00;36;58;07
you know,
00;36;58;07 - 00;37;00;07
the delicious taste of Pepsi
will surely get you
00;37;00;07 - 00;37;02;10
through this difficult time
that you're talking to me about.
00;37;02;10 - 00;37;06;03
And there will 100% be stories,
ads, a prediction in the media Oh, yeah.
00;37;06;03 - 00;37;07;16
a total inappropriate
00;37;07;16 - 00;37;10;16
ad that was like deeply
personalized was targeted at somebody.
00;37;10;17 - 00;37;13;14
yeah, I think the social media comparison
definitely.
00;37;13;14 - 00;37;17;08
It's got me thinking,
and I have to wonder, you know,
00;37;17;08 - 00;37;20;13
if consolidation in the AI space
is going to be inevitable, like it was on
00;37;20;13 - 00;37;23;23
the social media side, Facebook
buying Instagram and WhatsApp, right?
00;37;23;23 - 00;37;26;20
You know, TikTok, growing pretty rapidly.
00;37;26;20 - 00;37;28;21
And I agree with you
on the innovation front.
00;37;28;21 - 00;37;30;23
I have to wonder,
though, is the network effect,
00;37;30;23 - 00;37;32;14
which is really what drives
social media, right?
00;37;32;14 - 00;37;34;21
I go there because other people are there.
00;37;34;21 - 00;37;37;13
Like, does
I have a similar moat like lens?
00;37;37;13 - 00;37;38;09
I think certainly don't.
00;37;38;09 - 00;37;39;01
I mean, what's the difference between
00;37;39;01 - 00;37;43;13
Cloud and Gemini and, and ChatGPT
and you know, and any of these options
00;37;43;15 - 00;37;45;12
I could
somebody with enough computing power
00;37;45;12 - 00;37;50;21
just come in and, and dominate the game
or is it a marketing play? And,
00;37;50;24 - 00;37;52;06
Again, I think it's interesting.
00;37;52;06 - 00;37;54;21
But, you know,
the example that comes to mind is Google,
00;37;54;21 - 00;37;57;21
not as an AI player,
but actually as a search player, Sure.
00;37;57;22 - 00;38;01;16
because again, and, you know,
you may recall from your youth
00;38;01;16 - 00;38;06;17
the days of like Lycos and AltaVista
and Yahoo and all of those,
00;38;06;17 - 00;38;12;16
and Google came in and it was better
and it had a reputation for being better,
00;38;12;19 - 00;38;16;27
and it just consolidated the market
under it because it was the winner.
00;38;16;27 - 00;38;18;26
And so I think It was really effective
00;38;18;26 - 00;38;21;00
because they actually did things
differently.
00;38;21;00 - 00;38;21;07
Right.
00;38;21;07 - 00;38;23;06
Like not to get belabor the point
00;38;23;06 - 00;38;26;28
too much, but like AltaVista and Yahoo
especially, I was a portal right
00;38;27;01 - 00;38;28;17
where you saw
everything laid out in front of you.
00;38;28;17 - 00;38;29;18
Google was a search engine.
00;38;29;18 - 00;38;32;02
They had
they had abstracted all of that away.
00;38;32;02 - 00;38;34;01
Then Yahoo eventually had a search engine
and everything.
00;38;34;01 - 00;38;36;16
But yeah,
sorry to to to cut you off there.
00;38;36;16 - 00;38;41;05
Carry on.
Well so you know my point is that
00;38;41;08 - 00;38;43;04
even like
00;38;43;04 - 00;38;47;20
to me that it's still an incremental mode
in that story because they all had
00;38;47;20 - 00;38;51;29
search engines in some capacities by like,
you know, call it the year 2000.
00;38;52;02 - 00;38;56;13
But Google was still able to run away
with a victory there.
00;38;56;16 - 00;39;01;01
And, you know, I think we've already seen
you can look at graphs of,
00;39;01;04 - 00;39;07;05
you know, the the usage across the Bain
LMS now, and it's extremely consolidated.
00;39;07;12 - 00;39;09;14
And I think it'll stay
extremely consolidated.
00;39;09;14 - 00;39;13;00
And I talked to a lot of people
who are huge advocates
00;39;13;00 - 00;39;17;14
for decentralization of of AI and LMS.
00;39;17;14 - 00;39;20;14
And we're seeing, you know, deep
SQL is a big story this year
00;39;20;14 - 00;39;24;21
about how you can use substantially less
computing power and kind of get ahead.
00;39;24;24 - 00;39;28;07
But people still tend to gravitate
00;39;28;07 - 00;39;31;10
toward the no names, right?
00;39;31;10 - 00;39;34;17
Like AI and and being an incumbent,
00;39;34;17 - 00;39;37;23
I think is extremely powerful there.
00;39;37;26 - 00;39;39;27
that's one of the big narratives
that a lot of these
00;39;39;27 - 00;39;43;23
players have right now,
which is it's winner take all.
00;39;43;23 - 00;39;46;22
And so, you know,
whether it's our LMS or technology
00;39;46;22 - 00;39;50;27
beyond that, you've got to invest in us
and we're going to get it
00;39;50;27 - 00;39;56;03
like it's a it's a it's an arms race
functionally for, you know, AI tech.
00;39;56;03 - 00;39;57;23
And we're going to take it all.
00;39;57;23 - 00;40;02;25
And I'm, I'm very skeptical of that as
well, which I'm sure you'll find shocking.
00;40;02;25 - 00;40;06;10
Like to me,
that is, a tactic for raising capital.
00;40;06;10 - 00;40;08;01
And it's extremely effective.
00;40;08;01 - 00;40;11;15
But I think it's
I think we're going to see a space similar
00;40;11;15 - 00;40;13;09
to what we see in social media
with big tech, where
00;40;13;09 - 00;40;16;01
there's a few different niches
that are kind of carved out,
00;40;16;01 - 00;40;19;00
and people will stick
mostly with, with the big players there.
00;40;19;00 - 00;40;21;23
And, And there's two,
two points on that. Right.
00;40;21;23 - 00;40;24;16
So the first is that ChatGPT was an open
AI product.
00;40;24;16 - 00;40;27;05
And before 2022,
nobody had heard of open AI.
00;40;27;05 - 00;40;28;19
So like they actually became
00;40;28;19 - 00;40;33;01
the first mover in this giant,
valuable market out of almost nowhere.
00;40;33;04 - 00;40;35;14
And I say almost
because they did have some big investment.
00;40;35;14 - 00;40;39;13
And, you know, Sam Altman used to run Y
Combinator, which was a startup incubator
00;40;39;13 - 00;40;41;25
in Southern California
and in the Bay area.
00;40;41;25 - 00;40;45;00
So they did actually sort of leapfrog
a lot of their competitors.
00;40;45;00 - 00;40;46;05
And it caused a panic at Google.
00;40;46;05 - 00;40;48;06
And now it's
sort of become a resource competition.
00;40;48;06 - 00;40;52;07
So the winners of the AI race so far
are the people who can burn the most money
00;40;52;07 - 00;40;54;02
in pursuit of the goal. Right.
00;40;54;02 - 00;40;54;26
It's very hard.
00;40;54;26 - 00;40;56;14
You know,
if we wanted to make an AI startup,
00;40;56;14 - 00;40;59;27
we would have to raise a lot of money
to to add to the capacity.
00;41;00;02 - 00;41;01;03
So that's one. Thing.
00;41;01;03 - 00;41;06;00
But but on that thing, this is,
you know, one of my predictions for 2026
00;41;06;03 - 00;41;09;17
that I may be proven wrong on,
which is that there's so much talk about,
00;41;09;17 - 00;41;12;19
you know, compute
and just building more capacity
00;41;12;19 - 00;41;17;10
because all if we can only train
these models better, they'll,
00;41;17;13 - 00;41;19;22
you know, we're going to hit
some sort of inflection point
00;41;19;22 - 00;41;22;19
and we're going to race ahead
and it's going to just,
00;41;22;19 - 00;41;25;18
you know, take our stock to the moon
singularity.
00;41;25;19 - 00;41;27;23
I just think it's just total bullshit.
00;41;27;23 - 00;41;31;21
I think we're reaching a point
where, as I said, with LMS, we're
00;41;31;21 - 00;41;37;24
reaching a plateau where it costs
exponentially more to get marginal gains,
00;41;37;27 - 00;41;42;04
which is the financial equivalent
of just shoveling money into a furnace.
00;41;42;07 - 00;41;47;23
And so I think the floor for entry level
participants is getting to a point where,
00;41;47;26 - 00;41;52;16
despite all the talk about that,
like you're not going to build
00;41;52;18 - 00;41;57;12
a super intelligence just by shoveling
more money into training data.
00;41;57;12 - 00;41;57;22
Right?
00;41;57;22 - 00;42;02;16
I think we're we're going back to an era
where creativity
00;42;02;16 - 00;42;06;22
and asking the right questions
and, and structuring,
00;42;06;25 - 00;42;09;25
you know, having new ideas
like backpropagation and,
00;42;10;00 - 00;42;13;10
you know, with Hinton
a handful of years ago, those new ideas
00;42;13;10 - 00;42;17;21
for how we come up with these things
is going to lead to the next innovation.
00;42;17;21 - 00;42;24;06
It's not just a raw horsepower race,
which is a very inconvenient message.
00;42;24;13 - 00;42;29;00
If you're funding is contingent on
just building more and more horsepower.
00;42;29;03 - 00;42;31;03
A lot of horses
in this conversation, horse.
00;42;31;03 - 00;42;35;01
Turns out to be a wonderful metaphor, a
wonderful way to structure a conversation.
00;42;35;04 - 00;42;37;26
The other point that I wanted to make,
I just before we jump ahead
00;42;37;26 - 00;42;39;12
was around consolidation.
00;42;39;12 - 00;42;41;09
There's a regulatory angle to this, right?
00;42;41;09 - 00;42;43;22
Everybody wants to be big,
but they don't want to be so big
00;42;43;22 - 00;42;45;23
that they attract scrutiny. Right.
00;42;45;23 - 00;42;46;26
And that's something that,
00;42;46;26 - 00;42;50;18
you know, Apple has faced,
with the Epic Games lawsuit.
00;42;50;18 - 00;42;54;04
That's something that Google faced as a,
you know, a monopoly in search.
00;42;54;04 - 00;42;57;01
And I believe they actually
lost that case, if I'm not mistaken.
00;42;57;01 - 00;42;58;00
And there was a concern
00;42;58;00 - 00;43;01;04
that they were going to have to sell
Chrome and Perplexity wanted to buy it
00;43;01;04 - 00;43;04;11
and make an AI browser, which from what
I understand is your favorite thing.
00;43;04;14 - 00;43;06;18
And so that didn't go through,
thankfully.
00;43;06;18 - 00;43;08;03
But Chrome
will probably be coming in my browser.
00;43;08;03 - 00;43;08;23
So I think that
00;43;08;23 - 00;43;11;29
if you are a big tech company,
if you're watching this Sundar Pichai,
00;43;11;29 - 00;43;14;29
if you're watching this Satya Nadella,
you know,
00;43;15;05 - 00;43;17;28
be mindful of the regulators
because we don't typically like
00;43;17;28 - 00;43;20;28
one company sort of dominating the space
so significantly.
00;43;21;02 - 00;43;21;20
I do think.
00;43;21;20 - 00;43;23;06
So maybe a prediction on my lion for
00;43;23;06 - 00;43;27;12
the future is I think you're right
about diminishing returns on models.
00;43;27;12 - 00;43;29;10
I think the deep seek
example is a really good one.
00;43;29;10 - 00;43;32;10
Facebook meta had
it had a product called llama,
00;43;32;12 - 00;43;35;02
which is a local runnable
00;43;35;02 - 00;43;38;25
site, a correct way of phrasing that
a model that you can run locally has that,
00;43;38;28 - 00;43;41;28
and for most people it's
00;43;42;05 - 00;43;45;10
very effective for most cases. Right?
00;43;45;10 - 00;43;48;16
So the, you know,
you get 70% of the effectiveness
00;43;48;16 - 00;43;52;02
at 0.5% of the necessary compute.
00;43;52;02 - 00;43;55;07
Right.
And I'm speculating on those numbers.
00;43;55;10 - 00;43;58;23
So I think that there's a scenario where
we have these smaller specialized models
00;43;58;23 - 00;44;01;25
that don't reinvent the wheel
every time somebody asks them a question.
00;44;01;26 - 00;44;05;05
I could definitely see a world
where that type of ello and becomes cost
00;44;05;05 - 00;44;05;29
effective is
00;44;05;29 - 00;44;10;23
the barrier to entry is relatively low,
and the value is real and relatively high.
00;44;10;26 - 00;44;13;03
So definitely something
I want to talk about.
00;44;13;03 - 00;44;16;03
But speaking of things that you think are
bullshit.
00;44;16;05 - 00;44;20;04
We've talked a lot about AI,
but that wasn't like the biggest bet
00;44;20;04 - 00;44;22;13
for some of these companies
over the past couple of years.
00;44;22;13 - 00;44;26;19
And the thing that really comes to mind
for me is the metaverse.
00;44;26;21 - 00;44;28;23
And that's such a big bet.
00;44;28;23 - 00;44;31;28
You know Facebook didn't rename itself
I book okay.
00;44;32;01 - 00;44;33;19
They renamed themselves meta.
00;44;33;19 - 00;44;37;17
And they were going to inaugurate
this new world of spatial computing.
00;44;37;20 - 00;44;40;19
And this was pre 2022
when ChatGPT dropped.
00;44;40;19 - 00;44;44;18
And we just don't hear a lot
about that anymore.
00;44;44;21 - 00;44;46;06
So let's talk about the metaverse.
00;44;46;06 - 00;44;50;00
I know you had a podcast episode
this year on the metaverse.
00;44;50;03 - 00;44;52;07
How do you feel
about the metaverse in 2026?
00;44;52;07 - 00;44;53;03
I think I know the answer.
00;44;53;03 - 00;44;55;07
I'm legally obligated to ask the question.
00;44;55;07 - 00;44;55;16
Yeah.
00;44;55;16 - 00;44;58;22
So I mean, one of the things we've found
is that and I talked about this
00;44;58;22 - 00;45;02;12
with like a backlash against AI,
a consumer backlash is
00;45;02;12 - 00;45;06;11
metaverse has had such a strong backlash
that people won't even say it anymore.
00;45;06;11 - 00;45;08;13
They'll say like XR, right?
00;45;08;13 - 00;45;10;12
Which is like augmented reality
00;45;10;12 - 00;45;15;13
and virtual reality because metaverse
is just like so tarnished.
00;45;15;16 - 00;45;15;24
I've had
00;45;15;24 - 00;45;18;24
a few conversations with a few kind
of leading thinkers,
00;45;18;24 - 00;45;22;09
in mixed reality this year.
00;45;22;12 - 00;45;25;17
And, you know, a couple of things
I should say.
00;45;25;17 - 00;45;26;05
One of them
00;45;26;05 - 00;45;29;27
is that those episodes don't perform
very well on social media,
00;45;30;00 - 00;45;33;05
which tells me that people don't give
a shit about mixed reality right now.
00;45;33;07 - 00;45;37;04
maybe meta wants you to talk about
AI and is prioritizing.
00;45;37;05 - 00;45;38;10
maybe. Yeah.
00;45;38;10 - 00;45;43;01
I mean, I certainly has taken
a lot of the air out of the room
00;45;43;04 - 00;45;46;03
for like every other emerging technology
out there,
00;45;46;03 - 00;45;48;25
which no air in the room,
how will the horse survive?
00;45;48;25 - 00;45;50;20
Sorry,
I think I have lost the plot on this one.
00;45;50;20 - 00;45;52;01
Yeah. Which which horse is that?
00;45;52;01 - 00;45;55;22
And so there's a very,
you know, good chance that a horse comes
00;45;55;22 - 00;45;57;01
from behind in this race.
00;45;57;01 - 00;45;58;20
Dark horse. Yeah. Dark horse.
00;45;58;20 - 00;45;58;27
Okay.
00;45;58;27 - 00;46;01;05
That's, that's a different emerging
technology and captures
00;46;01;05 - 00;46;03;14
the popular imagination
because nobody's paying attention,
00;46;03;14 - 00;46;07;16
because they're all focused on, AI,
which is the name of the winning horse
00;46;07;16 - 00;46;08;11
of course.
00;46;08;11 - 00;46;11;18
but I don't think that dark horse
is going to be next reality.
00;46;11;24 - 00;46;13;24
I don't think it's going to be
the metaverse.
00;46;13;24 - 00;46;16;20
And coming back to a point
I've made a few times already.
00;46;16;20 - 00;46;19;17
Look, I don't think anybody has answered
00;46;19;17 - 00;46;23;29
the question of why
the hell should I use mixed reality?
00;46;24;02 - 00;46;26;04
Certainly not virtual reality.
00;46;26;04 - 00;46;26;14
Yeah.
00;46;26;14 - 00;46;28;15
Getting people to strap on a headset No.
00;46;28;15 - 00;46;30;06
is like, it's just a no go.
00;46;30;06 - 00;46;33;11
And there's like,
I don't know, there's this weird narrative
00;46;33;11 - 00;46;36;26
of, well, you know, the problem is
the headsets are just too heavy.
00;46;36;29 - 00;46;38;00
Is that ugly or.
00;46;38;00 - 00;46;41;26
If only they were less expensive
and less heavy, people would use them.
00;46;41;28 - 00;46;43;13
Bullshit.
00;46;43;13 - 00;46;44;28
Use them for what?
00;46;44;28 - 00;46;47;07
they Use them for what?
00;46;47;07 - 00;46;49;25
Like there's just nothing to do in there.
00;46;49;25 - 00;46;52;13
That's not better on the outside. Like.
00;46;52;13 - 00;46;55;09
Oh, it's like being inside your phone.
00;46;55;09 - 00;46;57;25
Who wants to be inside their phone
like you want?
00;46;57;25 - 00;47;01;14
I want to be farther from my phone,
not deeper into it.
00;47;01;21 - 00;47;02;07
And so.
00;47;02;07 - 00;47;05;23
And yeah, again, you know,
I talked to somebody this year,
00;47;05;26 - 00;47;09;05
you know, who was talking about the,
like, wearables,
00;47;09;05 - 00;47;13;24
like the augmented reality glasses
and that my notifications are right in my.
00;47;13;24 - 00;47;15;09
Glasses eyes. Right there.
00;47;15;09 - 00;47;19;05
You know, where I don't want
my notifications any closer to me.
00;47;19;06 - 00;47;19;13
Yeah.
00;47;19;13 - 00;47;22;24
You know, I think and by the way,
like I'm, I'm a sample of one,
00;47;23;01 - 00;47;24;07
but I think
00;47;24;07 - 00;47;28;10
certainly there's all sorts of research
right now that saying that Gen-Z younger
00;47;28;10 - 00;47;30;11
people have recognized
that we've come a become
00;47;30;11 - 00;47;33;02
a bit too dependent on our phones
and on social media,
00;47;33;02 - 00;47;34;25
and they're trying
to create some distance.
00;47;34;25 - 00;47;37;11
I don't know how they're succeeding,
but they're certainly trying.
00;47;37;11 - 00;47;41;02
And I think we're going to start to see
that spread upward through generations.
00;47;41;05 - 00;47;43;26
And I don't think I don't think you see
00;47;43;26 - 00;47;48;02
anyone clamoring for
I wish I was more plugged in. now.
00;47;48;02 - 00;47;51;23
And that is the entire value proposition
00;47;51;26 - 00;47;56;21
of mixed reality of like,
imagine you're more plugged in.
00;47;56;24 - 00;47;59;17
Nobody wants that. So. So what's left?
00;47;59;17 - 00;48;00;03
I don't know.
00;48;00;03 - 00;48;03;28
That's that's my take, Well,
I bet phones with e-ink displays
00;48;03;28 - 00;48;07;14
to deliberately make them more difficult
to use will probably outsell,
00;48;07;14 - 00;48;10;17
like the new Apple
Well and they've started right.
00;48;10;17 - 00;48;13;19
Like the deliberate dumb phones
or your phones in black and white.
00;48;13;19 - 00;48;17;03
Whether it's an app for that or people
just getting dumber phones.
00;48;17;06 - 00;48;18;22
of physical hardware that you can append.
00;48;18;22 - 00;48;20;05
They've been advertised. oh, there's.
00;48;20;05 - 00;48;24;24
There's tons of those because people
don't want more of that right now.
00;48;25;01 - 00;48;25;18
And it's interesting.
00;48;25;18 - 00;48;28;03
I think it speaks to the development cycle
on a lot of these things.
00;48;28;03 - 00;48;31;19
So like the Vision Pro is Apple's entry
into this market.
00;48;31;19 - 00;48;32;28
And Apple's actually usually pretty good.
00;48;32;28 - 00;48;37;05
They created two new wearable categories
very subtly 2015
00;48;37;05 - 00;48;39;05
the Apple Watch
and then the AirPods a year later.
00;48;39;05 - 00;48;40;15
I mean, those two things are ubiquitous.
00;48;40;15 - 00;48;43;27
I think the Apple Watch might be the best
selling watch in the world, AirPods.
00;48;43;27 - 00;48;46;09
People laughed at them.
Oh, they got stocks coming out of them.
00;48;46;09 - 00;48;47;22
Best selling headphones in the world.
00;48;47;22 - 00;48;51;07
If AirPods were an independent business,
they'd be a fortune 500 business
00;48;51;07 - 00;48;52;09
and not a small one, right?
00;48;52;09 - 00;48;54;03
Like they're massively popular,
00;48;54;03 - 00;48;56;28
but they couldn't get it to work
with virtual or mixed reality.
00;48;56;28 - 00;48;58;04
Well, but but but hold on.
00;48;58;04 - 00;49;02;07
The question is,
why did those both become so popular?
00;49;02;10 - 00;49;05;17
And I have a very simple answer,
which is they're status symbols.
00;49;05;20 - 00;49;08;26
They're they're luxury goods
that demonstrate status.
00;49;08;26 - 00;49;09;28
Yeah. That's what they do.
00;49;09;28 - 00;49;12;15
Like, yes, I have,
and that's why they bought beats
00;49;12;15 - 00;49;16;07
because beats is just a
it's a status symbol.
00;49;16;08 - 00;49;18;11
Look at my big headphones.
They're expensive.
00;49;18;11 - 00;49;20;12
Everybody can see them because I wear them
everywhere.
00;49;20;12 - 00;49;21;15
base is turned up.
00;49;21;15 - 00;49;24;00
And the basses turned up in the sound
as shit.
00;49;24;00 - 00;49;27;00
And, like,
I'm not going to go on my my strong.
00;49;27;02 - 00;49;27;29
opinions on beats.
00;49;27;29 - 00;49;29;00
I did not anticipate this.
00;49;29;00 - 00;49;31;14
I have I have strong
I look I have a music background,
00;49;31;14 - 00;49;33;17
I have it,
I have strong opinions on beats.
00;49;33;17 - 00;49;35;13
That's what Apple does. Apple isn't.
00;49;35;13 - 00;49;37;22
It's not a tech company.
It's a luxury good company.
00;49;37;22 - 00;49;39;26
And that's why
they keep hiring people from other luxury
00;49;39;26 - 00;49;41;24
good companies into their,
00;49;41;24 - 00;49;43;22
you know, kind of marketing
and product functions,
00;49;43;22 - 00;49;46;02
because that's the code
that they've cracked.
00;49;46;02 - 00;49;50;17
You cannot
just apply that logic to a headset.
00;49;50;20 - 00;49;52;03
Right. What's the luxury good there.
00;49;52;03 - 00;49;55;12
It doesn't signal
any status to be putting this on.
00;49;55;15 - 00;49;59;04
And the worst thing they're facing
and meta is dealing with this.
00;49;59;04 - 00;50;00;26
And everybody who's thrown
their hat in the ring is dealing with
00;50;00;26 - 00;50;05;25
this is the concern that if you put on
these glasses it's creepy, right?
00;50;05;28 - 00;50;08;18
This is someone who could be filming me
right now.
00;50;08;18 - 00;50;09;07
Got a camera?
00;50;09;07 - 00;50;13;09
How in the world does that
signal status, right?
00;50;13;09 - 00;50;16;11
Like, that's that's anti-social behavior.
00;50;16;17 - 00;50;19;17
And so this this is the moat
that's unprofitable.
00;50;19;18 - 00;50;25;00
And this, by the way,
is like a microcosm of this entire force
00;50;25;00 - 00;50;29;02
that's at play in tech right now,
which is on the one hand,
00;50;29;02 - 00;50;31;15
you have listening
for what people actually want.
00;50;31;15 - 00;50;35;01
And on the other hand,
you have the question, how do I make more,
00;50;35;04 - 00;50;37;16
more money and push more stuff?
00;50;37;16 - 00;50;42;14
And the pushing more stuff
is not going super well right now.
00;50;42;18 - 00;50;45;22
It's going well overall in the sense of
we just keep charging more money
00;50;45;22 - 00;50;47;07
for the same thing.
00;50;47;07 - 00;50;50;06
And trying to juice engagement.
00;50;50;06 - 00;50;54;01
But a lot of them are kind of starved
for new ideas and like that.
00;50;54;05 - 00;50;55;07
The idea this year
00;50;55;07 - 00;50;59;18
and in the past 18 months has been
I'll just add AI into the tech.
00;50;59;18 - 00;51;02;20
And it's like,
Why is there AI in my Instagram?
00;51;02;23 - 00;51;04;21
I, I don't know.
00;51;04;21 - 00;51;06;17
Where they're going to put them in
Reese's Cups too soon.
00;51;06;17 - 00;51;10;11
It's going to be peanut butter chocolate
and I and I will I will buy those.
00;51;10;18 - 00;51;12;27
There'll be
a subscription will be $1,000 a month.
00;51;12;27 - 00;51;13;05
Yeah.
00;51;13;05 - 00;51;18;15
I love this Apple example because we can
come back to AI and talk about that.
00;51;18;15 - 00;51;21;21
Like Apple has one of the most widely
used artificial intelligence
00;51;21;21 - 00;51;24;23
tools in the world from like 2010 or 2011,
Siri.
00;51;24;26 - 00;51;25;07
Yeah.
00;51;25;07 - 00;51;27;11
And I believe
you had the Siri founder on Yeah.
00;51;27;11 - 00;51;32;13
If I'm not mistaken, Siri does not have
a great reputation, generally speaking.
00;51;32;19 - 00;51;32;29
Right.
00;51;32;29 - 00;51;37;10
And so you would think that the emergence
of LMS would be a great opportunity
00;51;37;10 - 00;51;41;00
for Apple to say, hey, this product
that we have that, you know, has maybe
00;51;41;03 - 00;51;44;20
had some issues, you know, they'll even
say a keynotes, we fix Siri this year.
00;51;44;23 - 00;51;46;07
It's sort of acknowledging
that it was broken,
00;51;46;07 - 00;51;48;12
which is a very unnatural thing to do,
00;51;48;12 - 00;51;51;04
by the way, because you're right,
they're very focused on their image.
00;51;51;04 - 00;51;53;05
Why hasn't this been a big win for them?
00;51;53;05 - 00;51;53;17
Like, is it
00;51;53;17 - 00;51;58;09
because the LM, in your opinion, isn't,
sort of at that high quality level?
00;51;58;09 - 00;51;59;23
Is it because the voice of the customer
00;51;59;23 - 00;52;03;29
is not wanting more LMS in products,
or is it an Apple related problem?
00;52;03;29 - 00;52;07;25
I know this is kind of a wide
ranging question, but like, surely
00;52;07;28 - 00;52;10;18
a good use case for an Lem
would be a functional assistant.
00;52;10;18 - 00;52;11;11
Surely.
00;52;11;11 - 00;52;14;25
just to make sure I understand
the question you're you're asking like,
00;52;14;25 - 00;52;15;23
why is it C?
00;52;15;23 - 00;52;18;18
Does it seem like
Apple is falling behind in the AI race?
00;52;18;18 - 00;52;23;07
Yeah, yeah, it's it's really simple
because they're a luxury good company
00;52;23;07 - 00;52;26;24
and AI is not critical
to what they're doing.
00;52;26;24 - 00;52;29;01
And it shouldn't be critical
to what they're doing.
00;52;29;01 - 00;52;32;06
And I honestly think that like
we've talked about this before.
00;52;32;06 - 00;52;32;15
Right.
00;52;32;15 - 00;52;37;19
Like the fact that there's this like
investor bonanza of like, oh, you have AI
00;52;37;19 - 00;52;43;20
in your, in your, you know, prospectus
like, like let's skyrocket the stock.
00;52;43;24 - 00;52;46;21
uncommon 99. It's it's stupid.
00;52;46;21 - 00;52;49;10
And I think that we're going to see
00;52;49;10 - 00;52;52;25
a bit of reckoning
is probably too aggressive a word.
00;52;52;28 - 00;52;56;08
But I think people are going to say,
oh yeah, actually Apple
00;52;56;08 - 00;52;58;10
doesn't need to be an AI company.
00;52;58;10 - 00;53;00;22
Not every company needs to be
an AI company,
00;53;00;22 - 00;53;02;28
and there isn't that much of a premium
on it.
00;53;02;28 - 00;53;03;28
And all of this, I think, by
00;53;03;28 - 00;53;08;10
the way, is like pretty bullish for
companies like Google because actually,
00;53;08;13 - 00;53;11;24
truly being an AI company is very,
very helpful.
00;53;12;00 - 00;53;15;00
But for everybody else,
I don't think it's going to see the gains
00;53;15;00 - 00;53;16;23
that investors are expecting.
00;53;16;23 - 00;53;17;25
And if the average company
00;53;17;25 - 00;53;20;28
is going to get these gains, most of them,
I think are going to get it by
00;53;20;28 - 00;53;24;28
bringing in a Microsoft or a Google
into their shop anyway and get it through.
00;53;25;01 - 00;53;26;23
You know, we talked about this earlier,
00;53;26;23 - 00;53;30;16
a technology vendor, probably, frankly,
one that they're already using.
00;53;30;16 - 00;53;33;19
so I think that that's going to happen,
it's going to happen a lot of industries
00;53;33;19 - 00;53;37;28
where people are saying, oh, like,
do we still need consultants?
00;53;37;28 - 00;53;40;14
Do we still need
any professional services?
00;53;40;14 - 00;53;42;02
You know, AI is here.
00;53;42;02 - 00;53;44;24
I think that's a huge misrepresentation
of why people buy
00;53;44;24 - 00;53;46;16
these services to begin with.
00;53;46;16 - 00;53;49;21
I don't think it's just, oh, what's
what's the answer to this question.
00;53;49;21 - 00;53;54;16
Better bring in McKinsey to answer it
for me, I think it's how organizations
00;53;54;16 - 00;53;57;25
get things done is, you know,
working with some of these functions.
00;53;57;25 - 00;54;01;29
So we'll see the stock in some of these
rise that's been beat up a little bit.
00;54;02;05 - 00;54;05;10
But I mean, the Apple piece
I want to come back to because
00;54;05;13 - 00;54;09;00
there's a different concern at Apple,
which is you've got Tim Cook now,
00;54;09;04 - 00;54;12;14
you know, functionally announcing
that he's an outgoing CEO.
00;54;12;18 - 00;54;15;14
We've got some suggestions
about who may be up next.
00;54;15;14 - 00;54;18;29
But the real valuable question
is what's next for Apple.
00;54;18;29 - 00;54;19;15
Yeah.
00;54;19;15 - 00;54;24;11
And investors have decided
the answer's is I.
00;54;24;14 - 00;54;26;12
And I think that's dumb.
00;54;26;12 - 00;54;29;00
And I think maybe Apple thinks
that's dumb.
00;54;29;00 - 00;54;33;06
But without being able to tell them what
the real answer is, they're in trouble.
00;54;33;06 - 00;54;35;09
And that's
what we've seen depressing the stock.
00;54;35;09 - 00;54;37;02
So if they think the problem
they're solving is
00;54;37;02 - 00;54;38;27
how do we get AI into our products?
00;54;38;27 - 00;54;40;28
They're barking up the wrong tree.
00;54;40;28 - 00;54;44;14
The problem that they're trying to solve
is what's next for Apple that's going to,
00;54;44;17 - 00;54;45;22
you know, make us a pile of money.
00;54;45;22 - 00;54;47;23
And they've been really good
at being like a platform. Right.
00;54;47;23 - 00;54;48;19
Like the App Store.
00;54;48;19 - 00;54;50;00
One of the most innovative things,
00;54;50;00 - 00;54;53;27
I would argue, of the 21st century
so far certainly changed all of our lives.
00;54;53;27 - 00;54;56;28
Many of you are probably watching
this on an iPhone in an app
00;54;56;28 - 00;55;00;24
that was made by a third party that Apple,
you know, is collecting a vegan, right?
00;55;00;24 - 00;55;02;02
Like that's sort of their whole thing.
00;55;02;02 - 00;55;03;15
Search engine.
They could have built their own.
00;55;03;15 - 00;55;03;28
They didn't.
00;55;03;28 - 00;55;07;11
Google paid them a lot of money for
it was the center of an antitrust lawsuit.
00;55;07;11 - 00;55;07;28
Right.
00;55;07;28 - 00;55;11;07
So the reason I asked about Apple
is because there's
00;55;11;10 - 00;55;13;21
I think they're one of the most
compelling companies in this space.
00;55;13;21 - 00;55;16;01
So the 800 pound gorilla
that really has not stepped
00;55;16;01 - 00;55;18;24
into the ring and succession
is going to be very interesting.
00;55;18;24 - 00;55;21;28
Like if you look at Amazon
who succeeded at Amazon Andy Jassy,
00;55;21;28 - 00;55;25;07
what was his job Amazon Web Services
not a retail guy.
00;55;25;09 - 00;55;26;11
Yeah. Cloud guy.
00;55;26;11 - 00;55;29;27
Microsoft who took over from Steve
Ballmer of Nobody will want an iPhone
00;55;29;27 - 00;55;33;05
because it doesn't have a keyboard fame
it was Satya Nadella
00;55;33;08 - 00;55;35;06
Cloud business that division. Right.
00;55;35;06 - 00;55;39;09
So where Apple goes by the way,
both those companies massive,
00;55;39;09 - 00;55;41;14
massive success in their cloud businesses
00;55;41;14 - 00;55;45;02
after those folks took over,
in in Jesse's case before he took over.
00;55;45;07 - 00;55;45;25
So with Apple
00;55;45;25 - 00;55;50;02
is going to be very compelling to see who
they pick, as a successor to Tim Cook,
00;55;50;02 - 00;55;53;02
who is a supply chain guy
and really sorted out their supply chain.
00;55;53;05 - 00;55;56;02
So, yeah, I think that has so much to do
with the ultimate
00;55;56;02 - 00;55;59;02
success of Apple Intelligence or whatever
the final iteration it looks like.
00;55;59;04 - 00;56;04;12
So the way I'm seeing it, Jeff, right now
is so web 1.0, then web 2.0.
00;56;04;12 - 00;56;05;27
There was a big shaking out.
00;56;05;27 - 00;56;08;29
All the Pets.com just went bankrupt,
but we got a bunch of really
00;56;08;29 - 00;56;12;15
solid, web 2.0 companies like Google and.
00;56;12;15 - 00;56;15;26
Net and Amazon who are able to sort of
ride through that storm
00;56;15;26 - 00;56;20;01
in the case of Google and Amazon
and emerge from it in the case of,
00;56;20;04 - 00;56;23;18
meta on the AI side, you're predicting
00;56;23;21 - 00;56;26;12
sort of an AI 1.02.0 Yes.
00;56;26;12 - 00;56;29;00
right? What about web 3.0?
00;56;29;00 - 00;56;31;09
You may recall a couple of years ago
00;56;31;09 - 00;56;33;20
everything was going to be
on the blockchain,
00;56;33;20 - 00;56;36;12
and sometimes the blockchain
was going to be mixed with other stuff.
00;56;36;12 - 00;56;38;01
So I'm going to ask you a loaded question.
00;56;38;01 - 00;56;41;06
And that is
can we finally close the book on this?
00;56;41;09 - 00;56;44;24
Can we finally just say that blockchain
isn't a real enterprise technology and
00;56;44;24 - 00;56;46;16
just move along?
00;56;46;19 - 00;56;49;26
I, I certainly agree with that.
00;56;49;26 - 00;56;51;05
At the broadest scale,
00;56;51;05 - 00;56;53;09
there are industries,
and there are specific functions
00;56;53;09 - 00;56;56;20
where you can use a decentralized ledger
and there's some value to it.
00;56;56;23 - 00;56;58;13
But I think it's you, can.
00;56;58;13 - 00;57;00;12
You name one?
00;57;00;12 - 00;57;03;19
I've heard examples
in sort of finance and insurance
00;57;03;19 - 00;57;07;04
internally
where they keep records like this.
00;57;07;07 - 00;57;10;04
I'm always skeptical
when I hear that story, and I am,
00;57;10;04 - 00;57;14;25
I am too, and should it be
uttered in the same breath as I.
00;57;14;25 - 00;57;16;03
No, I don't think it should.
00;57;16;03 - 00;57;17;07
And, you know, we can decide
00;57;17;07 - 00;57;20;24
how much we want to talk about crypto,
but I think we've seen.
00;57;20;27 - 00;57;22;00
Crypto as a whole different thing.
00;57;22;00 - 00;57;24;13
Crypto is a speculative.
I think it is two.
00;57;24;13 - 00;57;27;23
And I think we're
sort of seeing peak crypto now as well,
00;57;27;24 - 00;57;30;00
because this is another thing.
Well, we're over the hump.
00;57;30;00 - 00;57;32;26
I mean, again, the price may be different
depending on when you're watching this.
00;57;32;26 - 00;57;36;06
But in late 2025 crypto
crashed a fair amount.
00;57;36;11 - 00;57;36;21
Yeah.
00;57;36;21 - 00;57;40;05
and this is after by the way
all this regulation came off.
00;57;40;05 - 00;57;43;06
You've got a very pro crypto
administration in the US.
00;57;43;07 - 00;57;47;01
And to me like I've been a big crypto
skeptic for a long time.
00;57;47;01 - 00;57;49;23
And it's funny
because like some of the smartest people
00;57;49;23 - 00;57;52;18
and some of the dumbest people I know
are all in on crypto,
00;57;52;18 - 00;57;54;20
which is like fascinating.
Like there's no middle ground.
00;57;54;20 - 00;57;57;06
It's just like. I'm going to find out
which bucket Yeah, exactly.
00;57;57;06 - 00;58;02;10
So it's like, you know, geniuses
and people who have never known
00;58;02;10 - 00;58;05;27
anything about finance or technology or
like putting their life savings in this.
00;58;05;27 - 00;58;09;05
There's a meme with a bell curve
with people I'm sure there are.
00;58;09;08 - 00;58;12;06
And it's the same concern
we talked about with AI,
00;58;12;06 - 00;58;17;00
which is it's it's really great
as a speculative investment.
00;58;17;00 - 00;58;22;10
And like the future is crypto, but
who uses crypto that isn't a drug dealer?
00;58;22;11 - 00;58;27;12
You know, like it's just it's a technology
that the average person
00;58;27;15 - 00;58;30;06
I don't find has any use for.
00;58;30;06 - 00;58;35;11
And they can't even describe in layman's
terms why they should care about it.
00;58;35;13 - 00;58;38;19
Like if you want to con someone
00;58;38;22 - 00;58;41;24
out of their money,
crypto is incredible for it.
00;58;41;24 - 00;58;42;17
I get those calls.
00;58;42;17 - 00;58;43;26
Yeah, if you want.
00;58;43;26 - 00;58;48;09
Like, yeah, if you're in the illicit goods
or services business, you know,
00;58;48;09 - 00;58;52;10
you want to order a hit like Bitcoin
have at her.
00;58;52;13 - 00;58;55;13
But but you know is that
00;58;55;15 - 00;58;58;05
what the valuation is based on like
come on.
00;58;58;05 - 00;59;01;09
And I'm sure there's like people
were going to be like furiously like this
00;59;01;09 - 00;59;04;04
Jeff guy is an idiot.
He just doesn't understand its potential.
00;59;04;04 - 00;59;06;25
And there's a lot of money
riding on that narrative.
00;59;06;25 - 00;59;10;11
But I don't know, like that's
that's my crypto take.
00;59;10;11 - 00;59;14;27
So crypto I'm completely with you
on the enterprise side though, in terms
00;59;14;27 - 00;59;16;11
of like actual blockchain,
00;59;16;11 - 00;59;19;01
the purpose of which is to solve
some sort of a business problem.
00;59;19;01 - 00;59;21;00
I think I disagree with you
because I actually think
00;59;21;00 - 00;59;22;20
that if you peel back the layers
00;59;22;20 - 00;59;25;28
on any enterprise blockchain use case,
it becomes apparent
00;59;25;28 - 00;59;27;26
that there is a better way
to solve that problem.
00;59;27;26 - 00;59;31;15
I'm not saying that a decentralized ledger
can't be used to do stuff,
00;59;31;15 - 00;59;34;16
I'm just saying that it is very rarely
the highest and best use of your time
00;59;34;16 - 00;59;35;14
to actually build one.
00;59;35;14 - 00;59;37;27
So I'll give some examples
just because, like I came prepared.
00;59;37;27 - 00;59;42;15
Jeff, everybody watching this Google
Australian Securities Exchange blockchain.
00;59;42;22 - 00;59;44;22
This was a slow motion train wreck from X.
00;59;44;22 - 00;59;45;18
I heard that they were
00;59;45;18 - 00;59;48;21
moving their trading platform
and I might get some of the details wrong,
00;59;48;21 - 00;59;49;16
but I heard they were moving
00;59;49;16 - 00;59;53;12
their trading platform
from a traditional system to a blockchain.
00;59;53;15 - 00;59;54;11
And I watch this.
00;59;54;11 - 00;59;56;02
I was like, there's no way
they're actually going to do that.
00;59;56;02 - 00;59;58;03
But they kept releasing announcements.
00;59;58;03 - 00;59;59;27
You know,
it was a big deal for their executives.
00;59;59;27 - 01;00;01;08
I was like,
this just isn't going to happen
01;00;01;08 - 01;00;04;01
because this isn't a real solution
to a real problem.
01;00;04;01 - 01;00;06;01
They kept announcing it, kept announcing
it. Boom.
01;00;06;01 - 01;00;07;24
Actually,
no, we're not doing that. Whoops.
01;00;07;24 - 01;00;09;25
And it's like, yeah,
obviously I wasn't going to happen.
01;00;09;25 - 01;00;12;24
Or a few years back IBM and Maersk, right.
01;00;12;24 - 01;00;16;00
They they partnered to use
some IBM technology
01;00;16;00 - 01;00;19;00
to build a platform called Trade Lens.
01;00;19;06 - 01;00;20;15
Right. And it sounded really good.
01;00;20;15 - 01;00;22;15
It was like shipping containers
sometimes get lost.
01;00;22;15 - 01;00;24;21
You know, it's great to have an immutable
ledger.
01;00;24;21 - 01;00;29;02
And, you know, people can, you know,
scan stuff in or whatever, and track it
01;00;29;05 - 01;00;32;14
around the whole world, you know,
biggest shipping company in the world,
01;00;32;17 - 01;00;36;09
a questionably relevant tech company,
but still quite large.
01;00;36;13 - 01;00;40;00
IBM is not anywhere near the size
of like an Apple or a Facebook or,
01;00;40;00 - 01;00;43;13
I guess, meta now, or a Microsoft,
but they're still a big player.
01;00;43;16 - 01;00;44;14
They shut that down.
01;00;44;14 - 01;00;46;08
It was it wasn't a real thing.
you know. right.
01;00;46;08 - 01;00;49;11
And every time this comes up,
it always comes back
01;00;49;18 - 01;00;52;12
to that would be neat if it worked.
01;00;52;12 - 01;00;54;21
But like it's just never the best way
to solve a problem.
01;00;54;21 - 01;00;55;17
So I agree with you.
01;00;55;17 - 01;00;58;15
Crypto not great enterprise
enterprise blockchain.
01;00;58;15 - 01;01;00;03
I wish we could just
close the door on this.
01;01;00;03 - 01;01;02;09
Like if I never have to hear about it
again.
01;01;02;09 - 01;01;03;10
I know talking about it on
01;01;03;10 - 01;01;06;10
a podcast is not the solution
for never hearing about it again.
01;01;06;11 - 01;01;09;00
But if I never have to hear about it
again, it'll be too soon.
01;01;09;00 - 01;01;12;17
But but there's there's an implication
here that really concerns me.
01;01;12;18 - 01;01;12;26
Okay.
01;01;12;26 - 01;01;16;21
Which is that in all these use cases
where you're like,
01;01;16;25 - 01;01;20;26
you're watching it from a safe distance
and being like, why would you do that?
01;01;20;26 - 01;01;23;14
It's obviously not the right call.
01;01;23;14 - 01;01;27;16
The answer to that question
is that somebody, somewhere
01;01;27;19 - 01;01;32;04
with a lot of authority said, how
can we get blockchain into our business?
01;01;32;04 - 01;01;32;28
What's the case
01;01;32;28 - 01;01;36;21
for blockchain like this is a company
that should be using blockchain, damn it.
01;01;36;21 - 01;01;39;19
We're backing into it. Yeah. Well.
Well exactly.
01;01;39;19 - 01;01;43;04
And it's like, well,
you know sir, it's always a sir.
01;01;43;07 - 01;01;45;24
You know, we've got our top minds on it.
01;01;45;24 - 01;01;48;24
And, you know, this is the example
we came up with.
01;01;48;25 - 01;01;53;12
And like, if you squint really hard, it's
not that stupid, right?
01;01;53;12 - 01;01;55;02
you squint really hard, it's not.
that. It's not.
01;01;55;02 - 01;01;58;00
That's like, that's kind of the.
Like stamp that on the bitcoins. Yeah.
01;01;58;00 - 01;01;58;22
Yeah.
01;01;58;22 - 01;02;00;24
There's,
there's a lot of enterprise projects
01;02;00;24 - 01;02;04;09
that that could
be the slogan for fairness.
01;02;04;12 - 01;02;07;17
That, that
like is starting to sound familiar to me.
01;02;07;17 - 01;02;07;26
Right.
01;02;07;26 - 01;02;11;02
Because that some of the concerns
we're having right now with AI.
01;02;11;05 - 01;02;16;00
And I think if the tide is going to go out
at all,
01;02;16;06 - 01;02;17;29
we're really going to like,
01;02;17;29 - 01;02;22;12
we're really going to see how many of
these projects have merit versus
01;02;22;15 - 01;02;27;21
how many is like the AI ification of,
you know, what's effectively blockchain.
01;02;27;21 - 01;02;29;21
That's
I will say, and I'm a little biased here.
01;02;29;21 - 01;02;32;24
I do personally use
AI for things, a fair sure.
01;02;32;26 - 01;02;33;17
One of my favorite things
01;02;33;17 - 01;02;36;19
to do is asking if certain subway stations
have bathrooms, for example.
01;02;36;19 - 01;02;37;12
It actually does.
01;02;37;12 - 01;02;39;24
Now it's
very smart, faster than googling it.
01;02;39;24 - 01;02;41;27
The bathrooms at subway stations
around here any.
01;02;41;27 - 01;02;43;07
And And then some of them.
01;02;43;07 - 01;02;47;09
Really I'd be like very concerned at the
like the cleanliness
01;02;47;09 - 01;02;51;03
level based on based on some of the things
I've seen in the subway station.
01;02;51;03 - 01;02;53;03
Definitely
the direction that I wanted to take.
01;02;53;03 - 01;02;56;28
This, this conversation,
I have yet to find a valid use case
01;02;56;28 - 01;02;57;27
personally for blockchain.
01;02;57;27 - 01;03;02;01
And then I'm also going to give AI the its
early excuse because it sort of is.
01;03;02;01 - 01;03;02;27
I mean, I have been
01;03;02;27 - 01;03;06;24
around in various forms since the 1950s,
Marvin Minsky's lab or whatever,
01;03;06;24 - 01;03;10;13
but like the modern iteration of it,
we're in an AI summer.
01;03;10;13 - 01;03;12;09
You know, we are mentor for a long time.
01;03;12;09 - 01;03;17;02
And so the past three years has been sort
of AI 6.0 or whatever we're on right now.
01;03;17;08 - 01;03;19;04
And that's been going quite well.
01;03;19;04 - 01;03;21;20
But blockchain, I mean, 16 years.
01;03;21;20 - 01;03;22;09
Oh, yeah.
01;03;22;09 - 01;03;27;05
And we've and we have yet to actually
come up with a viable use case for it.
01;03;27;05 - 01;03;31;00
And so I was hoping in this podcast
we could just collectively shout it down.
01;03;31;00 - 01;03;32;00
Oh yeah. Like if you're.
01;03;32;00 - 01;03;36;23
Yeah like there is nothing between me
and shutting the book on blockchain.
01;03;36;23 - 01;03;40;08
Like like
I don't think I mean, I'm curious.
01;03;40;08 - 01;03;42;22
I've read a handful of like 20,
26 tech trends.
01;03;42;22 - 01;03;44;28
I don't think any of them
are like blockchain.
01;03;44;28 - 01;03;48;09
Like this year
we're going all in on blockchain, like.
01;03;48;10 - 01;03;50;25
NFTs are back, baby. Oh my god.
01;03;50;25 - 01;03;52;29
Yeah it's hot. Speaking of scams.
01;03;52;29 - 01;03;53;12
so what.
01;03;53;12 - 01;03;56;10
Do you do with your
massive portfolio of NFTs?
01;03;56;13 - 01;03;58;08
con them off on another sucker
back in the day.
01;03;58;08 - 01;03;59;04
greater fool theory.
01;03;59;04 - 01;04;01;03
That's, that's so obviously so.
01;04;01;03 - 01;04;02;13
Yeah. Shut the book. Yeah. right.
01;04;02;13 - 01;04;03;16
The book Book is closed.
01;04;03;16 - 01;04;06;15
Our editors will put a book on the screen,
and we'll close it.
01;04;06;15 - 01;04;09;15
I love that, I wanted to talk about
a couple of other things that are sort of
01;04;09;15 - 01;04;13;08
like blockchain in the sense
that they've been, much hyped
01;04;13;08 - 01;04;19;16
and are maybe related to or, adjacent
in some way to artificial intelligence.
01;04;19;17 - 01;04;22;17
so one is, self-driving cars,
which is a gigantic
01;04;22;17 - 01;04;26;14
I use case right now, the
AI is an agent who's acting as a driver.
01;04;26;14 - 01;04;29;02
On behalf of you,
the passenger, you're the principal.
01;04;29;02 - 01;04;31;02
They've been around for a long time.
01;04;31;02 - 01;04;34;21
You know, self-driving in various forms,
depending on, you know,
01;04;34;23 - 01;04;37;23
what level you would
you would describe that.
01;04;37;27 - 01;04;40;09
It's been around
maybe in its modern iteration for,
01;04;40;09 - 01;04;43;13
let's say, a decade, Uber was involved
and they got sued.
01;04;43;13 - 01;04;44;29
Google's got Waymo.
01;04;44;29 - 01;04;48;21
Teslas have this thing called full
self-driving, which comes with a warning
01;04;48;21 - 01;04;51;23
that says this is not actually
a fully autonomous driving system.
01;04;51;24 - 01;04;54;03
Yeah, you must be present
and we'll shake the wheel
01;04;54;03 - 01;04;55;26
so that, you know, we know you're
you're still there.
01;04;55;26 - 01;04;58;09
What are your take takes on that for 2026?
01;04;58;09 - 01;05;03;03
Are we going to just ditch
the car and have robotaxis?
01;05;03;06 - 01;05;05;01
That's a that's a really interesting one.
01;05;05;01 - 01;05;09;14
I was a self-driving car skeptic
for a very long time.
01;05;09;14 - 01;05;11;20
Like, to me,
I put it in, like, the cold fusion bucket,
01;05;11;20 - 01;05;13;22
where it's like, it's just two years away.
01;05;13;22 - 01;05;15;12
Like, like, whatever, five years away.
01;05;15;12 - 01;05;18;24
No matter when you asked people and like,
Elon was like the ultimate.
01;05;18;24 - 01;05;21;06
Cheerleader slide
where I just clicked for every year.
01;05;21;06 - 01;05;24;03
He said it was one year Now I if you can.
01;05;24;03 - 01;05;26;11
It was it was a running joke
for a really long time.
01;05;26;11 - 01;05;28;19
And I've come around on that in 2025.
01;05;28;19 - 01;05;28;29
Okay.
01;05;28;29 - 01;05;34;03
I, I'm more bullish and like,
I'll go as far as to say
01;05;34;03 - 01;05;37;27
like self-driving
cars are an inevitability at this point.
01;05;37;27 - 01;05;38;06
Okay.
01;05;38;06 - 01;05;41;07
Like the proliferation of them
is inevitable.
01;05;41;14 - 01;05;45;09
And I think Waymo
is probably the first, company
01;05;45;09 - 01;05;47;14
that's really proven that out at scale.
01;05;47;14 - 01;05;50;24
And you're starting
to just hear more stories of people,
01;05;50;27 - 01;05;53;09
you know, on the West Coast
or visiting the West Coast or now,
01;05;53;09 - 01;05;55;06
you know,
now they're in a few more cities system.
01;05;55;06 - 01;05;58;24
Yeah. Austin, I think they're
they're in or coming to Phoenix.
01;05;58;27 - 01;06;00;27
Arizona's got good weather for it
though, right.
01;06;00;27 - 01;06;04;18
Well, while it does and that's to me
that's a surmountable problem, though.
01;06;04;18 - 01;06;06;01
It's like, you know,
01;06;06;01 - 01;06;10;05
road conditions, weather, topography,
like all of that will come with time.
01;06;10;05 - 01;06;13;05
I don't think there's anything
inherently insurmountable about it.
01;06;13;07 - 01;06;17;01
But what's interesting
and what changed my mind
01;06;17;07 - 01;06;20;13
is the virality of the experience.
01;06;20;17 - 01;06;24;14
And what I mean by that is anytime I know
someone who's in a self-driving car,
01;06;24;20 - 01;06;27;20
I know they're in a self-driving car
because they send me a picture of it
01;06;27;23 - 01;06;30;18
right in there, like,
I'm in a there's no driver look in this.
01;06;30;18 - 01;06;34;04
Like, people love to talk about it
and they're doing it.
01;06;34;07 - 01;06;36;29
And if you ask them,
would you do it again?
01;06;36;29 - 01;06;38;12
The answer is yes. Right.
01;06;38;12 - 01;06;43;04
Like those are signals
that this is not going away.
01;06;43;08 - 01;06;46;15
And so we've already started to hear about
01;06;46;15 - 01;06;49;21
organizations like Waymo talking about
bringing this to more and more cities.
01;06;49;21 - 01;06;53;00
Yeah, there's there's concerns
like weather, like topography.
01;06;53;03 - 01;06;55;13
There's this like no shortage of concerns.
01;06;55;13 - 01;06;58;22
But to me, those are like hurdles
that we will get over.
01;06;58;29 - 01;07;02;12
I don't know that we're going
to get over all of them in 2026,
01;07;02;18 - 01;07;06;08
but we're going to continue
to see a steady march forward here.
01;07;06;12 - 01;07;08;17
And as you can imagine, I wasn't around,
01;07;08;17 - 01;07;11;23
you know, when that first like,
model T rolled off the line.
01;07;11;28 - 01;07;14;25
But to get from that to, you know,
01;07;14;25 - 01;07;18;21
the modern age of highways and,
you know, cars
01;07;18;22 - 01;07;21;25
being a part of everyday life
every day, you know, life in America
01;07;21;25 - 01;07;25;28
and around the world, you know,
that didn't happen in in 18 months.
01;07;25;28 - 01;07;27;19
You know, it took decades.
01;07;27;19 - 01;07;30;00
And this,
I think will be faster than that.
01;07;30;00 - 01;07;34;14
But it's going to be years and years
and it's going to it's
01;07;34;14 - 01;07;38;11
going to be transformative for our society
01;07;38;11 - 01;07;41;11
and for I think in a lot of ways,
our physical infrastructure,
01;07;41;14 - 01;07;45;01
in ways that are really tough to predict
right now.
01;07;45;04 - 01;07;49;02
One of the pieces, though,
that I've been thinking about lately is
01;07;49;09 - 01;07;52;04
this is especially true
in the Western world,
01;07;52;04 - 01;07;55;27
is with a lot of new immigrants
01;07;56;00 - 01;07;59;00
to the US, Canada, Western Europe,
01;07;59;07 - 01;08;03;05
one of the lowest barriers to entry ways
you can start earning
01;08;03;05 - 01;08;09;12
an income is as an Uber driver
or as some sort of, you know, basically,
01;08;09;15 - 01;08;10;23
you know, just tap
01;08;10;23 - 01;08;15;17
food delivery taxiing people or stuff
around and parcels is the same, right?
01;08;15;23 - 01;08;19;08
Amazon has a awful lot of humans
that are serving as the last mile.
01;08;19;15 - 01;08;22;26
If you don't need those people anymore,
01;08;22;29 - 01;08;27;16
what are we doing with these people
who are functionally self-employed?
01;08;27;21 - 01;08;29;00
And this is their skill set?
01;08;29;00 - 01;08;32;26
Oh, and by the way, they're saddled
with this large asset Yeah,
01;08;32;26 - 01;08;35;25
that they had to purchase
to be able to do that. of some sort.
01;08;35;25 - 01;08;37;19
yeah, so that's
01;08;37;19 - 01;08;41;07
that's a big societal question that I
think we're going to have to grapple with.
01;08;41;10 - 01;08;44;15
It's going to be asked more in 2026.
01;08;44;18 - 01;08;48;22
I don't know that there's going to be,
you know, a volcanic moment,
01;08;48;25 - 01;08;51;25
but we're marching
in an inevitable direction.
01;08;51;25 - 01;08;55;09
So the reason I brought this up
was because exactly that point, and I'm
01;08;55;09 - 01;08;59;16
glad you sort of came, came to it because
when I was recently in one of the cities,
01;08;59;16 - 01;09;03;16
Austin, that has a large fleet of Waymo's,
they're it's integrated with Uber
01;09;03;19 - 01;09;07;03
and you order and Uber
and a Waymo might show up or a person
01;09;07;03 - 01;09;09;08
might show up, right, depending on where
you're going. Availability.
01;09;09;08 - 01;09;11;24
I'm assuming there's an algorithmic
aspect to it.
01;09;11;24 - 01;09;13;10
In fact, there obviously is.
01;09;13;10 - 01;09;16;03
And I didn't
actually end up riding in one.
01;09;16;03 - 01;09;20;01
But I did ride in a regular Uber
01;09;20;04 - 01;09;23;24
for an hour for like $38 or something.
01;09;23;27 - 01;09;25;01
And I thought, oh,
01;09;25;01 - 01;09;30;03
I think the price of Uber's in Austin
might have taken a hit because,
01;09;30;06 - 01;09;30;23
you know,
01;09;30;23 - 01;09;33;21
the drivers, you know, want more money,
they're going, no problem.
01;09;33;21 - 01;09;36;27
I've got a fleet of autonomous vehicles
that don't get tired.
01;09;37;00 - 01;09;40;02
And that was why I brought that up.
01;09;40;04 - 01;09;42;07
I think it's in microcosm,
01;09;42;07 - 01;09;45;21
the scenario that we're talking about
with AI sort of writ large.
01;09;45;21 - 01;09;47;24
And this just happens
to be a particularly useful case.
01;09;47;24 - 01;09;49;05
It solves a problem.
01;09;49;05 - 01;09;52;10
It will displace workers
if it's done correctly.
01;09;52;10 - 01;09;55;27
Never listen to an AI CEO who says, oh,
the goal isn't displacement.
01;09;55;27 - 01;09;57;05
No, the problem that they're solving
01;09;57;05 - 01;10;00;29
is salaries,
Yes, wages, it's commissions, whatever.
01;10;01;02 - 01;10;04;10
And I think that self-driving is
is one such case.
01;10;04;13 - 01;10;07;20
I'm less bullish than you, though
I do think it's asymptotic.
01;10;07;20 - 01;10;11;27
I think the reason that we've seen it
in Austin and in Phoenix and in California
01;10;12;00 - 01;10;16;16
is because they can mostly solve
for those conditions.
01;10;16;16 - 01;10;19;14
I say mostly because there are still,
you know, relatively high profile
01;10;19;14 - 01;10;21;27
problems, although as far as I know,
nobody's been killed. Recently.
01;10;21;27 - 01;10;28;02
There was an Uber related death,
in Arizona, about eight years ago.
01;10;28;05 - 01;10;29;02
But as soon as
01;10;29;02 - 01;10;33;25
you complicate, like I think that driving
in Ohio in January is exponentially
01;10;33;25 - 01;10;37;04
more difficult than driving
in Chandler, Arizona, at any time of year.
01;10;37;07 - 01;10;40;27
I know this because we live in a northern
climate, and I had to drive to get
01;10;40;27 - 01;10;44;24
to a train to get here and like, people
don't know how to drive in the snow.
01;10;44;24 - 01;10;47;01
I don't know that a robot
will be able to do it
01;10;47;01 - 01;10;49;20
when it can't read the lines on the road,
and it's got to make a judgment
01;10;49;20 - 01;10;52;26
when it just pull over, throw the hazards
on, and wait for this all to blow over.
01;10;52;29 - 01;10;58;01
So I mean, that's a really interesting
nuance to it because you, you know,
01;10;58;01 - 01;11;02;13
I'm framing it as snow
is incrementally more difficult.
01;11;02;13 - 01;11;05;07
You're saying no,
it's like it's it's more than incremental.
01;11;05;07 - 01;11;07;10
It's a completely what while an end. Yeah.
01;11;07;10 - 01;11;09;05
Somebody probably smarter than us
01;11;09;05 - 01;11;13;03
are more knowledgeable and and not a CEO
because they'll probably just tell us.
01;11;13;03 - 01;11;15;05
Of course it's just around the corner.
01;11;15;05 - 01;11;17;04
Elon water is going to stop me.
01;11;17;04 - 01;11;18;13
Get out of my way, snow.
01;11;18;13 - 01;11;19;18
Put a cow catcher on the front.
01;11;19;18 - 01;11;21;24
Yeah but but but so let me to the mall.
01;11;21;24 - 01;11;24;05
let me frame my position a different way.
01;11;24;05 - 01;11;27;14
it's very hard for me to believe
that the Waymo's and Ubers of the world
01;11;27;14 - 01;11;31;21
and in 2026 or 20, 30 or whatever,
we are going to be like,
01;11;31;28 - 01;11;33;24
you know,
I guess we're just throwing in the towel.
01;11;33;24 - 01;11;35;26
We can't beat snow, snow, winds.
01;11;35;26 - 01;11;37;11
So that market is closed.
01;11;37;11 - 01;11;43;11
Like, I think we're going to just
just bang the hammer at this until
01;11;43;11 - 01;11;47;13
at some point we reach a breakthrough and
I don't know what year that will be in.
01;11;47;13 - 01;11;50;17
My guess is sooner rather than later.
01;11;50;17 - 01;11;53;03
But you know,
if your guess is later than here we are.
01;11;53;03 - 01;11;53;25
Yeah, it is later.
01;11;53;25 - 01;11;56;26
And I want to make one more point on this
before we transition
01;11;56;26 - 01;12;01;00
to our most exciting,
you know, predictions for 2026.
01;12;01;00 - 01;12;02;22
So the things that we're most excited
about going forward.
01;12;02;22 - 01;12;03;15
So think on that.
01;12;03;15 - 01;12;07;28
While I make this point,
Apple looked into building a car right?
01;12;08;00 - 01;12;10;04
The Apple car, they never released
anything publicly about it.
01;12;10;04 - 01;12;11;20
Apple is like notoriously tight lipped.
01;12;11;20 - 01;12;14;03
Like whenever they acquire a company,
they release the same statement.
01;12;14;03 - 01;12;16;02
And they have for like decades.
01;12;16;02 - 01;12;18;24
And it's like 2 or 3 sentences
that basically like leave us alone.
01;12;18;24 - 01;12;19;18
Journalist.
01;12;19;18 - 01;12;21;07
And so they didn't talk about it,
but like,
01;12;21;07 - 01;12;22;22
there's been a bunch of reporting on this.
01;12;22;22 - 01;12;24;03
If you Google the Apple Car,
01;12;24;03 - 01;12;26;10
you'll you'll know exactly
what we're talking about here.
01;12;26;10 - 01;12;27;18
And they just dropped it.
01;12;27;18 - 01;12;31;29
One of the most well-resourced companies
in the world entering a market that exists
01;12;32;05 - 01;12;35;19
that they did not have to reinvent,
literally the wheel.
01;12;35;22 - 01;12;36;08
Right?
01;12;36;08 - 01;12;37;19
Oh, maybe that's part of their problem.
01;12;37;19 - 01;12;39;09
Maybe they didn't know
it's going to be a triangle
01;12;39;09 - 01;12;41;04
and it's going to be revolutionary
and creative.
01;12;41;04 - 01;12;43;16
They just dropped it because they said,
you know what?
01;12;43;16 - 01;12;46;01
We actually can't do this. Billions
and investment.
01;12;46;01 - 01;12;48;25
Thousands of employees were involved
in it, and they just dropped it.
01;12;48;25 - 01;12;51;26
So like Google saying, actually,
this Waymo thing isn't scalable
01;12;51;26 - 01;12;53;07
and we're not making any money doing it.
01;12;53;07 - 01;12;56;06
We're pulling the plug like
I don't think that's likely, but like,
01;12;56;07 - 01;12;58;18
you know, too big to fail
isn't the thing in tech.
01;12;58;18 - 01;13;01;15
Everyone everyone has their moment.
01;13;01;15 - 01;13;04;23
I while and I'm I'm skeptical of that
for a couple of different reasons.
01;13;04;23 - 01;13;08;08
I mean, we're talking about, in
a lot of cases, hardware versus software.
01;13;08;10 - 01;13;08;18
Sure.
01;13;08;18 - 01;13;11;23
And, and the hardware investment for Apple
was very,
01;13;11;23 - 01;13;14;28
very high there versus,
you know, hardware company, I.
01;13;14;28 - 01;13;16;16
Mean, they are a hardware company.
01;13;16;16 - 01;13;19;17
But I think, you know,
and it's funny that you talk about Apple
01;13;19;17 - 01;13;22;16
as being notoriously tight lipped because,
01;13;22;19 - 01;13;26;06
the corollary there is like, well,
why did they shut that down?
01;13;26;09 - 01;13;29;01
And because they're tight lipped,
they haven't been like, well,
01;13;29;01 - 01;13;30;26
because of x, y and Z.
01;13;30;26 - 01;13;35;10
But if I had to guess at some point
somebody did a sober analysis
01;13;35;10 - 01;13;38;16
and said, like,
what's like an optimistic guess
01;13;38;16 - 01;13;42;12
for how many of these cars
we could sell in the next five years?
01;13;42;15 - 01;13;45;12
It's an ROI calculation plan,
and I'm going to make that with Waymo
01;13;45;12 - 01;13;46;01
at some point,
01;13;46;01 - 01;13;47;16
or Alphabet's going to
they're going to look at it
01;13;47;16 - 01;13;48;19
and they're going to say,
we've been throwing
01;13;48;19 - 01;13;51;16
a bunch of money into this
and they're not profitable right now.
01;13;51;16 - 01;13;53;22
I would be shocked
if they were profitable.
01;13;53;22 - 01;13;54;18
So like at some point
01;13;54;18 - 01;13;58;18
some executive is going to say, like,
should we keep throwing money into this,
01;13;58;21 - 01;14;02;01
you know, 55 gallon drum
that's already on fire, right.
01;14;02;04 - 01;14;02;18
Maybe.
01;14;02;18 - 01;14;05;28
But but I think there's a, there's a,
there's a more subtle pivot they can do
01;14;05;28 - 01;14;09;03
which is, which is the most profitable
piece of this.
01;14;09;03 - 01;14;09;12
Yeah.
01;14;09;12 - 01;14;11;17
And at some point,
while it's not buying or designing
01;14;11;17 - 01;14;15;04
cars, it's like a software
component of other cars.
01;14;15;11 - 01;14;17;07
And they're able to pivot into like.
01;14;17;07 - 01;14;18;11
call it car play.
01;14;18;11 - 01;14;19;06
Yeah. Yeah.
01;14;19;06 - 01;14;22;01
Well like exactly right.
01;14;22;01 - 01;14;24;19
And so that's where it can
potentially go.
01;14;24;19 - 01;14;27;08
but the other piece
I don't want to lose sight of is,
01;14;27;08 - 01;14;31;03
you know, if all this comes to pass
and you end up with all these
01;14;31;03 - 01;14;34;03
kind of unemployed, low skill workers,
you know
01;14;34;08 - 01;14;37;02
this to me is broadly
01;14;37;02 - 01;14;40;14
the the risk being created by technology
today.
01;14;40;14 - 01;14;41;27
Right? Because that's one example.
01;14;41;27 - 01;14;42;23
Always been the risk.
01;14;42;23 - 01;14;43;12
Like so when
01;14;43;12 - 01;14;46;27
somebody invented the combine, I needed
fewer people to harvest my field Yeah.
01;14;47;00 - 01;14;48;13
Yeah.
And there are lots of reasons for that.
01;14;48;13 - 01;14;52;04
When somebody invented,
you know, the keyboard, the keyboard
01;14;52;04 - 01;14;55;16
attached to a PC, I didn't need a person
to take a letter anymore.
01;14;55;16 - 01;14;58;13
Right. Like, and this is just the history
of technology, inevitably.
01;14;58;13 - 01;15;02;20
While and but it's also the history of
and I don't I don't want to get like too
01;15;02;20 - 01;15;08;00
much into like political or social theory,
but it's just like it's always
01;15;08;04 - 01;15;12;05
the bottom of the socioeconomic ladder
that gets get chopped first.
01;15;12;05 - 01;15;12;11
Right.
01;15;12;11 - 01;15;16;01
Like the and I don't know this
this was like one of
01;15;16;04 - 01;15;19;03
the many lessons for me
coming through Covid is it's just like,
01;15;19;03 - 01;15;23;01
I don't know, anytime anything happens
the rich get richer, right?
01;15;23;01 - 01;15;26;14
Like it's just like it's it's
become very difficult for me to imagine
01;15;26;14 - 01;15;30;01
a world where like something happens
and that's not the case because.
01;15;30;01 - 01;15;33;14
Really shuts down, the rich get richer,
while booms, rich get It's all.
01;15;33;17 - 01;15;33;23
Well.
01;15;33;23 - 01;15;38;06
Well, exactly like they're just it's so
well insulated right now
01;15;38;10 - 01;15;41;18
in the, you know, political,
economic system that's been created.
01;15;41;22 - 01;15;44;00
And so your point is well taken.
01;15;44;00 - 01;15;46;17
That does not make this moment
in history unique,
01;15;46;17 - 01;15;51;14
but it makes it, flammable,
I guess I could say so.
01;15;51;14 - 01;15;54;23
So my argument on this and I have a chart
I like to show
01;15;54;23 - 01;15;58;21
And it's the adoption
of mechanized agriculture tools over time.
01;15;58;24 - 01;16;01;14
Compared to visa burden,
essentially. Right. Yeah.
01;16;01;14 - 01;16;02;27
I used to have an ox to till my field.
01;16;02;27 - 01;16;04;21
Now I got a mechanical plow.
01;16;04;21 - 01;16;06;28
And like,
that took decades in the United States.
01;16;06;28 - 01;16;09;05
Right. And there's a point
at which the line crosses over.
01;16;09;05 - 01;16;09;28
It's all very exciting.
01;16;09;28 - 01;16;13;03
But like, people coexist and you got it
and you got an ox or a horse.
01;16;13;03 - 01;16;15;17
I've got a tractor
now. Everybody has tractors.
01;16;15;17 - 01;16;17;17
But it took decades. With AI.
01;16;17;17 - 01;16;21;17
It's like they invented that thing
like a year ago, you know, three years ago
01;16;21;17 - 01;16;22;19
in the case of ChatGPT.
01;16;22;19 - 01;16;25;20
And it's like, well,
now it can do the job of a translator.
01;16;25;20 - 01;16;28;09
And now we've got a bunch of unemployed
translators. Right.
01;16;28;09 - 01;16;32;01
And like, the technology can propagate
so quickly in the scheme of things
01;16;32;01 - 01;16;34;10
that I think you are left
in that transition period
01;16;34;10 - 01;16;36;23
with a lot of people
who are impacted all at once.
01;16;36;23 - 01;16;38;21
And, you know, again,
not to get too far into social
01;16;38;21 - 01;16;42;07
and political theory here, but
like when you have a lot of unemployed,
01;16;42;07 - 01;16;45;27
relatively young people,
like it's not good for your society.
01;16;46;00 - 01;16;48;27
They tend to,
you know, move fast and break things.
01;16;48;27 - 01;16;51;27
As far as I, as far as I understand.
01;16;51;27 - 01;16;54;09
And so, I mean, the tech leaders
have been talking about UBI,
01;16;54;09 - 01;16;57;09
but that's very politically unpopular,
the universal basic income.
01;16;57;09 - 01;17;01;26
And so I can't see that happening
for political reasons.
01;17;01;26 - 01;17;05;22
And so like, yeah, you are left
with a disenfranchized, a class
01;17;05;22 - 01;17;10;25
whose work has been essentially subsumed
by artificial intelligence.
01;17;10;25 - 01;17;14;17
And so, yeah, I think that
that is a super compelling point
01;17;14;17 - 01;17;17;22
and not not the note
we want to end the Well,
01;17;17;25 - 01;17;22;21
but but just on the UBI, the UBI point
and I'm, a bit of a UBI skeptic.
01;17;22;21 - 01;17;24;28
Not because I think it's necessarily
a bad idea.
01;17;24;28 - 01;17;27;03
Although, you know,
we'll leave that to the economists.
01;17;27;03 - 01;17;31;22
But because I think
it's extremely difficult to implement,
01;17;31;25 - 01;17;34;00
you know, pragmatically,
and every time we've done
01;17;34;00 - 01;17;36;07
some sort of experiment with it, it's
been shut down.
01;17;36;07 - 01;17;37;23
And, oh, we'll have to figure it out
next time.
01;17;37;23 - 01;17;40;08
Well, that that in and of itself
is extremely telling.
01;17;40;08 - 01;17;43;13
it's political reasons, But there's
but there's something else which is,
01;17;43;16 - 01;17;49;21
I think sort of a soft UBI
which is rise of the public sector
01;17;49;24 - 01;17;54;15
like literally as a device
to employ people to prevent unrest.
01;17;54;20 - 01;17;57;13
And by the way, like we used to have
this really big mechanism
01;17;57;13 - 01;17;59;26
in a lot of countries to do this,
which was called the military.
01;17;59;26 - 01;18;00;24
Yes. Right.
01;18;00;24 - 01;18;03;01
And like, oh, we've got all these troubled
young men.
01;18;03;01 - 01;18;05;22
What can we possibly do with them
to keep them away from crime?
01;18;05;22 - 01;18;06;07
Okay, well,
01;18;06;07 - 01;18;09;27
we can give them some discipline
and we can point them at the enemy right.
01;18;10;04 - 01;18;13;00
And I don't know, maybe that's
a controversial thing to say, but.
01;18;13;00 - 01;18;16;17
But it was an extremely
it was an extremely effective mechanism
01;18;16;20 - 01;18;17;18
for doing that.
01;18;17;18 - 01;18;19;12
Many times had a national service.
01;18;19;12 - 01;18;21;22
So you might not be in the military,
but you might be called
01;18;21;22 - 01;18;23;28
upon by the government
to do something like, you know, labor.
01;18;23;28 - 01;18;24;25
Sure.
01;18;24;25 - 01;18;27;22
Well, and we've seen, you know,
we're at an interesting,
01;18;27;22 - 01;18;31;09
you know, point in modern history
where we've seen just kind of the erosion
01;18;31;09 - 01;18;35;27
of military budgets
and, you know, military, you know, staff
01;18;35;27 - 01;18;36;27
size for so long.
01;18;36;27 - 01;18;39;15
And I think that, like, that curve
is starting to bend up again.
01;18;39;15 - 01;18;42;00
Civil service is also shrinking
in a lot of places as well,
01;18;42;00 - 01;18;44;24
because it's not politically popular
to have a large civil service. Right.
01;18;44;24 - 01;18;46;16
Well, and I agree with you.
01;18;46;16 - 01;18;48;27
And so these are going to be like
two forces that are
01;18;48;27 - 01;18;52;03
that are probably colliding
because sure, it's not popular.
01;18;52;03 - 01;18;55;01
But if you have, you know, mass
unemployment, What does that look like.
01;18;55;01 - 01;18;59;10
And it's, you know, it's somewhere
between funny and depressing because
01;18;59;13 - 01;19;02;25
you can talk to the
leaders of the tech of the tech companies
01;19;02;25 - 01;19;06;17
making these products, and they're like,
yeah, it'll probably lead
01;19;06;17 - 01;19;07;29
to a lot of unemployment.
01;19;07;29 - 01;19;10;03
I don't know
what we're going to do about that.
01;19;10;03 - 01;19;11;26
And it's not my problem.
01;19;11;26 - 01;19;14;29
Well, but it's, you know,
it's a tragedy of the commons type problem
01;19;14;29 - 01;19;18;21
because if there's less people with money
01;19;18;22 - 01;19;22;28
for consumption, then, you know,
how does the market stay buoyant?
01;19;23;01 - 01;19;26;01
So so that's a very real concern.
01;19;26;02 - 01;19;26;20
Well.
01;19;26;20 - 01;19;29;20
One point that I'll make on this is.
01;19;29;23 - 01;19;30;29
I think for one of.
01;19;30;29 - 01;19;33;22
The first times in history,
01;19;33;22 - 01;19;37;08
the technology being proposed
has the potential to impact actually
01;19;37;08 - 01;19;41;08
relatively high wage, high salary,
high status occupations.
01;19;41;08 - 01;19;41;15
Right.
01;19;41;15 - 01;19;46;10
Like if I make an AI
that can read an X-ray
01;19;46;13 - 01;19;51;28
with, you know, 99.99999% effectiveness
or whatever, like that, radiologists
01;19;51;28 - 01;19;52;21
who might have been pulling
01;19;52;21 - 01;19;56;18
in half a million or three quarters
of $1 million a year, all of a sudden
01;19;56;20 - 01;19;59;28
their livelihood is impacted,
and those people actually tend
01;19;59;28 - 01;20;01;10
to have a lot more political power.
01;20;01;10 - 01;20;04;10
So we could see a situation
where they're able to leverage that.
01;20;04;10 - 01;20;05;00
I agree with you
01;20;05;00 - 01;20;08;04
that historically, technology Well,
so that the radiology one is interesting.
01;20;08;04 - 01;20;09;28
And I you know,
I was talking to someone last week
01;20;09;28 - 01;20;11;02
who was saying that,
01;20;11;02 - 01;20;15;05
if you look at the studies
in the last handful of years, radiologists
01;20;15;05 - 01;20;18;08
are using AI more than ever to help,
you know, with imaging,
01;20;18;08 - 01;20;21;17
which is a really, really good thing for,
you know, everybody's health.
01;20;21;21 - 01;20;23;20
But it hasn't led to fewer radiologists.
01;20;23;20 - 01;20;28;28
It's actually led to more radiologists
and led to is doing led to is maybe it.
01;20;28;28 - 01;20;32;11
Has not stemmed the flood Right
I don't want it flowing over
01;20;32;11 - 01;20;34;26
I don't want to
I don't want to imply causality there.
01;20;34;26 - 01;20;36;27
But it has not decreased the number.
01;20;36;27 - 01;20;38;17
That explains why when I look left,
I look right.
01;20;38;17 - 01;20;40;24
Radiologist radiology
radiologists everywhere.
01;20;40;24 - 01;20;43;17
Yeah, I moonlight as a radiologist.
I actually need a too.
01;20;43;17 - 01;20;45;02
That's a it's amazing.
01;20;45;02 - 01;20;45;21
Yeah.
01;20;45;21 - 01;20;47;23
So I think that's such
a compelling example.
01;20;47;23 - 01;20;49;26
So it was the same with ATMs, right.
01;20;49;26 - 01;20;51;19
In the 1970s and 1980s.
01;20;51;19 - 01;20;53;29
There are more bank tellers
now than there were back then.
01;20;53;29 - 01;20;55;26
The nature of their role changed again.
01;20;55;26 - 01;20;57;01
It's that speed piece.
01;20;57;01 - 01;21;01;21
But I do think that the political power
that the moneyed classes have
01;21;01;21 - 01;21;03;27
and this podcast got really interesting,
01;21;03;27 - 01;21;06;27
the political power
that these moneyed classes have does
01;21;07;00 - 01;21;11;09
make it likely that there might be a
society wide solution,
01;21;11;09 - 01;21;12;22
or at least a serious discussion
01;21;12;22 - 01;21;15;22
of one, whereas historically,
maybe there wouldn't have been one.
01;21;15;27 - 01;21;16;19
You know, one of the other
01;21;16;19 - 01;21;19;21
interesting kind of corollaries of that is
people have started to talk either
01;21;19;21 - 01;21;24;08
seriously or facetiously about like, well,
are we ready for AI CEOs?
01;21;24;08 - 01;21;26;04
Then? Yes, I'll. Send our pitch.
01;21;26;04 - 01;21;28;01
I came out and said, hey,
I could do my job.
01;21;28;01 - 01;21;30;10
And he was probably
just trying to sell his AI, but what?
01;21;30;10 - 01;21;34;02
Well, and it's really interesting
because I think to answer that question,
01;21;34;02 - 01;21;38;13
you have to like really soberly ask
yourself, like, what is a CEO good for?
01;21;38;16 - 01;21;44;10
And if you look at a lot of the day to day
tasks of a CEO, you're like, yes, I can.
01;21;44;13 - 01;21;45;11
You know, if you think their job
01;21;45;11 - 01;21;48;12
is to come up with a strategy
and communicate it or whatever.
01;21;48;18 - 01;21;51;21
Yeah, there's a lot there
that an LM can do.
01;21;51;21 - 01;21;55;06
It can do a good impression
on the AI similarly.
01;21;55;10 - 01;22;00;08
But if you know, to me what is the core of
a CEO
01;22;00;12 - 01;22;05;10
is inspiring confidence
and raising capital.
01;22;05;16 - 01;22;05;29
Right.
01;22;05;29 - 01;22;07;28
Like fired
when things aren't going and getting fired
01;22;07;28 - 01;22;10;01
when you're you're a throat
to choke there. Right.
01;22;10;01 - 01;22;14;20
It's accountability and
and projecting confidence so that you can,
01;22;14;20 - 01;22;17;21
you know, get more investment
and raise the value of the organization.
01;22;17;21 - 01;22;20;09
It's not just strategic leadership.
01;22;20;09 - 01;22;24;00
And strategic leadership, by the way,
is not just a good idea, right?
01;22;24;03 - 01;22;27;06
There's just a lot more to it than that
in terms of getting it sold
01;22;27;06 - 01;22;28;06
and implemented.
01;22;28;06 - 01;22;30;22
And so I think while people do talk
seriously about it,
01;22;30;22 - 01;22;35;10
I think it's an injustice to the role
to suggest that an AI can do it.
01;22;35;15 - 01;22;38;16
And as we've talked
about before, you know,
01;22;38;19 - 01;22;40;27
the higher
up you are, the more insulated you are.
01;22;40;27 - 01;22;43;26
I think in a lot of ways
I see you smiling nervously.
01;22;43;26 - 01;22;47;22
I was not so much nervous smiling and say,
I think, oh, those poor CEOs.
01;22;47;22 - 01;22;49;24
Yeah, yeah, yeah, poor one out.
01;22;49;24 - 01;22;52;26
For those poor
CEOs, It's an injustice to the role.
01;22;53;01 - 01;22;56;04
Well, CEOs who are watching at home
know that Jeff Nielsen,
01;22;56;04 - 01;22;58;26
host of the Digital Disruption
podcast, has your back.
01;22;58;26 - 01;23;01;29
Yeah you will if there's anyone defending
01;23;01;29 - 01;23;05;21
well it you know, underappreciated
and underpaid CEOs.
01;23;05;28 - 01;23;07;02
It's us here at.
01;23;07;02 - 01;23;09;01
The rebooted for the Tesla pay package.
01;23;09;01 - 01;23;11;25
We thought that oh my guy.
01;23;11;25 - 01;23;15;05
So let's talk about going forward
a couple of things that have us excited
01;23;15;05 - 01;23;17;02
because we've talked about
like a lot of frankly,
01;23;17;02 - 01;23;20;02
pretty depressing things, a lot of very
exciting things, but depressing things.
01;23;20;08 - 01;23;21;12
I'll go first.
01;23;21;12 - 01;23;25;12
The thing that I am most excited about for
2026 is not AI related.
01;23;25;15 - 01;23;26;28
It's not self-driving car related.
01;23;26;28 - 01;23;29;09
I tried Tesla Full Self-Driving one time
and it was cool.
01;23;29;09 - 01;23;30;00
It was kind of scary.
01;23;30;00 - 01;23;33;00
So I think I'm probably not going to I'm
not going to buy that.
01;23;33;03 - 01;23;36;05
It's a full the Apple phone
they're tight lipped notoriously.
01;23;36;05 - 01;23;38;02
They haven't come out and said
we're going to make a folding phone.
01;23;38;02 - 01;23;39;18
But more companies are making these.
01;23;39;18 - 01;23;42;15
Samsung just released
a really thin folding phone. Very cool.
01;23;42;15 - 01;23;43;15
They're expensive.
01;23;43;15 - 01;23;48;13
But like when I can fold my iPhone open
and it's going to be a tablet,
01;23;48;15 - 01;23;50;08
that's the thing
that I'm most excited about.
01;23;50;08 - 01;23;52;16
And sometimes I have dreams about it.
01;23;52;16 - 01;23;55;14
And I come into the office
and I look at my tiny little iPhone mini,
01;23;55;14 - 01;23;57;07
and I think, if only this were bigger.
01;23;57;07 - 01;24;01;01
You know, you can get these right now
from companies that aren't Apple.
01;24;01;04 - 01;24;03;16
And we'll put a pin in. Now
we'll come back to that discussion.
01;24;03;16 - 01;24;07;12
But I'm very excited
about the ecosystem play
01;24;07;15 - 01;24;09;17
that is iMessage.
01;24;09;17 - 01;24;12;09
And I've got all these books in my Apple
Books library.
01;24;12;09 - 01;24;13;21
You can't get that on Android.
01;24;13;21 - 01;24;15;15
the service life is incredible.
01;24;15;15 - 01;24;17;07
And usually they don't
bring something to market
01;24;17;07 - 01;24;18;23
until they worked out a lot of the kinks.
01;24;18;23 - 01;24;20;26
Not exclusively,
but usually they're pretty good.
01;24;20;26 - 01;24;24;09
I actually, I'm not scared
of a first gen Apple product in the way
01;24;24;09 - 01;24;27;06
that I am for other manufacturers,
a lower on like Gen nine.
01;24;27;06 - 01;24;28;25
So that's the thing that I'm most excited
about.
01;24;28;25 - 01;24;32;01
I'm anticipating this sort of prediction
slash excitement.
01;24;32;03 - 01;24;35;27
I'm optimistic that in September of 2026,
01;24;36;00 - 01;24;39;00
Apple will say, you know,
we think you're going to love it.
01;24;39;07 - 01;24;41;10
It's our fullest iPhone yet.
01;24;41;10 - 01;24;44;10
No headphone jack. I can't wait for that.
01;24;44;13 - 01;24;45;16
What are you excited about?
01;24;45;16 - 01;24;48;28
I'm skeptical that you're going to get
your folded your folded phone.
01;24;48;28 - 01;24;49;12
All right.
01;24;49;12 - 01;24;50;27
I'm really skeptical
that you're going to get it,
01;24;50;27 - 01;24;53;16
but I'll, I'll I'll pray for
you thoughts and prayers.
01;24;53;16 - 01;24;54;16
Like just
01;24;54;16 - 01;24;59;15
I hope you get your your full that iPhone
I'm like just Android to the bone.
01;24;59;15 - 01;25;03;18
So like I don't have a horse in your
in your like for the Apple race.
01;25;03;20 - 01;25;05;10
Thrive on the instability.
01;25;05;10 - 01;25;08;02
I thrive on the on the flexibility.
01;25;08;02 - 01;25;09;21
whether folding phone
is going to be very flexible.
01;25;09;21 - 01;25;10;16
While there you go.
01;25;10;16 - 01;25;12;02
It's going to And you can get it now
01;25;12;02 - 01;25;14;18
through a series of Android phone
manufacturer.
01;25;14;18 - 01;25;16;19
how many easy payments of.
01;25;16;19 - 01;25;18;23
Yeah. No it's all it's all easy payments.
01;25;18;23 - 01;25;22;17
I don't know that I have a tech, not
a technology right off the bat that I'm.
01;25;22;20 - 01;25;24;11
Maybe not necessarily a No, no.
01;25;24;11 - 01;25;26;09
Well, and there
something that you're excited about?
01;25;26;09 - 01;25;28;20
Are you excited about AI 2.0, for example?
01;25;28;20 - 01;25;29;08
Yeah.
01;25;29;08 - 01;25;33;07
I'm a guy, as you may know, who,
like, is just like, really?
01;25;33;07 - 01;25;35;13
I really at, like bulshit.
01;25;35;13 - 01;25;38;29
And there's just been so much
the past two years
01;25;38;29 - 01;25;42;11
around AI, and the tide is coming out
Yeah.
01;25;42;11 - 01;25;42;25
when we can
01;25;42;25 - 01;25;46;21
start to have real conversations
and people don't talk about AI like it's
01;25;46;21 - 01;25;50;26
the metaverse or like it's blockchain
when it's what can this actually do?
01;25;51;00 - 01;25;52;17
What can it not do?
01;25;52;17 - 01;25;54;15
What are we going to invest in?
01;25;54;15 - 01;26;00;03
Like when it's a real practical
conversation tied to the capabilities?
01;26;00;10 - 01;26;01;20
I'm really excited for that.
01;26;01;20 - 01;26;04;26
Like the hype machine
I think is running out of gas.
01;26;04;26 - 01;26;05;10
Yeah.
01;26;05;10 - 01;26;08;10
And yeah,
when when like companies are like,
01;26;08;10 - 01;26;12;04
oh, it turns out people don't as consumers
just want to pay
01;26;12;04 - 01;26;16;07
for stuff with AI tacked onto it,
like want an a genetic version of windows.
01;26;16;07 - 01;26;17;26
Yeah, exactly, exactly.
01;26;17;26 - 01;26;20;11
And one of the hate replies on Twitter
when they announced that.
01;26;20;11 - 01;26;22;05
I was all of the hate replies on Twitter.
01;26;22;05 - 01;26;24;11
No, I just, it's like.
01;26;24;11 - 01;26;27;07
again, it comes back to like,
01;26;27;10 - 01;26;29;03
why? Like, what are
01;26;29;03 - 01;26;32;03
you doing for people
that they should be excited about
01;26;32;05 - 01;26;35;08
versus what are you doing for investors
that they should be excited about?
01;26;35;15 - 01;26;38;23
And the investors are going
to run out of steam at some point
01;26;38;26 - 01;26;42;01
if the money from consumers doesn't
follow.
01;26;42;08 - 01;26;46;14
And, I have money in stocks, I don't
I don't want the bottom of the market
01;26;46;14 - 01;26;47;12
to fall out this year.
01;26;47;12 - 01;26;50;12
That's not
that's not good for me and my family.
01;26;50;14 - 01;26;53;14
And It's a discount opportunity.
01;26;53;16 - 01;26;55;11
Like we could unpack a lot of that,
01;26;55;11 - 01;26;59;04
but that's probably not the the road
we want to go down right now.
01;26;59;07 - 01;27;02;05
But being able to start
01;27;02;05 - 01;27;07;00
having real conversations and talking
about what this technology can do,
01;27;07;03 - 01;27;10;13
and I think that's going to be
a really good thing.
01;27;10;20 - 01;27;15;29
And regardless of
exactly how the market performs next year
01;27;16;02 - 01;27;18;26
and, you know, in the next year,
I think we're going
01;27;18;26 - 01;27;22;00
to, I think that's going to be
a very good thing.
01;27;22;04 - 01;27;22;28
Fantastic.
01;27;22;28 - 01;27;24;28
Well, I'm
glad you forced us to end on that note,
01;27;24;28 - 01;27;29;00
which is positive
in a sort of skeptical way versus just,
01;27;29;00 - 01;27;32;29
you know, kind of doom outlook that we've
been talking about for a lot of this.
01;27;33;02 - 01;27;35;13
Thanks for moderating that.
I really appreciate this.
01;27;35;13 - 01;27;38;06
And for everybody else, stick
with digital disruption for all things
01;27;38;06 - 01;27;41;07
future work, future of tech,
what's coming down the pipeline next?
01;27;41;12 - 01;27;44;03
And, what's that other thing
I was just to say?
01;27;44;03 - 01;27;47;25
Oh, yeah. Don't
forget to like and subscribe.
01;27;47;28 - 01;27;48;20
If you work in
01;27;48;20 - 01;27;52;10
IT, Infotech research Group is a name
you need to know.
01;27;52;13 - 01;27;55;13
No matter what your needs are, Infotech
has you covered.
01;27;55;18 - 01;27;56;25
AI strategy?
01;27;56;25 - 01;27;59;07
Covered. Disaster recovery?
01;27;59;07 - 01;28;00;09
Covered.
01;28;00;09 - 01;28;02;24
Vendor negotiation? Covered.
01;28;02;24 - 01;28;06;17
Infotech supports you with the best
practice research and a team of analysts
01;28;06;17 - 01;28;10;10
standing by ready to help you
tackle your toughest challenges.
01;28;10;13 - 01;28;13;13
Check it out at the link below
and don't forget to like and subscribe!
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Jeremy Roberts Discusses
What AI Bubble? Top Trends in Tech and Jobs in 2026
Looking ahead to 2026, Geoff Nielson and Jeremy Roberts sit down for an unfiltered conversation about artificial intelligence, the economy, and the future of work. As AI hype accelerates across markets, boardrooms, and headlines, they ask the hard questions many leaders and workers are quietly worrying about: Are we in an AI bubble? If so, what happens when expectations collide with reality?
Our Guest Dr. Vivienne Ming Discusses
Top Neuroscientist Says AI Is Making Us DUMBER?
Are we using AI in a way that actually makes us smarter, or are we unknowingly making ourselves less capable, less curious, and easier to automate?
On this episode of Digital Disruption, we are joined by artificial intelligence expert and neuroscientist Dr. Vivienne Ming.
Our Guest Kenneth Cukier Discusses
Go All In on AI: The Economist’s Kenneth Cukier on AI's Experimentation Era
On this episode, we are joined by Kenneth Cukier, Deputy Executive Editor at The Economist and bestselling author, to explore why most companies should treat AI as a playground for experimentation, how The Economist is using generative AI behind the scenes, the human skills needed to stay competitive, and why great leadership now requires enabling curiosity, psychological safety, and responsible innovation.
Our Guest Dr. Anne-Marie Imafidon Discusses
Is AI Eroding Identity? Future of Work Expert on How AI Is Taking More Than Jobs
From redefining long-held beliefs about “jobs for life,” to the cultural fractures emerging between companies, workers, and society, Dr. Anne-Marie goes deep on what’s changing, what still isn’t understood, and what leaders must do right now to avoid being left behind.