Our Guest Ian Beacraft Discusses
AI Is Rewriting the Rules of Work: Futurist Ian Beacraft Explains Why Jobs Are Dead
Today on Digital Disruption, we’re joined by Ian Beacraft, Chief Futurist and Founder of Signal and Cipher.
One of the leading voices in AI and the future of work, Ian helps organizations become AI-ready through strategic workforce transformation, training, and innovation. Ian has advised top global brands including Samsung, Google, Microsoft, and Nike. A former agency executive, he now pioneers immersive presentations that bring AI and extended reality to life. Ian is also the co-owner of a production studio designing virtual worlds and the first person ever to host a news segment as a synthetic human, streaming to over 100 million devices around the world. A classically trained musician and passionate educator, he champions the responsible, creative use of emerging technologies – and remains an optimist about the future of humanity in a tech-driven world.
Ian sits down with Geoff Nielson to unpack the real impact of artificial intelligence on the workplace. They discuss why outdated leadership mindsets are more harmful than AI itself, how organizations must evolve beyond rigid roles and job descriptions, and why the future of work is less about replacing people and more about reshaping how we define value, productivity, and collaboration. Ian explains that it’s not about eliminating jobs, but about eliminating the artificial boundaries that confine people to specific roles within an organization.
1
00:00:00,667 --> 00:00:01,668
Hey everyone!
2
00:00:01,668 --> 00:00:05,171
Today I'm
super excited to be talking to Ian Craft.
3
00:00:05,505 --> 00:00:08,942
He's the founder and chief
futurist at Signal and Cipher,
4
00:00:09,175 --> 00:00:12,812
and he is an absolute thought leader
when it comes to the intersection
5
00:00:12,812 --> 00:00:16,750
of AI and enterprises
that we all work for.
6
00:00:17,317 --> 00:00:19,586
He has so many deep, amazing insights.
7
00:00:19,586 --> 00:00:22,689
I had the chance
to watch a number of his keynotes at South
8
00:00:22,689 --> 00:00:26,393
by Southwest recently,
and it just never ceases to amaze me
9
00:00:26,760 --> 00:00:30,330
how much new thought leadership
he brings to the table here.
10
00:00:30,330 --> 00:00:33,366
So I'm really excited to pick his brain
to understand
11
00:00:33,700 --> 00:00:37,837
how much of this new technology
is really being limited by us
12
00:00:38,038 --> 00:00:41,041
and our own boundaries,
and how we can break through that.
13
00:00:41,041 --> 00:00:44,010
It should be an amazing conversation.
14
00:00:45,779 --> 00:00:48,982
I've been a big fan of, of your keynotes.
15
00:00:48,982 --> 00:00:51,584
I've been bingeing them
in the last couple of days.
16
00:00:51,584 --> 00:00:53,787
One of the quotes that I wrote down,
I don't remember
17
00:00:53,787 --> 00:00:56,823
if, is from South by Southwest this year
last year, but as I was bingeing,
18
00:00:56,823 --> 00:00:59,726
one of the quotes I wrote
down was poor leadership,
19
00:01:00,660 --> 00:01:03,063
adherence to old systems and technology.
20
00:01:03,063 --> 00:01:07,767
First, mindsets are a bigger risk
than AI to organizations.
21
00:01:09,035 --> 00:01:11,004
What is going on out there?
22
00:01:11,004 --> 00:01:13,473
And can you kind of dissect that quote?
23
00:01:13,473 --> 00:01:14,340
Yeah, absolutely.
24
00:01:14,340 --> 00:01:16,509
So it to me,
25
00:01:16,509 --> 00:01:20,280
when we go through times of change,
we need to galvanize behind something.
26
00:01:20,313 --> 00:01:23,083
And that happens
both productively and disruptive.
27
00:01:23,083 --> 00:01:27,520
We tend to find something
to create opposition towards.
28
00:01:27,787 --> 00:01:31,391
And for a lot of people, they see AI
as the main threat because it is the
29
00:01:31,391 --> 00:01:34,394
easiest thing to point to and say,
that's a threat to my job.
30
00:01:34,594 --> 00:01:38,531
I look at that and I see very clearly it's
automating pieces of what I do,
31
00:01:38,631 --> 00:01:39,766
and that becomes extractive.
32
00:01:39,766 --> 00:01:43,036
It's taking something
that basically I provided
33
00:01:43,036 --> 00:01:46,039
value through that thing before
and I no longer do.
34
00:01:46,339 --> 00:01:51,044
And because of that, I as an individual
are less valuable to that organization.
35
00:01:51,678 --> 00:01:55,381
Now, if I'm continuing as a leader
to just say
36
00:01:55,381 --> 00:01:59,919
my goal is to create efficiency and scale
within the system that we have today,
37
00:02:00,353 --> 00:02:04,157
the typical lever we're going to pull
is efficiency, which is code for layoffs.
38
00:02:04,691 --> 00:02:08,161
And that is essentially how our system
is operated for the last hundred
39
00:02:08,161 --> 00:02:08,995
and 50 years.
40
00:02:08,995 --> 00:02:10,897
And it's been able to grow.
41
00:02:10,897 --> 00:02:12,265
We've been able to create prosperity
42
00:02:12,265 --> 00:02:15,635
in a number different ways,
but that systems changing now, the era
43
00:02:15,635 --> 00:02:18,638
of unending exponential growth,
44
00:02:18,638 --> 00:02:22,275
in existing paradigms,
is starting to fray at the seams.
45
00:02:22,475 --> 00:02:26,112
And when we think about the future
through the lens of the past,
46
00:02:26,613 --> 00:02:31,417
what we do is we apply old metrics,
old ways of thinking, old processes
47
00:02:31,718 --> 00:02:34,921
to new technologies, new ways
of working, and new challenges.
48
00:02:35,221 --> 00:02:37,557
And those things come together
and they don't work.
49
00:02:37,557 --> 00:02:40,527
But when leaders are so fixed in
50
00:02:40,527 --> 00:02:42,729
how they want to approach these things,
they're not thinking
51
00:02:42,729 --> 00:02:44,497
about how this is different
52
00:02:44,497 --> 00:02:47,200
and how they have to take a
different paradigm or a different approach
53
00:02:47,200 --> 00:02:50,203
to this new type of challenge
and new type of circumstances.
54
00:02:50,370 --> 00:02:51,771
And that's what leads to the demise
55
00:02:51,771 --> 00:02:56,042
of the organization,
not just that employee or that department.
56
00:02:57,210 --> 00:02:57,844
Right.
57
00:02:57,844 --> 00:03:00,180
So I want to
I want to zoom in on two different phrases
58
00:03:00,180 --> 00:03:02,248
you use there
that I think are really important.
59
00:03:02,248 --> 00:03:05,084
You talked about, shoot.
60
00:03:05,084 --> 00:03:07,520
Now I'm going to screw it up,
but you talked about,
61
00:03:07,520 --> 00:03:11,591
you know, growth and the ability for us
to have these, you know, productivity
62
00:03:11,591 --> 00:03:14,594
improvements and, and be thinking about
what we're doing differently.
63
00:03:14,627 --> 00:03:18,731
And you also said efficiency
and efficiency has become a really,
64
00:03:19,499 --> 00:03:21,935
popular word these days.
65
00:03:21,935 --> 00:03:22,735
You know,
66
00:03:22,735 --> 00:03:24,304
efficiency is something,
67
00:03:24,304 --> 00:03:27,307
you know, across the public sector,
across the commercial sector.
68
00:03:27,407 --> 00:03:32,278
You know, I've very hot word
to what degree is efficiency
69
00:03:32,278 --> 00:03:35,582
the right thing to be looking at right now
or a distraction from what's
70
00:03:35,582 --> 00:03:36,616
actually going to help us?
71
00:03:36,616 --> 00:03:38,751
Yeah, I think there's a balance here.
72
00:03:38,751 --> 00:03:44,824
There's a recognition that organizations
have a duty to their stakeholders
73
00:03:44,824 --> 00:03:48,094
and their stockholders, and that means
you have to look for efficiencies.
74
00:03:48,094 --> 00:03:51,097
And if you're not,
that's a dereliction of that obligation.
75
00:03:51,097 --> 00:03:52,966
Understandable. Right.
76
00:03:52,966 --> 00:03:55,134
So they should be looking at efficiency
and they should be looking
77
00:03:55,134 --> 00:03:56,803
at increasing productivity.
78
00:03:56,803 --> 00:04:01,441
But to do so with the same fervor
we have over the past several decades
79
00:04:01,441 --> 00:04:04,177
since the beginning
of the digital revolution,
80
00:04:04,177 --> 00:04:08,114
I think is absolutely, incredibly
shortsighted, because what this does
81
00:04:08,114 --> 00:04:11,484
is this doesn't just scale in individuals
efficiency and effectiveness.
82
00:04:11,918 --> 00:04:16,322
This changes the fundamental boundaries
of what jobs and tasks are.
83
00:04:16,689 --> 00:04:19,892
We're really, we're reengineering
84
00:04:19,892 --> 00:04:23,496
or completely changing
what the atomic unit of work looks like.
85
00:04:23,730 --> 00:04:28,034
So, for example, we take a look
at organizations as built from people,
86
00:04:28,034 --> 00:04:32,372
which are defined for jobs, very specific
slotted roles that are well defined.
87
00:04:32,739 --> 00:04:35,808
And if I look at an org chart
of any organization, I had these mental
88
00:04:35,808 --> 00:04:38,911
shortcuts that I can use to understand
who does what, where and how.
89
00:04:39,612 --> 00:04:42,682
All of that starting to change, though,
because I makes it
90
00:04:42,682 --> 00:04:45,285
so that I don't actually have to stick
within the boundaries
91
00:04:45,285 --> 00:04:47,220
of a specific role and say,
that's all you do.
92
00:04:47,220 --> 00:04:51,457
If you're accountant number two,
you know, in the finance department or
93
00:04:51,691 --> 00:04:52,292
whatever that,
94
00:04:53,526 --> 00:04:56,229
designation
might mean to a specific organization,
95
00:04:56,229 --> 00:04:59,699
you have a very specific set of roles,
responsibilities, KPIs, and remit
96
00:04:59,732 --> 00:05:04,771
that you are responsible for
what happens with I though, is it makes it
97
00:05:04,771 --> 00:05:09,242
so that the skill sets
that might sit adjacent to
98
00:05:09,242 --> 00:05:12,445
my existing skill sets or responsibilities
are now accessible to me.
99
00:05:12,979 --> 00:05:16,482
So it starts to put pressure
on these boundaries
100
00:05:16,482 --> 00:05:18,151
that we keep people in with their roles.
101
00:05:18,151 --> 00:05:21,354
So if you're just a copywriter
or just an ad,
102
00:05:22,088 --> 00:05:26,192
if you're staying within those boundaries,
now with the access to AI and generative
103
00:05:26,192 --> 00:05:30,697
AI and other toolsets, it's
also almost a, abdication
104
00:05:30,697 --> 00:05:33,333
of responsibility to say,
I'm just going to stay in my lane,
105
00:05:33,333 --> 00:05:36,936
and all of a sudden we have this chaos
that comes with people saying,
106
00:05:36,936 --> 00:05:41,407
I have access to new skill sets,
I have access to capabilities,
107
00:05:41,641 --> 00:05:46,813
but the system around me has not adapted
to really make that possible.
108
00:05:47,213 --> 00:05:50,883
Fluid and part of the system
where I'm not stepping on other people's
109
00:05:50,883 --> 00:05:52,752
toes, I'm not doing things
without permission.
110
00:05:52,752 --> 00:05:56,489
I'm not doing things without support
and an apparatus or feedback loop.
111
00:05:56,823 --> 00:05:59,892
So what's happening is
we have this new technology that allows
112
00:06:00,093 --> 00:06:02,929
all sorts of new behaviors
within the organization, right?
113
00:06:02,929 --> 00:06:05,965
But the organization has no idea
how to pull those
114
00:06:06,032 --> 00:06:09,035
behaviors and structures and processes
together.
115
00:06:09,102 --> 00:06:09,369
Right?
116
00:06:09,369 --> 00:06:14,741
It's too rigid to actually take advantage
of what this technology could unlock.
117
00:06:14,741 --> 00:06:15,408
Absolutely.
118
00:06:16,442 --> 00:06:19,445
So so what do we what do we do with that?
119
00:06:19,445 --> 00:06:22,548
Like as leaders, where do we start?
120
00:06:22,582 --> 00:06:26,619
How can we start to rethink these systems
and processes in a way
121
00:06:26,619 --> 00:06:30,356
that's more dynamic, or at least
can help us harness these possibilities?
122
00:06:30,390 --> 00:06:31,124
Absolutely.
123
00:06:31,124 --> 00:06:33,092
Well, what I would say
is, the first place you should
124
00:06:33,092 --> 00:06:36,062
look is not just about
how do I do more with less?
125
00:06:36,496 --> 00:06:37,029
You can.
126
00:06:37,029 --> 00:06:40,466
I'm not saying don't think about that,
but that shouldn't be the primary goal,
127
00:06:40,566 --> 00:06:43,770
the value you're trying to get out of AI
because that's a race to the bottom.
128
00:06:43,770 --> 00:06:45,872
Like we're all going to get that benefit.
129
00:06:45,872 --> 00:06:50,109
And if that's your focus, then you're
playing a Walmart game for, you know,
130
00:06:50,109 --> 00:06:53,613
a premium enterprise type of environment
that's not going to help anybody.
131
00:06:53,613 --> 00:06:55,948
It might give short term impact.
132
00:06:55,948 --> 00:06:57,016
So a quarter to quarter
133
00:06:57,016 --> 00:07:00,019
thinking of a Western or American style
way of doing business,
134
00:07:00,186 --> 00:07:01,554
you'll see immediate impact.
135
00:07:02,555 --> 00:07:04,690
And I think that when you take a look
at what's happening
136
00:07:04,690 --> 00:07:08,060
in the balance sheets at meta and several
other organizations, people see oh,
137
00:07:08,461 --> 00:07:12,131
less staff, higher margins,
more productivity, that's the way of it.
138
00:07:12,331 --> 00:07:15,234
They're missing a lot of what's
actually going on behind the scenes.
139
00:07:15,234 --> 00:07:19,272
So a lot of these companies that are
in the space of building the models
140
00:07:19,272 --> 00:07:23,309
and kind of changing the way they work
have seen this coming around the bend,
141
00:07:23,509 --> 00:07:26,345
and they're already restructuring
the way that they operate.
142
00:07:26,345 --> 00:07:29,282
For leaders that are trying to figure out
what this means for them,
143
00:07:29,282 --> 00:07:32,852
I would say, first of all, you need
some sort of experiential learning,
144
00:07:33,085 --> 00:07:36,088
like just understanding this stuff
theoretically, theoretically
145
00:07:36,722 --> 00:07:38,524
might have worked for the past 25 years
146
00:07:38,524 --> 00:07:42,195
because you have a lot of understanding
of how the digital paradigm works.
147
00:07:42,862 --> 00:07:46,499
This is not new digital, you know,
connected networks, all that stuff.
148
00:07:46,499 --> 00:07:48,634
We've known that since the 80s and 90s.
149
00:07:48,634 --> 00:07:51,571
This is a fundamentally different
paradigm, different way of working,
150
00:07:51,571 --> 00:07:54,574
a different way of thinking
about growth, different way of connecting,
151
00:07:55,074 --> 00:07:56,642
software and systems.
152
00:07:56,642 --> 00:07:57,743
I mean, we're literally working
153
00:07:57,743 --> 00:08:01,781
with quote unquote software
that now replicates and imitates
154
00:08:02,315 --> 00:08:05,818
cognitive processes, completely different
paradigm for people. So
155
00:08:06,886 --> 00:08:09,422
having
some sort of education or experience
156
00:08:09,422 --> 00:08:12,391
that gets you into that headspace
where you can start to grapple
157
00:08:12,492 --> 00:08:15,761
with what those changes are,
is absolutely necessary.
158
00:08:15,962 --> 00:08:17,430
Just reading articles
159
00:08:17,430 --> 00:08:20,466
and doing a couple of things
and ChatGPT is not going to be enough,
160
00:08:21,334 --> 00:08:25,171
because if you're going
to competently lead an AI transformation,
161
00:08:25,171 --> 00:08:29,308
you as a leader also need to have spent
that time immersed in that space.
162
00:08:29,308 --> 00:08:32,512
I'm not saying,
you know, hundreds of hours, but
163
00:08:32,745 --> 00:08:36,349
at least dozens in that space
to understand it with the proper guidance,
164
00:08:37,016 --> 00:08:39,752
what it's going to do for your team,
your business.
165
00:08:39,752 --> 00:08:42,121
How does it help
you answer the questions about
166
00:08:42,121 --> 00:08:44,490
what kind of value
we providing as a company?
167
00:08:44,490 --> 00:08:46,025
How do we structure our teams?
168
00:08:46,025 --> 00:08:48,995
How do we grow effectively in this market
where, you know,
169
00:08:48,995 --> 00:08:51,631
everyone else is starting to go
into these adjacent spaces as well?
170
00:08:51,631 --> 00:08:54,367
So there's a lot of new questions
that I have to answer from that.
171
00:08:55,501 --> 00:08:55,902
Yeah.
172
00:08:55,902 --> 00:08:59,205
Well, when you talk about that
experiential learning, Ian,
173
00:09:00,172 --> 00:09:03,309
do you have a sense of
like what leaders are doing?
174
00:09:03,743 --> 00:09:05,344
Like, what does that mean?
175
00:09:05,344 --> 00:09:08,347
Like what tools are they using?
176
00:09:08,514 --> 00:09:09,916
What does that look like? Yeah.
177
00:09:09,916 --> 00:09:12,919
So the most effective programs
that we've seen,
178
00:09:13,586 --> 00:09:16,989
is, are ones that are built to be used
with the same types of tools.
179
00:09:17,023 --> 00:09:18,424
They are already using their environment.
180
00:09:18,424 --> 00:09:23,029
So everyone uses Microsoft Teams,
slack, have access to ChatGPT or something
181
00:09:23,029 --> 00:09:26,799
like that, 90% of what you need to do
can be done with is in those environments,
182
00:09:27,033 --> 00:09:30,002
but just using the vanilla version of it
is not going to cut it.
183
00:09:30,770 --> 00:09:33,105
Being able to learn with these tools
184
00:09:33,105 --> 00:09:37,577
by also building the documentation,
the vision,
185
00:09:37,577 --> 00:09:41,581
the information you need to move forward
is really where we've seen the most value.
186
00:09:41,581 --> 00:09:44,850
So to give you an example of a module
that we run will work with leaders
187
00:09:44,850 --> 00:09:50,523
in an environment where they're working
with the AI to build their vision for what
188
00:09:50,523 --> 00:09:54,126
AI looks like in the organization,
to create a maturity assessment.
189
00:09:54,126 --> 00:09:57,730
So where do we stand and where's
the alignment amongst the CXOs?
190
00:09:58,230 --> 00:10:01,434
And it's not just about the education of
in AI.
191
00:10:01,434 --> 00:10:02,969
It's about alignment.
192
00:10:02,969 --> 00:10:05,972
And there's a strong difference
between agreement and alignment.
193
00:10:06,238 --> 00:10:09,775
What happens oftentimes at a leadership
level is we agree AI is important.
194
00:10:10,042 --> 00:10:13,245
We agree that we're all implementing it,
but they're not aligned
195
00:10:13,245 --> 00:10:17,283
as to how that happens or where they even
are on a maturity index.
196
00:10:17,617 --> 00:10:21,120
So having them come together
to do that together
197
00:10:21,320 --> 00:10:25,458
while using the tools to facilitate
it brings a couple of those objectives
198
00:10:25,458 --> 00:10:26,792
together, and all of a sudden
199
00:10:26,792 --> 00:10:30,463
they start to see, okay, here, here's
how the tools can facilitate this work.
200
00:10:30,596 --> 00:10:32,131
It doesn't need to take six weeks.
201
00:10:32,131 --> 00:10:34,133
It can literally take an afternoon.
202
00:10:34,133 --> 00:10:36,836
So you've taken something
that might have been a six month
203
00:10:36,836 --> 00:10:40,473
consulting engagement
and said, we're walking out in 2.5 hours
204
00:10:40,773 --> 00:10:43,676
with a much clearer understanding
of what we're doing, how we're doing it,
205
00:10:43,676 --> 00:10:46,679
who's responsible
and what that roadmap looks like.
206
00:10:46,746 --> 00:10:49,081
So that's one of the big
paradigm shifts we're seeing.
207
00:10:50,850 --> 00:10:52,284
So it's more,
208
00:10:52,284 --> 00:10:55,321
if I understand you correctly,
there's there's more value in like
209
00:10:55,454 --> 00:10:58,591
one condensed facilitated session of ask
210
00:10:58,658 --> 00:11:01,861
having the right people in the room
and asking the right questions.
211
00:11:02,061 --> 00:11:06,465
Then like a protracted engagement on like,
here's a bunch of recommendations
212
00:11:06,465 --> 00:11:09,402
of like potential use cases
and what you could be doing.
213
00:11:09,402 --> 00:11:10,236
Is that fair?
214
00:11:10,236 --> 00:11:13,773
I think the potential use case
model is kind of outdated at this point.
215
00:11:13,773 --> 00:11:17,343
So I do think that the idea of the
the thousand page Dax
216
00:11:17,343 --> 00:11:20,446
and the the long consulting engagements
need to change.
217
00:11:21,414 --> 00:11:25,518
I won't proclaim that consulting companies
are dead on arrival.
218
00:11:25,518 --> 00:11:28,120
I think that's a
there's a little hyperbole there.
219
00:11:28,120 --> 00:11:31,657
They're like advertising agencies,
like they survive this kind of stuff.
220
00:11:32,725 --> 00:11:36,062
The consulting company of the future
looks wildly different than it does today.
221
00:11:36,295 --> 00:11:38,397
But one of the things that they're often
222
00:11:38,397 --> 00:11:41,000
kind of admonished for
is this idea of a thousand page deck.
223
00:11:41,000 --> 00:11:44,003
And I do think that the idea of learning
224
00:11:44,270 --> 00:11:48,140
being separate from doing so, having
a thousand page deck, a bunch of seminars,
225
00:11:48,140 --> 00:11:51,877
and then finally being responsible
to do on your own is outdated.
226
00:11:51,877 --> 00:11:56,515
We now have the tools,
we have the apparatus to learn and do at
227
00:11:56,515 --> 00:12:01,120
the same time, while also building
some of the most essential infrastructure,
228
00:12:01,287 --> 00:12:05,191
as well as documentation
and strategy amongst executives.
229
00:12:05,191 --> 00:12:07,993
So when you walk out of a session,
you should have a clear vision.
230
00:12:07,993 --> 00:12:10,563
You should have an understanding
of how this impacts you,
231
00:12:10,563 --> 00:12:13,733
what your maturity assessment looks like
and what your roadmap looks like.
232
00:12:13,999 --> 00:12:17,903
So what we've come to see is that we can
take things that should take six weeks
233
00:12:18,137 --> 00:12:22,141
or even three months and condense that
into one afternoon or a full day session.
234
00:12:23,242 --> 00:12:24,276
Right.
235
00:12:24,276 --> 00:12:26,512
Which is which is really exciting.
236
00:12:26,512 --> 00:12:30,182
And, you know,
I think helps us get a lot more
237
00:12:30,282 --> 00:12:33,452
like just accelerate our time
to value our time to results.
238
00:12:34,053 --> 00:12:36,422
And there's a, there's a phrase
I want to talk about,
239
00:12:36,422 --> 00:12:38,824
that you've said quite a lot,
and I want to
240
00:12:38,824 --> 00:12:42,228
I want to put some parameters around it,
which is the tools.
241
00:12:42,228 --> 00:12:44,997
Right. Talking about AI and the tools.
242
00:12:44,997 --> 00:12:48,667
Because, you know, in my mind,
when we talk about AI
243
00:12:48,667 --> 00:12:50,536
and when we talk about the tools, it's
244
00:12:50,536 --> 00:12:53,139
everything from,
you know, just go to ChatGPT
245
00:12:53,139 --> 00:12:57,076
or Google and write in your question
to, you know, this world I'm finding
246
00:12:57,076 --> 00:13:02,047
we live in increasingly where like
every software vendor and their brother
247
00:13:02,047 --> 00:13:05,985
is promising you that, oh, there's
AI in this now it's an AI PC that like,
248
00:13:06,152 --> 00:13:11,624
like any crevice we can hide
AI in, we're telling you there's AI in it.
249
00:13:12,358 --> 00:13:14,760
What tools do
people need to be thinking about?
250
00:13:14,760 --> 00:13:18,631
How should we be bringing tool
wise AI into our organizations?
251
00:13:18,631 --> 00:13:23,269
Yeah, we're at the very beginning of the
development of a very robust ecosystem.
252
00:13:23,502 --> 00:13:27,673
So most people are seeing things
like ChatGPT, Claude, Copilot, etc.
253
00:13:27,673 --> 00:13:30,776
and that's kind of when people say tools,
they think of that and that's fine.
254
00:13:32,011 --> 00:13:33,813
But what we're seeing at the enterprise
level
255
00:13:33,813 --> 00:13:37,783
is a weaving of that into the basic
infrastructure across the board.
256
00:13:37,783 --> 00:13:41,187
So for a lot of people,
they're pulling in copilot, others
257
00:13:41,187 --> 00:13:44,356
they're pulling in, you know, open
AI's API into the work that they do.
258
00:13:44,757 --> 00:13:47,326
And stage one is just to get people
exposed to tools.
259
00:13:47,326 --> 00:13:49,461
So that's access to the chat bots.
260
00:13:49,461 --> 00:13:51,463
It's a 1 to 1 relationship.
261
00:13:51,463 --> 00:13:55,701
You put in an input,
you get an output that is barely silly.
262
00:13:55,768 --> 00:13:58,771
Even alpha products for an enterprise
at this point in time, like you're
263
00:13:59,071 --> 00:14:03,008
just getting your socks on before
you put your shoes on to get out the door.
264
00:14:03,475 --> 00:14:08,581
And what we're seeing with organizations
that are more successful is their leading
265
00:14:08,581 --> 00:14:11,183
with use cases
that everyone can understand,
266
00:14:11,183 --> 00:14:14,286
and then they're building that into
the infrastructure of their organization,
267
00:14:14,587 --> 00:14:17,590
not just saying,
can you go learn how to use ChatGPT?
268
00:14:17,823 --> 00:14:21,260
That's basic and necessary,
because people can't start
269
00:14:21,260 --> 00:14:22,962
to identify
what the use cases are going to be
270
00:14:22,962 --> 00:14:25,164
unless they have experienced
the technology.
271
00:14:25,164 --> 00:14:28,000
And I'm a big fan of pushing
that down and out.
272
00:14:28,000 --> 00:14:29,935
You can have people at the edges
273
00:14:29,935 --> 00:14:32,905
coming up with the use cases,
not just the people at the top,
274
00:14:32,905 --> 00:14:33,739
because the people
275
00:14:33,739 --> 00:14:37,743
who are dealing with the challenges,
who are the ones actually doing the work,
276
00:14:38,010 --> 00:14:41,013
have a much more nuanced understanding
of how it should be done
277
00:14:41,080 --> 00:14:44,083
and what success looks like
for those types of tasks and activities.
278
00:14:44,483 --> 00:14:47,820
To get there has to go way
beyond just access to ChatGPT.
279
00:14:48,754 --> 00:14:49,221
What we
280
00:14:49,221 --> 00:14:53,559
found for so many organizations is
they'll often us like, hey, we we invested
281
00:14:53,559 --> 00:14:57,196
half $1 million in ChatGPT licenses,
and people loved it at the beginning.
282
00:14:57,196 --> 00:15:00,199
And then like,
nobody uses it, it goes like this.
283
00:15:00,532 --> 00:15:02,001
It creates and then it crashes.
284
00:15:02,001 --> 00:15:03,168
As far as usage.
285
00:15:03,168 --> 00:15:05,604
And the big part of it is
you just gave them a new tool.
286
00:15:05,604 --> 00:15:09,108
You gave them barely
any training, barely any context.
287
00:15:09,174 --> 00:15:11,176
And you said amongst
all the things you're doing,
288
00:15:11,176 --> 00:15:14,513
you're overstretched, under-resourced,
your expectations are just getting higher.
289
00:15:14,513 --> 00:15:17,182
Now you have to go
learn a new way of doing things.
290
00:15:17,182 --> 00:15:19,618
There's no surprise
people are not using the tools, right?
291
00:15:19,618 --> 00:15:23,589
So it has to be very clear from
the beginning how this applies to them,
292
00:15:24,123 --> 00:15:28,627
why it's relevant, why it's going to
change their life, not the organization.
293
00:15:29,261 --> 00:15:32,264
Like really put it into terms
they can understand and grapple with.
294
00:15:33,065 --> 00:15:36,235
And then over time,
as the knowledge starts to diffuse across
295
00:15:36,235 --> 00:15:39,238
the organization, the infrastructure
also becomes more robust.
296
00:15:39,371 --> 00:15:42,608
It's almost the opposite
in many ways of previous transformation.
297
00:15:42,608 --> 00:15:45,611
So let's take a SAP ERP implementation.
298
00:15:45,811 --> 00:15:47,880
That's a 3 to 5 year process.
299
00:15:47,880 --> 00:15:49,415
IT command and control.
300
00:15:49,415 --> 00:15:50,582
We install it.
301
00:15:50,582 --> 00:15:52,851
Everyone else has to adapt to it.
302
00:15:52,851 --> 00:15:55,287
What's happening with
AI is we're kind of reversing that
303
00:15:55,287 --> 00:15:58,290
in a in a way,
yes it is provisioning, licensing.
304
00:15:58,524 --> 00:16:00,392
But this is not just an IT issue anymore.
305
00:16:00,392 --> 00:16:01,460
This is an HR issue.
306
00:16:01,460 --> 00:16:02,728
This is a strategy issue.
307
00:16:02,728 --> 00:16:04,763
This is a finance issue.
308
00:16:04,763 --> 00:16:10,803
And if you don't have your CE, CFO,
your CRO and your CEO all in lockstep
309
00:16:10,803 --> 00:16:13,472
on how this is being distributed,
you're not going to come up
310
00:16:13,472 --> 00:16:16,175
with an effective way
of distributing the technology,
311
00:16:16,175 --> 00:16:19,178
the knowledge and the application
across your organization
312
00:16:19,411 --> 00:16:22,748
so that that changes
the dynamic of the tool enormously.
313
00:16:23,182 --> 00:16:26,318
And over time, it then becomes
part of the fabric of the organization,
314
00:16:26,318 --> 00:16:30,622
just like we have with all of our
other, productivity tools. So
315
00:16:31,890 --> 00:16:34,593
I'm lot lots to process there.
316
00:16:34,593 --> 00:16:36,862
What one of the,
317
00:16:36,862 --> 00:16:40,966
one of the things you've said in is
we talked a little bit about it already,
318
00:16:40,966 --> 00:16:44,269
but this there's a fear of people
losing their jobs.
319
00:16:44,269 --> 00:16:48,574
And like Disintermediated like jobs from
tasks and augmenting what we're doing.
320
00:16:49,308 --> 00:16:51,543
You had said previously,
321
00:16:51,543 --> 00:16:52,678
we're not going to lose jobs.
322
00:16:52,678 --> 00:16:54,980
We're going to lose job descriptions.
323
00:16:54,980 --> 00:16:56,815
Well, what does that mean?
324
00:16:56,815 --> 00:17:00,986
And what does that look like
with these, with AI and with these modern
325
00:17:00,986 --> 00:17:02,421
tools and approaches? Absolutely.
326
00:17:02,421 --> 00:17:07,993
So, I said that probably 2 or 3 years ago
in one of my South by Southwest speeches,
327
00:17:07,993 --> 00:17:10,996
I said, we're we won't lose our jobs
or lose our job descriptions.
328
00:17:11,230 --> 00:17:14,166
And, I've, I've had several people say,
well, that age horribly.
329
00:17:14,166 --> 00:17:16,101
And I'm like, actually, yes and no.
330
00:17:16,101 --> 00:17:18,971
Like, I will willingly say
there are jobs that are going to be lost.
331
00:17:18,971 --> 00:17:20,806
This there's no question about them.
332
00:17:20,806 --> 00:17:24,710
But the the image that conjures
for a lot of people is like,
333
00:17:24,710 --> 00:17:26,678
well, you're going to be on a breadline
for the rest of your life.
334
00:17:26,678 --> 00:17:29,681
You'll never be allowed to work again
because you're completely irrelevant.
335
00:17:30,249 --> 00:17:32,117
That's not how this works.
336
00:17:32,117 --> 00:17:35,254
There will be some fracturing
of the system that we work within,
337
00:17:35,254 --> 00:17:36,588
and that's not going to be easy.
338
00:17:36,588 --> 00:17:39,992
And it's going to cause some some pain for
some organizations and a lot of people.
339
00:17:40,592 --> 00:17:44,730
But what does happen is it also changes
the nature of the jobs we hold.
340
00:17:45,064 --> 00:17:47,933
So what I see happening,
and this is related to the concept
341
00:17:47,933 --> 00:17:48,867
of creative journalist
342
00:17:48,867 --> 00:17:51,670
that have been talking about for a while
now, and I can define that as well.
343
00:17:51,670 --> 00:17:55,607
But, a creative journalist is essentially
I'll pull this back
344
00:17:55,607 --> 00:17:58,610
because it's related to the waves
educated, and we build our careers.
345
00:17:58,610 --> 00:18:01,613
So we grew up in a system
that said, go to school,
346
00:18:01,880 --> 00:18:05,851
get a job, build expertise within that job
that becomes defensible.
347
00:18:05,851 --> 00:18:07,686
And that's
what gets you promoted over time.
348
00:18:07,686 --> 00:18:10,689
Over time, you start to manage
the functions that you're an expert in.
349
00:18:11,123 --> 00:18:14,026
You become management and leadership
and up you go.
350
00:18:14,026 --> 00:18:18,430
That vertical, vertical, way of working
351
00:18:18,430 --> 00:18:21,800
has been the way we've promoted people
and told people to go after their careers.
352
00:18:21,800 --> 00:18:23,402
Right. Decades.
353
00:18:23,402 --> 00:18:27,139
What I does, as I mentioned earlier,
is it essentially abstracts
354
00:18:27,139 --> 00:18:32,010
the years and decades of expertise,
influence, opportunity, exposure
355
00:18:32,244 --> 00:18:36,849
that you need to build expertise
in a specific, subject.
356
00:18:37,316 --> 00:18:40,953
And it allows you to perform proficiently
in skill sets that are adjacent
357
00:18:40,953 --> 00:18:43,989
to your own, or even some
that you never had access to before.
358
00:18:44,423 --> 00:18:47,459
I said proficiently,
I did not say in any means an elite level,
359
00:18:47,993 --> 00:18:51,630
but what we often have to confront
in our organizations
360
00:18:51,630 --> 00:18:55,000
is, in many cases,
good enough is good enough.
361
00:18:55,501 --> 00:18:59,938
You don't need somebody with 25 years
of experience to do junior level work.
362
00:19:00,339 --> 00:19:03,775
And if I am an outsider adjacent
to that role
363
00:19:03,775 --> 00:19:08,347
and I can get that junior level work done,
why do I have to wait? Or,
364
00:19:08,447 --> 00:19:12,151
you know, rely on specialized expertise
and resources to get that work done?
365
00:19:12,618 --> 00:19:14,520
So that changes this dynamic.
366
00:19:14,520 --> 00:19:18,857
And, the nature
of how I relate to my peers, their roles,
367
00:19:18,857 --> 00:19:22,261
my rules, and it expands the capacity
368
00:19:22,261 --> 00:19:25,264
and responsibility
in the remit of individual roles.
369
00:19:25,264 --> 00:19:28,567
We're moving away from role
based relationships to jobs,
370
00:19:29,001 --> 00:19:32,070
to skill based
and task based relationships to jobs.
371
00:19:32,838 --> 00:19:36,608
And that's where I feel like
the idea of jobs are not going away.
372
00:19:36,608 --> 00:19:39,611
I even say in that same speech,
jobs are dead.
373
00:19:39,678 --> 00:19:41,580
Long live work.
374
00:19:41,580 --> 00:19:46,218
And one thing that I think we're so stuck
on and say we need good jobs, we need,
375
00:19:46,351 --> 00:19:46,785
you know,
376
00:19:46,785 --> 00:19:49,788
you'll hear every politician say, we need
to bring good jobs back to America.
377
00:19:50,155 --> 00:19:54,026
And it's not the jobs,
it's the work that we need.
378
00:19:54,393 --> 00:19:58,463
If we're so focused on jobs or already
narrowly defining ourselves and oftentimes
379
00:19:58,463 --> 00:20:01,500
attaching ourselves
to things that are not coming back,
380
00:20:01,900 --> 00:20:05,270
like we're we're not really going to be
bringing back the coal sector that way.
381
00:20:05,270 --> 00:20:07,039
Everybody is talking about it
in many ways.
382
00:20:07,039 --> 00:20:10,042
That's kind of a train
that's moved along no matter what we do,
383
00:20:10,576 --> 00:20:13,078
and other jobs are the same way.
384
00:20:13,078 --> 00:20:16,915
But we talk about the work
as it relates to the tasks and roles
385
00:20:16,915 --> 00:20:20,452
and things like that that need to be,
put together for the future of work.
386
00:20:20,852 --> 00:20:23,188
That's
where we can actually make some traction.
387
00:20:23,188 --> 00:20:26,191
And that's why I do believe, like,
we're not getting rid of jobs.
388
00:20:26,425 --> 00:20:30,229
We're getting rid of the definition
of the artificial boundaries
389
00:20:30,229 --> 00:20:32,864
that keep you in a specific space
in your organization.
390
00:20:32,864 --> 00:20:36,902
And that's how I see organizations evolving significantly over the next decade.
391
00:20:38,570 --> 00:20:40,172
So if that if that
392
00:20:40,172 --> 00:20:43,842
is the case,
there's like tons of wild implications.
393
00:20:43,942 --> 00:20:46,478
And I by the way,
I think that probably is the case.
394
00:20:46,478 --> 00:20:50,015
But there's like tons of wild implications
from the education system
395
00:20:50,015 --> 00:20:53,018
to our careers to organizations.
396
00:20:53,485 --> 00:20:56,488
But in terms of impacts in,
397
00:20:58,056 --> 00:21:00,359
and maybe feel free to tell me
398
00:21:00,359 --> 00:21:03,495
this entire paradigm is wrong,
but where is there more risk?
399
00:21:03,495 --> 00:21:07,766
Is there more risk for,
you know, junior employees right now
400
00:21:07,966 --> 00:21:11,603
who you know, their general skill set
401
00:21:11,603 --> 00:21:14,873
and their baseline
level of knowledge can be replaced by AI?
402
00:21:15,107 --> 00:21:19,144
Or is there more risk for people
who are 20 years into their career
403
00:21:19,411 --> 00:21:22,347
and they have a deep skill set
404
00:21:22,347 --> 00:21:25,350
that maybe now is almost commoditized?
405
00:21:25,417 --> 00:21:28,420
Because you can ask, you know, I
406
00:21:28,520 --> 00:21:31,523
how to, you know,
I'll pick on data scientists, you know,
407
00:21:31,690 --> 00:21:34,593
just for example, because, you know,
these are, you know, historically
408
00:21:34,593 --> 00:21:38,130
high paid roles
that have a lot of schooling behind them.
409
00:21:38,130 --> 00:21:39,164
And if you can start to get
410
00:21:39,164 --> 00:21:42,768
some of these answers,
some of this validation information
411
00:21:42,968 --> 00:21:46,972
commoditized by AI, what does that mean
for people in these careers.
412
00:21:46,972 --> 00:21:50,008
So where where is the risk greater
413
00:21:50,008 --> 00:21:53,111
or is the entire paradigm
just, you know, misplaced.
414
00:21:53,145 --> 00:21:56,782
It's so they're both under
an enormous amount of strain right now.
415
00:21:56,948 --> 00:22:00,786
The one that's most present
that we're seeing, happen
416
00:22:01,753 --> 00:22:04,656
with more frequency is those
who are at the beginning of their careers
417
00:22:04,656 --> 00:22:08,493
are at highest risk to this exposure,
because you can replace a lot
418
00:22:08,493 --> 00:22:09,995
of the things that they would do.
419
00:22:09,995 --> 00:22:13,899
Now we're kind of there was this unspoken
contract that when you leave school,
420
00:22:13,899 --> 00:22:15,434
you would continue your training
421
00:22:15,434 --> 00:22:18,437
almost like a vocation in whatever place
you would go in and learn.
422
00:22:18,704 --> 00:22:20,205
And there's
an enormous amount of teaching,
423
00:22:20,205 --> 00:22:22,674
mentoring that goes on with that.
424
00:22:22,674 --> 00:22:27,012
And when I can just consult ChatGPT
and get it done and not have to mentor it,
425
00:22:27,145 --> 00:22:28,213
I'm going to do that.
426
00:22:28,213 --> 00:22:31,049
There's just no question
that that's going to happen in my case.
427
00:22:31,049 --> 00:22:34,052
So junior roles are already starting
to disappear.
428
00:22:34,686 --> 00:22:38,590
The capacity and capability that juniors
have are starting to disappear.
429
00:22:38,857 --> 00:22:41,560
And the challenge
that just adding the technology
430
00:22:41,560 --> 00:22:44,696
to and giving juniors access
to that doesn't help a lot
431
00:22:44,696 --> 00:22:48,734
because they don't have the experience
and the frameworks to understand.
432
00:22:48,967 --> 00:22:51,136
You know, when I'm
doing some exploratory research on this,
433
00:22:51,136 --> 00:22:54,473
what matters, what are the things
I should be looking and honing in on?
434
00:22:54,673 --> 00:22:55,474
And they just
435
00:22:55,474 --> 00:22:58,910
they get overwhelmed by the sheer
volume of things they should be looking at
436
00:22:59,144 --> 00:23:03,482
without being able to find the right
signals to, to to hone in on.
437
00:23:03,882 --> 00:23:07,386
Now, the other part
that's true is that middle management
438
00:23:07,552 --> 00:23:10,455
is also getting
hit really hard with this stuff.
439
00:23:10,455 --> 00:23:14,793
Because if your role is focused
more on creating alignment,
440
00:23:15,026 --> 00:23:18,663
checking in on organizations,
or checking in on your employees,
441
00:23:18,663 --> 00:23:20,766
doing a little bit of mentoring here
and there, but more
442
00:23:20,766 --> 00:23:23,769
so the things that are around
productivity and efficiency of a team.
443
00:23:24,102 --> 00:23:26,805
So not the leadership level,
not the visioning.
444
00:23:26,805 --> 00:23:27,139
Right.
445
00:23:27,139 --> 00:23:31,410
But just like the operations
of the company that is directly
446
00:23:31,410 --> 00:23:35,013
in the line of fired
AI and what that changes, is it
447
00:23:35,013 --> 00:23:38,850
the stuff of being a boss
is also starting to go away.
448
00:23:39,284 --> 00:23:43,188
What that makes space for is for people
to step into actual roles of leadership.
449
00:23:43,655 --> 00:23:46,158
So we're seeing this layer
of middle management
450
00:23:46,158 --> 00:23:50,862
that is directly in the line of fire,
but also collapses the organization a bit,
451
00:23:50,862 --> 00:23:54,399
where the line between junior
or more junior people and leadership
452
00:23:54,633 --> 00:23:57,569
is also starting
to become thinner and thinner and thinner.
453
00:23:57,569 --> 00:23:59,604
But it means more direct contact
with people
454
00:23:59,604 --> 00:24:02,674
who can spend time in the space of
being leaders versus bosses.
455
00:24:03,975 --> 00:24:04,609
Can you can
456
00:24:04,609 --> 00:24:07,612
you, just unpack that a little bit for me?
457
00:24:07,879 --> 00:24:10,148
What is it? I'm.
458
00:24:10,148 --> 00:24:13,051
I'm actually a little bit surprised
to hear that that the some of the team
459
00:24:13,051 --> 00:24:16,021
management
piece is something that I can do.
460
00:24:16,354 --> 00:24:19,357
What what are the tasks
that you see as being,
461
00:24:20,492 --> 00:24:22,627
you know, no. Now doable to AI.
462
00:24:22,627 --> 00:24:24,596
What does that look like? Absolutely.
463
00:24:24,596 --> 00:24:29,701
So a lot of what happens at the management
level is facilitation and alignment
464
00:24:30,068 --> 00:24:33,472
and things like facilitation
are easily done by AI.
465
00:24:34,172 --> 00:24:38,109
When we actually work within Signal and
Cipher, the projects that we're working on
466
00:24:38,410 --> 00:24:42,180
are known not only to us, but also our AI
and our project management system.
467
00:24:42,180 --> 00:24:43,782
So I don't have to check in
with my co-founder.
468
00:24:43,782 --> 00:24:45,617
You're like, where are you on this?
469
00:24:45,617 --> 00:24:46,251
It knows.
470
00:24:46,251 --> 00:24:49,821
Therefore I know, so that
the amount of meetings that I'm having
471
00:24:49,921 --> 00:24:54,392
around alignments
and approval have dropped like 70%.
472
00:24:54,826 --> 00:24:58,096
So the wow, these big parts
where we call it corporate waste,
473
00:24:58,797 --> 00:24:59,865
these items where you're
474
00:24:59,865 --> 00:25:03,301
waiting on specialized resources,
they come available to do the work
475
00:25:03,568 --> 00:25:07,806
or to have a moment of someone's
time to say, do you approve of this?
476
00:25:08,073 --> 00:25:11,676
Or having those CYA moments
of does legal approve of this?
477
00:25:12,210 --> 00:25:14,312
Those are actually starting
to disappear as well
478
00:25:14,312 --> 00:25:16,348
because the systems
can check on these things.
479
00:25:16,348 --> 00:25:19,351
Now, none of these things are bulletproof
or faultless yet.
480
00:25:19,484 --> 00:25:23,522
So in the beginning they actually create
more work in an apparatus.
481
00:25:23,522 --> 00:25:26,525
So there's more work for bosses
and middle management to work on.
482
00:25:26,525 --> 00:25:28,226
There's more work for implementation.
483
00:25:28,226 --> 00:25:32,130
And and we see that almost in every
transformation, it actually productivity
484
00:25:32,130 --> 00:25:36,601
dips and effort and investment
go up during that transitory phase.
485
00:25:36,968 --> 00:25:39,704
But when you get past that
and start to see some of those
486
00:25:39,704 --> 00:25:42,707
benefits,
it changes dramatically how you operate.
487
00:25:42,941 --> 00:25:44,943
So we'll go into organizations.
488
00:25:44,943 --> 00:25:48,079
We'll see, you know, 25 people sitting
at a table for an hour long meeting.
489
00:25:48,079 --> 00:25:49,347
And the first thing I think of is like,
490
00:25:50,882 --> 00:25:51,650
three of you need to
491
00:25:51,650 --> 00:25:54,786
be here, and this is so expensive,
like you're wasting so much more.
492
00:25:55,420 --> 00:25:58,790
And for us, it's gotten to a point
where we say that the small team
493
00:25:58,790 --> 00:25:59,691
is the ultimate flacks.
494
00:25:59,691 --> 00:26:03,461
It means that if you've got a small team
that can really run effectively
495
00:26:03,762 --> 00:26:06,765
and you're using your tools
and your infrastructure appropriately,
496
00:26:06,831 --> 00:26:10,402
then you can move so fast,
you can make decisions confidently
497
00:26:10,502 --> 00:26:12,203
without having to consult everybody.
498
00:26:12,203 --> 00:26:15,740
And most of those meetings
are really about CIA.
499
00:26:15,740 --> 00:26:16,575
It's about,
500
00:26:16,575 --> 00:26:20,712
can I distribute the liability
of this decision across a group of people?
501
00:26:20,712 --> 00:26:22,948
So it's not just my fault,
502
00:26:22,948 --> 00:26:24,983
right.
503
00:26:24,983 --> 00:26:27,953
So so with that in mind,
you've talked before about this notion
504
00:26:27,953 --> 00:26:31,122
of augmented teams and being able to use
some of this technology
505
00:26:31,122 --> 00:26:34,392
to get more out of an existing team
or even restructure an existing team.
506
00:26:34,893 --> 00:26:36,928
What does that look like in practice?
Yeah.
507
00:26:36,928 --> 00:26:40,565
So there's a couple things that we lean
into to to help augment teams.
508
00:26:40,565 --> 00:26:42,701
The first level is just learning
to use the tools.
509
00:26:42,701 --> 00:26:46,037
Like just by doing
that, you're already moving 10 to 20%
510
00:26:46,037 --> 00:26:49,541
faster, more effectively
better thought partnership with AI.
511
00:26:50,175 --> 00:26:53,912
But the next level is starting
to actually encode your own knowledge.
512
00:26:54,112 --> 00:26:56,681
So that can happen at a team level
or an individual level.
513
00:26:56,681 --> 00:26:59,784
What I mean by including
it is taking things that you've written,
514
00:26:59,784 --> 00:27:04,723
whether that's briefs, emails, contents,
and starting to turn that
515
00:27:04,723 --> 00:27:08,193
into something that the large language
model can work with and understand.
516
00:27:08,493 --> 00:27:11,162
It's a bit like training a Laura,
517
00:27:11,162 --> 00:27:14,833
on your own assets,
and that becomes a bit of a digital twin.
518
00:27:15,300 --> 00:27:16,735
And we do this at the team level.
519
00:27:16,735 --> 00:27:19,304
The individual level
and the organizational level.
520
00:27:19,304 --> 00:27:21,973
What most people aren't seeing right
now, we're seeing a lot of that happen
521
00:27:21,973 --> 00:27:23,475
at the organizational level.
522
00:27:23,475 --> 00:27:27,646
But when we augment an individual and say,
okay, I've taken everything
523
00:27:27,646 --> 00:27:29,247
you've written at it, not everything,
524
00:27:29,247 --> 00:27:32,350
but the the highest signal,
highest quality stuff that you've written,
525
00:27:33,685 --> 00:27:36,688
content
about who you are, what you've done, etc.
526
00:27:37,022 --> 00:27:40,158
and turn that into a document that sits on
top of the large language model,
527
00:27:40,458 --> 00:27:43,261
and that becomes the filter through
which you prompt.
528
00:27:43,261 --> 00:27:46,231
It's to fill out a filter
and going in and going out.
529
00:27:46,231 --> 00:27:49,067
So it's going to augment
what you're actually prompting against.
530
00:27:49,067 --> 00:27:52,203
It's also going to filter
the responses that the LM gives you.
531
00:27:52,370 --> 00:27:56,307
So everything you do is going to be
in your style, in your tone of voice,
532
00:27:56,574 --> 00:27:59,577
with your strategic
understanding of the business.
533
00:27:59,744 --> 00:28:03,548
And that can really expand
your capabilities in a number of different
534
00:28:03,548 --> 00:28:04,683
directions.
535
00:28:04,683 --> 00:28:07,352
The same thing happens
with the organization, which helps
536
00:28:07,352 --> 00:28:11,222
enormously with things like brain drain
or bringing people up to speed.
537
00:28:11,389 --> 00:28:15,727
One of our goals is as an organization,
if I bring someone in off the street,
538
00:28:15,727 --> 00:28:18,930
I want to be able to get them
to the same level as everybody else within
539
00:28:18,930 --> 00:28:22,600
less than a month and on day one,
they should be able to write an email.
540
00:28:22,667 --> 00:28:24,869
The tone of voice of the company
that should be able to manage
541
00:28:24,869 --> 00:28:26,905
our social media presence,
all that kind of stuff,
542
00:28:26,905 --> 00:28:28,373
because we've built a layer
543
00:28:28,373 --> 00:28:31,710
on top of the large language model
that already has all of that encoded,
544
00:28:32,410 --> 00:28:35,647
rather than having to teach someone
to do that from day one.
545
00:28:36,381 --> 00:28:37,916
So it's doing two things.
546
00:28:37,916 --> 00:28:41,319
It's augmenting the individual's
capability and capacity.
547
00:28:41,720 --> 00:28:45,590
And it's also removing all the friction
of having to become up to speed
548
00:28:45,857 --> 00:28:47,625
on how your organization works.
549
00:28:49,060 --> 00:28:49,828
Right?
550
00:28:49,828 --> 00:28:51,196
So I mean, I
551
00:28:51,196 --> 00:28:54,199
it's such an interesting approach
and I can, you know, immediately
552
00:28:54,199 --> 00:28:58,737
see like huge transformational
organizational benefits, you know,
553
00:28:58,737 --> 00:29:02,073
from an efficiency and effectiveness
perspective by doing something like that,
554
00:29:02,574 --> 00:29:07,245
are you finding that there's resistance
at an individual level to that?
555
00:29:07,445 --> 00:29:10,315
Like my concern would be that
people say like, oh, you're
556
00:29:10,315 --> 00:29:13,685
trying to like download my brain into AI
and then get rid of me.
557
00:29:13,685 --> 00:29:18,156
Like,
I can imagine a world where there's angst.
558
00:29:18,690 --> 00:29:20,992
Is that the case?
And if so, how do you overcome that?
559
00:29:20,992 --> 00:29:23,995
That's the very first response
most people give is like, hey,
560
00:29:23,995 --> 00:29:25,463
this, this is my value.
561
00:29:25,463 --> 00:29:27,932
Like, this is why you hired me to do this.
562
00:29:27,932 --> 00:29:30,635
And that becomes a conversation
between the organization,
563
00:29:30,635 --> 00:29:33,738
the individual, and a contract of
this is yours, not ours.
564
00:29:34,339 --> 00:29:37,909
So the goal with the data
is for the organization
565
00:29:37,909 --> 00:29:40,779
and a team level data
that is owned by the organization.
566
00:29:40,779 --> 00:29:44,349
The individual level data
must stay with the individual.
567
00:29:44,883 --> 00:29:46,951
And this is a
this is a personal philosophy of mine.
568
00:29:46,951 --> 00:29:50,255
I honestly don't believe
that we can move towards a future paradigm
569
00:29:50,255 --> 00:29:53,057
where this is a part of the way
we do our work.
570
00:29:53,057 --> 00:29:55,794
If that agreement doesn't stand a place.
571
00:29:55,794 --> 00:29:56,761
So it's like when you move
572
00:29:56,761 --> 00:30:00,532
from organization to organization,
you take your experience with you,
573
00:30:00,532 --> 00:30:03,835
but you're not taking the files
you worked on at at the office.
574
00:30:04,435 --> 00:30:07,639
If I'm going to encode
you and your thoughts, it's
575
00:30:07,639 --> 00:30:11,409
just like saying, as an actor
or a voice actor, I've encoded your voice
576
00:30:11,409 --> 00:30:15,180
and no longer have to give you credit
for what I extracted from you.
577
00:30:15,814 --> 00:30:19,083
That, by definition, creates
an antagonistic relationship
578
00:30:19,083 --> 00:30:21,152
between the organization
and the individual.
579
00:30:21,152 --> 00:30:24,122
I firmly believe
we should own our data, so
580
00:30:24,122 --> 00:30:27,091
that's what we have encouraged
and facilitated.
581
00:30:27,826 --> 00:30:30,628
But yes, that is typically the way people
look at it when they first start.
582
00:30:30,628 --> 00:30:33,631
It's like, okay, the company's
gonna own everything about me.
583
00:30:33,765 --> 00:30:36,601
And right, sometimes the company are like,
oh yeah, that's that's right.
584
00:30:36,601 --> 00:30:38,937
We could really do a lot more with a lot
fewer people.
585
00:30:38,937 --> 00:30:42,841
Like, no, that that will absolutely one
destroy morale
586
00:30:43,174 --> 00:30:44,843
and to no one a lot work with you.
587
00:30:45,977 --> 00:30:47,011
Right.
588
00:30:47,011 --> 00:30:49,614
So are we.
589
00:30:49,614 --> 00:30:51,282
It's such an interesting idea.
590
00:30:51,282 --> 00:30:52,650
And I and I really.
591
00:30:52,650 --> 00:30:54,919
I hadn't heard that before, to be honest.
592
00:30:54,919 --> 00:30:57,922
And I, I talk to people in this space
all day, every day.
593
00:30:58,056 --> 00:31:03,027
So the idea of having this, like,
this individual,
594
00:31:03,761 --> 00:31:06,764
you know, digital twin or AI ified,
595
00:31:06,764 --> 00:31:10,368
you know, likeness or,
you know, data model of you
596
00:31:11,069 --> 00:31:14,973
right now,
I have to imagine if most organizations
597
00:31:14,973 --> 00:31:17,876
propose that to a staff member,
that'll be like the first
598
00:31:17,876 --> 00:31:20,879
they've heard of it and they're like,
whoa, you know, what is this?
599
00:31:21,079 --> 00:31:23,882
Are we heading toward a world
in the next few years
600
00:31:23,882 --> 00:31:27,185
where this is going to become
the norm and, you know,
601
00:31:27,185 --> 00:31:31,155
everyone will start to be literate around
this and,
602
00:31:31,389 --> 00:31:34,392
you know, expecting this conversation
almost.
603
00:31:34,492 --> 00:31:37,161
I don't know, and the reason I say
that is not
604
00:31:37,161 --> 00:31:40,164
because the technology's not ready for it,
not because it's not possible.
605
00:31:40,598 --> 00:31:43,902
It's that the the limitations
of how our organizations grow
606
00:31:44,102 --> 00:31:45,737
are not just technological.
607
00:31:45,737 --> 00:31:48,673
There's so many different
other constraints as to why organizations
608
00:31:48,673 --> 00:31:52,310
change the way they do
and why technological adoption is so slow.
609
00:31:53,144 --> 00:31:55,647
There's a concept that I love called
martech law,
610
00:31:55,647 --> 00:31:58,950
and it's it's about the difference
between technology
611
00:31:58,950 --> 00:32:02,720
moving exponentially and people
and organizations develop logarithmically.
612
00:32:02,720 --> 00:32:07,392
And what this does is creates this ever
increasing gap between what is possible
613
00:32:07,392 --> 00:32:11,229
with the technology and what the companies
and individuals are actually capable of.
614
00:32:11,462 --> 00:32:14,599
So we're looking at potential versus
practical reality.
615
00:32:14,799 --> 00:32:19,971
And the thing that's pushing this curve
so much lower than this is infrastructure,
616
00:32:19,971 --> 00:32:23,741
technology, culture, decision
debt, technological debt.
617
00:32:23,741 --> 00:32:27,745
There are all these constraints
in an organization that dictate how high
618
00:32:27,745 --> 00:32:31,783
that logarithmic curve can go,
and how far you can push that up
619
00:32:32,150 --> 00:32:36,254
so the technology can move
as fast as you possibly could imagine.
620
00:32:36,688 --> 00:32:40,892
We are not going to be able to integrate
and adapt it as fast as it changes.
621
00:32:40,892 --> 00:32:43,861
We're already seeing that right
now. We're right.
622
00:32:43,861 --> 00:32:46,965
You'll see some people who are just,
you know, they've hundred x themselves,
623
00:32:47,699 --> 00:32:50,735
with what they can do, whether possible,
whether it capable of doing
624
00:32:50,735 --> 00:32:53,304
and then everyone else is looking at them
like they're an alien.
625
00:32:53,304 --> 00:32:56,741
And that's because they, as an individual,
have leaned into it
626
00:32:56,741 --> 00:32:58,242
and are already adapted.
627
00:32:58,242 --> 00:33:01,245
A lot of the stuff, they've already
had the experience to help them do that.
628
00:33:01,479 --> 00:33:02,246
Usually, you know,
629
00:33:02,246 --> 00:33:05,583
really good engineers and developers
can lean in and the way to do that.
630
00:33:06,017 --> 00:33:09,721
But if you're an account person
who doesn't have that expertise,
631
00:33:09,721 --> 00:33:11,489
you look at that and say,
there's all these limitations
632
00:33:11,489 --> 00:33:12,857
preventing me from doing that.
633
00:33:12,857 --> 00:33:14,926
Organizations work the same way.
634
00:33:14,926 --> 00:33:17,328
Some organizations like ours are small.
635
00:33:17,328 --> 00:33:19,497
We're built for this where native AI,
636
00:33:19,497 --> 00:33:22,533
whereas a lot of organizations
we're working with that are fortune 500,
637
00:33:22,934 --> 00:33:24,869
don't have any of those things,
any of those qualities
638
00:33:24,869 --> 00:33:27,372
that will allow them to say,
we're going to be AI native tomorrow.
639
00:33:28,773 --> 00:33:30,141
Right?
640
00:33:30,141 --> 00:33:32,977
So, you know, this brings me
to another central question
641
00:33:32,977 --> 00:33:35,313
that I was excited to ask you about, Ian,
which is
642
00:33:35,313 --> 00:33:38,282
who do you see as being the winners
and losers?
643
00:33:38,616 --> 00:33:39,684
Of this disruption?
644
00:33:39,684 --> 00:33:42,887
And I'm deliberately asking that
in kind of the broadest possible sense.
645
00:33:43,287 --> 00:33:46,824
Yeah, it's an interesting question for
which I'm still forming an answer myself,
646
00:33:47,191 --> 00:33:51,996
because the initial thinking is like,
okay, if we don't need big teams,
647
00:33:51,996 --> 00:33:57,001
then we obviously don't need organizations
that are 200, 300, and 400,000 people.
648
00:33:57,301 --> 00:34:01,172
And, right, all these startups
are going to come and take their lunch.
649
00:34:01,172 --> 00:34:04,142
And it's a lot more complicated than that.
650
00:34:05,209 --> 00:34:08,312
There are other structures
besides just size
651
00:34:08,312 --> 00:34:12,650
keeping the, the current winners
entrenched in their space.
652
00:34:12,917 --> 00:34:17,321
So let's take like a chemical manufacturer
for, for, instance,
653
00:34:17,522 --> 00:34:18,756
like there's a lot of the corporate work
654
00:34:18,756 --> 00:34:21,759
that can be taken away
by AI and made more efficient.
655
00:34:22,026 --> 00:34:23,194
And you can use machines for that.
656
00:34:23,194 --> 00:34:26,431
But there's physical apparatus,
there's mechanical apparatus,
657
00:34:26,431 --> 00:34:28,699
all that needs to be done,
this distribution,
658
00:34:28,699 --> 00:34:31,803
there's geopolitical elements
to how these companies grow.
659
00:34:32,336 --> 00:34:35,573
And again, that gets back
to the practical limitations
660
00:34:35,840 --> 00:34:40,111
that shape the growth and change
of these organizations in new paradigms.
661
00:34:40,478 --> 00:34:44,582
So I don't think it's just as simple
as, okay, you need smaller teams,
662
00:34:44,582 --> 00:34:48,519
fewer people, companies will shrink
and startups will come in and eat lunch.
663
00:34:48,519 --> 00:34:52,223
Of those who don't move fast
enough, speed is one variable.
664
00:34:52,223 --> 00:34:54,992
It's a very important variable,
but it's just one.
665
00:34:54,992 --> 00:34:58,496
What I do think,
though, is companies will get smaller.
666
00:34:59,831 --> 00:35:00,765
But I also think there will
667
00:35:00,765 --> 00:35:04,001
be more startups and more businesses
formed than ever before.
668
00:35:04,402 --> 00:35:07,805
If we just look at the trajectory
of the statistics, even since Covid,
669
00:35:08,005 --> 00:35:11,576
we've had a massive increase in the number
as corpse and LLCs formed
670
00:35:11,576 --> 00:35:14,579
more than any time in history,
and that's likely
671
00:35:14,579 --> 00:35:17,582
to get even easier,
672
00:35:17,582 --> 00:35:21,152
as time goes on, because the ability
to form a company again
673
00:35:21,719 --> 00:35:25,022
gets easier with AI, the ability to form
a team gets easier with AI.
674
00:35:25,056 --> 00:35:27,558
So I think freelancing is going to explode
even more than our. Yes.
675
00:35:27,558 --> 00:35:29,760
So an acceleration in the existing trend,
676
00:35:29,760 --> 00:35:32,797
the ability to open businesses,
the things that keep people away
677
00:35:33,064 --> 00:35:36,067
from opening
businesses is going to almost evaporate.
678
00:35:36,234 --> 00:35:40,071
And I think that the opportunity
to start creating these entities
679
00:35:40,071 --> 00:35:41,372
for even short time
680
00:35:41,372 --> 00:35:44,942
periods of times for more specialized use
cases is going to become a thing, too.
681
00:35:45,243 --> 00:35:49,313
So I could easily see the number
of businesses built in the next ten years.
682
00:35:49,313 --> 00:35:50,715
100 Xingu
683
00:35:50,715 --> 00:35:54,418
not just multiplying on that, because
we're also using agents for that too.
684
00:35:54,752 --> 00:35:58,322
We're not using agents for employees, but
we're using agents to build the business.
685
00:35:58,856 --> 00:36:01,459
So if you think about how you can scale
that,
686
00:36:01,459 --> 00:36:04,662
that's I'm still wrapping my head around
what that looks like.
687
00:36:04,662 --> 00:36:07,832
As far as what is the economy look like,
what is and, you know,
688
00:36:07,899 --> 00:36:10,368
how do we align that
in terms of geopolitics
689
00:36:10,368 --> 00:36:13,437
and how that becomes
the way organizations shape as well?
690
00:36:13,437 --> 00:36:16,841
It's a lot of things that, you know,
still not shape themselves yet.
691
00:36:17,208 --> 00:36:20,344
So I don't think it's
just a small organization's more business.
692
00:36:20,545 --> 00:36:22,580
But that's the closest
thing I could find so far.
693
00:36:23,714 --> 00:36:24,682
Right.
694
00:36:24,682 --> 00:36:29,520
And I'm glad you had that level of clarity
because it's it's easy to just end up
695
00:36:29,520 --> 00:36:32,890
in the mindset of, you know, smaller
organizations eating your lunch.
696
00:36:32,890 --> 00:36:35,893
You're done, you know, good luck.
697
00:36:36,961 --> 00:36:39,463
The when I think about the implications
698
00:36:39,463 --> 00:36:42,667
of what I think we broadly agree upon,
which is it's going to be way
699
00:36:42,667 --> 00:36:43,834
easier to start a business.
700
00:36:43,834 --> 00:36:45,970
There's already more businesses happening.
701
00:36:45,970 --> 00:36:50,074
That's really good for consumers,
I would imagine.
702
00:36:50,508 --> 00:36:54,645
And it does mean more competition
for incumbents.
703
00:36:54,645 --> 00:36:57,715
And I like I really like your point about,
well, it's
704
00:36:57,715 --> 00:36:58,983
not just they're going to eat your lunch
705
00:36:58,983 --> 00:37:03,421
because there's more to it than, you know,
just just the speed or the efficiency.
706
00:37:03,421 --> 00:37:05,223
They're
707
00:37:05,223 --> 00:37:08,492
you've talked before
about the need for transformational change
708
00:37:09,327 --> 00:37:13,264
versus just strictly optimizing
what incumbents are doing,
709
00:37:13,965 --> 00:37:16,968
how transformational do
we need to be thinking,
710
00:37:16,968 --> 00:37:20,304
and what's the best way
to get into that mindset?
711
00:37:20,304 --> 00:37:24,008
Is it creating like an innovation
incubator in your organization?
712
00:37:24,008 --> 00:37:27,912
Is it trying to start your own,
you know, kind of funded startups?
713
00:37:27,912 --> 00:37:30,615
Are there any kind of tactics
you recommend with organization?
714
00:37:30,615 --> 00:37:31,382
Absolutely.
715
00:37:31,382 --> 00:37:34,719
So I think I would encourage organizations
to be radical
716
00:37:34,719 --> 00:37:37,822
with their thinking and practical
with their approach.
717
00:37:38,689 --> 00:37:41,626
So there's there are too many people.
718
00:37:41,626 --> 00:37:45,196
So you kind of need to break burning
all the ground, start fresh.
719
00:37:45,830 --> 00:37:48,499
There's no enterprise
that says we're profitable.
720
00:37:48,499 --> 00:37:49,800
We're doing just fine.
721
00:37:49,800 --> 00:37:52,637
We want to disrupt that. Nobody says that.
722
00:37:53,871 --> 00:37:57,108
But what I do think is,
unless you are radical with your thinking,
723
00:37:57,108 --> 00:38:00,111
you will not be ready for the disruptions
that are going to come.
724
00:38:00,444 --> 00:38:04,148
So these technological transformations
that happen at GPT level.
725
00:38:04,148 --> 00:38:07,318
So general purpose technology
start at the infrastructure level.
726
00:38:07,752 --> 00:38:10,821
So we've seen disruption with technology
and the technology that we use.
727
00:38:10,855 --> 00:38:12,923
So electricity did the same thing.
728
00:38:12,923 --> 00:38:15,092
And OpenAI did the same thing with GPT.
729
00:38:15,092 --> 00:38:16,927
So we know now we're all using it.
730
00:38:16,927 --> 00:38:19,697
But over time those disruptions move up
731
00:38:19,697 --> 00:38:22,833
a level from infrastructure
to application to industry.
732
00:38:23,567 --> 00:38:25,936
So if you're not
733
00:38:25,936 --> 00:38:28,939
okay I guess I guess it is explosive.
734
00:38:29,807 --> 00:38:32,209
But if you're not thinking radically
735
00:38:32,209 --> 00:38:35,279
about the transformation that can happen
at each one of those levels,
736
00:38:35,279 --> 00:38:38,482
and also the transformation
that can happen to your industry,
737
00:38:38,883 --> 00:38:41,852
and you're just focused on the data,
what you have now,
738
00:38:42,153 --> 00:38:44,822
you're missing one of the critical shifts
of transformation in the business.
739
00:38:44,822 --> 00:38:45,723
And there's a,
740
00:38:47,258 --> 00:38:48,092
theme that's becoming
741
00:38:48,092 --> 00:38:51,095
more popular right now
is going moving from insight to foresight.
742
00:38:51,862 --> 00:38:56,634
And when everything is changing around
you, insights valuable, it's
743
00:38:56,634 --> 00:38:59,770
how you create structure around a business
that you can take to market.
744
00:39:00,271 --> 00:39:03,274
Foresight
is about how you avoid getting disrupted.
745
00:39:03,507 --> 00:39:06,677
If we're not looking forward
and we're still letting yesterday's
746
00:39:06,677 --> 00:39:10,414
mental models collide with tomorrow's
technologies, that is how we lose.
747
00:39:10,981 --> 00:39:14,985
But if we are radical
with the way we think, with the IT ability
748
00:39:14,985 --> 00:39:16,754
to test different business models,
749
00:39:16,754 --> 00:39:20,257
put things to market faster
when we might not previously
750
00:39:20,524 --> 00:39:23,427
get that data and that feedback loop
as fast as possible,
751
00:39:23,427 --> 00:39:26,731
we're going to learn more
about that unexplored terrain way faster.
752
00:39:27,164 --> 00:39:32,069
So I wouldn't say go and disrupt your
your $1 billion, you know, revenue line,
753
00:39:32,336 --> 00:39:35,272
but you absolutely
should be incubating things that will
754
00:39:35,272 --> 00:39:36,941
because there are hundreds
755
00:39:36,941 --> 00:39:39,944
and eventually thousands of other startups
that are doing exactly that.
756
00:39:40,144 --> 00:39:42,847
And you have no defense against that
if you're not thinking in that way.
757
00:39:42,847 --> 00:39:45,816
So think radically approach practically.
758
00:39:45,816 --> 00:39:47,618
So that next step goes, okay.
759
00:39:47,618 --> 00:39:49,487
So what do we do to implement this?
760
00:39:49,487 --> 00:39:52,223
Is it Tiger teams. Is it small skunkworks.
761
00:39:52,223 --> 00:39:53,491
All of those are viable.
762
00:39:53,491 --> 00:39:54,959
I do believe that having
763
00:39:55,893 --> 00:39:59,196
in its transformation,
you need to find people who are leaning in
764
00:39:59,196 --> 00:39:59,730
and are
765
00:39:59,730 --> 00:40:03,300
self-selecting as the people who are like,
I'm all about this, I want to do this.
766
00:40:03,667 --> 00:40:07,338
Don't try and convince a bunch of people
who might not be invested
767
00:40:07,338 --> 00:40:09,740
in the nest
to be the first ones through the door.
768
00:40:09,740 --> 00:40:11,308
They will be unenthusiastic about it.
769
00:40:11,308 --> 00:40:13,611
They don't have the willpower
to get through the challenges.
770
00:40:13,611 --> 00:40:14,879
It's going to be hard,
771
00:40:14,879 --> 00:40:17,882
and they're going to fail a million times
before they get it right.
772
00:40:17,882 --> 00:40:20,818
If they're not already passionate
about this,
773
00:40:20,818 --> 00:40:23,287
they're going to stop
at the first sign of trouble.
774
00:40:23,287 --> 00:40:26,524
Those people can be followers
of the people who lead the way.
775
00:40:26,524 --> 00:40:27,925
It's not that they're irrelevant.
776
00:40:27,925 --> 00:40:29,326
You need to find the people who are like,
777
00:40:29,326 --> 00:40:31,262
I want to be
the person who kicks the door down.
778
00:40:31,262 --> 00:40:34,365
I want the first person in the room,
and those are the ones you want to build
779
00:40:34,365 --> 00:40:37,802
your teams around to, to think about
these things and build different ideas
780
00:40:38,469 --> 00:40:39,870
and find the tinkerers.
781
00:40:39,870 --> 00:40:42,006
Find the people who may not be
782
00:40:42,006 --> 00:40:45,342
the developers or the engineers
who are already tinkering with the stuff.
783
00:40:45,609 --> 00:40:48,612
There are so many people who are using AI
and building their own agents
784
00:40:48,612 --> 00:40:51,482
or creating,
you know, side businesses on the weekends
785
00:40:51,482 --> 00:40:52,983
who could also be resources for this.
786
00:40:52,983 --> 00:40:57,421
And that's the culture that will create
new opportunities, new business models.
787
00:40:57,688 --> 00:40:58,656
And they're going to learn
788
00:40:58,656 --> 00:41:02,493
what these new paradigms will look like
by doing the work in that space.
789
00:41:02,860 --> 00:41:05,729
That then can be diffused
across the organization.
790
00:41:05,729 --> 00:41:07,731
And that's the second most important part.
791
00:41:07,731 --> 00:41:10,968
Once you have the knowledge,
do you have the infrastructure
792
00:41:10,968 --> 00:41:14,104
set up to diffuse
that knowledge as fast as possible
793
00:41:14,305 --> 00:41:17,141
and as thoroughly as possible
across the organization?
794
00:41:17,141 --> 00:41:18,409
Otherwise, it just is.
795
00:41:18,409 --> 00:41:20,611
Compartmentalize it. Compartmentalize
it. It dies on the vine.
796
00:41:22,012 --> 00:41:22,446
Right.
797
00:41:22,446 --> 00:41:22,980
And I'm glad.
798
00:41:22,980 --> 00:41:25,983
I'm glad. Ian, you used the word culture.
799
00:41:26,450 --> 00:41:28,652
Because I'm curious.
We talked about martech law.
800
00:41:28,652 --> 00:41:30,120
We talked about the need to.
801
00:41:30,120 --> 00:41:33,290
I don't know if we use these word,
but bend the curve upward
802
00:41:33,591 --> 00:41:36,660
to try to keep pace with technology,
to try to compete,
803
00:41:37,194 --> 00:41:39,296
you know, to what degree does
culture play a role there?
804
00:41:39,296 --> 00:41:42,333
Is it the most important thing
is that in the top like
805
00:41:42,433 --> 00:41:45,102
and if it's not the most important,
what is the most important?
806
00:41:45,102 --> 00:41:49,640
I do think it's the most important thing
because if you if you don't have a culture
807
00:41:49,640 --> 00:41:53,777
or can't create a culture
that is willing to lean in and say, hey,
808
00:41:54,178 --> 00:41:57,781
things are going to look so different
in the next couple of years
809
00:41:57,781 --> 00:41:59,383
that we won't even recognize it.
810
00:41:59,383 --> 00:42:01,719
It's up to us to make that change.
811
00:42:01,719 --> 00:42:04,922
You're not going to get there
if everyone is waiting for the vision
812
00:42:04,922 --> 00:42:08,225
to be given to them to take action,
it's already too late.
813
00:42:08,859 --> 00:42:11,562
And that's one of the biggest challenges,
is a lot of organizations.
814
00:42:11,562 --> 00:42:14,565
We built this expectation
that when the CEO
815
00:42:14,565 --> 00:42:17,835
gives the vision,
then people act and people stand there.
816
00:42:18,335 --> 00:42:23,073
If people aren't leaning in and saying,
I'm in R&D too, like I, I'm actively
817
00:42:23,073 --> 00:42:26,310
in research and development of what
my own role looks like in my organization.
818
00:42:26,477 --> 00:42:28,746
My own profession looks like
819
00:42:28,746 --> 00:42:31,382
because you're going to encounter this
no matter what role
820
00:42:31,382 --> 00:42:34,051
you have or what company you work at,
you go work somewhere else.
821
00:42:34,051 --> 00:42:35,586
It's still going to find you.
822
00:42:35,586 --> 00:42:38,589
So we as individuals
have to take ownership over this
823
00:42:38,689 --> 00:42:40,491
if we want to maintain relevance
in this space.
824
00:42:40,491 --> 00:42:42,493
This is not an us
versus them up versus down
825
00:42:42,493 --> 00:42:45,429
organization versus an individual issue.
It's a collective one, right.
826
00:42:45,429 --> 00:42:49,800
So if that's recognized in a healthy way
within an organization,
827
00:42:49,800 --> 00:42:53,904
that creates a camaraderie, a collectivism
that can move an organization forward,
828
00:42:55,406 --> 00:42:55,806
right?
829
00:42:55,806 --> 00:42:58,042
If everybody's kind of,
830
00:42:58,042 --> 00:43:01,245
sorry, I'm going to use the silly like,
you know, the silly idiom
831
00:43:01,245 --> 00:43:02,613
about rowing in the same direction,
832
00:43:02,613 --> 00:43:05,215
but having that purpose
and everybody kind of banding
833
00:43:05,215 --> 00:43:06,417
together to move the organization.
834
00:43:06,417 --> 00:43:09,420
But it's so true and so, so spot on. Yeah.
835
00:43:09,987 --> 00:43:13,958
I want to, you know, with that,
I want to come back to, you know, another
836
00:43:14,325 --> 00:43:16,961
kind of quote you had which is moving
from insight to foresight, which I, which
837
00:43:16,961 --> 00:43:21,999
I love, by the way, where does foresight
come from and can it come from?
838
00:43:22,299 --> 00:43:25,302
I because my sense is
it's like a lot of these tools,
839
00:43:25,569 --> 00:43:27,871
they can summarize what's already known.
Right?
840
00:43:27,871 --> 00:43:32,042
Like they aren't necessarily taking
you forward, or they're just telling you
841
00:43:32,176 --> 00:43:35,446
the sum of what we know up into this point
where where does foresight come from?
842
00:43:35,446 --> 00:43:37,581
Right. So I would actually disagree
with that a little bit.
843
00:43:38,616 --> 00:43:39,049
Okay.
844
00:43:39,049 --> 00:43:44,288
The the my perspective is that, large
language models are commodifying like they
845
00:43:44,321 --> 00:43:46,824
if you just use the large language models,
846
00:43:46,824 --> 00:43:49,493
there's a period of probably
two more years where
847
00:43:49,493 --> 00:43:52,496
you can have an advantage over
many of your your colleagues,
848
00:43:52,730 --> 00:43:57,201
but over time, it's just gonna be like,
I use email, think big whoop.
849
00:43:57,201 --> 00:43:58,502
So does everybody else.
850
00:43:58,502 --> 00:44:00,404
It's literally a commodity.
851
00:44:00,404 --> 00:44:03,140
And the difference is going to be
how do you use it?
852
00:44:03,140 --> 00:44:05,909
And then how have you encoded
your knowledge into doing that.
853
00:44:05,909 --> 00:44:09,179
But specifically on the foresight piece,
it's it's
854
00:44:09,179 --> 00:44:12,650
about searching for signals
and how you combine things as the user.
855
00:44:12,650 --> 00:44:15,653
This is a place where
we're still very much in control.
856
00:44:15,753 --> 00:44:18,422
I don't think this is the kind of thing
you want to automate.
857
00:44:18,422 --> 00:44:21,125
What you can't automate
is searching for signals of change.
858
00:44:21,125 --> 00:44:24,128
So foresight is really about finding
those,
859
00:44:24,762 --> 00:44:27,798
those data points that are outside
of the normal distribution that say,
860
00:44:28,232 --> 00:44:29,066
this is different.
861
00:44:29,066 --> 00:44:32,870
Like you should you should pay attention
over here and validate whether or not
862
00:44:32,870 --> 00:44:35,906
this is something that you should be
investing your time in or concerned about
863
00:44:36,473 --> 00:44:37,808
and in foresight.
864
00:44:37,808 --> 00:44:39,143
If you talk about formal foresight,
865
00:44:39,143 --> 00:44:41,278
there's probable, plausible and possible
futures.
866
00:44:41,278 --> 00:44:42,346
There's a whole bunch of
867
00:44:42,346 --> 00:44:45,482
structure around it to create really good
thinking about what's to come.
868
00:44:46,083 --> 00:44:48,786
But I think that can overwhelm
and overcomplicate people.
869
00:44:48,786 --> 00:44:51,989
We all have a responsibility
to think about foresight.
870
00:44:52,156 --> 00:44:53,824
What does my role look like
871
00:44:53,824 --> 00:44:56,627
in a world where I actually don't
have to go to six meetings a day?
872
00:44:56,627 --> 00:44:57,928
Oh my gosh, sounds amazing.
873
00:44:57,928 --> 00:44:59,430
But what's my new responsibility?
874
00:44:59,430 --> 00:45:01,131
Because it's more on me now, right?
875
00:45:01,131 --> 00:45:05,035
And like not thinking that through puts
you on your back foot
876
00:45:05,536 --> 00:45:08,772
and it makes you subservient
to the vision of whatever else
877
00:45:08,772 --> 00:45:10,374
is happening around you.
878
00:45:10,374 --> 00:45:12,810
So when it comes to foresight,
we should be thinking about,
879
00:45:12,810 --> 00:45:15,179
well, what if this happened?
How would I react?
880
00:45:15,179 --> 00:45:18,382
And it's not about fortune
telling or predicting the future.
881
00:45:19,049 --> 00:45:22,219
It's about seeing the signals and patterns
that are starting to arrive
882
00:45:22,653 --> 00:45:26,223
and understanding the scenarios
of how might that affect me?
883
00:45:26,490 --> 00:45:27,691
How might I react
884
00:45:27,691 --> 00:45:30,661
so that when something does actually come
your way, you can say, oh, you know,
885
00:45:30,994 --> 00:45:34,965
I've seen something that looks like this
before, or this rhymes with something else
886
00:45:34,965 --> 00:45:38,702
we've already thought through,
and you're adapting versus reacting.
887
00:45:38,702 --> 00:45:41,071
You're proactive versus reactive.
888
00:45:41,071 --> 00:45:44,374
And I think the best organizations in
the world do an enormous amount of that.
889
00:45:44,875 --> 00:45:48,679
The ones that don't are the ones
that really do get caught by surprise.
890
00:45:48,679 --> 00:45:51,415
And we see a lot of enterprises
in that space right now.
891
00:45:51,415 --> 00:45:56,253
But I do think that the
the foresight is where we need to lean,
892
00:45:56,487 --> 00:45:59,289
because it's also where we can have
a lot more of our human agency
893
00:45:59,289 --> 00:46:03,627
using the models and the AI
and the tools to bring the data into us,
894
00:46:03,627 --> 00:46:05,462
to help us identify what those what our
895
00:46:05,462 --> 00:46:08,198
what's different and say,
what does it actually mean to me?
896
00:46:08,198 --> 00:46:11,702
And using as a thought partner
in getting to getting to clarity.
897
00:46:12,903 --> 00:46:13,637
Right.
898
00:46:13,637 --> 00:46:16,874
So, you know, on the note
of what it means to me and to,
899
00:46:17,141 --> 00:46:19,943
to be honest, you know, I'm, I'm surprised
900
00:46:19,943 --> 00:46:22,946
and very intrigued to hear you say that
more of the foresight
901
00:46:23,180 --> 00:46:26,617
can be done by these tools
than maybe we imagined before.
902
00:46:26,917 --> 00:46:31,688
And I'm curious, and this is sort of a
self-serving question, for for both of us.
903
00:46:31,688 --> 00:46:37,561
But what role does an organization
like Signal and Cipher play in this world?
904
00:46:37,561 --> 00:46:39,463
Right. Is this something
that all other tools can do it.
905
00:46:39,463 --> 00:46:40,998
I can do it.
906
00:46:40,998 --> 00:46:43,333
You know, we don't need partners
to help us with this.
907
00:46:43,333 --> 00:46:46,336
Where does an organization like yours
come into play to help
908
00:46:46,603 --> 00:46:49,640
actually accelerate,
you know, traditional enterprises?
909
00:46:49,640 --> 00:46:50,340
Absolutely.
910
00:46:50,340 --> 00:46:53,177
So the to create a little more clarity
around that.
911
00:46:53,177 --> 00:46:56,180
I don't believe that the AI tools can do
them, do this on their own.
912
00:46:56,613 --> 00:47:00,150
But I think that they can facilitate
our own work in coming to
913
00:47:00,150 --> 00:47:03,153
an understanding of what possible
and plausible futures could look like.
914
00:47:03,420 --> 00:47:07,291
So a lot of the research
that one would do to do, you know,
915
00:47:07,357 --> 00:47:09,560
future scanning and signal scanning
916
00:47:09,560 --> 00:47:13,363
to find these opportunities
we should be looking into can be massively
917
00:47:13,363 --> 00:47:16,800
accelerated, scaled
and assisted using large language models.
918
00:47:16,800 --> 00:47:19,503
It's still up to us to say,
how does this fit with my strategy?
919
00:47:19,503 --> 00:47:23,040
How does this fit with the market dynamics
that I'm seeing play out?
920
00:47:24,007 --> 00:47:24,474
So it
921
00:47:24,474 --> 00:47:27,511
expands and augments
our capability to do it, and it makes it
922
00:47:27,511 --> 00:47:31,648
so people who are unfamiliar with
this can dive in even faster.
923
00:47:31,815 --> 00:47:34,818
So it's less intimidating
to get into that space.
924
00:47:34,952 --> 00:47:38,822
There will be more and more automation
of that as time goes on.
925
00:47:39,156 --> 00:47:43,627
And who knows, maybe in five seven years
we could actually say, okay,
926
00:47:43,627 --> 00:47:45,562
just go run and create,
927
00:47:45,562 --> 00:47:47,397
you know, do a monte
Carlo simulation times
928
00:47:47,397 --> 00:47:49,199
a thousand for me
and pull this all together
929
00:47:49,199 --> 00:47:52,202
and then give me that data
and tell me what my future looks like.
930
00:47:52,302 --> 00:47:54,304
I don't think it'll be that simple, but
931
00:47:54,304 --> 00:47:57,307
there could be a world that looks like,
932
00:47:57,374 --> 00:48:00,244
but for the work that we're doing,
933
00:48:00,244 --> 00:48:04,848
our focus is on helping organizations
get that to become part of their culture.
934
00:48:04,848 --> 00:48:06,316
So it comes from training
935
00:48:06,316 --> 00:48:08,919
that comes from building that data layer
that goes on top.
936
00:48:08,919 --> 00:48:11,321
A large language model
encoding their knowledge
937
00:48:11,321 --> 00:48:14,057
so that they can understand
how the signals that come in from
938
00:48:14,057 --> 00:48:15,926
the outside
world are going to impact them.
939
00:48:15,926 --> 00:48:17,861
How might they respond to that?
940
00:48:17,861 --> 00:48:20,864
And also scaling
the internal workings of the organization
941
00:48:20,864 --> 00:48:23,767
so they can be more efficient
and effective?
942
00:48:23,767 --> 00:48:26,770
Things we talked about at the very
beginning, we're not eliminating that,
943
00:48:27,070 --> 00:48:30,440
but what we're doing
is expanding the capacity and ability
944
00:48:30,440 --> 00:48:33,443
for them to operate in spaces
they never could have before.
945
00:48:33,443 --> 00:48:38,215
So were that changes is organizations
might say, well, okay, I can use this.
946
00:48:38,215 --> 00:48:41,385
And now I only need a three person
marketing team instead of a 25 person
947
00:48:41,385 --> 00:48:43,620
marketing team. Company A might do that.
948
00:48:43,620 --> 00:48:47,491
Company B might say, hey,
for the last three years we probably had
949
00:48:47,491 --> 00:48:52,296
if we go back and look at our backlog,
3 or 400 products or projects
950
00:48:52,462 --> 00:48:55,799
that we would love to pursue, tested
and gotten data on,
951
00:48:56,133 --> 00:48:59,703
that there's no way we'd
have a team of 500 that we would need.
952
00:48:59,970 --> 00:49:01,538
But you know what?
953
00:49:01,538 --> 00:49:06,143
Now we could do that with 25 and all of a
sudden running simulations, right?
954
00:49:06,176 --> 00:49:09,613
Putting products together, testing things,
getting that data becomes
955
00:49:09,613 --> 00:49:12,616
possible
at an enterprise scale from a small team.
956
00:49:12,916 --> 00:49:16,353
And you're exploring new opportunities
in unknown territories.
957
00:49:16,987 --> 00:49:20,290
So you have this expansive
mindset versus a contracting mindset.
958
00:49:20,490 --> 00:49:23,427
One works very well
with industrialist capitalist mindset.
959
00:49:23,427 --> 00:49:25,796
One works
very well in a time of transition
960
00:49:25,796 --> 00:49:28,832
where new markets, economies, and form
factors are starting to develop.
961
00:49:28,932 --> 00:49:30,567
But we don't know what they look like yet.
962
00:49:30,567 --> 00:49:34,104
So that's what we encourage companies
to do, is say, rather than saying,
963
00:49:34,104 --> 00:49:38,408
you're going to lay off 50% of workforce
and do more with less, do more
964
00:49:38,408 --> 00:49:43,213
with the same expansive skill sets, right,
expand capacity, expand possibility.
965
00:49:43,613 --> 00:49:45,816
And that is really where
we see the most value and
966
00:49:46,883 --> 00:49:48,051
I want to talk.
967
00:49:48,051 --> 00:49:51,388
I love the I love the dual approach there.
968
00:49:51,388 --> 00:49:53,323
And it makes it makes complete sense
to me.
969
00:49:53,323 --> 00:49:57,361
I'm curious with the the second scenario
you talked about where it's 25 people
970
00:49:57,361 --> 00:50:01,064
or 500 new products or tests or what
have you.
971
00:50:02,065 --> 00:50:03,433
One of the things I've found,
972
00:50:03,433 --> 00:50:06,603
and I'm curious if you've seen it too,
or you've seen something different, is
973
00:50:07,537 --> 00:50:11,908
in this emerging world where technology
isn't the limiting factor anymore.
974
00:50:11,908 --> 00:50:16,013
And it's like, if you can dream it,
you can build it, you can test it
975
00:50:16,680 --> 00:50:21,752
at some point, the like,
the bottleneck becomes the market,
976
00:50:21,752 --> 00:50:26,023
or it becomes your staff,
or in some sense, it's people's ability
977
00:50:26,023 --> 00:50:29,026
to to like,
actually try and digest new things.
978
00:50:29,493 --> 00:50:35,198
And my sense is you have to still get back
to like prioritization in some capacity.
979
00:50:35,198 --> 00:50:38,535
Because even if the technology
can give you 500 new things,
980
00:50:38,735 --> 00:50:40,670
you're going to be limited somewhere.
981
00:50:40,670 --> 00:50:44,641
Do you buy that
and what are the implications?
982
00:50:44,641 --> 00:50:46,076
I absolutely do.
983
00:50:46,076 --> 00:50:49,479
I think that what that does is
it shows another weakness
984
00:50:49,479 --> 00:50:53,116
in the current paradigm for the future
that we're trying to create.
985
00:50:53,483 --> 00:50:57,954
We go through these phases of oftentimes
150, 200,
986
00:50:57,954 --> 00:51:01,224
300 years,
where the economic paradigm also shifts.
987
00:51:01,458 --> 00:51:05,829
So the one that we're currently in
is identical to the one where
988
00:51:05,829 --> 00:51:09,766
we built the steam engine and connected
geographically disparate places
989
00:51:10,467 --> 00:51:13,637
are the metrics that we use are
still the same ones that we used with
990
00:51:13,637 --> 00:51:17,107
some modifications when the steam engine
was a cutting edge technology.
991
00:51:17,474 --> 00:51:18,708
So what that does,
992
00:51:18,708 --> 00:51:21,711
it shows that the paradigm that we're in
is kind of the ultimate bias.
993
00:51:22,279 --> 00:51:24,047
A paradigm shows you what's important.
994
00:51:24,047 --> 00:51:26,383
What do you measure,
what questions are worth asking.
995
00:51:26,383 --> 00:51:28,285
And all of those are still very much
996
00:51:28,285 --> 00:51:32,055
directed towards the capitalist system
that we have now.
997
00:51:32,055 --> 00:51:33,690
And I'm not saying this to capitalism
versus
998
00:51:33,690 --> 00:51:35,525
socialism versus Marxism type of argument.
999
00:51:35,525 --> 00:51:38,528
It's what is capitalism 8.0 look like
1000
00:51:38,562 --> 00:51:42,065
in order
to start to expand its, environment.
1001
00:51:42,065 --> 00:51:44,668
So these new types of businesses
could become possible.
1002
00:51:44,668 --> 00:51:47,070
So there will be fully autonomous
organizations
1003
00:51:47,070 --> 00:51:48,872
that have zero humans evolved.
1004
00:51:48,872 --> 00:51:50,941
And what does that look like?
1005
00:51:50,941 --> 00:51:53,443
It's not a full replacement for humans.
1006
00:51:53,443 --> 00:51:56,446
That would be like saying
the digital office replaced paper.
1007
00:51:57,180 --> 00:51:59,249
It obviously did not.
1008
00:51:59,249 --> 00:52:02,953
That digital killed analog
analog is still is absolutely decreasing.
1009
00:52:02,953 --> 00:52:05,789
But it's not zero.
And it won't ever be zero in my opinion.
1010
00:52:05,789 --> 00:52:10,527
But it creates this fragmentation of what
we saw as like the dominant paradigm.
1011
00:52:10,861 --> 00:52:13,897
And it creates space for coexistence
of all these new middle,
1012
00:52:14,131 --> 00:52:17,167
models, mental models,
operational models and economic models.
1013
00:52:17,601 --> 00:52:19,469
We don't know what those look like
just yet
1014
00:52:19,469 --> 00:52:21,872
because we haven't seen many of them
succeed.
1015
00:52:21,872 --> 00:52:25,175
We're seeing some signals
when we look at companies that are,
1016
00:52:26,042 --> 00:52:28,612
let's say, on the lean AI leaderboard,
which is a reference I love,
1017
00:52:29,579 --> 00:52:29,913
you know, the
1018
00:52:29,913 --> 00:52:33,350
average 3.3 million average revenue
per employee.
1019
00:52:34,284 --> 00:52:36,553
You know, time to scale
is absolutely insane.
1020
00:52:36,553 --> 00:52:39,422
And we look at these organizations
that are AI native,
1021
00:52:39,422 --> 00:52:42,259
they're starting to show what
some of those paradigms could look like.
1022
00:52:42,259 --> 00:52:46,630
If you extrapolate that from, you know,
ten people, 50 million are and say,
1023
00:52:46,630 --> 00:52:49,633
what would it look like with one person,
150 million are
1024
00:52:50,167 --> 00:52:53,003
what apparatus and infrastructure
would you need?
1025
00:52:53,003 --> 00:52:56,006
What would it look like to operate
as that individual?
1026
00:52:56,039 --> 00:52:57,641
And that can give you
some of those signals.
1027
00:52:57,641 --> 00:52:58,975
I was talking about earlier
1028
00:52:58,975 --> 00:53:01,978
of what possible problem
plausible futures could look like.
1029
00:53:03,380 --> 00:53:05,482
But I do think that
1030
00:53:05,482 --> 00:53:09,052
how we look at capitalism
today is going to change dramatically.
1031
00:53:09,052 --> 00:53:12,689
And it's not just a technological
question, it's a social question.
1032
00:53:12,756 --> 00:53:15,192
It's an economic question,
geopolitical question.
1033
00:53:15,192 --> 00:53:17,761
And that's why I as all of those as well.
1034
00:53:17,761 --> 00:53:20,697
So these things are all coming together
at the same time.
1035
00:53:20,697 --> 00:53:22,065
And that's another reason
people kind of feel
1036
00:53:22,065 --> 00:53:25,802
like they're being thrown off balance
in every direction, because everything is
1037
00:53:25,802 --> 00:53:26,803
changing all at once.
1038
00:53:28,171 --> 00:53:29,105
Right?
1039
00:53:29,105 --> 00:53:32,342
So throughout this conversation,
I feel like I've started to be able
1040
00:53:32,342 --> 00:53:36,446
to put together a mosaic
of like your view of the future through,
1041
00:53:36,446 --> 00:53:39,849
you know, a series of different lenses
and also some spaces where you say,
1042
00:53:39,849 --> 00:53:42,852
you know what, there's still too much
uncertainty here.
1043
00:53:43,720 --> 00:53:45,055
Is there anything you can tell me
1044
00:53:45,055 --> 00:53:48,258
about, like your predictions
for the next 5 to 10 years
1045
00:53:48,258 --> 00:53:52,529
that we haven't covered that, like, you're
pretty confident we're going to see?
1046
00:53:52,562 --> 00:53:54,064
Yeah.
1047
00:53:54,064 --> 00:53:55,966
I would say that the, the paradigm around
1048
00:53:55,966 --> 00:53:59,002
training, skill sets and education
is going to change dramatically.
1049
00:53:59,369 --> 00:54:00,870
And that has profound implications
1050
00:54:00,870 --> 00:54:03,840
for the work that we do
and how we go about that.
1051
00:54:03,907 --> 00:54:04,808
One of the things I talk about
1052
00:54:04,808 --> 00:54:08,678
is skill flux, and it's this concept
that we go from this paradigm of,
1053
00:54:08,712 --> 00:54:11,248
you know, 30 years ago,
you could have a skill set
1054
00:54:11,248 --> 00:54:14,251
that lasted you 30 years before a shelf
life was obsolete.
1055
00:54:14,818 --> 00:54:19,089
Now, you know, someone like me, I had
a skill set that was, you know, ten years.
1056
00:54:19,089 --> 00:54:20,357
It was, valuable.
1057
00:54:20,357 --> 00:54:24,294
I started off as a mobile strategist for,
an agency.
1058
00:54:24,661 --> 00:54:26,263
You don't hire those anymore.
1059
00:54:26,263 --> 00:54:28,298
It just doesn't happen. Right?
1060
00:54:28,298 --> 00:54:29,432
You might at an enterprise level,
1061
00:54:29,432 --> 00:54:32,802
if you're like Cisco and doing software,
but not in that environment.
1062
00:54:33,136 --> 00:54:37,607
And the skill sets that are valuable
are shortening on their shelf life.
1063
00:54:37,841 --> 00:54:39,843
And for more technical skill sets,
1064
00:54:39,843 --> 00:54:42,412
those are a rising
and disappearing faster than ever.
1065
00:54:42,412 --> 00:54:45,415
So now we're at like two and a half years
for technical skill set
1066
00:54:45,615 --> 00:54:48,451
and I could see that shrinking
more and more and more is the point
1067
00:54:48,451 --> 00:54:52,222
where many of them are rising very quickly
and gone the next day, six months.
1068
00:54:52,989 --> 00:54:55,458
To give you an example, I think coding
1069
00:54:55,458 --> 00:54:58,395
and, prompt engineering
are two versions of that.
1070
00:54:58,395 --> 00:55:01,398
So prompt engineering became something
it was relevant about two years ago.
1071
00:55:01,731 --> 00:55:03,600
I would give that maybe a five year
1072
00:55:03,600 --> 00:55:07,437
shelf life
max before it's no longer relevant at all.
1073
00:55:07,437 --> 00:55:10,440
And we're already seeing agents
being able to take on a lot of that work.
1074
00:55:12,042 --> 00:55:14,944
But there will also be a new skill set
that you'll have to learn in order
1075
00:55:14,944 --> 00:55:18,415
to operate in a new paradigm
with new technology and new objectives.
1076
00:55:18,815 --> 00:55:22,452
So we're going to see this exponential
increase in importance and value.
1077
00:55:22,852 --> 00:55:25,989
And a ChatGPT moment that comes in
says that's not valuable anymore.
1078
00:55:25,989 --> 00:55:27,157
That's gone right.
1079
00:55:27,157 --> 00:55:30,960
And that's going to have this almost like
whiplash effect for us as we go along.
1080
00:55:30,960 --> 00:55:35,098
That changes how we educate ourselves
is if we're frontloading education
1081
00:55:35,098 --> 00:55:37,434
for the first quarter of our lives,
we're out of date.
1082
00:55:37,434 --> 00:55:39,035
By the time we walk out of university.
1083
00:55:39,035 --> 00:55:42,972
And this is not a new discussion at all,
but it becomes exacerbated by that.
1084
00:55:43,206 --> 00:55:44,841
So the idea of lifelong learning,
1085
00:55:44,841 --> 00:55:47,510
you know, very cliche,
but micro credentialing,
1086
00:55:47,510 --> 00:55:50,480
we call it surge skilling, where it's like
you're actually having to get very deep
1087
00:55:50,480 --> 00:55:53,516
into something very, very quickly
to create competitive advantage.
1088
00:55:53,917 --> 00:55:57,320
And then you just know
that this is going to be less valuable
1089
00:55:57,620 --> 00:55:58,922
in a certain period of time.
1090
00:55:58,922 --> 00:56:02,392
But what is valuable
is being that first mover and creating
1091
00:56:02,392 --> 00:56:05,462
value with it as fast as possible
before it before comes obsolete.
1092
00:56:05,595 --> 00:56:08,598
So that's where I see education changing,
1093
00:56:08,698 --> 00:56:11,801
where I see people shifting their focus
for competitive advantage.
1094
00:56:12,202 --> 00:56:15,972
And this the culture of an organization
changing too, because you're going
1095
00:56:15,972 --> 00:56:17,440
to have to keep learning on the job.
1096
00:56:17,440 --> 00:56:20,710
And the tools and the
AI as you're using, will have to teach you
1097
00:56:21,077 --> 00:56:23,380
how to work with it as they change.
1098
00:56:25,382 --> 00:56:27,050
So I'm really glad, Ian,
1099
00:56:27,050 --> 00:56:28,985
that that was your answer,
because that was on my list of things
1100
00:56:28,985 --> 00:56:31,988
that I wanted to talk to you about that
we hadn't gotten to yet.
1101
00:56:32,922 --> 00:56:36,960
With that in mind, these the
the shortening time horizon of skills
1102
00:56:36,960 --> 00:56:42,098
you mentioned, you know, it's going
to have massive implications on education.
1103
00:56:42,232 --> 00:56:43,733
What do you see as being the risks
1104
00:56:43,733 --> 00:56:47,370
and the opportunities
for the traditional education system?
1105
00:56:48,037 --> 00:56:52,308
And also, what does it mean for like
the hiring process of organizations?
1106
00:56:52,475 --> 00:56:53,343
Absolutely.
1107
00:56:53,343 --> 00:56:56,379
So it completely disrupts
the one to many broadcast model,
1108
00:56:56,379 --> 00:56:59,382
like the idea of a teacher
standing in front of a room and speaking
1109
00:56:59,382 --> 00:57:02,385
for an hour
and a half to three hours is gone.
1110
00:57:03,420 --> 00:57:05,088
The which is great for people like me.
1111
00:57:05,088 --> 00:57:07,590
I was a terrible student,
super neurodivergent.
1112
00:57:07,590 --> 00:57:09,793
I can't sit in a class on the lesson
for more than five minutes.
1113
00:57:09,793 --> 00:57:11,127
I have to be engaged.
1114
00:57:11,127 --> 00:57:14,130
So what this is going to do,
it will disrupt the current model,
1115
00:57:14,164 --> 00:57:17,000
but it will make it amenable
to a much larger
1116
00:57:17,000 --> 00:57:21,438
group of people who are not built
for the more industrial ask
1117
00:57:22,539 --> 00:57:25,141
manufacturing like education model.
1118
00:57:25,141 --> 00:57:27,610
The challenge, though,
and I don't say that with
1119
00:57:27,610 --> 00:57:31,214
with any malice towards
teachers and educators.
1120
00:57:31,214 --> 00:57:34,451
They are some of the most underresourced,
overtaxed and over
1121
00:57:34,484 --> 00:57:36,319
expected people in the world.
1122
00:57:36,319 --> 00:57:39,322
Then you take a look at the dynamic
in the US and how hostile it is.
1123
00:57:39,522 --> 00:57:42,325
I have so much empathy
for people who choose
1124
00:57:42,325 --> 00:57:45,328
to go into a life of service
for the next generation.
1125
00:57:45,462 --> 00:57:48,998
We need to be spending
a lot more time and money in that space.
1126
00:57:48,998 --> 00:57:51,501
And I make a comment in my keynote
where I say
1127
00:57:51,501 --> 00:57:54,504
the land budget should be as big
as your technology budget,
1128
00:57:55,338 --> 00:57:57,841
and that kind of like, people
look bug eyed at me.
1129
00:57:57,841 --> 00:57:58,741
Like what?
1130
00:57:58,741 --> 00:57:59,843
Like what we're spending.
1131
00:57:59,843 --> 00:58:01,811
So we're spending trillions of dollars
on technology.
1132
00:58:01,811 --> 00:58:04,080
I love that, I love that, yeah.
1133
00:58:04,080 --> 00:58:07,884
But if we don't like the technology's
moving faster than any other sector,
1134
00:58:07,884 --> 00:58:09,619
faster in the economy, fashion,
1135
00:58:09,619 --> 00:58:11,955
the society is moving
faster and education's moving.
1136
00:58:11,955 --> 00:58:16,025
And if we truly want to understand
where humans play in that picture,
1137
00:58:16,626 --> 00:58:19,462
the fact that we're investing
everything we have in technology
1138
00:58:19,462 --> 00:58:22,832
has already indicated our preference
for technology over humans,
1139
00:58:23,800 --> 00:58:26,669
so that math has to balance out a bit.
1140
00:58:26,669 --> 00:58:31,174
We have to figure out how do we invest
so much more into education?
1141
00:58:31,174 --> 00:58:33,243
Not so much less.
1142
00:58:33,243 --> 00:58:36,179
And until we do that,
we are going to be behind the ball.
1143
00:58:36,179 --> 00:58:38,214
We are going to have a target on our back
in many ways,
1144
00:58:38,214 --> 00:58:41,751
because if the paradigms don't change,
the technology just gets better.
1145
00:58:42,018 --> 00:58:44,087
We're going to suffer the consequences.
1146
00:58:44,087 --> 00:58:47,524
But if we put ourselves
front and center of that equation,
1147
00:58:47,857 --> 00:58:50,860
we have the chance and the opportunity
to figure that out.
1148
00:58:51,594 --> 00:58:53,396
Right? It's wow.
1149
00:58:53,396 --> 00:58:57,467
Yeah, it's
it's as you said it like this is not
1150
00:58:58,535 --> 00:58:59,569
an incremental shift.
1151
00:58:59,569 --> 00:59:03,172
This is like a complete disruption
of the model from end to end,
1152
00:59:03,473 --> 00:59:04,440
without a doubt.
1153
00:59:04,440 --> 00:59:07,577
And even for people who live and breathe
it like it's overwhelming for me,
1154
00:59:07,977 --> 00:59:10,413
I do this 24 seven.
I love it, I'm passionate about it.
1155
00:59:10,413 --> 00:59:12,148
I'm excited about where we're going
1156
00:59:12,148 --> 00:59:15,151
and net net,
I'm optimistic about the long term future,
1157
00:59:15,818 --> 00:59:19,022
but we are all pioneers right now,
whether we want to be or not.
1158
00:59:19,556 --> 00:59:23,026
And when people we've kind of bastardized
the term pioneer,
1159
00:59:23,026 --> 00:59:25,194
we've made it seem like,
oh, it's Richard Branson on the cover
1160
00:59:25,194 --> 00:59:28,197
of entrepreneur magazine with his billions
of dollars of success, like
1161
00:59:28,231 --> 00:59:30,867
he was a pioneer at one point in time.
1162
00:59:30,867 --> 00:59:34,170
But yeah, high nears
through really hard shit
1163
00:59:34,170 --> 00:59:37,574
and they go to places
where there's no infrastructure.
1164
00:59:37,740 --> 00:59:41,477
They suffer the consequences of,
you know, decisions that they didn't know
1165
00:59:41,477 --> 00:59:42,211
they'd have to make.
1166
00:59:42,211 --> 00:59:45,181
They are attacked by the environment
that they're in.
1167
00:59:45,181 --> 00:59:47,584
Nature tries to kill them
in a number of different ways. Yeah.
1168
00:59:47,584 --> 00:59:51,321
And as a super resilient species,
we still make a way forward.
1169
00:59:51,554 --> 00:59:54,157
We construct the environment
after we figure it out.
1170
00:59:54,157 --> 00:59:55,658
You know, we might show up in Hawaii
1171
00:59:55,658 --> 00:59:59,362
with no shoes on and realize, oh, crap,
I'm not properly equipped for this.
1172
00:59:59,629 --> 01:00:02,332
And then we figure a way out that pattern.
1173
01:00:02,332 --> 01:00:03,433
Time to go from
1174
01:00:03,433 --> 01:00:07,170
not knowing to knowing can be really hard,
painful, and challenging.
1175
01:00:07,437 --> 01:00:11,541
But the way we thrive
once we do is absolutely amazing.
1176
01:00:11,874 --> 01:00:15,278
So I would say that we are going
to have amazing things happen,
1177
01:00:15,578 --> 01:00:19,082
but we're also going to have to encounter
some really tough growing pains
1178
01:00:19,349 --> 01:00:21,417
individually
and collectively to get there.
1179
01:00:21,417 --> 01:00:24,187
So if anyone is saying otherwise,
it's absolutely smoke and mirrors,
1180
01:00:25,421 --> 01:00:26,222
right?
1181
01:00:26,222 --> 01:00:29,192
Right. Wow. Exciting times ahead.
1182
01:00:29,292 --> 01:00:30,393
There's one more question
1183
01:00:30,393 --> 01:00:32,629
that I wanted to ask you
that I haven't had a chance to yet,
1184
01:00:32,629 --> 01:00:35,632
which is I wanted to ask you the
the inverse of what I just asked you,
1185
01:00:35,698 --> 01:00:36,633
which is, you know,
1186
01:00:36,633 --> 01:00:39,636
aside from, like, what is going to happen
and what's going to disrupt us,
1187
01:00:40,770 --> 01:00:43,373
is there anything you're hearing right
now, hype wise,
1188
01:00:43,373 --> 01:00:46,542
technology wise,
trend wise, that you're like, that's B.S.
1189
01:00:46,576 --> 01:00:48,845
like that's
not actually going to come to pass.
1190
01:00:48,845 --> 01:00:51,481
We're being sold a bill of goods.
1191
01:00:51,481 --> 01:00:54,984
Yeah, I actually I think the agents
conversation is way over heights.
1192
01:00:56,152 --> 01:00:58,321
I think they are transformative.
1193
01:00:58,321 --> 01:01:01,324
I don't know, a single organization
that is going to say, I'm
1194
01:01:01,324 --> 01:01:04,861
going to let an autonomous
series of agents run my enterprise
1195
01:01:04,861 --> 01:01:08,498
that I've spent decades building,
without the oversight necessary.
1196
01:01:08,498 --> 01:01:10,833
Like, we've
we've been working with agents for years,
1197
01:01:10,833 --> 01:01:14,404
and we've been building setups
where agents will work with other agents
1198
01:01:14,404 --> 01:01:17,407
and giving them autonomy and creating
virtual environments to see what happens.
1199
01:01:17,540 --> 01:01:18,975
And every time we let them run amok,
1200
01:01:20,043 --> 01:01:21,010
it's frightening.
1201
01:01:21,010 --> 01:01:24,580
Like it is absolute, like jaw dropping.
1202
01:01:24,580 --> 01:01:26,349
Oh my gosh,
I can't believe that would have happened.
1203
01:01:26,349 --> 01:01:27,617
So glad I didn't get them.
1204
01:01:27,617 --> 01:01:30,620
Freedom to access real live data.
1205
01:01:30,687 --> 01:01:31,654
Yeah.
1206
01:01:31,654 --> 01:01:33,790
And that that infrastructure
needs to be built.
1207
01:01:33,790 --> 01:01:37,794
The there are actions that agents can do
that are absolutely mind blowing.
1208
01:01:38,094 --> 01:01:39,529
But they're narrow.
1209
01:01:39,529 --> 01:01:41,831
They're specific. They're structured.
1210
01:01:41,831 --> 01:01:44,267
And they have strong guardrails.
1211
01:01:44,267 --> 01:01:47,303
The idea that we can kind of
have this almost,
1212
01:01:47,970 --> 01:01:50,573
reinforcement learning, you know,
give it a million different examples,
1213
01:01:50,573 --> 01:01:53,576
let it kind of like bang around,
figure its way through approach
1214
01:01:53,710 --> 01:01:57,780
to unleashing it in
the organization does not work.
1215
01:01:58,781 --> 01:02:00,516
Because the infrastructure is
just not there yet.
1216
01:02:00,516 --> 01:02:03,319
It hasn't caught up
with the promise of the technology.
1217
01:02:03,319 --> 01:02:05,621
So I think we're very much at the top
of the hype cycle of agents.
1218
01:02:05,621 --> 01:02:08,458
We're going to have this crash
into the trough of disillusionment,
1219
01:02:08,458 --> 01:02:12,261
which in my opinion, is the best place
for a nascent technology to be.
1220
01:02:13,096 --> 01:02:15,264
A lot of people say less bad,
but what it means is
1221
01:02:15,264 --> 01:02:18,134
the people who are making promises,
who don't know what they're talking about.
1222
01:02:18,134 --> 01:02:21,904
And let's face it, there's an enormous
amount of people who are rushing to find
1223
01:02:21,904 --> 01:02:25,842
the gold that have no business being here
and making promises.
1224
01:02:25,842 --> 01:02:27,944
They disappear
because it's now it's getting hard.
1225
01:02:27,944 --> 01:02:29,612
You actually have to deliver.
1226
01:02:29,612 --> 01:02:33,416
And in the trough of disillusionment,
it pulls all the pundits out.
1227
01:02:33,783 --> 01:02:36,319
And now the people who are committed
to doing the work,
1228
01:02:36,319 --> 01:02:38,521
who are there for the right reasons,
they get to work
1229
01:02:38,521 --> 01:02:40,890
and they build that infrastructure
that's necessary
1230
01:02:40,890 --> 01:02:43,893
to deliver on all those promises
we were making back here.
1231
01:02:43,893 --> 01:02:46,662
So it takes time,
and I'm just kind of waiting
1232
01:02:46,662 --> 01:02:49,665
for that to kind of implode on itself
1233
01:02:49,732 --> 01:02:52,335
and for people to see like,
oh yeah, these are very, very powerful.
1234
01:02:52,335 --> 01:02:54,637
This is absolutely
the paradigm of the future.
1235
01:02:54,637 --> 01:02:57,840
But the future is still the future,
not the present.
1236
01:02:57,907 --> 01:02:59,308
We need to get there first.
1237
01:03:00,676 --> 01:03:01,310
Right?
1238
01:03:01,310 --> 01:03:02,478
I love that answer.
1239
01:03:02,478 --> 01:03:07,717
And, I think it's so appropriate right now
given where we are in that hype cycle.
1240
01:03:07,717 --> 01:03:10,887
So, thank you for taking some of the air
out of that one.
1241
01:03:11,053 --> 01:03:11,854
That's awesome.
1242
01:03:11,854 --> 01:03:13,589
One of the things I would
maybe I can put a finer
1243
01:03:13,589 --> 01:03:17,093
point on is the metrics piece, and that is
the expansive versus constructive.
1244
01:03:17,460 --> 01:03:20,463
So what I was talking about earlier,
as I was mentioning,
1245
01:03:20,830 --> 01:03:23,766
a lot of teams are going to say
let's let's do more with less.
1246
01:03:23,766 --> 01:03:26,469
Let's pull back the number of resources
we have and get along
1247
01:03:26,469 --> 01:03:29,972
and have higher efficiencies, greater
margins and better stockholder returns.
1248
01:03:30,473 --> 01:03:34,277
And a challenge that we have
is we're moving into a paradigm
1249
01:03:34,277 --> 01:03:37,313
that is going to shape
what matters and what matters,
1250
01:03:37,313 --> 01:03:40,316
and what's valued in the work
that we do is going to be different.
1251
01:03:40,516 --> 01:03:43,886
But when what matters changes,
but the metrics do
1252
01:03:43,886 --> 01:03:47,223
not and the incentives do not. Yeah, yeah.
1253
01:03:47,223 --> 01:03:50,726
That means you run right into a paradigm
that is going to push back on
1254
01:03:50,726 --> 01:03:54,096
you and potentially hurt you
as an organization.
1255
01:03:54,564 --> 01:03:57,700
So we encourage organizations
in time of change
1256
01:03:58,234 --> 01:04:01,103
to also understand
how is this going to change the incentives
1257
01:04:01,103 --> 01:04:04,440
and the metrics that are used to measure
that change as it's happening.
1258
01:04:04,707 --> 01:04:08,311
So we're thinking more about growth
metrics, metrics of innovation, metrics
1259
01:04:08,311 --> 01:04:12,949
that are about charting the unknown versus
optimizing the known.
1260
01:04:13,382 --> 01:04:16,686
We've come from a paradigm of optimizing
the known for the last 150 years.
1261
01:04:16,686 --> 01:04:18,187
We're really good at it.
1262
01:04:18,187 --> 01:04:21,290
The problem is how much is known
about the next five years.
1263
01:04:21,791 --> 01:04:25,862
So if we're doing 95% of our metrics
on optimizing the known,
1264
01:04:25,862 --> 01:04:29,732
5% on exploring the unknown,
that means you're already out of date.
1265
01:04:30,399 --> 01:04:33,536
If we're starting to push more of that
towards exploring and charting
1266
01:04:33,536 --> 01:04:37,974
this unknown territory, this makes us more
prepared for what's going to be coming.
1267
01:04:37,974 --> 01:04:41,244
This gives us the opportunity
to think about innovation quotient
1268
01:04:41,611 --> 01:04:45,615
in, knowledge diffusion across
the organization, building the structures
1269
01:04:45,781 --> 01:04:48,818
that will make you resilient
in this future paradigm.
1270
01:04:49,118 --> 01:04:52,121
Because right
now, optimization organization is scale
1271
01:04:52,421 --> 01:04:55,558
by definition, require
some calcification of the organization.
1272
01:04:55,558 --> 01:04:59,395
It needs to be rigid in some ways
in order to be efficient.
1273
01:05:00,029 --> 01:05:03,933
And rigidness against
an oncoming wave is a recipe for disaster.
1274
01:05:04,200 --> 01:05:06,702
So that's one of the things
we encourage organizations to think up to.
1275
01:05:06,702 --> 01:05:11,007
And we get very deep into
what is that metric, what matters for you?
1276
01:05:11,007 --> 01:05:13,209
How is it specific to your context?
1277
01:05:13,209 --> 01:05:14,610
What are the things that you measure?
1278
01:05:14,610 --> 01:05:16,646
How do you actually do that work?
1279
01:05:16,646 --> 01:05:20,082
And when that clarity is there, all of a
sudden it goes from, well, we don't know.
1280
01:05:20,082 --> 01:05:23,586
The future brings to at least
we know how to move in that direction.
1281
01:05:25,054 --> 01:05:25,888
Right.
1282
01:05:25,888 --> 01:05:28,891
And with respect to the fact that I'm sure
there's lots of different metrics
1283
01:05:28,891 --> 01:05:30,393
for different organizations,
1284
01:05:30,393 --> 01:05:34,830
is the answer to like,
just move to a new set of hard metrics
1285
01:05:34,997 --> 01:05:38,534
or get more comfortable with the notion
that we need to be flexible
1286
01:05:39,001 --> 01:05:42,271
and measure things with a little bit
more flexibility than we have in the past.
1287
01:05:42,271 --> 01:05:44,674
Absolutely.
That that's typically the first step.
1288
01:05:44,674 --> 01:05:48,811
You never want to shift entirely because
you kind of want to leave what's working?
1289
01:05:48,811 --> 01:05:49,645
Working.
1290
01:05:49,645 --> 01:05:52,648
So we don't say you've measured this way,
don't do that anymore,
1291
01:05:52,848 --> 01:05:56,285
but at least a portion of the work
that's being done needs to be done
1292
01:05:56,285 --> 01:05:57,653
in this forward facing way.
1293
01:05:57,653 --> 01:06:00,656
And that type of work
needs to be measured differently.
1294
01:06:00,890 --> 01:06:04,293
Because if you were measuring,
for example, a lot of teams,
1295
01:06:04,627 --> 01:06:07,563
a lot of tiger teams, a lot of innovation
teams are measured by ROI
1296
01:06:07,563 --> 01:06:10,733
on their first run,
which is mind boggling to me.
1297
01:06:10,733 --> 01:06:11,434
Okay, right.
1298
01:06:11,434 --> 01:06:14,704
You're going to have impact on margin
the first time you touch ChatGPT
1299
01:06:15,705 --> 01:06:18,040
know that that doesn't happen.
1300
01:06:18,040 --> 01:06:20,309
And yeah,
I think that's almost a bad example
1301
01:06:20,309 --> 01:06:22,244
because that's just ludicrous
on all levels.
1302
01:06:22,244 --> 01:06:24,246
But if you're thinking
about scale efficiency,
1303
01:06:24,246 --> 01:06:27,883
margin impacts on things
that are by definition going to require
1304
01:06:27,883 --> 01:06:32,021
investment and time,
you're already impeding the work
1305
01:06:32,021 --> 01:06:35,024
that is going to help you
explore unknown territory.
1306
01:06:35,291 --> 01:06:37,259
Yeah. Now it's it's super interesting.
1307
01:06:37,259 --> 01:06:40,730
And yeah, I'm sure we could talk
for another hour just on just on that.
1308
01:06:41,397 --> 01:06:42,631
With that in mind, though,
1309
01:06:42,631 --> 01:06:45,601
you know, I did want to say a big
thank you for joining me today.
1310
01:06:45,801 --> 01:06:47,370
This has been super, super interesting.
1311
01:06:47,370 --> 01:06:48,637
And it's honestly been a real treat.
1312
01:06:48,637 --> 01:06:52,441
I talk to a lot of people in this space,
and I'm just continuously blown away
1313
01:06:52,441 --> 01:06:55,411
by the breadth
and the depth of insights that you have.
1314
01:06:55,778 --> 01:06:57,613
In this space. So I wanted to say a big
thank you.
1315
01:06:57,613 --> 01:06:59,682
Thanks, Jeff. It's been an honor to join
you. I really enjoyed it.



The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Ian Beacraft Discusses
AI Is Rewriting the Rules of Work: Futurist Ian Beacraft Explains Why Jobs Are Dead
Today on Digital Disruption, we’re joined by Ian Beacraft, Chief Futurist and Founder of Signal and Cipher.
Our Guest Dr. Michael Littman Discusses
AI Expert Dr. Michael Littman: This Is Why Everything You Know About AI Is Wrong
Today on Digital Disruption, we’re joined by Dr. Michael Littman, Division Director for Information and Intelligent Systems at the National Science Foundation.
Our Guest Daniel Pink Discusses
Daniel Pink: How the Future of Work Is Changing
Today on Digital Disruption, we’re joined by Daniel Pink, New York Times best-selling author and workplace expert.
Our Guest Pau Garcia Discusses
Pau Garcia: How AI Is Connecting People With Their Pasts
Today on Digital Disruption, we’re joined by Pau Garcia, media designer and founder of Domestic Data Streamers.