Our Guest Dr. Anne-Marie Imafidon Discusses
Is AI Eroding Identity? Future of Work Expert on How AI Is Taking More Than Jobs
What does the future of work really look like when AI, identity, and culture collide?
On this episode of Digital Disruption, we’re joined by Dr. Anne-Marie Imafidon, Chair of the Institute for the Future of Work.
Anne-Marie is a leading voice in the tech world, known for her work as a trustee at the Institute for the Future of Work and as the temporary Arithmetician on Channel 4’s Countdown. A former child prodigy who passed A-level computing at 11 and earned a Master’s in Maths and Computer Science from Oxford by 20, she has since spoken globally for companies including Facebook, Amazon, Google and Mastercard. She hosts the acclaimed Women Tech Charge podcast and is a sought-after presenter who has interviewed figures such as Jack Dorsey and Sir Lewis Hamilton. Anne-Marie has received multiple Honorary Doctorates, serves on several national boards, and continues to champion diversity and innovation in tech. Her latest book, She’s In CTRL, was published in 2022.
Dr. Anne-Marie joins Geoff to break down how AI, big data, quantum, and the wider “Fourth Industrial Revolution” are transforming jobs, workplaces, identity, culture, and society. From redefining long-held beliefs about “jobs for life,” to the cultural fractures emerging between companies, workers, and society, Dr. Anne-Marie goes deep on what’s changing, what still isn’t understood, and what leaders must do right now to avoid being left behind. This conversation dives into why most AI use cases are still limited to fraud detection and customer service, and the hidden cultural blockers preventing real transformation. She emphasizes the danger of hype cycles and how to stay focused on real value and build organizations that can experiment, learn, and make “high-quality mistakes.”
00;00;00;11 - 00;00;01;01
Hey everyone!
00;00;01;01 - 00;00;02;23
I'm super excited to be sitting down
00;00;02;23 - 00;00;06;20
with Marie Imafidon, chair
of the Institute for the Future of Work.
00;00;06;28 - 00;00;10;23
What's so cool about Anne-Marie
is that she does it all and sees it all.
00;00;11;01 - 00;00;14;07
Not only is she leading the charge
on the future of work, but she's a tech
00;00;14;07 - 00;00;18;02
CEO, computer scientist,
and former child prodigy in her own right.
00;00;18;13 - 00;00;22;25
I want to ask her how she sees the nature
of work changing, what role AI plays
00;00;22;28 - 00;00;25;28
and what each of us can do
so that we're not left behind.
00;00;26;00 - 00;00;29;00
Let's find out.
00;00;30;07 - 00;00;31;29
I'm here with Anne Marie Imafidon.
00;00;31;29 - 00;00;34;07
And Marie,
thanks so much for joining today.
00;00;34;07 - 00;00;36;08
Maybe just to get things started,
I wanted to talk a little bit
00;00;36;08 - 00;00;39;23
about some of the work you're doing
with the Institute for the Future of Work.
00;00;39;28 - 00;00;42;21
I know you're
you're a trustee with that organization.
00;00;42;21 - 00;00;45;11
Maybe just to kick things off
from your perspective.
00;00;45;11 - 00;00;47;18
what are you seeing
in terms of the future of work?
00;00;47;18 - 00;00;50;11
What's your outlook, what's what's
changing? And what do we need to know?
00;00;51;10 - 00;00;52;01
All lot.
00;00;52;01 - 00;00;54;28
Lots of things that we're seeing, lots
of things that we're still figuring out.
00;00;54;28 - 00;00;58;20
I think we could say as an institute,
we're working through the details.
00;00;59;01 - 00;01;02;29
As different things
come onto the horizon, as we start
00;01;02;29 - 00;01;07;18
to explore different uses of what we're
calling the kind of fourth industrial,
00;01;07;19 - 00;01;11;28
revolution technologies coming
on, on, on stream.
00;01;11;28 - 00;01;15;14
Not just AI, but of course, big data,
quantum, lots of other bits and pieces.
00;01;15;14 - 00;01;18;06
So in terms of what we're seeing,
I think there's, there's
00;01;18;06 - 00;01;21;28
a, there's a great kind of figuring out of
how we're making all of these
00;01;21;28 - 00;01;24;28
things, work for the workforce.
00;01;24;28 - 00;01;28;20
But I think the, the kind
of two big things that I like to explore
00;01;28;27 - 00;01;31;04
with folks kind of based off our work.
00;01;31;04 - 00;01;34;22
One is the levels
at which these impacts are being felt.
00;01;35;10 - 00;01;38;03
So a lot of the times
when we talk about the future of work,
00;01;38;03 - 00;01;41;18
we end up focusing on the worker
and on the employee.
00;01;42;00 - 00;01;44;23
Where actually where we tend to
00;01;44;23 - 00;01;47;22
kind of it's one of the lenses
that we have over the work that we do.
00;01;47;23 - 00;01;49;04
This is not just about individuals,
00;01;49;04 - 00;01;51;25
but this is about companies,
and this is about society.
00;01;51;25 - 00;01;52;24
And so that's one of the
00;01;52;24 - 00;01;55;03
one of the kind of big things
that we end up trying to work
00;01;55;03 - 00;01;57;21
through with folks is it's not just about
what happens to an individual,
00;01;57;21 - 00;01;58;24
what's happening to a group,
00;01;58;24 - 00;02;00;29
what's happening to a company,
or what's happening to a team,
00;02;00;29 - 00;02;03;03
what's happening to an industry
and therefore what's happening.
00;02;03;03 - 00;02;06;03
Society's at three levels that we look at.
00;02;06;13 - 00;02;09;21
But then the other big one, that's maybe
a little bit further off that I don't know
00;02;09;21 - 00;02;13;19
if you have got to explore
with folks on the podcast previously
00;02;13;29 - 00;02;16;29
is also this idea of identity and work,
00;02;17;18 - 00;02;19;23
and we've had this
since right at the beginning.
00;02;19;23 - 00;02;22;29
So I'm the chair now at the Institute
for Future of Work.
00;02;23;08 - 00;02;25;10
I also run an organization called Stem.
00;02;25;10 - 00;02;28;12
It's and we work with young people, young,
00;02;28;12 - 00;02;31;12
young folks,
young them binary, young women.
00;02;31;19 - 00;02;33;22
Kind of people
that have historically marginalized
00;02;33;22 - 00;02;37;03
from the tech industry in the Stem
y the Stem and steam industries.
00;02;37;10 - 00;02;40;25
And a couple of years ago,
we ended up working in a part of the UK
00;02;41;06 - 00;02;43;02
that historically had a lot of minors.
00;02;44;03 - 00;02;46;13
And this is like mines, like coal mines,
00;02;46;13 - 00;02;50;01
which in the UK is an industry
that shot decades ago.
00;02;50;13 - 00;02;52;13
And we were working teenagers.
00;02;52;13 - 00;02;56;03
And it kind of came back as this feedback
that, yeah, these are minors kids.
00;02;56;12 - 00;02;59;13
That's why they're having
this kind of response to the events.
00;03;00;02 - 00;03;02;15
And this is kind of early
this is kind of 20 tens.
00;03;02;15 - 00;03;05;07
And I remember
saying to a couple of people there,
00;03;05;07 - 00;03;06;10
what do you mean they're minors?
00;03;06;10 - 00;03;09;10
Kids like the mines shot the decades ago.
00;03;10;05 - 00;03;12;07
These are teenage like it physically.
00;03;12;07 - 00;03;13;24
How is that a
00;03;13;24 - 00;03;17;01
and of course, no, it was that they were
they were kind of second generation.
00;03;17;06 - 00;03;22;21
Their grandparents had been minors
as their entire job for life.
00;03;22;21 - 00;03;25;20
That was the entire density
that they had of these people.
00;03;25;20 - 00;03;28;09
And so even that their children
had never seen a mine and never been in
00;03;28;09 - 00;03;31;09
life, was anyone in their lineage
just working in mine?
00;03;31;18 - 00;03;35;13
That notion of work
for life was so embedded
00;03;35;23 - 00;03;39;09
that not only was it the identity
of their grandparents that were miners,
00;03;39;14 - 00;03;41;11
but now there's grandchildren at them.
00;03;41;11 - 00;03;44;27
And so it's something
that when I graduated, we'd said
00;03;44;27 - 00;03;48;06
that I think something
like I'd have six roles in my career,
00;03;48;17 - 00;03;51;12
and now our relationship with work is one
that folks are kind of maybe
00;03;51;12 - 00;03;55;02
even doing multiple jobs
at the same time versus that one for life.
00;03;55;13 - 00;03;58;01
And so actually,
if we start to think of it that way,
00;03;58;01 - 00;04;00;16
what's the new identity
that people might have?
00;04;00;16 - 00;04;03;25
If work is not for life and it's not,
you know, and all these technologies
00;04;03;25 - 00;04;06;25
are enabling us
to have different relationships with work,
00;04;06;26 - 00;04;11;04
with, you know, income generation,
with what we spend our time doing.
00;04;11;13 - 00;04;14;06
And so that's the other
big overarching piece that I like
00;04;14;06 - 00;04;17;12
to kind of explore that runs away
a little bit from the technology.
00;04;18;09 - 00;04;21;04
But so much of this is about,
you know, the impact of the technology
00;04;21;04 - 00;04;24;09
rather than purely
about how it gets deployed and gets used.
00;04;25;16 - 00;04;28;04
The in and it's a super it's a super,
super interesting piece.
00;04;28;04 - 00;04;32;04
And I had had laugh I had sort of sardonic
thought, which is that I don't think,
00;04;32;17 - 00;04;36;16
I don't think my grandchildren are going
to say they're like podcasters, children,
00;04;36;16 - 00;04;40;00
you know, that said, either of ours are
and I, I hope they're I hope they're not,
00;04;41;22 - 00;04;43;19
whether the miner's
children are good or bad,
00;04;43;19 - 00;04;47;11
but but you know, what are Anne-Marie,
what are some of the implications of that?
00;04;47;11 - 00;04;51;08
Like, what are we not seeing here
and what does it mean as a society
00;04;51;15 - 00;04;54;15
for us when we move from jobs for life?
00;04;54;15 - 00;04;58;24
Or, you know, my work is my identity to
I mean, it feels like the economy.
00;04;58;24 - 00;05;01;28
And, you know, I tell me
if you're seeing the same thing in the UK,
00;05;01;28 - 00;05;02;29
but it's not even that.
00;05;02;29 - 00;05;06;06
It's that it's shorter
stints, that organization.
00;05;06;06 - 00;05;07;26
It feels like it's more part time.
00;05;07;26 - 00;05;09;14
It's more piecemeal work.
00;05;09;14 - 00;05;12;18
Like, what are you seeing
and how is that impacting people?
00;05;13;05 - 00;05;16;15
So what we're what we're therefore
seeing what you're saying is there's maybe
00;05;16;15 - 00;05;19;15
less, slightly
less of a sense of identity in work,
00;05;19;16 - 00;05;22;14
or there's a mismatch
where folks have been looking for that.
00;05;22;14 - 00;05;26;03
I've assumed that they have had that,
and it's no longer
00;05;26;03 - 00;05;29;11
being supported by the way
that the industry is being transformed.
00;05;29;11 - 00;05;32;24
So, you know, the, the
I don't know, as an example,
00;05;32;24 - 00;05;35;21
we can say if we look
at the legal practice, for example,
00;05;35;21 - 00;05;37;23
the way that folks kind of move
through the ranks,
00;05;37;23 - 00;05;42;26
a lot of those kind of entry
level roles are transforming and morphing.
00;05;42;26 - 00;05;45;08
And so therefore, the way
that we're evaluating folks to then
00;05;45;08 - 00;05;48;25
be future leaders, you know,
the legal industries having to rethink
00;05;49;12 - 00;05;53;15
elements of how that's done as we bring in
the new, forms of technology.
00;05;53;15 - 00;05;54;14
I think the other thing
00;05;54;14 - 00;05;58;21
that we are seeing, and we're big
proponent is kind of where we started,
00;05;58;21 - 00;06;02;02
actually, as an institute
was our good work charter.
00;06;02;16 - 00;06;07;08
But definitely looking at how do we ensure
that folks, are intentional
00;06;07;08 - 00;06;07;21
in the way that they're
00;06;07;21 - 00;06;11;14
deploying technology in the workplaces
to ensure that people still have good work
00;06;11;14 - 00;06;15;17
that does promote, dignity, that does
promote learning, that allows them to have
00;06;15;17 - 00;06;21;12
all these things that I think we might
have had as part and parcel of jobs.
00;06;21;12 - 00;06;24;12
And the way that work was built
in previous generations.
00;06;24;19 - 00;06;25;24
But yes, if we remove
00;06;25;24 - 00;06;28;24
the sense of identity, removed
the way that things are working now,
00;06;28;25 - 00;06;31;21
then actually we
you've got to be mindful of
00;06;31;21 - 00;06;34;22
I'm deploying this technology in a way
that allows folks to continue to learn.
00;06;35;27 - 00;06;36;08
And so
00;06;36;08 - 00;06;38;16
then that
becomes the way that we set work up
00;06;38;16 - 00;06;41;17
and that we instill that we're still going
to have good work for folks
00;06;41;17 - 00;06;44;17
to be doing, rather than them
maybe being at the mercy of
00;06;45;20 - 00;06;48;20
technical and kind of robotic overlords.
00;06;49;25 - 00;06;50;19
Right.
00;06;50;19 - 00;06;53;10
Well, and it seems like it seems like
we're kind of backing into something
00;06;53;10 - 00;06;56;10
you discussed earlier, which is that
there's the worker lens on this,
00;06;56;10 - 00;06;59;11
but there's also the company lens
and the societal lens.
00;06;59;11 - 00;07;01;13
And you described it
as sort of a sorting out period.
00;07;01;13 - 00;07;03;01
But it feels,
00;07;03;01 - 00;07;06;13
at least from where I'm sitting,
it feels like we're increasingly at odds
00;07;06;18 - 00;07;10;13
with some of these mandates
as we sort that out and that,
00;07;10;24 - 00;07;14;11
what's best for the company doesn't
necessarily feel like it's best
00;07;14;11 - 00;07;16;04
for the worker or best for society.
00;07;16;04 - 00;07;19;21
And there's this, you know,
kind of this triplicate tension there.
00;07;19;22 - 00;07;21;17
What what does that look like
from your lens?
00;07;21;17 - 00;07;23;05
And, and, you know,
can you tell me a little bit more
00;07;23;05 - 00;07;26;05
about what the institute and,
and you are exploring.
00;07;26;06 - 00;07;28;10
Yeah,
I think I think it's it's a funny one.
00;07;28;10 - 00;07;29;21
You say it's at odds.
00;07;29;21 - 00;07;30;25
I think it's something that
00;07;30;25 - 00;07;33;24
because we've never had to explicitly say,
that's how we're developing work.
00;07;34;01 - 00;07;35;27
Well,
that's what we're creating work to do.
00;07;35;27 - 00;07;39;16
I think, folks, it ends up feeling like it
jars a little bit to say,
00;07;39;25 - 00;07;40;22
what do you mean?
00;07;40;22 - 00;07;43;07
Good work
means that this person is learning,
00;07;43;07 - 00;07;45;08
or that we're ensuring
that there's dignity, right?
00;07;45;08 - 00;07;48;19
Or that they've got access to kind of
what it is that they need to do.
00;07;48;24 - 00;07;49;19
You know, a good work
00;07;49;19 - 00;07;52;19
was just that that person got paid
well and that was good enough.
00;07;52;22 - 00;07;55;04
And so I think we're definitely
seeing elements of that job.
00;07;55;04 - 00;07;59;06
But I think what we're seeing more though,
is that a lot of companies,
00;07;59;06 - 00;08;01;08
a lot of industries
are trying to make sense
00;08;01;08 - 00;08;03;19
of how they make the most of this fourth
industrial revolution.
00;08;03;19 - 00;08;07;09
What is the transformative everyone saying
it's going to be disruptive?
00;08;07;09 - 00;08;09;09
You know, we're here
talking about digital disruption, right?
00;08;09;09 - 00;08;11;06
You know,
what is that actually going to look like.
00;08;11;06 - 00;08;15;05
And what we're seeing is that
this is allowing folks to have
00;08;15;14 - 00;08;20;07
a better way
to approach this transformation
00;08;20;19 - 00;08;24;20
in a way that they can feel at peace,
maybe with the idea of legacy.
00;08;24;20 - 00;08;27;19
I talked to a lot of audiences, actually,
about this notion of legacy
00;08;27;20 - 00;08;32;00
that the decisions you're making today
aren't just about the next quarter
00;08;32;00 - 00;08;37;00
or the next annual round, or the next
strategic, from the 3 to 5 years.
00;08;37;06 - 00;08;40;06
But actually, these
a lot of these decisions we're making now
00;08;40;13 - 00;08;42;08
have this kind of 50 years down the line.
00;08;42;08 - 00;08;45;08
Think if the
if the person sat in your seats
00;08;45;08 - 00;08;48;08
and the decisions you're making today,
the norms that you're codifying
00;08;48;19 - 00;08;52;03
the intentions that you have for
the industry, for the job, for the role,
00;08;52;03 - 00;08;55;03
for the nature of the products
and services that you're developing.
00;08;55;08 - 00;08;58;09
If when you think about it
kind of from a long term perspective,
00;08;58;11 - 00;09;00;14
then actually it helps for folks to say,
yeah,
00;09;00;14 - 00;09;03;04
what part of this allows people
to continue to learn?
00;09;03;04 - 00;09;05;28
What part of this allows
folks to have dignity?
00;09;05;28 - 00;09;11;14
And so we're finding that organizations
are thankful actually to have a framework
00;09;11;24 - 00;09;15;28
to which to think about this, that isn't
purely about profits and more money.
00;09;16;05 - 00;09;20;07
I think even more so there's so much
rhetoric that we have in this hype
00;09;20;07 - 00;09;24;00
cycle that ends up being quite dystopian
and a bit of Black Mirror
00;09;24;00 - 00;09;27;27
or wooly or, you know, in some other
kind of dystopian end tier of
00;09;28;29 - 00;09;30;26
things are going to be terrible
because we're just going to follow
00;09;30;26 - 00;09;32;13
along with the technology
and we're all going to end up
00;09;32;13 - 00;09;35;13
killing ourselves by accident
or whatever else it might be.
00;09;35;17 - 00;09;37;23
And I think some of these things end up
being checks and balances
00;09;37;23 - 00;09;40;14
that folks can
then feel a little bit better as,
00;09;40;14 - 00;09;42;10
yes, I've used this audit,
I've used this assessment,
00;09;42;10 - 00;09;46;09
I've used this tool from the Institute,
I've read this this paper, I've understood
00;09;46;09 - 00;09;50;09
some of the ideas that are floated here
and the warning signs that we have.
00;09;50;09 - 00;09;52;24
And so actually,
any step that I make in the write down,
00;09;52;24 - 00;09;54;04
any set that I'm making, this direction
00;09;54;04 - 00;09;57;03
is going to be in the right direction
and is going to, you know,
00;09;57;26 - 00;10;02;05
prove a good one in the long term
so that that's what we're beginning to see
00;10;02;14 - 00;10;04;25
with people
genuinely wanting to work up these details
00;10;04;25 - 00;10;07;21
and whether that's unions,
whether that's regulators,
00;10;07;21 - 00;10;10;13
whether that's academics,
whether that's, you know, we work
00;10;10;13 - 00;10;13;21
with so many different,
organizations and types of stakeholders
00;10;13;28 - 00;10;17;03
that we're able to have
this kind of holistic view
00;10;17;03 - 00;10;20;17
of what good looks like
in lots of different places and spaces.
00;10;22;11 - 00;10;23;03
If you work in
00;10;23;03 - 00;10;26;11
IT, Infotech research Group is a name
you need to know.
00;10;26;26 - 00;10;29;26
No matter what your needs are, Infotech
has you covered.
00;10;30;01 - 00;10;31;08
AI strategy?
00;10;31;08 - 00;10;33;20
Covered. Disaster recovery?
00;10;33;20 - 00;10;34;20
Covered.
00;10;34;20 - 00;10;37;05
Vendor negotiation? Covered.
00;10;37;05 - 00;10;40;28
Infotech supports you with the best
practice research and a team of analysts
00;10;40;28 - 00;10;44;14
standing by ready to help you
tackle your toughest challenges.
00;10;44;24 - 00;10;47;24
Check it out at the link below
and don't forget to like and subscribe!
00;10;49;26 - 00;10;51;20
want to clarify a little bit because
00;10;51;20 - 00;10;54;15
we're talking about this notion
of the fourth industrial revolution.
00;10;54;15 - 00;10;57;10
We're talking about all this change
and the need to do. Right.
00;10;57;10 - 00;11;00;19
And by the way I love the,
I love the framework around legacy for,
00;11;00;22 - 00;11;02;24
for a lot of reasons
that I want to get into in a minute.
00;11;02;24 - 00;11;06;16
But just before we go there, what
when we talk about this fourth industrial
00;11;06;16 - 00;11;10;12
revolution, like, is this 90% AI?
00;11;10;12 - 00;11;14;15
Is it 10%
I like, aside from the IP, like what?
00;11;14;16 - 00;11;18;01
What are some of the other big drivers
in your mind of what are shifting for
00;11;18;05 - 00;11;19;07
for work?
00;11;19;07 - 00;11;21;10
It shifts, it shifts, and it differs.
00;11;21;10 - 00;11;24;01
So I it's not an 80%.
00;11;24;01 - 00;11;27;09
Not 90%. I'd say it's probably more 60%.
00;11;27;09 - 00;11;30;08
I think quantum is definitely on
the horizon.
00;11;30;08 - 00;11;32;22
We've had big data for quite a while.
00;11;32;22 - 00;11;36;05
We do
a lot of kind of physical manufacturing,
00;11;36;05 - 00;11;39;05
kind of psych pieces, pieces as well.
00;11;39;10 - 00;11;45;07
So for us, it's rather than focusing
specifically on, on the technology across
00;11;45;23 - 00;11;49;08
the kind of wider set of research
that we're doing, we've gone quite broad,
00;11;49;15 - 00;11;53;06
and I think we've had to do that
because of the longitudinal nature of what
00;11;53;06 - 00;11;54;27
we're trying to establish or set up.
00;11;54;27 - 00;12;00;16
I think if we got caught up in AI,
we'd end up restricting, really, you know,
00;12;00;16 - 00;12;06;02
the, the impact we're able to have, but
also, the quality of what we're doing.
00;12;06;02 - 00;12;08;05
So I think with AI, actually,
there's quite a lot of things.
00;12;08;05 - 00;12;09;28
I mean, when we started,
00;12;09;28 - 00;12;13;00
you know, there was also a notion of
what is AI and what's not I.
00;12;13;00 - 00;12;13;04
Right?
00;12;13;04 - 00;12;16;04
A lot of people were kind
of taking statistical analysis.
00;12;16;15 - 00;12;20;14
And calling it wrap up packaging is AI
and kind of getting more funding
00;12;20;14 - 00;12;22;12
and getting more eyeballs on it
based on that.
00;12;22;12 - 00;12;25;12
So I think we kind of wanted to rise
a little bit above the hype,
00;12;25;16 - 00;12;28;21
and not be kind of AI specific.
00;12;29;00 - 00;12;32;01
Quantum is one that has is something
I was reading yesterday.
00;12;32;02 - 00;12;34;16
I'm still kind of surprised. Maybe not.
00;12;34;16 - 00;12;36;29
I mean, I'm
I'm more surprised than I'm not.
00;12;36;29 - 00;12;39;29
That quantum isn't featuring
so much in conversations now,
00;12;39;29 - 00;12;43;18
as much as AI is, though,
this research is going on.
00;12;43;26 - 00;12;47;19
There's supposedly, updates and progress
that's being made on that front,
00;12;47;19 - 00;12;50;19
and it is going to be incredibly
transformative,
00;12;51;01 - 00;12;54;07
once this goes live
and once we really do have the kind of the
00;12;54;10 - 00;12;56;06
we've worked out
all the elements of what needs to
00;12;56;06 - 00;12;59;08
to happen for it
to be kind of stable and, and at large.
00;13;00;02 - 00;13;03;11
So I think for us, yeah, as a, as
an institute, we're wanting to be broader
00;13;03;16 - 00;13;07;09
and look at kind of, all manner of different technologies in the here and now.
00;13;08;17 - 00;13;09;10
Sure.
00;13;09;10 - 00;13;13;23
And you know, you mentioned AI hype,
which I think for anyone
00;13;13;23 - 00;13;16;27
even tangentially tied to this space,
it's just impossible
00;13;16;27 - 00;13;18;23
to get away from these days.
00;13;18;23 - 00;13;21;01
And so I'm curious,
from your perspective, Anne-Marie,
00;13;21;01 - 00;13;23;14
what are the things you're hearing
in this sort of hype sphere
00;13;23;14 - 00;13;26;04
that from your perspective
and what you've seen on the ground,
00;13;26;04 - 00;13;30;04
like what's BSW versus
what are the pieces of it and the,
00;13;30;09 - 00;13;30;27
you know,
00;13;30;27 - 00;13;33;22
implications and implementations of it
that you really think
00;13;33;22 - 00;13;36;05
are going to be transformative
and disruptive.
00;13;36;05 - 00;13;39;05
What I think is really funny about
this is the.
00;13;39;08 - 00;13;41;12
So much of the piece is on the potential.
00;13;41;12 - 00;13;42;11
Right.
00;13;42;11 - 00;13;45;17
And I get to work at the institute
and in other work I do,
00;13;45;18 - 00;13;47;13
I get to work with all manner of AI.
00;13;47;13 - 00;13;50;12
That's what we governments, right?
I get to work with NGOs.
00;13;50;12 - 00;13;54;03
I get to work with industries
kind of various different ones.
00;13;54;12 - 00;13;58;10
And what I find the funny,
frustrating is that the
00;13;58;18 - 00;14;01;20
the use cases, the proven use cases
00;14;02;16 - 00;14;04;15
where we've said this is something
that folks are able to say,
00;14;04;15 - 00;14;06;24
we're going to set the AI to do it
and we're going to look away.
00;14;06;24 - 00;14;09;24
We're not still trying to figure out
we're not still trying to make it happen.
00;14;10;01 - 00;14;14;19
We're not still trying to, realize
have a kind of a really have benefit.
00;14;14;19 - 00;14;19;11
Realize
ROI realized is kind of fraud cases.
00;14;19;11 - 00;14;23;07
It's customer service
and then it's not much else.
00;14;25;12 - 00;14;28;10
So I think a lot of folks
are trying to use it for creative ends.
00;14;28;10 - 00;14;31;17
A lot of folks are trying to use it for,
00;14;31;17 - 00;14;34;17
tagging kind of en masse, but
maybe that's another one that I've seen.
00;14;34;17 - 00;14;38;03
But again, the tagging as a use case
kind of doesn't stand on its own.
00;14;38;03 - 00;14;41;22
It's kind of been part of a wider process,
maybe fraud,
00;14;41;29 - 00;14;43;28
but fraud is one customer services.
00;14;43;28 - 00;14;46;28
The other,
and then folks tend to go quiet.
00;14;47;09 - 00;14;51;28
I've seen at all levels and elsewhere
really getting the AI to do.
00;14;51;29 - 00;14;55;23
And so a lot of it, it's
not that it's completely BS, it's just
00;14;56;02 - 00;15;00;26
this is so it's still so theoretical
or it's still so early on that
00;15;00;26 - 00;15;04;22
no one's been able to genuinely realize
that benefit drove it to the ground
00;15;04;22 - 00;15;07;13
and say from that,
because we've done it faster,
00;15;07;13 - 00;15;10;23
because we've done it with less resource
or fewer resources,
00;15;10;26 - 00;15;16;05
we no longer pay for this or we, you know,
the speed at which we're able to serve
00;15;16;05 - 00;15;19;20
our customers is now this
and that's been realized on their end.
00;15;20;01 - 00;15;22;27
I think for me, it's quite frustrating
actually not getting not
00;15;22;27 - 00;15;26;07
that folks are struggling,
I guess, to identify that and stick to it
00;15;26;19 - 00;15;30;23
and have it repeat again, unless it's
customer service or fraud detection.
00;15;31;18 - 00;15;32;08
really interesting.
00;15;32;08 - 00;15;35;10
And I'm, you know, I've seen something
similar with a lot of the organizations
00;15;35;10 - 00;15;36;10
we work with over here.
00;15;36;10 - 00;15;40;07
But I'm curious when you see that
and when you see that it's, you know, 1
00;15;40;07 - 00;15;42;03
to 2 things that
00;15;42;03 - 00;15;45;17
companies or organizations, NGOs,
governments are getting value out of.
00;15;45;17 - 00;15;47;05
what's typically your advice for them?
00;15;47;05 - 00;15;49;10
Is it, you know,
you're not looking hard enough.
00;15;49;10 - 00;15;51;03
Go back out there. Is it.
00;15;51;03 - 00;15;52;20
You know, go all in on AI.
00;15;52;20 - 00;15;54;17
Is it okay? This makes sense for now.
00;15;54;17 - 00;15;58;20
Let's wait until we find more promising
use of the technology.
00;15;58;23 - 00;16;00;22
Well, what do you typically tell them?
00;16;00;22 - 00;16;04;05
So my my advice is,
I mean, not it's not as negative
00;16;04;05 - 00;16;05;26
if you're not looking hard hard enough.
00;16;05;26 - 00;16;09;05
I think my advice is
there are a lot of exciting, sexy
00;16;09;05 - 00;16;13;02
things that folks,
the big, big tech thinks are the problems.
00;16;13;18 - 00;16;15;10
And so go look for the, exciting things.
00;16;15;10 - 00;16;18;10
One of my favorite ones
recently has been potholes.
00;16;18;17 - 00;16;19;29
The eye of potholes.
00;16;21;18 - 00;16;22;16
Who's thinking about that?
00;16;22;16 - 00;16;24;17
Talking about I know the big folks.
00;16;24;17 - 00;16;27;18
There's not a huge amount of money
supposedly in potholes,
00;16;27;26 - 00;16;30;26
but actually, if you were able to sit down
and think about it
00;16;30;27 - 00;16;34;05
and the folks working in that space
really could get focus on
00;16;34;14 - 00;16;37;17
what is the data that we have,
what's the logic that we have around this?
00;16;37;17 - 00;16;39;12
What what are the learnings that we have?
00;16;39;12 - 00;16;40;27
And we can look at the air of potholes.
00;16;40;27 - 00;16;43;12
It's almost like that
kind of what's the unsexy stuff?
00;16;43;12 - 00;16;45;11
And then
how do we start to experiment with that?
00;16;45;11 - 00;16;48;23
And you have to do those experiments
and you have to make those mistakes
00;16;50;02 - 00;16;53;05
in order for you to then
have that kind of first mover advantage.
00;16;53;13 - 00;16;56;24
And so that's what I end up talking to
folks about, is there will be things
00;16;56;24 - 00;17;02;06
internally that are almost hidden problems
that you have to then go and work
00;17;02;06 - 00;17;05;21
and experiment on, and you want to make
quality mistakes, is the other notion.
00;17;05;21 - 00;17;10;27
I'm talking to audiences a lot about,
I give the example of there was a time
00;17;10;27 - 00;17;14;02
before cloud when we'd have to save files
as we went along.
00;17;14;27 - 00;17;15;06
Right?
00;17;15;06 - 00;17;19;07
And most folks who are old, most folks
you know of a certain age will remember
00;17;19;07 - 00;17;22;07
you'd have to control s or command s
as you went,
00;17;22;08 - 00;17;25;21
and God forbid you forgot to do that
or your file was corrupted,
00;17;25;21 - 00;17;29;14
then you'd have to rewrite your document
entirely from scratch, right?
00;17;29;14 - 00;17;32;02
And you'd kind of do that
and you'd call it final,
00;17;32;02 - 00;17;35;12
and then I'd know you'd use the loo,
the toilet, you'd come back, you'd realize
00;17;35;12 - 00;17;37;23
there was like a heading
at the end that you changed.
00;17;37;23 - 00;17;40;16
So you changed that
and you'd call it final. Final.
00;17;40;16 - 00;17;43;04
And then maybe your mum would call you
and you're on the phone to her.
00;17;43;04 - 00;17;45;10
How much?
Something she says triggers your mind.
00;17;45;10 - 00;17;47;18
And then you call this
final, final, final.
00;17;47;18 - 00;17;49;03
And most of us
would then be like, brilliant.
00;17;49;03 - 00;17;50;12
Would you send final?
00;17;50;12 - 00;17;52;10
Final off and then realized
00;17;52;10 - 00;17;55;10
and then a final, final,
final is what you should have sent, right?
00;17;56;04 - 00;17;59;00
And I know
we're reaching back into our memory banks.
00;17;59;00 - 00;18;02;02
Then the next time you sit down to write
a document, not only you control thing
00;18;02;05 - 00;18;06;10
compulsively the whole way through,
but now you've called it final.
00;18;06;10 - 00;18;09;10
I don't know, 5th of January 2026,
00;18;09;14 - 00;18;12;11
maybe that's when you're listening
to this podcast, right?
00;18;12;11 - 00;18;13;27
And then you're like, okay, cool.
00;18;13;27 - 00;18;15;14
So now I know which final it was.
00;18;15;14 - 00;18;16;19
I'm going to take you off the date.
00;18;16;19 - 00;18;20;05
Or if you're particularly sophisticated,
you start versioning
00;18;20;09 - 00;18;23;28
and you have version
and you don't even use the word final.
00;18;24;08 - 00;18;26;29
And I talked to folks in this example
that there's a journey you've been on,
00;18;26;29 - 00;18;28;04
which means that now
00;18;29;07 - 00;18;30;16
you don't use the word final.
00;18;30;16 - 00;18;32;22
You've got a fairly sophisticated
versioning,
00;18;32;22 - 00;18;35;24
system that you're following along with,
00;18;36;15 - 00;18;37;20
but actually,
you wouldn't have gotten there
00;18;37;20 - 00;18;40;02
if the file wasn't
corrupted the first time.
00;18;40;02 - 00;18;43;14
And I talk about now, when we're using AI
in the same kind of way, there's
00;18;43;14 - 00;18;46;00
going to be so many different elements
that you want to try and use it on.
00;18;46;00 - 00;18;46;18
And then you remember.
00;18;46;18 - 00;18;49;01
Now, actually,
I doesn't understand humans.
00;18;49;01 - 00;18;51;04
So what's the non-human
element of what we're doing that
00;18;51;04 - 00;18;54;22
I can train this on for me to then
get the results that I want to have, or,
00;18;54;26 - 00;18;55;25
you know, all those kinds of things.
00;18;55;25 - 00;18;58;25
And so I think there's a high quality
mistakes that folks need to be making.
00;18;59;07 - 00;19;02;11
But as you do that, you're going to make
that you can have that quality
00;19;02;11 - 00;19;03;13
of learning that others don't have.
00;19;03;13 - 00;19;07;20
And that's going to be your competitive
advantage in making the best use of AI.
00;19;08;05 - 00;19;11;08
And so that that's where
I want to end up talking to folks about.
00;19;13;03 - 00;19;14;02
I love that approach.
00;19;14;02 - 00;19;18;20
And by the way I'm chuckling as well
I, I like very, I like you know
00;19;18;29 - 00;19;23;16
getting like PTSD about underscore
final underscore one,
00;19;24;21 - 00;19;27;26
the one that the one that really gave me
like philosophical pause was
00;19;27;26 - 00;19;31;13
I once had someone sent me underscore
final underscore
00;19;31;13 - 00;19;34;13
final underscore new
00;19;35;14 - 00;19;37;03
that naming convention
00;19;37;03 - 00;19;40;13
tells a story of That's exactly.
00;19;40;23 - 00;19;45;11
done and we've gone back to the drawing
board, but but all of this, all of this.
00;19;45;11 - 00;19;47;23
And the reason I laugh is, is, you know,
as I listen to you
00;19;47;23 - 00;19;50;23
talking about finding your own mistakes
and looking in the unsexy parts,
00;19;50;29 - 00;19;53;26
it feels like it's very cultural
within an organization.
00;19;53;26 - 00;19;57;02
The ability to do this successfully,
what does a culture look like
00;19;57;02 - 00;19;58;00
in an organization
00;19;58;00 - 00;20;01;14
that you find is going to be successful
with this versus unsuccessful?
00;20;02;02 - 00;20;04;28
AI is I mean, it's it's a big point.
00;20;04;28 - 00;20;08;16
And, and it's
maybe something that, again, isn't hasn't
00;20;08;26 - 00;20;12;08
it doesn't ever it's not an overt
kind of kind of message of kind of
00;20;12;08 - 00;20;14;03
you need the right culture
in order for these things to do.
00;20;14;03 - 00;20;18;01
Well, I think it is those cultures
that are open to those mistakes
00;20;18;01 - 00;20;21;24
and are open to that iterative approach,
which I think if we're working on the
00;20;21;24 - 00;20;22;27
agile framework
00;20;22;27 - 00;20;25;09
and you're particularly technical,
you will understand what that means.
00;20;25;09 - 00;20;28;09
If you build in that, you're going to have
a retrospective, right?
00;20;28;10 - 00;20;29;04
As you're going.
00;20;29;04 - 00;20;31;19
That means that you're saying,
we are going to have that.
00;20;31;19 - 00;20;33;25
We're going to have to take
the opportunity to reflect back on,
00;20;33;25 - 00;20;35;07
because there will be things
for us to reflect on.
00;20;35;07 - 00;20;37;21
There will be things that have gone wrong,
things for us to learn on, or things
00;20;37;21 - 00;20;41;27
for us to measure how that has gone
and then have a higher quality iteration.
00;20;42;04 - 00;20;43;29
So I think organizations that do that,
00;20;43;29 - 00;20;47;12
are the ones that do well,
and the other ones are the organizations
00;20;47;22 - 00;20;52;15
that culturally value difference
as they're going through these processes.
00;20;52;15 - 00;20;56;12
So the other example that I have,
and anytime anyone gives me a mic at tech
00;20;56;12 - 00;21;00;20
conference is my favorite example to give
is that of periods, periods, menopause,
00;21;00;20 - 00;21;02;11
pregnancy periods. Right.
00;21;02;11 - 00;21;05;03
These things that we don't talk about
often enough, it's taboo.
00;21;05;03 - 00;21;06;29
Everyone's
afraid of them and runs away from them.
00;21;06;29 - 00;21;10;15
But so many of us have to deal with these
anyway on a regular basis.
00;21;10;22 - 00;21;12;19
Every couple of years, the health
00;21;12;19 - 00;21;16;18
tech industry discovers
the period for the very first time,
00;21;18;10 - 00;21;18;18
right?
00;21;18;18 - 00;21;21;07
So five years ago, I won't name here it is
because it's been recorded.
00;21;21;07 - 00;21;24;28
If you read my book and she's in control,
I have the example in there.
00;21;25;01 - 00;21;26;01
There was a big health
00;21;26;01 - 00;21;29;28
tech company, discovered the period
for the very first time, and allowed users
00;21;29;28 - 00;21;34;01
to track ten days of that period
as a new feature on this fitness tracker,
00;21;35;11 - 00;21;38;11
which some folks listening will think,
oh my goodness, this is why
00;21;38;11 - 00;21;41;27
I, you know, switching off, switch it off
the women's talking periods again,
00;21;42;10 - 00;21;46;07
others are thinking ten days of a period,
what's the significance?
00;21;46;07 - 00;21;47;05
And then you have to ask yourself
00;21;47;05 - 00;21;50;04
the question, is it
that these health tech companies,
00;21;50;04 - 00;21;53;09
or this one in particular, that had
the ten day period that they'd never met
00;21;53;09 - 00;21;56;21
anyone in their team, who'd ever met
anyone who'd ever had a period,
00;21;57;16 - 00;21;59;08
or was it
00;21;59;08 - 00;22;01;24
they had a couple of people there
who might have known something about it,
00;22;01;24 - 00;22;03;09
but what they were contributing
00;22;03;09 - 00;22;06;06
and what they had to say
wasn't valued in those spaces.
00;22;06;06 - 00;22;09;22
And if you're literally working
on a period tracker in this company
00;22;10;02 - 00;22;12;04
and you can't hear that from them,
what else
00;22;12;04 - 00;22;15;11
have you literally working on that you've
not been able to hear from those people.
00;22;15;20 - 00;22;17;06
And so actually,
the right kind of cultures
00;22;17;06 - 00;22;21;07
allows those nuances
and those differences to come to the fore.
00;22;21;08 - 00;22;23;11
And it's not just about periods, it's
not just about gender.
00;22;23;11 - 00;22;24;13
We all manner all things.
00;22;24;13 - 00;22;26;17
I mean, another one I love and love to
00;22;26;17 - 00;22;30;21
reference is legally Blond,
the great 2001 film where, spoiler alert,
00;22;30;22 - 00;22;35;14
she does really well in a real law case
based on her in-depth knowledge of pants.
00;22;35;25 - 00;22;38;25
And again,
these things feel so tangential.
00;22;38;25 - 00;22;41;25
But actually, is your culture
allowing different people to show up
00;22;41;26 - 00;22;44;27
in the different ways and the different
experiences that they have.
00;22;45;07 - 00;22;48;05
Because, again,
a lot of these spaces, it's a very narrow
00;22;48;05 - 00;22;50;00
set of life experiences.
00;22;50;00 - 00;22;54;07
And so a very narrow set of, solutions
that we end up proposing to problems
00;22;54;08 - 00;22;57;25
or is actually something a lot broader,
allows you to make the best use of the AI
00;22;57;25 - 00;23;00;27
because you're not focusing
on the same doom, gloom,
00;23;01;04 - 00;23;04;09
and tiny set of use cases that
everyone else has already done to death.
00;23;05;09 - 00;23;07;25
I, I love the examples.
00;23;07;25 - 00;23;12;01
And, and who knows if
what if listeners are like,
00;23;12;01 - 00;23;15;01
clothes, clothes, clothes
00;23;18;06 - 00;23;20;25
really
I feel like it's a really important point,
00;23;20;25 - 00;23;26;04
because culturally, it's not just it's
not just an impact within the four walls,
00;23;26;04 - 00;23;26;21
you know, physically
00;23;26;21 - 00;23;30;10
or metaphorically of your organization,
and it impacts your customers.
00;23;30;19 - 00;23;34;23
And, you know, unless your customers
all look exactly like your,
00;23;35;00 - 00;23;38;16
you know, your CEO
or whoever your, you know, founder is,
00;23;39;00 - 00;23;41;25
it's important to have
those it's important to have those voices.
00;23;41;25 - 00;23;42;01
Right.
00;23;42;01 - 00;23;47;01
And how can you how can you build that
into the organization itself?
00;23;47;01 - 00;23;50;23
So is this something
you help organizations get better at?
00;23;50;23 - 00;23;52;20
I mean, one of the adages about culture
00;23;52;20 - 00;23;56;03
is that a culture
is notoriously difficult to change.
00;23;56;14 - 00;23;58;10
Do you work with organizations
on cultural change?
00;23;58;10 - 00;23;59;27
And and what does that look like?
Typically?
00;23;59;27 - 00;24;02;25
In some ways I do with the Institute
for Future of Work.
00;24;02;25 - 00;24;06;02
It's definitely something
that we are doing
00;24;06;07 - 00;24;09;05
with, with the research
that we do as a lever.
00;24;09;05 - 00;24;11;28
I think for me as an organization
I run Stamets.
00;24;11;28 - 00;24;14;23
So we work on the kind of attraction
of different types of folks
00;24;14;23 - 00;24;16;02
into the industry.
00;24;16;02 - 00;24;18;04
But then ad hoc,
there would be particular partners
00;24;18;04 - 00;24;19;26
in particular pieces of work
that we put out. Yeah.
00;24;19;26 - 00;24;24;00
To help folks think about what does that
what do that culture change look like?
00;24;24;00 - 00;24;24;15
And you're right.
00;24;24;15 - 00;24;27;04
You know, culture is the average
of everybody's actions.
00;24;27;04 - 00;24;31;10
And so it is about how do we empower
everybody to to evolve I guess.
00;24;31;10 - 00;24;32;06
And again, like I said,
00;24;32;06 - 00;24;36;02
use a legacy lens perhaps on actually
50 years down the line, would I be
00;24;36;02 - 00;24;39;28
proud of this decision that I've made
and the impact that it will have had?
00;24;40;08 - 00;24;41;25
And so that's something
I end up working, folks.
00;24;41;25 - 00;24;44;01
But it's a really it's a really tough one.
00;24;44;01 - 00;24;47;17
It's a tough one to do is a tough one
to transform, I think even more so
00;24;47;25 - 00;24;49;14
as a classically trained
computer scientist,
00;24;49;14 - 00;24;51;21
and it's definitely something
that wasn't there
00;24;51;21 - 00;24;55;09
and hasn't been a core part
of what we think knowledge is.
00;24;55;09 - 00;24;58;00
And kind of his value
across the technology scene.
00;24;58;00 - 00;24;59;24
So it's definitely
also something we're looking to see.
00;24;59;24 - 00;25;00;29
How do we change the systems
00;25;00;29 - 00;25;04;03
and the structures to make sure it's
part of the curriculum as part of,
00;25;04;03 - 00;25;06;11
you know, the way people learn about
can be science
00;25;06;11 - 00;25;08;21
and learn about a lot of these things
that they understand.
00;25;08;21 - 00;25;11;25
Also, the implications
that it's not just about deterministic,
00;25;11;25 - 00;25;12;18
not just about the numbers.
00;25;12;18 - 00;25;15;15
You know, life is so much more complicated
than the maths.
00;25;15;15 - 00;25;18;04
And so how do we reflect that
in the way that we build and that we do,
00;25;18;04 - 00;25;18;23
that we deploy?
00;25;20;02 - 00;25;20;25
Right.
00;25;20;25 - 00;25;24;07
And you know, you mentioned,
you know, being classically trained in
00;25;24;08 - 00;25;25;02
computer science.
00;25;25;02 - 00;25;26;09
And I know some of the work that you do
00;25;26;09 - 00;25;30;14
is, as you said, with Stem
Arts and beyond, with, women and,
00;25;30;24 - 00;25;34;05
you know, traditionally
marginalized groups in computer science
00;25;34;05 - 00;25;37;05
and in some of these organizations
trying to get more,
00;25;37;17 - 00;25;38;24
more of them into the field.
00;25;38;24 - 00;25;42;29
So, I mean, again, this is probably too
big a question to bite off in one go, but,
00;25;43;17 - 00;25;45;29
what what's your sense about, about
00;25;45;29 - 00;25;49;29
why women have been
historically underrepresented in tech?
00;25;50;05 - 00;25;53;02
And, I mean, does it matter?
00;25;53;02 - 00;25;54;28
And what do we do about it?
00;25;54;28 - 00;25;58;06
Oh, I mean, there's yeah, we definitely
I don't know how much we have,
00;25;58;06 - 00;25;59;18
how much time we have on this, I think.
00;25;59;18 - 00;26;02;19
So, I mean, have been this
historically marginalized
00;26;02;21 - 00;26;05;27
from the tech scene,
lots of different elements
00;26;05;27 - 00;26;09;14
of our kind of heritage in the tech scene
mean that we've taken some of this
00;26;09;28 - 00;26;12;21
unhelpful elitism, maybe from from maths,
you know,
00;26;12;21 - 00;26;15;21
it became as a field
out of mathematics, academic mathematics.
00;26;15;21 - 00;26;18;16
And you can kind of take a look
at what's going on over in that scene.
00;26;18;16 - 00;26;21;29
I think there's also this kind of myth
of the kind of the lone
00;26;21;29 - 00;26;25;00
genius and people
that are born with this knowledge.
00;26;25;00 - 00;26;26;22
And, you know,
if you're born with this knowledge
00;26;26;22 - 00;26;28;12
and born with this, these things,
00;26;28;12 - 00;26;31;12
then you look a certain way
because that's, you know, part of the
00;26;31;15 - 00;26;34;26
the kind of the, the adjacency we have to
you being
00;26;35;01 - 00;26;39;00
what I sometimes call dead and white
and male with a beard and, you know,
00;26;39;00 - 00;26;40;07
you must be all of those
00;26;40;07 - 00;26;44;02
there's four things dead included
for you to be someone that can be revered.
00;26;44;02 - 00;26;44;08
Right.
00;26;44;08 - 00;26;47;08
And so we're all aiming to be that,
that dead genius who,
00;26;47;20 - 00;26;50;28
you know, created something that's now
kind of transformed the whole world.
00;26;51;08 - 00;26;54;28
Which I think is a high bar to to say
you must be dead in order for you
00;26;54;28 - 00;26;55;24
to kind of contribute to.
00;26;55;24 - 00;27;00;04
Technically, I think that's really I mean,
we don't grapple with that often enough.
00;27;00;24 - 00;27;04;16
But I think there's lots of geopolitical,
notions.
00;27;04;16 - 00;27;07;04
Mark. Hexes, right. Written a great book.
00;27;07;04 - 00;27;11;09
Programed inequality is kind of
there's a lot lots.
00;27;11;09 - 00;27;13;20
I think the face
and kind of read to delve into why
00;27;13;20 - 00;27;15;16
this is something that happens again
and again.
00;27;15;16 - 00;27;19;01
And I think if I'm to be
completely honest, up until this point,
00;27;19;01 - 00;27;23;25
it almost okay, because so much of what
we were doing technically was very niche.
00;27;24;18 - 00;27;28;17
And for a certain set of people,
you know, in a certain space.
00;27;28;17 - 00;27;30;02
And we kind of just made it
because it would be cool.
00;27;30;02 - 00;27;33;14
Wouldn't it be cool to have a website
that rated whether people were or not?
00;27;33;21 - 00;27;34;22
We kind of just built that.
00;27;34;22 - 00;27;36;08
And and I don't think any of us
00;27;36;08 - 00;27;39;01
really grappled
with the fact that you build that website.
00;27;39;01 - 00;27;41;20
Then all of a sudden
you're moving elections
00;27;41;20 - 00;27;43;19
and you're transforming democracy
as we know it.
00;27;43;19 - 00;27;48;00
And so I think it matters because
we're building things for the world.
00;27;48;00 - 00;27;49;25
These women are part of that world.
00;27;49;25 - 00;27;52;03
And so why wouldn't
we have their experiences
00;27;52;03 - 00;27;56;04
reflected there, in what we're doing
so we can sell more if we really want to.
00;27;56;04 - 00;27;59;15
But I also, I feel that we end up
creating more problems than we're solving
00;27;59;28 - 00;28;02;24
in the way that we deploy technology
at the moment, because we don't have
00;28;02;24 - 00;28;07;11
different types of voices,
taken into account in what we're creating.
00;28;07;11 - 00;28;11;25
And ultimately,
you know, I, I, I'm a technologist
00;28;11;25 - 00;28;14;28
because of the altruistic strain
I have in me.
00;28;15;13 - 00;28;17;20
You know, if I can create this thing,
if I understand how it works
00;28;17;20 - 00;28;18;08
and logically,
00;28;18;08 - 00;28;20;05
then I can recreate it,
and then I can solve that problem
00;28;20;05 - 00;28;22;18
again and again
without me needing to be there.
00;28;22;18 - 00;28;26;03
And so I think for me, it's only fair
to the technology, right,
00;28;26;07 - 00;28;27;24
that we actually end up building things.
00;28;27;24 - 00;28;31;04
We make the right kinds of decisions
what we're prioritizing
00;28;31;04 - 00;28;33;14
and how we're building it
and what we're creating
00;28;33;14 - 00;28;35;11
and what life experiences
that we're reflecting it.
00;28;35;11 - 00;28;37;23
And so that's why it's important
to have women,
00;28;37;23 - 00;28;41;06
be a part of the, of the puzzle,
but also other groups.
00;28;41;06 - 00;28;41;19
Right.
00;28;41;19 - 00;28;44;18
You've got gender, you've got races,
all manner of different folks.
00;28;44;18 - 00;28;45;13
That should be part of it.
00;28;45;13 - 00;28;49;00
Given this is now global tech
that we're building at any given point.
00;28;49;00 - 00;28;51;23
And there are so many problems
to be solved in the world
00;28;51;23 - 00;28;53;05
without us creating new ones.
00;28;55;10 - 00;28;55;21
So when it,
00;28;55;21 - 00;28;59;27
when we talk about solving this problem
and and how right, like how we create,
00;29;00;09 - 00;29;04;02
a more equitable or democratized
environment where everybody can contribute
00;29;04;02 - 00;29;06;10
through these technologies
and with this technologies.
00;29;06;10 - 00;29;11;24
And you don't have to, look a certain way
or see yourself compared to a certain,
00;29;12;08 - 00;29;16;10
you know, reverential figure,
how how do we get there?
00;29;16;10 - 00;29;19;01
And what sort of work are you doing to
to get folks there?
00;29;19;01 - 00;29;22;09
I mean, I love the idea of,
you know, destroying
00;29;22;09 - 00;29;25;17
this narrative of, you know, the dead
white guy with the beard.
00;29;25;17 - 00;29;29;18
And, you know, if you don't look like
Albert Einstein or whoever else, like,
00;29;29;18 - 00;29;33;18
you can't possibly be successful in
this is it as simple as, you know,
00;29;33;18 - 00;29;37;16
just having more certainly role models
that look like,
00;29;37;17 - 00;29;40;17
you know,
you know, look more diverse is important.
00;29;40;24 - 00;29;44;22
What else is are kind of the key factors
in your mind in terms of
00;29;45;22 - 00;29;48;08
making this
a viable path for for more young people
00;29;48;08 - 00;29;51;04
who wouldn't necessarily
have seen themselves in these roles.
00;29;51;04 - 00;29;54;26
So I it's fun that you start
with role models kind of in your question,
00;29;54;26 - 00;29;57;03
I think it's, it's ensuring
that we have the right environment
00;29;57;03 - 00;29;58;27
for these role models to exist.
00;29;58;27 - 00;30;00;26
And so I think yes, yes,
we need the stories
00;30;00;26 - 00;30;02;28
to be told of the,
of the different types of people
00;30;02;28 - 00;30;05;15
that are thriving in the industry,
that are creating,
00;30;05;15 - 00;30;09;01
that have already made things,
you know, whether it's GG, right,
00;30;09;01 - 00;30;10;23
whether it's, you know, insert,
00;30;10;23 - 00;30;13;23
insert name of any of the things actually,
the women have created Kevlar,
00;30;13;23 - 00;30;17;04
like there's kind of a long list,
actually, of women innovators
00;30;17;04 - 00;30;20;22
who have done things already who
we haven't eulogized in Hollywood movies
00;30;21;01 - 00;30;24;20
and or whose stories we haven't told again
and again who have died off.
00;30;24;20 - 00;30;26;24
Right. And actually,
we've not given them their flowers before.
00;30;26;24 - 00;30;28;06
They've before they've gone.
00;30;28;06 - 00;30;30;00
And so there's a whole life stories and,
00;30;30;00 - 00;30;32;00
you know, I implore Hollywood
or anyone that's listening
00;30;32;00 - 00;30;34;07
that has that ability
to tell their stories,
00;30;34;07 - 00;30;37;12
to go and look up their stories
and try and tell lots of them.
00;30;37;12 - 00;30;38;28
You know, there's quite a long list
that we have
00;30;38;28 - 00;30;41;06
for this Stephanie Shirley,
whether it's Gladys West,
00;30;41;06 - 00;30;42;18
Katherine Johnson,
we've now actually seen.
00;30;42;18 - 00;30;43;26
But there's there's always more.
00;30;43;26 - 00;30;46;18
We've had so many Feinstein
and so many of the others
00;30;46;18 - 00;30;48;19
that actually let's let's kind of fill
those in.
00;30;48;19 - 00;30;50;04
So I think one is to tell the story.
00;30;50;04 - 00;30;51;25
But I think the other thing
is to genuinely work on
00;30;51;25 - 00;30;55;04
ensuring that different types of people
can thrive in our spaces,
00;30;55;04 - 00;30;58;13
and it doesn't need to be, again,
that lone role model.
00;30;58;13 - 00;31;01;11
Although that kind of list of people
that it can be just something that, hey,
00;31;01;11 - 00;31;03;19
name a female journalist,
people can can do that
00;31;03;19 - 00;31;05;24
the same way they can name
email journalist. Right.
00;31;05;24 - 00;31;08;28
And so I think it
we need to have the same sorts of spaces.
00;31;08;28 - 00;31;11;28
Then we also need to be able to hold folks
accountable for bad behavior,
00;31;12;02 - 00;31;14;09
which, you know,
when we see the use of NDAs
00;31;14;09 - 00;31;18;04
across our industry, when we see
the kind of the norms we have as to,
00;31;18;13 - 00;31;21;20
you know, how people get equity
in particular companies, how people are
00;31;21;26 - 00;31;23;09
held accountable for, for, for,
00;31;23;09 - 00;31;27;00
like I said, bad behavior
kind of corporately versus others.
00;31;27;00 - 00;31;28;27
I think there's
a lot of kind of bad habits
00;31;28;27 - 00;31;32;23
and a certain type of person
that gets protected in these spaces,
00;31;32;23 - 00;31;36;25
and that's academically as well
as, in industry entrepreneurs as well,
00;31;36;25 - 00;31;41;06
who's allowed to fail fast and often,
who's not allowed to fail fast and often.
00;31;41;15 - 00;31;43;03
And so I think there are quite
a lot of tweaks, actually,
00;31;43;03 - 00;31;46;03
that we do need to make to ensure
that different types of folks can survive.
00;31;46;04 - 00;31;49;04
Another one would be, you know,
and you have this
00;31;49;06 - 00;31;53;07
to a lesser degree in the US,
but it it still does show up in the UK.
00;31;53;07 - 00;31;56;05
We have it that you make decisions
on particular subjects.
00;31;56;05 - 00;32;00;06
You want to study at a high level at 13,
and those decisions follow
00;32;00;06 - 00;32;03;16
you whole of your life,
which I mean, I don't know.
00;32;03;16 - 00;32;05;11
I don't know
if you're even eating the same food
00;32;05;11 - 00;32;09;17
that you ate as a 13 year old,
let alone your entire career.
00;32;09;17 - 00;32;12;22
And, you know, God forbid
you made the wrong decision at 13.
00;32;13;13 - 00;32;16;19
Hair color what you're eating
clothing wise or whatever.
00;32;16;22 - 00;32;19;09
But still that we have an entire industry
that no, no, no, no,
00;32;19;09 - 00;32;22;16
you cannot possibly be an engineer here
because you did not make that decision
00;32;22;16 - 00;32;23;20
at that time.
00;32;23;20 - 00;32;26;11
And so there's there's nothing of value
that you have to contribute here
00;32;26;11 - 00;32;29;03
because you didn't enter onto that
pathway.
00;32;29;03 - 00;32;33;06
And again, it speaks to this idea of
you have to have been born that genius,
00;32;33;06 - 00;32;36;22
because then at 13
you would have locked in those ideas.
00;32;37;08 - 00;32;37;26
Was actually there.
00;32;37;26 - 00;32;38;12
A lot of folks
00;32;38;12 - 00;32;41;28
who have a lot of things to contribute
that we're doing other things at 13,
00;32;41;28 - 00;32;46;09
and we should value that actually
in what we're creating, what we're doing.
00;32;46;09 - 00;32;49;09
So I think it's it's updating the culture
so that we can have more
00;32;49;09 - 00;32;50;11
of those role models,
00;32;50;11 - 00;32;53;11
as well as telling the story of the role
models who have somehow made it through,
00;32;53;22 - 00;32;55;23
despite how,
00;32;57;01 - 00;33;00;03
unfriendly the industry can be.
00;33;00;12 - 00;33;03;00
So, I mean, I'll tell you off the bat
00;33;03;00 - 00;33;06;14
that 13 year olds are not typically
the target demographic of this podcast,
00;33;08;07 - 00;33;10;20
I think a lot of,
I think a lot of listeners
00;33;10;20 - 00;33;13;04
do potentially have their own
13 year old or,
00;33;13;04 - 00;33;16;14
you know, own teenager in their house,
which is what do you tell,
00;33;17;01 - 00;33;19;23
you know, young people these days
about what the future looks like,
00;33;19;23 - 00;33;20;10
what they should
00;33;20;10 - 00;33;21;16
do, you know,
00;33;21;16 - 00;33;24;16
how they should be thinking about
their own future and their own legacy.
00;33;24;25 - 00;33;28;25
So we estimates we ended up working
this notion of Stem, steam.
00;33;28;25 - 00;33;31;11
So rather than Stem
so including art and design in there.
00;33;31;11 - 00;33;35;11
And I think what you do up telling
the 13 year old is a lot of these things,
00;33;35;11 - 00;33;36;16
we haven't figured them out,
00;33;36;16 - 00;33;39;00
you know, as we as
we started at the top of this conversation
00;33;39;00 - 00;33;41;22
where the beginning of this fourth
industrial revolution,
00;33;41;22 - 00;33;44;06
a lot of the answers
we don't have the adults.
00;33;44;06 - 00;33;46;14
There's a lot of things
the adults do not know.
00;33;46;14 - 00;33;51;01
And so we end up talking to them about,
okay, you know, what would be your ideal?
00;33;51;01 - 00;33;53;27
What would you create?
What would you prioritize?
00;33;53;27 - 00;33;55;05
How would you see this?
00;33;55;05 - 00;33;58;14
And we end up relating it as well often
to what their would be interested in.
00;33;58;14 - 00;34;01;18
So you know, not AI for the sake of AI,
but how does that impact
00;34;01;18 - 00;34;05;03
the football team that they support,
or how does that impact on the,
00;34;05;05 - 00;34;08;21
clothing, brand
that they, that they, love to wear?
00;34;08;21 - 00;34;11;02
Or how does that impact on the food
that they like to choose?
00;34;11;02 - 00;34;14;22
So we start with what they're interested
in, that kind of relate the technology,
00;34;14;22 - 00;34;15;06
to that.
00;34;15;06 - 00;34;18;25
But in, in quite a big way,
I think this, this generation
00;34;19;06 - 00;34;23;26
have grown up, quite different
to previous generations
00;34;23;26 - 00;34;28;15
in that they've got to see the problems
of the world up close in real time.
00;34;29;15 - 00;34;30;11
And so I think
00;34;30;11 - 00;34;34;01
the notion of what success
looks like is quite different to them.
00;34;34;25 - 00;34;38;09
And so we end up talking to them about
how does technology, how much technology
00;34;38;09 - 00;34;40;08
and power that version of success,
how might technology
00;34;40;08 - 00;34;42;08
be something that helps a climate change
technology
00;34;42;08 - 00;34;45;16
be something that helps with fairness
and equity and social justice?
00;34;45;26 - 00;34;49;29
And so that's what we end up talking to
13 year olds about and leave the door
00;34;49;29 - 00;34;54;05
open to say, hey, like study history,
study languages, whatever it might be.
00;34;54;09 - 00;34;58;07
But understand that technology is a tool
that will have a role to play in
00;34;58;07 - 00;34;59;18
whatever you end up doing.
00;34;59;18 - 00;35;02;03
And so you can choose to study technology
that's completely fine,
00;35;02;03 - 00;35;05;24
or build your tech literacy
so that no matter what you do,
00;35;05;24 - 00;35;07;12
you're able to apply the technology
in a way
00;35;07;12 - 00;35;10;01
that makes sense to you
and solves problems that you have.
00;35;10;01 - 00;35;12;17
And so those are the kinds
of conversations, controversially or not,
00;35;12;17 - 00;35;13;12
that we're having with them.
00;35;13;12 - 00;35;15;22
It's not everybody learn to code,
00;35;15;22 - 00;35;18;25
but it is definitely learned
to have that critical thinking.
00;35;18;25 - 00;35;20;29
Learn to collaborate, learn to create,
00;35;20;29 - 00;35;22;14
because these are things
that you'll still need to do
00;35;22;14 - 00;35;24;29
no matter how many of our
jobs the robots take away.
00;35;26;15 - 00;35;28;11
I'm I love that answer.
00;35;28;11 - 00;35;31;22
And it it preempted the next thing
I wanted to talk about, which is you
00;35;31;22 - 00;35;35;06
use you said, you know, critical thinking,
you know, collaborate create
00;35;35;20 - 00;35;39;12
are those the question I wanted to ask you
is for those coming back
00;35;39;12 - 00;35;43;20
to the notion of the workers
who are facing,
00;35;44;00 - 00;35;48;20
you know, the, the,
the reality of, you know, coexisting
00;35;48;20 - 00;35;53;00
with more and more technology that can do
things that could never do before.
00;35;53;00 - 00;35;58;04
What are the the critical skills for,
you know, the roles of tomorrow
00;35;58;04 - 00;36;02;01
and for the to be kind
of a successful worker or a leader
00;36;02;01 - 00;36;05;01
in tomorrow is is it those same three
or how would you frame that out?
00;36;06;00 - 00;36;07;12
Yeah. So it's it's definitely something.
00;36;07;12 - 00;36;10;23
So we at the institute, recently completed
something called a PC release review.
00;36;10;23 - 00;36;14;13
So we did this with the Nobel
laureate, Christmas release
00;36;14;27 - 00;36;17;20
looking at under the transformation
of work and jobs
00;36;17;20 - 00;36;19;28
and how they were all being transformed
at this time,
00;36;19;28 - 00;36;23;08
one of the things we did
was a huge analysis of, job,
00;36;23;27 - 00;36;28;02
roles, that kind of job openings
across a big engine
00;36;28;12 - 00;36;31;24
and looking at over time
how the skills would be
00;36;31;24 - 00;36;34;24
transforming and changing across
this kind of big data set.
00;36;35;12 - 00;36;36;10
And what we saw that. Yeah.
00;36;36;10 - 00;36;38;09
Where we've got AI and maybe data skills
00;36;38;09 - 00;36;40;26
that are more prevalent
across the different job descriptions.
00;36;40;26 - 00;36;44;24
So going out this idea of communicating,
of collaborating, critical thinking,
00;36;45;03 - 00;36;49;29
of creativity, these sorts of skills
which some folks have called soft skills,
00;36;49;29 - 00;36;53;15
I didn't like that core skills, I think is
is really what we're talking about.
00;36;53;23 - 00;36;55;09
If you want future proof skills,
00;36;55;09 - 00;36;58;02
like those notions of synthesizing
the information
00;36;58;02 - 00;37;01;16
and connecting with it, and understanding
as well as communicating,
00;37;01;22 - 00;37;02;27
think those end up being skills
00;37;02;27 - 00;37;06;28
that folks will need, you know,
no matter what the technology is doing.
00;37;06;28 - 00;37;08;02
And you can have them
00;37;08;02 - 00;37;11;02
augmented by the technology,
but those things can't be replaced
00;37;11;06 - 00;37;12;08
by the technology.
00;37;12;08 - 00;37;14;18
I mean, we've all seen what generative
AI is spewing out.
00;37;14;18 - 00;37;16;13
Now we're calling it slopped, right?
00;37;16;13 - 00;37;18;23
It's the kind of the overarching term.
00;37;18;23 - 00;37;20;00
And so as a human being, yeah,
00;37;20;00 - 00;37;21;18
the quality of slop
that you will have put out
00;37;21;18 - 00;37;24;07
manually is probably slightly different
to what they are generated.
00;37;24;07 - 00;37;26;22
Slop, is doing is creating.
00;37;26;22 - 00;37;29;24
But critical thinking
I think is probably the biggest one.
00;37;29;24 - 00;37;30;04
Right.
00;37;30;04 - 00;37;32;21
And and it's possible
that some of this is missing and that's
00;37;32;21 - 00;37;36;16
why a lot of those, for example,
I use cases just aren't hitting
00;37;36;25 - 00;37;39;22
because folks aren't able
to critically think and say,
00;37;39;22 - 00;37;43;13
if this technology is automating
or if it's being able to do better, better
00;37;43;13 - 00;37;47;22
data exploration, or if it's generate
like critically thinking,
00;37;48;01 - 00;37;51;17
which part of the processes that I'm doing
at the moment or what part of the value
00;37;51;17 - 00;37;55;04
that I'm providing to my clients
or my customers at the moment, require
00;37;55;27 - 00;37;58;02
repetitive things
that then can be automated,
00;37;58;02 - 00;38;01;04
or it has a lot of data
that needs to be explored,
00;38;01;05 - 00;38;03;07
or is going to support me in making,
you know,
00;38;03;07 - 00;38;05;08
high quality decisions or whatever
it might be.
00;38;05;08 - 00;38;07;08
So I think critical thinking and saying,
okay,
00;38;07;08 - 00;38;09;03
just because I said
this, should I believe it,
00;38;10;28 - 00;38;13;02
critical thinking to say
when I give it this prompt, when I give it
00;38;13;02 - 00;38;16;17
that it gave me this back to actually
what what really is going on here.
00;38;17;00 - 00;38;20;06
I think there's a lot of that that ends up
going missing that will stand folks
00;38;20;06 - 00;38;24;00
in good stead and allow them to explore
and make better use of the technology.
00;38;24;01 - 00;38;27;19
Because as much as we have AI as this hype
machine as this hype cycle, sorry.
00;38;27;20 - 00;38;28;17
And as much as you know,
00;38;28;17 - 00;38;31;23
we can talk down to that, actually the
AI is good at particular things,
00;38;32;26 - 00;38;33;03
right?
00;38;33;03 - 00;38;36;03
We just have to spend the time to say,
where does that overlap with what
00;38;36;03 - 00;38;39;13
my specific use cases and work is?
00;38;40;16 - 00;38;44;00
When you talk to leaders
then who are looking at the technology,
00;38;44;00 - 00;38;45;23
they're ideally thinking critically.
00;38;45;23 - 00;38;49;24
They're they're looking at,
the, the organization of the future
00;38;49;24 - 00;38;51;16
or the workforce of the future.
00;38;51;16 - 00;38;52;23
what's your best advice for them?
00;38;52;23 - 00;38;54;23
And are you finding that
they're typically,
00;38;54;23 - 00;38;58;06
you know, thinking too small
or thinking too abstractly?
00;38;58;15 - 00;39;01;17
What are some of the biggest mistakes
you're seeing?
00;39;01;25 - 00;39;03;21
I think that that thinking to short term,
00;39;04;24 - 00;39;05;14
I think, is the
00;39;05;14 - 00;39;08;14
big mistake that I see
from businesses at the moment.
00;39;08;17 - 00;39;11;17
There's a lot of
this is the problem I have here and now,
00;39;12;11 - 00;39;14;18
and you're
kind of dealing with the symptom.
00;39;14;18 - 00;39;17;22
You're not doing the root cause, you're
not dealing with the wider set of things.
00;39;18;00 - 00;39;21;22
The kind of the example
I almost always give folks is Tobias
00;39;21;22 - 00;39;23;20
comes up
and folks always end up asking me,
00;39;23;20 - 00;39;26;21
oh, you know, what can we do to mitigate
the bias in the AI that we're building?
00;39;26;21 - 00;39;27;21
And I'm like, okay, cool.
00;39;27;21 - 00;39;29;26
Why don't we rewind?
00;39;29;26 - 00;39;32;26
We can build an AI
00;39;33;06 - 00;39;37;10
that helps predict, analyze, whatever
the best sandwiches for us to create
00;39;37;19 - 00;39;41;19
and to have on the menu at our downtown
Toronto lunch space place.
00;39;41;19 - 00;39;42;13
Right.
00;39;42;13 - 00;39;46;10
And we can say we can build and fine tune
the AI to the nth degree,
00;39;47;14 - 00;39;47;25
right.
00;39;47;25 - 00;39;48;23
And say, this is the best
00;39;48;23 - 00;39;52;28
AI that's ever been known for anyone
to build to choose sandwich flavors.
00;39;53;06 - 00;39;56;06
But there are a lot of people in Toronto
who don't eat sandwiches for lunch.
00;39;58;17 - 00;40;01;07
I said, and so you have to think
a little bit, but do you know, like you've
00;40;01;07 - 00;40;04;07
prescribed this, you've set this problem,
you set this scope, you don't.
00;40;04;07 - 00;40;06;10
You're not thinking wider.
You're not thinking ten.
00;40;06;10 - 00;40;07;28
Then this is why I say 50
is down the line.
00;40;07;28 - 00;40;11;02
When you do, that allows folks to zoom out
a little bit and actually not focus
00;40;11;02 - 00;40;14;06
on the current pain point that they have,
or even the next pain point.
00;40;14;14 - 00;40;17;15
But to say, let's
go back to first principles
00;40;17;24 - 00;40;20;24
and really think about how we can do this
in a transformative way.
00;40;20;29 - 00;40;22;29
And I find that a lot of business
00;40;22;29 - 00;40;26;09
leaders are held by the neck,
by shareholders, stakeholders, you know,
00;40;26;10 - 00;40;30;26
whatever the shorter loop cycles
we are that we're operating in.
00;40;31;08 - 00;40;34;03
And so actually the folks that do the best
are the ones that kind of can pull back
00;40;34;03 - 00;40;39;24
and say, we're going to take a long term
view on our very niche industry.
00;40;39;24 - 00;40;42;21
And again, this is not about tech.
This is about other places.
00;40;42;21 - 00;40;44;20
This okay. In education.
It's like a long term view.
00;40;44;20 - 00;40;46;07
What is the point of education.
00;40;47;20 - 00;40;48;21
Therefore what are we going to use
00;40;48;21 - 00;40;53;13
AI to help teachers in the here and now
deal with admin in the current system?
00;40;53;13 - 00;40;56;22
Or are we going to use
AI to completely transform the picture
00;40;56;22 - 00;41;00;10
that we have of skills and knowledge
that our learners have?
00;41;00;28 - 00;41;03;28
And if you start to think about that,
then you work on a whole nother set of
00;41;04;19 - 00;41;05;02
problems.
00;41;05;02 - 00;41;07;04
You're working at a whole nother level,
00;41;07;04 - 00;41;09;07
and then you're able
to kind of fundamentally transform
00;41;09;07 - 00;41;12;07
the way we're looking at just the admin
that they're having to do today,
00;41;12;15 - 00;41;13;15
which is a tiny use case.
00;41;13;15 - 00;41;14;19
And, you know, it's ended up
00;41;14;19 - 00;41;18;16
not actually necessarily giving the right
kind of, payoff for folks.
00;41;18;16 - 00;41;20;24
But when you start to think about
assessment, we start to, you know,
00;41;20;24 - 00;41;23;24
and so I think is that strategic thinking
long term
00;41;24;16 - 00;41;27;01
and then working your way back
then allows folks
00;41;27;01 - 00;41;30;01
to have the vision
to have a little bit more motivation,
00;41;30;02 - 00;41;33;08
allows them to maybe have a bit more room
for the kind of accountability
00;41;33;08 - 00;41;35;28
on learning from mistakes,
as we mentioned already.
00;41;35;28 - 00;41;38;19
And that's what we're seeing is working
for, for folks, when they're able
00;41;38;19 - 00;41;41;27
to look
beyond the problems of the here and now.
00;41;43;24 - 00;41;45;03
I love that approach.
00;41;45;03 - 00;41;47;20
And it makes
I mean, to me, it makes complete sense.
00;41;47;20 - 00;41;51;24
And I can, you know, you say, and
it's like, yes, like I'm 100% bought in.
00;41;52;02 - 00;41;55;18
one of the things I've seen is
a lot of leaders
00;41;55;18 - 00;41;59;06
seemed and a lot of people,
frankly, seem to have a convenient belief.
00;41;59;06 - 00;42;02;13
Right now
that we're in such a period of disruption
00;42;02;13 - 00;42;04;16
that the long term
is completely unknowable.
00;42;04;16 - 00;42;06;16
It's just like, who knows?
00;42;06;16 - 00;42;07;25
We can't possibly know it.
00;42;07;25 - 00;42;09;20
Therefore we can't plan for it.
00;42;09;20 - 00;42;13;15
Therefore, I don't have to look beyond
the next three months
00;42;13;15 - 00;42;15;07
because that's all that matters.
00;42;15;07 - 00;42;19;18
when you look out over the horizon,
do you give much credence to that,
00;42;19;18 - 00;42;24;27
or do you think that's just an excuse for,
you know, not having to do the
00;42;24;27 - 00;42;28;20
the thoughtful work
required to, to to plan?
00;42;29;10 - 00;42;30;18
I think is fair.
00;42;30;18 - 00;42;32;04
I mean, we're in a human right.
00;42;32;04 - 00;42;35;09
And so I think
if things you can't control, you fear.
00;42;35;09 - 00;42;38;01
I think my biggest response to that,
though, is that, you know, there's less
00;42;38;01 - 00;42;38;25
there's less fear
00;42;38;25 - 00;42;41;24
if you are in the driving seat
and if you make some decisions,
00;42;41;24 - 00;42;43;29
and if you look at some of these things
that are now
00;42;43;29 - 00;42;46;23
and you do have the agency to do it,
because at the time that we're at
00;42;46;23 - 00;42;49;22
in this cycle, there will come a time
when maybe it is too late
00;42;49;22 - 00;42;51;24
and too many of these decisions
have been locked in,
00;42;51;24 - 00;42;53;15
and too many of these norms have been set.
00;42;53;15 - 00;42;55;08
But that time isn't now.
00;42;55;08 - 00;42;57;17
And so I do get folks excited.
00;42;57;17 - 00;43;00;06
You know, I do like to, you know,
there's so much you see in science
00;43;00;06 - 00;43;02;17
fiction, right,
that's ended up being real.
00;43;02;17 - 00;43;07;06
I mean, kids, Knight
Rider from the 1980s.
00;43;07;06 - 00;43;09;01
I don't remember much of the 1980s. Right.
00;43;09;01 - 00;43;11;07
But David Hasselhoff played
this guy called Michael.
00;43;11;07 - 00;43;13;02
He took to his car, called Kitt,
00;43;13;02 - 00;43;16;10
and it was such a far off thing
in the 1980s for him to talk to this car,
00;43;16;10 - 00;43;19;24
in the car, to hear him, understand him,
then to successfully fight crime together
00;43;19;24 - 00;43;22;24
and always get folks
to think that we talk to our cars now
00;43;22;29 - 00;43;26;12
in 2025, soon to be 2026, whatever,
you're listening to this, there are people
00;43;26;19 - 00;43;27;28
for real talking to their cars.
00;43;27;28 - 00;43;29;19
There's no science fiction thing.
00;43;29;19 - 00;43;30;29
They might not be fighting crime.
00;43;30;29 - 00;43;33;29
Maybe that, you know, turning up
the temperature or changing the track.
00;43;34;07 - 00;43;36;09
Right. But the car is understanding them.
00;43;36;09 - 00;43;40;02
And so I get folks to think like,
this doesn't have to be a big, major,
00;43;40;08 - 00;43;43;27
huge shift that you invent the internet
in your organization.
00;43;44;05 - 00;43;47;00
But there will be small tweaks
based off desires, dreams,
00;43;47;00 - 00;43;49;26
imaginations that you have
and so you better imagine.
00;43;49;26 - 00;43;51;22
I mean, we're human beings.
All of us have desires.
00;43;51;22 - 00;43;54;08
Us, you know, things that we wish
we could have instead.
00;43;54;08 - 00;43;59;00
And maybe not all of them are the most
worthy of desires that we want to have.
00;43;59;06 - 00;44;02;03
But actually, across an organization,
you'll have a lot of people,
00;44;02;03 - 00;44;03;24
they'll have a lot of ideas
on what could be next,
00;44;03;24 - 00;44;05;07
and you could definitely make that.
00;44;05;07 - 00;44;06;28
I mean, if we if we don't,
00;44;06;28 - 00;44;09;22
if we don't, you know, seem
that something we've got within object
00;44;09;22 - 00;44;10;19
and why do we bother
00;44;12;03 - 00;44;15;03
getting up
to work and getting out to create anyway.
00;44;15;03 - 00;44;18;03
It's, it's a thing to be human, to dream
and to hope and to explore.
00;44;18;05 - 00;44;21;15
So yeah, I mean, that's
another thing I get focused, you know?
00;44;21;15 - 00;44;23;14
Right.
If you're not a sci fi person. Right?
00;44;23;14 - 00;44;25;04
Your version is sci fi.
00;44;25;04 - 00;44;30;22
So 200 years down the line, what would be
well, you in 200 and, you know, to 2225
00;44;30;22 - 00;44;35;07
or 26 whenever you're listening to this,
what would they be doing just right?
00;44;35;07 - 00;44;37;01
That what what do you imagine?
00;44;37;01 - 00;44;39;24
Because none of us know completely
what's going on, but all of us can dream
00;44;39;24 - 00;44;42;24
and can explore and can hope.
00;44;44;13 - 00;44;46;04
But I understand that it's very human
to be.
00;44;46;04 - 00;44;49;02
To be afraid of the unknown.
00;44;49;02 - 00;44;50;21
We've talked to them.
00;44;50;21 - 00;44;52;10
It may it makes sense.
00;44;52;10 - 00;44;56;06
And it's something that I think has,
you know, is being profoundly felt
00;44;56;06 - 00;44;57;22
by a lot of people right now.
00;44;57;22 - 00;44;59;20
We've talked at length
00;44;59;20 - 00;45;02;21
and read about kind of the worker piece
about the organizational piece.
00;45;02;29 - 00;45;05;23
We haven't talked a lot yet
about kind of the society piece,
00;45;05;23 - 00;45;10;17
which you said is sort of the third leg
of this stool and in some ways is,
00;45;11;19 - 00;45;12;04
maybe the most
00;45;12;04 - 00;45;15;04
difficult to nail down,
but also the most powerful.
00;45;15;27 - 00;45;18;06
What are some of the big,
00;45;18;06 - 00;45;21;24
I guess, risk
like change, risks of society
00;45;21;24 - 00;45;25;10
as well as some of the benefits
you see kind of emerging that, you know,
00;45;25;20 - 00;45;29;04
maybe we're not taking seriously enough
or not planning for well enough.
00;45;29;14 - 00;45;31;23
So I look at this
from the social justice lens.
00;45;31;23 - 00;45;32;02
I think.
00;45;32;02 - 00;45;35;02
I think the biggest risks
that we have at societal level
00;45;35;12 - 00;45;40;28
is that we forget or we ignore or we,
00;45;42;00 - 00;45;44;09
dull to or dumb
00;45;44;09 - 00;45;48;19
to the social inequalities
that we've already had thus far.
00;45;49;20 - 00;45;52;16
Is the frustration that I have,
00;45;52;16 - 00;45;55;24
in quite a lot of the discussions
that I have with people
00;45;55;24 - 00;45;57;00
and with technologies in particular,
00;45;57;00 - 00;46;00;11
I said just earlier, you know, less
more complicated in the maths,
00;46;01;07 - 00;46;03;21
there are a lot of data sets
that we just don't have a lot of things
00;46;03;21 - 00;46;06;21
that we just don't know about people
and have never known,
00;46;06;24 - 00;46;09;24
those in power have never haven't
deemed important enough
00;46;09;28 - 00;46;13;01
for us to have,
a very UK specific example.
00;46;13;02 - 00;46;16;12
Before Covid, when folks got married,
00;46;17;01 - 00;46;19;29
they entered their names onto the marriage
registry, you know, marriage rich.
00;46;19;29 - 00;46;23;22
So we had we have a central marriage
registry in the UK before Covid.
00;46;23;27 - 00;46;25;24
They asked you for your father's
occupation.
00;46;25;24 - 00;46;27;04
Only.
00;46;27;04 - 00;46;30;01
If you've got married since Covid,
you've entered your mother's
00;46;30;01 - 00;46;33;01
and your father's occupation
onto the marriage register.
00;46;33;11 - 00;46;36;04
So if you're at the
whole of eternity pre-COVID,
00;46;37;07 - 00;46;37;18
that's just
00;46;37;18 - 00;46;40;18
information we do not have for people
that got married.
00;46;40;23 - 00;46;43;23
Whether there's something wrong with that
or not,
00;46;44;03 - 00;46;46;16
let folks unpick.
00;46;46;16 - 00;46;49;07
But actually when we then build new models
and we take in data
00;46;49;07 - 00;46;52;13
sets from lots of different places
without asking the question
00;46;52;13 - 00;46;55;25
or being blind to gaps
that we've had thus far,
00;46;56;01 - 00;46;59;21
then we're only going to repeat the issues
and the problems that we've had already,
00;46;59;25 - 00;47;02;27
but we're just now going to do it at scale
millions of times a second.
00;47;02;27 - 00;47;03;27
Whereas before,
00;47;03;27 - 00;47;06;22
I don't know, in in medieval England
that was just something
00;47;06;22 - 00;47;09;13
that when someone went back through
they didn't have that information.
00;47;09;13 - 00;47;12;24
And so I think, I think there's
so many things that we have to sort of
00;47;13;17 - 00;47;14;13
these are real people.
00;47;14;13 - 00;47;16;18
These are real
things happening in real spaces
00;47;16;18 - 00;47;18;10
that people should really be doing
something about.
00;47;18;10 - 00;47;22;20
Another example, Invisible Women, another
book that I always talk to folks about.
00;47;22;20 - 00;47;25;03
And if you haven't had Caroline
on, you should definitely have,
00;47;25;03 - 00;47;26;03
because it's fascinating.
00;47;26;03 - 00;47;27;17
Some of the things that she was able
to unpick
00;47;27;17 - 00;47;29;19
and kind of communicate in that book.
00;47;29;19 - 00;47;34;12
Another example that I would love to share
from that book is, single parents.
00;47;35;18 - 00;47;36;09
So again, this is
00;47;36;09 - 00;47;41;15
a facet of life that actually, you know,
single parents, people being widowed,
00;47;41;23 - 00;47;46;18
whatever reason it is that you end up
as a single parent is not a new notion.
00;47;46;18 - 00;47;47;22
But whether that's a concept
00;47;47;22 - 00;47;50;21
that we've had kind of reflected in
law and spaces, whatever it is.
00;47;50;25 - 00;47;51;26
No. Right.
00;47;51;26 - 00;47;55;02
Caroline gives us example of someone
who's a director of a company
00;47;55;23 - 00;47;58;02
and also a single parent,
so that people who are directors
00;47;58;02 - 00;48;01;04
at this company,
they're all invited to this dinner.
00;48;01;23 - 00;48;03;27
They have a look at where it is
they or their assistant
00;48;03;27 - 00;48;06;15
book themselves into the hotel next door.
00;48;06;15 - 00;48;11;19
Let's say it's $200 for the night for them
to attend this dinner, and they accept.
00;48;11;19 - 00;48;13;18
And that's good.
That's done. Books on the company card.
00;48;15;00 - 00;48;17;09
The person who's a single parent
doesn't need a hotel room.
00;48;17;09 - 00;48;21;11
They need childcare for that night,
let's say quite expensive,
00;48;21;11 - 00;48;24;16
but whatever, $200
or less on the childcare for that night.
00;48;24;16 - 00;48;27;16
And that person takes
accepting on the invite
00;48;27;25 - 00;48;29;08
and this will differ. I'm not sure.
00;48;29;08 - 00;48;31;01
I guess in the US,
I think it might be the same.
00;48;31;01 - 00;48;34;04
Actually in the UK,
the hotel, the $200 on the hotel
00;48;34;04 - 00;48;37;09
is an allowable
business expense in the UK.
00;48;37;21 - 00;48;41;01
The $200 or less on the childcare
is not an allowable business expense,
00;48;42;10 - 00;48;45;00
but it's the same amount of money
being spent by the same company for people
00;48;45;00 - 00;48;48;00
at the same rank to attend the same event,
why does it matter
00;48;48;07 - 00;48;49;07
and what have we not built?
00;48;49;07 - 00;48;51;29
And so I think this is the thing
is, is size is again there's so this all
00;48;51;29 - 00;48;55;28
these little things to unpick of us
doing better for different types of folks.
00;48;56;07 - 00;48;59;10
And like I said, solving problems
at a faster rate than we're creating them.
00;48;59;10 - 00;49;02;16
And all the other ideas
that people are creating and products
00;49;02;16 - 00;49;04;13
that people are kind
of pushing out to the market.
00;49;04;13 - 00;49;07;14
And so I think societally, again,
there's an opportunity.
00;49;07;14 - 00;49;09;18
And with AI in particular,
I think this is something
00;49;09;18 - 00;49;12;04
I've been really excited about
is so many of these things.
00;49;12;04 - 00;49;13;02
People have had a hunch,
00;49;14;03 - 00;49;15;11
and we're
like, yeah, that doesn't seem right.
00;49;15;11 - 00;49;17;19
That doesn't seem fair, or that's a gap,
but it's not.
00;49;17;19 - 00;49;21;07
AI is helping to show
a lot of these biases
00;49;21;07 - 00;49;25;03
that are in real life, not necessarily
really for us to mitigate getting the AI,
00;49;25;03 - 00;49;28;23
but for us to mitigate in life, mitigate
in the way that we run our societies
00;49;28;26 - 00;49;33;20
to get in the way that we are building,
our communities and our groups.
00;49;33;20 - 00;49;34;21
And so I think, you know,
00;49;34;21 - 00;49;36;12
there's it's an opportunity,
there's an opportunity
00;49;36;12 - 00;49;39;13
for all of us to do better for everybody
and for all of us to gain from it.
00;49;41;20 - 00;49;43;12
I like the examples that you use
00;49;43;12 - 00;49;46;12
precisely because there's so
00;49;46;28 - 00;49;49;05
the scenario is so obviously dumb.
00;49;49;05 - 00;49;50;19
Right. Like the example isn't dumb.
00;49;50;19 - 00;49;54;14
The scenario is dumb that
how can we not just be better at this?
00;49;54;14 - 00;49;55;28
And I think for all of us, you know,
00;49;55;28 - 00;50;00;06
in some capacity we've encountered this
and we just say like, why?
00;50;00;11 - 00;50;02;11
Why is this fair?
Why does this make sense?
00;50;02;11 - 00;50;03;29
Why am I being penalized for this?
00;50;03;29 - 00;50;06;17
Or why is that person being penalized
for this?
00;50;06;17 - 00;50;08;15
there's
certainly a lot of intractable problems.
00;50;08;15 - 00;50;10;01
But I do it to your point.
00;50;10;01 - 00;50;12;04
I think there's a lot of problems
that are not at all intractable.
00;50;12;04 - 00;50;15;02
We just have to say that
this does not make sense.
00;50;15;02 - 00;50;17;19
Let's figure it out. Yeah, I sort it out.
Yeah.
00;50;17;19 - 00;50;20;19
is that, you know, a mindset thing
because again, there's
00;50;21;00 - 00;50;24;10
yeah, I don't know, I feel like there's
so much talk about like, oh no, that's
00;50;24;10 - 00;50;29;01
like that's not my job
or you know that that should be government
00;50;29;01 - 00;50;32;06
that's not me is like a business leader
or in the government being like,
00;50;32;06 - 00;50;34;18
that's the business leader.
That's not me or government.
00;50;34;18 - 00;50;38;06
Like, is this just a matter of all of us
kind of pulling up
00;50;38;06 - 00;50;41;10
our bootstraps and saying, let's
try to make the world a better place?
00;50;41;10 - 00;50;45;14
Or what's kind of your best guidance for
actually making some of this happen in.
00;50;46;08 - 00;50;47;18
I think so.
00;50;47;18 - 00;50;49;22
Ideally, that would be it, right?
00;50;49;22 - 00;50;53;14
We'd also let's all do better
for everybody and we all work together.
00;50;53;21 - 00;50;54;29
I there's a lot of it.
00;50;54;29 - 00;50;58;11
I, I, I see just ends up being norms.
00;50;58;22 - 00;51;01;16
So we have so many unintended
consequences from
00;51;01;16 - 00;51;05;03
I don't know if our existing has meant
that the way that different folks
00;51;05;03 - 00;51;08;11
like the laws around
gig gig work have transformed
00;51;08;11 - 00;51;11;11
in so many different countries, right,
as a result of Uber existing.
00;51;11;11 - 00;51;14;22
And so I feel like there's, there's
we haven't kind of kind of scratched
00;51;14;22 - 00;51;17;29
all use that lever as much as we can of,
hey, these are norms
00;51;17;29 - 00;51;20;29
that we can build in the technology
that might just solve this problem.
00;51;21;18 - 00;51;24;14
That means that then actually,
you know, a woman bearing the title
00;51;24;14 - 00;51;27;29
doctor doesn't get locked out of her
gym locker every time she goes to the gym.
00;51;29;03 - 00;51;30;22
You know, I and there's so
00;51;30;22 - 00;51;34;23
many like there's so many small bits, but
it's almost like some of the small bits.
00;51;34;23 - 00;51;38;01
And yeah, government and business
eventually can solve the big.
00;51;38;06 - 00;51;42;02
But the tiny, the small irritating pieces,
00;51;42;09 - 00;51;44;02
we can definitely just say that's
something
00;51;44;02 - 00;51;46;12
we're not going to do on this platform
or something
00;51;46;12 - 00;51;48;15
we're not going to have going forward
or something that in our company
00;51;48;15 - 00;51;51;25
we're going to set as a norm
that that means in our supply chain
00;51;51;29 - 00;51;54;08
and in our customer chain,
these are things that then
00;51;54;08 - 00;51;57;06
in our ecosystem,
our sphere of influence that we have,
00;51;57;06 - 00;52;00;07
that then becomes part of the norm
that we've upgraded and we've changed.
00;52;00;07 - 00;52;02;16
And so I think,
I think it's a combination of both.
00;52;02;16 - 00;52;06;17
But if everybody took it upon themselves
to understand this way of influence
00;52;06;17 - 00;52;08;20
and then to implement things
properly across there,
00;52;08;20 - 00;52;11;12
then actually there'd be
a lot of good things that would happen.
00;52;11;12 - 00;52;13;21
I think the other thing
would be potentially that,
00;52;13;21 - 00;52;16;27
you know, if we go back to our point,
if you have different folks valued
00;52;17;06 - 00;52;19;29
and different folks have access
to make these technical decisions,
00;52;19;29 - 00;52;23;18
then an upstream would mean that
we have better solutions for things,
00;52;24;00 - 00;52;25;13
which I think is the other thing that we
00;52;25;13 - 00;52;28;08
we end up missing out on,
and we probably see it the most in health.
00;52;28;08 - 00;52;31;21
Examples of the different things
that come up from pharmaceuticals
00;52;31;29 - 00;52;35;04
because of who had the most money,
who they thought were the biggest market
00;52;35;11 - 00;52;39;05
share, rather than the actual solving
of a problem that we've had
00;52;39;05 - 00;52;42;07
for ages, that endometriosis
is not a brand new condition.
00;52;42;21 - 00;52;45;11
And so someone should just have that
as something they're going to work out
00;52;45;11 - 00;52;49;19
and they're just going to do and solving
in the small with the use of this AI.
00;52;49;19 - 00;52;52;19
And it's why I get excited about things
like AI for potholes.
00;52;52;26 - 00;52;56;06
There are so many problems that actually,
the more that we have this accessible,
00;52;56;06 - 00;52;57;05
the more that it's democratized,
00;52;57;05 - 00;53;00;07
the more that folks understand
they can just experiment.
00;53;00;18 - 00;53;05;09
The more of new types of ideas
and solutions we might get to see.
00;53;05;14 - 00;53;07;16
I mean, a couple of years
ago, I had someone come to me
00;53;08;25 - 00;53;09;27
who had
00;53;09;27 - 00;53;12;27
conceptualized a new social media platform
00;53;13;03 - 00;53;16;03
type that centered on the idea,
not the person.
00;53;17;02 - 00;53;20;02
And I remember thinking yet, like,
if you were in those rooms
00;53;20;13 - 00;53;22;13
at the beginning
when we were doing bulletin boards
00;53;22;13 - 00;53;24;24
and all these other forums,
you wouldn't have censored it on the user,
00;53;24;24 - 00;53;26;26
or it would have been another way
that we cut the information
00;53;26;26 - 00;53;30;11
that we center on the thought,
on the nugget of the idea
00;53;30;24 - 00;53;32;06
and how many lives could have been solved.
00;53;32;06 - 00;53;34;22
I mean, you know, all these things
that could have happened if we
00;53;34;22 - 00;53;37;08
we just had something slightly different
and there's nothing to say.
00;53;37;08 - 00;53;39;20
This person wasn't there at the beginning.
00;53;39;20 - 00;53;42;09
That couldn't have been something,
you know, cyberbullying in the way
00;53;42;09 - 00;53;45;13
that we know it could have actually,
maybe just we would have just missed that
00;53;46;05 - 00;53;48;29
for just having something
slightly different there at the beginning.
00;53;48;29 - 00;53;51;27
And so it excites me, excites
me, the capacity we have to solve problems
00;53;53;03 - 00;53;53;15
by taking a
00;53;53;15 - 00;53;56;15
slightly different approach
to the way that we engage with technology.
00;53;56;15 - 00;53;58;24
Well it excites me that it excites you.
00;53;58;24 - 00;54;02;28
And that's something that, that it is,
is, you know, really is really great
00;54;02;28 - 00;54;05;19
to see because not, not everybody
I talked to is this excited by it.
00;54;05;19 - 00;54;09;16
Because I think, you know, it's so easy
to, to, you know, look out
00;54;09;16 - 00;54;14;11
over the horizon and see so much change
and so much uncertainty and, you know,
00;54;14;11 - 00;54;17;14
there's obviously no shortage of things
that aren't even going well today.
00;54;17;14 - 00;54;20;02
And there's, you know, things
that will be going better, but
00;54;20;02 - 00;54;21;09
and things that will be going worse.
00;54;21;09 - 00;54;24;12
But it seems like, you know, net net your,
00;54;24;25 - 00;54;27;27
you know, optimistic For now, yes.
00;54;28;09 - 00;54;29;21
while so so tell me more.
00;54;31;04 - 00;54;31;26
I'm optimistic.
00;54;31;26 - 00;54;36;04
For now, there will be a point, I'm
sure, of no return where we didn't act.
00;54;36;04 - 00;54;37;21
So we should have in the time.
00;54;37;21 - 00;54;41;13
And I'm, you know, we're much further
along into this next revolution.
00;54;41;26 - 00;54;45;18
And, you know, the norms have been set
and the dice have been cast.
00;54;45;18 - 00;54;47;25
And, you know,
we've we've locked ourselves out.
00;54;47;25 - 00;54;49;02
We shot our selves in the foot.
00;54;49;02 - 00;54;52;21
At which point then I will run to a bunker
or run somewhere, I don't know, East
00;54;52;21 - 00;54;53;10
coast of Kenya.
00;54;53;10 - 00;54;56;10
I just go, how do I do maths on the beach
00;54;56;10 - 00;54;57;19
forever? I'll retire.
00;54;57;19 - 00;55;00;28
So just just do solve
math problems on the beach, but on
00;55;00;28 - 00;55;03;28
so that I don't feel like
it's too late yet.
00;55;04;25 - 00;55;05;19
So so now's
00;55;05;19 - 00;55;10;14
the time to set the norms to influence,
you know, some of these, you know,
00;55;10;14 - 00;55;14;07
cultural changes and decisions and try
and try and build the legacy that we want.
00;55;14;19 - 00;55;15;06
Exactly.
00;55;15;06 - 00;55;18;19
Now's the time to try to genuinely do
better,
00;55;18;19 - 00;55;21;19
and be inclusive of folks
to help us do better.
00;55;21;26 - 00;55;24;19
I don't think it's too late for us to edit
00;55;24;19 - 00;55;27;19
things, change things,
and we see this even more.
00;55;27;21 - 00;55;30;20
I see this so much that the work
that we do at the institute,
00;55;30;20 - 00;55;32;18
that folks are still trying
to figure out these use cases,
00;55;32;18 - 00;55;34;09
they're still trying
to get these things out.
00;55;34;09 - 00;55;37;01
So it's not it's not set in stone yet.
00;55;37;01 - 00;55;40;11
There's definitely a lot more disruption
on, on, on the way,
00;55;40;23 - 00;55;42;21
a lot of these
kind of white collar industries
00;55;42;21 - 00;55;45;21
that are kind of having to change
how folks kind of enter.
00;55;45;26 - 00;55;48;26
So there's there's a lot of disruption
that's happening, I think is when,
00;55;48;26 - 00;55;52;15
when the dust settles on that,
then might be slightly closer to me
00;55;52;15 - 00;55;53;16
saying, maybe it's too late.
00;55;57;06 - 00;55;57;23
you the question.
00;55;57;23 - 00;56;00;22
I'll ask you the question
that everybody hates, which is
00;56;00;22 - 00;56;03;20
what's your kind of predicted timeline
for when the dust settles on that?
00;56;03;20 - 00;56;08;05
Is that 2027 is a 2030, I don't think
it's it's I know I don't think
00;56;08;05 - 00;56;12;15
is the time thing, I don't I it's not
a, I don't think it will be a time.
00;56;13;00 - 00;56;16;00
As we've seen with this hype cycle,
actually, it's not just about
00;56;16;01 - 00;56;19;00
and we know this kind of classic thing,
you know, this about technology
00;56;19;00 - 00;56;19;28
anyway, right? It's
00;56;19;28 - 00;56;22;28
not in the presence of the technologies
and the adoption of the technology.
00;56;24;10 - 00;56;26;24
And so and you can never really tell,
I don't know.
00;56;26;24 - 00;56;28;06
I don't know how many of us
could have predicted
00;56;28;06 - 00;56;31;07
that we'd be this deep into the AI hype
cycle.
00;56;31;19 - 00;56;32;04
Still here.
00;56;32;04 - 00;56;35;16
And now, three years ago
we would have been blockchain and bitcoin.
00;56;36;09 - 00;56;40;07
I'm guessing maybe in five years
the quantum volume would have come up
00;56;40;12 - 00;56;42;04
ever so slightly.
00;56;42;04 - 00;56;44;07
But I don't know
that it's a particular time
00;56;44;07 - 00;56;46;06
that when it's the technology,
it's relevant.
00;56;46;06 - 00;56;49;21
I think now, again, as a classically
trained computer scientist,
00;56;49;24 - 00;56;53;06
it's less about the tech, it's
more about the adoption and the
00;56;53;17 - 00;56;57;02
the kind of the socio economic scenario
in which it's being deployed
00;56;57;02 - 00;56;59;01
and it's being used.
00;56;59;01 - 00;57;00;18
I don't think it's a time thing.
00;57;02;12 - 00;57;03;03
Then there'll be
00;57;03;03 - 00;57;06;25
there would be some particular markers,
in particular norms
00;57;07;26 - 00;57;12;13
in which, which suit tech superpowers
using technology for folk.
00;57;12;13 - 00;57;14;20
To what end
00;57;14;20 - 00;57;16;07
instances might sound so
00;57;16;07 - 00;57;19;07
mysterious, doesn't it?
00;57;19;18 - 00;57;20;10
Be an internal.
00;57;20;10 - 00;57;23;10
Failing.
00;57;24;01 - 00;57;24;27
She just saw a website.
00;57;24;27 - 00;57;27;18
Where is am I? Am I on the beta? I'm not.
00;57;27;18 - 00;57;28;26
I'm not on the beach or I'm.
00;57;28;26 - 00;57;30;25
I'm in the bunker or I'm not.
00;57;30;25 - 00;57;32;29
And that'll be how we know
the time is in the day.
00;57;32;29 - 00;57;34;18
The day that I book the ticket.
00;57;34;18 - 00;57;34;29
The one way
00;57;34;29 - 00;57;38;06
ticket against the on the beach,
then the yeah, everyone can hunker down.
00;57;38;28 - 00;57;39;19
Yeah.
00;57;39;19 - 00;57;41;15
If you're on the beach,
we're in big trouble.
00;57;41;15 - 00;57;43;02
Exactly. Yeah, this is it.
00;57;43;02 - 00;57;44;21
So I'll just kind of sit on
the beach, which means that.
00;57;45;23 - 00;57;46;08
There you go.
00;57;46;08 - 00;57;49;08
It sounds, I can think of worse things.
00;57;56;26 - 00;57;57;21
Thanks for having me, Jeff.
00;57;57;21 - 00;58;00;14
Good luck everybody.
00;58;00;14 - 00;58;01;26
If you work in IT,
00;58;01;26 - 00;58;04;26
Infotech research Group is a name
you need to know.
00;58;04;29 - 00;58;07;29
No matter what your needs are, Infotech
has you covered.
00;58;08;04 - 00;58;09;11
AI strategy?
00;58;09;11 - 00;58;11;23
Covered. Disaster recovery?
00;58;11;23 - 00;58;12;23
Covered.
00;58;12;23 - 00;58;15;08
Vendor negotiation? Covered.
00;58;15;08 - 00;58;19;01
Infotech supports you with the best
practice research and a team of analysts
00;58;19;01 - 00;58;22;17
standing by ready to help you
tackle your toughest challenges.
00;58;22;27 - 00;58;25;27
Check it out at the link below
and don't forget to like and subscribe!
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Dr. Anne-Marie Imafidon Discusses
Is AI Eroding Identity? Future of Work Expert on How AI Is Taking More Than Jobs
From redefining long-held beliefs about “jobs for life,” to the cultural fractures emerging between companies, workers, and society, Dr. Anne-Marie goes deep on what’s changing, what still isn’t understood, and what leaders must do right now to avoid being left behind.
Our Guest Andy Mills Discusses
How AI Will Save Humanity: Creator of The Last Invention Explains
If you want clarity on AGI, existential risk, the future of work, and what it all means for humanity, this is an episode you won’t want to miss.
Our Guest Peter Norvig Discusses
AGI Is Here: AI Legend Peter Norvig on Why It Doesn't Matter Anymore
Are we chasing the wrong goal with artificial general intelligence and missing the breakthroughs that matter now?
Our Guest Cassie Kozyrkov Discusses
Why AI Is Failing: Ex-Google Chief Cassie Kozyrkov Debunks "AI-First"
In this episode, Cassie Kozyrkov, former Google Chief Decision Scientist and CEO of Kozyr, sits down with Geoff to unpack the hidden cost of the “AI-first” hype, the dangers of AI infrastructure debt, and why real AI readiness starts with people, not technology.