Our Guest Amy Webb Discusses
AI Convergence: Amy Webb on Why This Is the Year of Creative Destruction
Are we in an AI bubble or at the beginning of the biggest technological convergence cycle since the Industrial Revolution?
On this episode of Digital Disruption, we’re joined by the CEO of the Future Today Strategy Group and tech futurist Amy Webb.
Amy joins Geoff Nielson to unpack what 2026 really looks like through the lens of artificial intelligence, programmable biology, quantum computing, biological computing, geopolitics, and systems-level change. Amy argues that we’ve officially entered a new convergence cycle, a rare historical moment where AI, biotech, computing architectures, economic systems, and geopolitics collide to create an entirely new reality. This isn’t incremental innovation. It’s structural transformation.
If you’re looking for a conversation grounded in data-backed frameworks to help you navigate disruption, understand convergence cycles, and build real strategic vision in an age of uncertainty, this episode is for you.
Tune in to hear what “creative destruction” truly means for business leaders, how power is shifting between Big Tech, governments, and capital markets, why “future-proofing” is a myth, and why many CEOs are falling short when it comes to long-term strategic foresight.
Amy is recognized as the global authority who transformed the practice of strategic foresight into a rigorous, data-driven discipline. A pioneering quantitative futurist, she established the field’s foundational methodologies that today guide leaders, organizations, and governments in anticipating disruption, shaping the future, and securing long-term growth. Ranked the #4 Most Influential Management Thinker in the world by Thinkers50, Amy is regarded as one of the most important voices on the future of technology, business, and society. Forbes named her “one of the five women changing the world,” and the BBC recognized her among its 100 Women of the Year.
00;00;00;03 - 00;00;32;21
Geoff Nielson
Hey everyone! I'm super excited to be sitting down with Amy Webb, tech futurist, author, and CEO of the Future Today Strategy Group. She's a globally recognized authority on the future of tech, covering everything from AI to biotech, and appearing regularly in The New York Times, HBR, wired, fortune everywhere from CNN to NPR to Fox News. What I love about Amy, though, is that she goes deep and is a zero fluff futurist who is always backed by deep quantitative models and forecasting tools to support her predictions.
00;00;32;22 - 00;00;52;27
Geoff Nielson
I want to ask her what signals she's seeing right now, and what the future of tech and work look like through her eyes and the eyes of the dozens of fortune 500 leaders, she advises. What are the megatrends we should be looking out for? And more importantly, what should we do about them? Let's find out.
00;00;52;29 - 00;01;08;14
Geoff Nielson
Amy, thanks so much for joining today. Really excited to have you on the program. Maybe just to kick things off, can you paint me a picture of where we're at right now in 2026, just from the perspective of technology, from AI, from work? Like, where is the world right now from your lens?
00;01;08;16 - 00;01;34;21
Amy Webb
Well, I think from my lens, the world is in the midst of what I would call a convergence cycle, which I can explain in a moment. But I think from the lens of the average person, it might feel like we are in the midst of a typhoon of absolute insanity when it comes to technology and politics and social stuff, and it might just feel, soul crushing, almost.
00;01;34;24 - 00;01;49;23
Geoff Nielson
That you use the word convergence cycle. Let's, let's talk a little bit more about that in the context of this typhoon of insanity, because convergence implies that there's like, yeah, there's some themes here or it's not just, you know, item after item of insanity, there's some sort of structure or pattern to it.
00;01;49;25 - 00;02;22;21
Amy Webb
That's right. So, the best way to think of a convergence is a, sort of a situation where emerging technology and, and trends and social issues and economic issues, all of these areas that might usually develop independently, become linked, and they converge. The net result of that convergence is not an evolution, but rather something net new. So what's happening?
00;02;22;21 - 00;02;46;17
Amy Webb
Right. Convergence is happen all the time. They just don't happens, the way that they are happening right now. So when enabling conditions align these interactions, they start to intensify. And then all of these things start moving together. And it's a clustering that marks the beginning of a convergence cycle. And we modern history has had a few of these.
00;02;46;19 - 00;03;14;25
Amy Webb
So at the start of the Industrial Revolution, you had steam power, you had mechanized production, you had transport costs falling, capital markets improving new forms of energy systems and things people could do. So the result of all of those independent developments formed this, you know, industrial revolution, you know, and some institutions survived that and many did not.
00;03;14;28 - 00;03;43;07
Amy Webb
So that was the cycle after World War two. In the United States, in North America, we we accounted for more than half of all global, industrial output, which is pretty significant. There was an economic Bretton Woods happened and basically that reset global finance. The there were a lot of different things happening, and that resulted in, a shift in the global balance of power.
00;03;43;13 - 00;04;14;24
Amy Webb
Similar kinds of things happened, at the beginning of the commercial internet. So end of 1990s, we had widespread PC adoption, commercial internet access, telecom deregulation, venture capitalism. Starting to expand. You wound up with new digital business models overtaking what had existed in the past. And you know, again, that that disrupted a lot of business. So we've had these cycles before, and it wasn't a single event that precipitated that cycle beginning.
00;04;14;24 - 00;04;35;29
Amy Webb
But rather lots of things converging at once. So we've just entered a new convergence cycle. And the foundation of that is artificial intelligence. But it's not the only thing. There's a lot of different forces that are all in play right now. And as before, there will be winners and losers. There will be a lot of uncertainty. There will be big transitions in the workforce.
00;04;35;29 - 00;04;46;27
Amy Webb
And, you know, I don't believe in astrology, but, this is like a mercury, Mercury and retrograde situation. I think for that might be an analogy that fits for some people. We'll see.
00;04;46;29 - 00;05;04;06
Geoff Nielson
You mentioned, you know, AI being in some ways the driving force, but not a singular force here. Hence the very nature of a convergence. What do you see as being the biggest factors at play here?
00;05;04;08 - 00;05;26;23
Amy Webb
Well, the biggest, biggest factor is that we're not talking about a single technology or even a cluster of tech trends anymore. What we're talking about our systems level change. So convergences don't really pile trends. One on top of the other. They operate across different domains so artificial that anybody who's listening to this knows that AI is not like an overnight thing.
00;05;26;25 - 00;05;48;20
Amy Webb
Depending on who you talk to. You know, I been in some form of development for literally centuries. I mean, you can go back to the mid 1800s to, Ada Lovelace and the guy that she worked with, who were starting to think about algorithms, and computation in new ways. You know, and there was a lot of developed in the 60s.
00;05;48;21 - 00;06;19;14
Amy Webb
There was developed in the 40s. Again, the 60s. It was 1965 that the term artificial intelligence was coined. I'm sorry, 1956, not 65 at Dartmouth University. You know, a lot of development, 70s and 80s. Then there was a crash, like an AI winter. We've all forgotten this now, but, like, there was such a thing as a generative adversarial network that was all the rage, some time ago, that that that led to where we are now.
00;06;19;20 - 00;06;50;02
Amy Webb
The point being, this has been a long, long transition. At the same time, other things have happened, like, compute has scaled up, like the enabling factors for AI also started to improve at the same time. So we have the system level change rather than, somebody made a cool thing with an AI. So that's one, a convergence results in a, in a net new reality.
00;06;50;06 - 00;07;27;19
Amy Webb
So, you know, you can now sort of demarcate the world before in the world after generative AI, which is kind of become shorthand for AI, but it's, it's it's one of many, many facets of artificial intelligence, which itself is kind of a, a word that includes many other things. A convergence tends to redistribute power and value. So, you know, I think right now it might, it might be easy for us to all think that open AI is going to become the de facto winner or some combination of open AI and Microsoft or Google or whatever.
00;07;27;22 - 00;07;54;16
Amy Webb
To be truthful, I don't know yet. It's still pretty early days and who has leverage shifts every now and then? But but all of that will wind up rewriting competitive dynamics in a way that didn't like it wouldn't map back to what we were doing before. And then these convergences are hard to reverse. So again, like, if I think back to social networks in the early 20 tens.
00;07;54;18 - 00;08;20;10
Amy Webb
So Foursquare, Google, the, the mobile that the like mobile, social networks that conferred like cool digital merit badges for doing stuff. All of that is gone. That that was sort of trendy for the moment. So a convergence is, is the type of it sets off a systems level change that just becomes very, very hard to reverse once it's done.
00;08;20;12 - 00;08;31;00
Amy Webb
So it's pretty clear that, that, that we have entered this cycle now and AI underpins it, but it's linked to all of these other things.
00;08;31;03 - 00;08;54;10
Geoff Nielson
I want to talk in just a minute about some of those other things in these specific convergences. But maybe before I do. Amy, this notion has come up a couple of times now of of winners and losers and the competitive landscape changing, whether that's for individual firms, whether that's a kind of a nation state level. When you zoom out and you look at the the landscape of convergence, is that the cycle that we're in right now?
00;08;54;12 - 00;09;02;06
Geoff Nielson
What do you see as being the key themes that are going to differentiate the winners from the losers? Right now?
00;09;02;09 - 00;09;26;23
Amy Webb
Yeah. And I think it's important to point out that artificial intelligence, underpins several convergences that are underway. So it itself, like AI, is just technology. It's a suite of technologies. It's not anything. But beyond that. But but there are some characteristics that, that will foretell, you know, who's who's going to be where on the other side of this.
00;09;26;25 - 00;09;52;11
Amy Webb
The, the biggest one is our you, you know, our organizations digging in their heels, putting their heads in the sand, trying to go backwards in time, you know, to the way things used to be or, are they willing to, in a way, destroy the things that have worked best because they see that on the other side of this business will be done differently.
00;09;52;13 - 00;10;16;19
Amy Webb
Or governing like, like again, this is systems level change stuff. So on the other side, what is different? That is a very difficult question to ask. And most of the time what we see are leaders who are like, anytime there's change, trying to go in reverse, or we see leaders who are.
00;10;16;21 - 00;10;42;06
Amy Webb
You know, they're sort of guided by two principles fear and FOMO. So they're either afraid of what's happening and they do nothing, or there's this fear of missing out, and they make rash decisions quickly. It's a very small group in that third category, which is taking a surgical, clinical look at what's happening and really mapping out, you know, what?
00;10;42;06 - 00;11;04;28
Amy Webb
What are we doing going forward and what's our vision for the future? I think that there are definitely and, you know, the tech industry used to be much better at this. And I was just trying to think of like, which tech company and I don't have one for you right now. I just don't I don't think any of the technology companies are doing that.
00;11;05;01 - 00;11;32;18
Amy Webb
We work with most of them. So if they are, I haven't seen anything yet. I don't think that's entirely their fault. I think that there's been this whiplash effect with policy in the US that has made doing business pretty challenging. So anyhow, I can't I can't think of a lot of businesses or sectors right now that would fit into that category of we're willing to disrupt things to in order to, to move forward.
00;11;32;21 - 00;11;55;28
Amy Webb
I do see some companies getting back to the core, like, what's the one thing that only we can do? Let's do that better than everybody else while we are building toward a future. So there are some companies doing that. In terms of like governments, governments stuff, you know, because, the government stuff in every country right now.
00;11;56;00 - 00;12;17;22
Amy Webb
We are seeing a lot of, a lot of leaders going backwards. The ones who are trying to push forwards, I don't think have the long term view. Yeah. And the one thing I think everybody ought to be doing right now in, in government is rethinking what regulation is, because regulation is a look back. It's not a plan for the future.
00;12;17;25 - 00;12;27;25
Amy Webb
And I don't see anybody truly innovating on the regulatory side either. So I don't mean to sound so dour, but that's kind of the that's the boat we're rowing in right now.
00;12;27;28 - 00;12;52;10
Geoff Nielson
No, it's completely fair. And, you know, for what it's worth, it's this is this is not meant to be a geopolitical podcast, but we often end up in geopolitics anyway, because to your point of how intertwined it is with, with tech and work and everything else there. And yeah, I mean, it's hard to get away from the fact that it feels like there's an awful lot of noise in the geopolitical realm, in the policy realm these days, that's serving as a distraction.
00;12;52;14 - 00;13;13;25
Amy Webb
Yeah. And I was going to say, look, I, I happen to be deeply interested in geopolitics, but I but, like, you can't have a conversation about tech in the year 2026 without also having a conversation about policy and regulation and economics, because they all go together now. There is no I don't care what anybody in Silicon Valley says.
00;13;13;27 - 00;13;43;28
Amy Webb
You know, for a long time, for a long time, those leaders, thought that whatever. Move fast, break things, ask forgiveness later. They have all the power. And Wall Street thought it had all the power, right? The like. And they didn't care what anybody else said. And of course, DC always thinks it has all the power. So it's set up this sort of three sided prisoner's dilemma where everybody thought that they were fully in charge, when in fact it was shared power.
00;13;44;00 - 00;14;13;11
Amy Webb
And, and the companies, the people, whatever the or the financiers who, like, got that were much better at figuring out how to work through those dynamics. Most weren't. That has changed. So you are now seeing these strange new bedfellows. You see tech leaders and politicians around the world, not just in the US, getting together and trying to have doing deals during different kinds of, you know, having different kinds of conversations.
00;14;13;11 - 00;14;24;15
Amy Webb
I think there is a sense that, you know, what? No one, no one part of that triangle anymore is, is, you know, has accumulated all the power.
00;14;24;18 - 00;14;52;12
Geoff Nielson
So if you work in it, Infotech research Group is a name you need to know no matter what your needs are. Infotech has you covered. AI strategy covered. Disaster recovery covered. Vendor negotiation covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe!
00;14;52;15 - 00;15;12;17
Geoff Nielson
I just find myself coming back to the comment you made earlier about the. The big tech companies used to be much better at sort of this future proofing and long term planning and that, you know, even they have sort of lost their way there, which to me is like really alarming at this point in time, given the magnitude of what's going on in kind of every aspect of our society.
00;15;12;23 - 00;15;34;20
Geoff Nielson
You met, you mentioned policy as being a driver of that. Is that a singular driver? Is there more to it than that? Yeah. Have they gotten too big and cumbersome? Have I gotten too complacent? Is the technology moving too fast? Like what's going on? And is there is there a role? I'm chuckling because I was going to ask, is there a road backwards?
00;15;34;20 - 00;15;40;25
Geoff Nielson
And I feel like that's a theme we've already been talking about. There isn't or is like, how do you get this train back on track?
00;15;40;27 - 00;16;01;20
Amy Webb
Sure. I mean, I think there's a recalibration, which I will tell you at first. Let me go back to something you said. So you use the word future proof that the companies used to be really good at future proofing. What I would say is they used to be very good at feature developing. By that I mean having a strong perspective on what the future could be and then developing toward that.
00;16;01;22 - 00;16;20;08
Amy Webb
And then just as like a side quest conversation very quickly, I hate the term future proofing. Because it's it's nonsense. There is no way to future proof, in order for you to be able to future proof anything, you'd have to be Doctor Manhattan. Are you a comic book? Now, one more time.
00;16;20;10 - 00;16;21;16
Geoff Nielson
I'm with you on Doctor Manhattan.
00;16;21;16 - 00;16;44;16
Amy Webb
Yeah. So, like, you're not, and neither am I. You cannot stop. You can't go to another planet, like, stop time and contemplate and feel British, you know, and have, and have all the data and know both the past, present and future. Like, that's that's not any of us. So if that's true, then there are always like, the world is multivariate, like things are always happening.
00;16;44;16 - 00;17;14;04
Amy Webb
Nobody's ever going to have control over everything. So there's you cannot future proof anything at all. I'll come back now to the main, the main road, after the side quest. So, yes, to your point about policy and regulation, I think there are probably some tech companies that out there that might be arguing, well, regulation, you know, policy, it's made it really tough for us to innovate or there's so much political uncertainty or economic uncertainty whatever.
00;17;14;04 - 00;17;35;02
Amy Webb
Like, you know, just it's it's really tough. Here's where I think here's what I think happened. I think it has nothing to do with any of that. I think it has to do with greed. We are seeing the amount of capital right now that is flowing into AI is alarming. OpenAI has been around for a decade.
00;17;35;08 - 00;17;55;17
Amy Webb
You know, this is really tough, complicated technology, very challenging. I don't know how much they raised before 2022 when ChatGPT became public. However, they've raised a metric. I was going to say an expletive. They're they they've raised a lot. You know.
00;17;55;17 - 00;17;56;15
Geoff Nielson
You can swear on here if.
00;17;56;15 - 00;18;36;18
Amy Webb
You want. They raised a metric ton of of capital, you know, 20, 23 onward. And it's, it's it's debt. Like, there is so much debt that at this stage, I'm not sure, you know, I, I'm not sure what's going to happen. Once investors start it, start getting jittery, and they will. So the problem here is that if you are in a situation where you have taken that much capital, or you are spending that much money and your run rate is absolutely insane, then no, you are no longer thinking about the long term.
00;18;36;24 - 00;19;00;22
Amy Webb
I mean, you may go to your your somewhere and smoke a bowl or do ayahuasca or whatever it is. The cool tech guys, the cool tech bros do on the weekend, and have a vision quest. But at the end of the day, you come back to like operations. And when you were when there's that much money moving around, you are you have a fiduciary responsibility to do something about it.
00;19;00;22 - 00;19;25;13
Amy Webb
You know what I mean? And that's not just the story of OpenAI, Microsoft, Google. I think Amazon too, they're all spending 200 billion plus dollars this year on data centers. I mean, these are extraordinary amounts of money. And don't forget the market used to punish tech companies for R&D. So what we're seeing right now is a flood of capital moving in, a lot of money being spent.
00;19;25;16 - 00;19;42;23
Amy Webb
Business models having yet to be proven in the enterprise. And so no, there's nobody out there thinking through, not just like what could the future be, but what are the potential consequences of what we're building if we're not really thinking all of this through? And that is problematic.
00;19;42;25 - 00;20;01;01
Geoff Nielson
I completely agree with you. And, you know, I have a few different concerns about that. And just to unpack a couple of different things you said there, Amy. One of them is in and I'll continue your analogy a little bit, but it really feels like the tech companies are starting to get high on their own supply, like, because they're raising this much money.
00;20;01;04 - 00;20;27;04
Geoff Nielson
The the hype and the narratives around this stuff and the certainty with which they're describing a very specific future because there's so much money tied up in that it makes it, from my perspective, very difficult to figure out, what do we actually predict is going to happen, versus what do some of these fund raisers need to believe, need to convince us to believe is going to happen?
00;20;27;07 - 00;20;57;02
Geoff Nielson
Which is one side of it. And I mean, and that's not even to speak about, just like the fog of war that's created by, you know, how many convergences are happening at once. But I mean, what's your outlook on this in the, I guess, the shorter term and the longer term? Do you think where there's an implication in what you're saying, that we're kind of dealing with a house of cards here and that we've we've extended this farther than it can, you know, reasonably be expected or that we can't possibly get the gains that are being promised.
00;20;57;02 - 00;21;06;25
Geoff Nielson
Now, what is that fair? And where do we go from here? And what does that mean for tech in business in the next handful of years?
00;21;06;27 - 00;21;39;29
Amy Webb
We just need time. This is the thing. Like, are we going to get to a place where we have tools that perform like right now there's a whole bunch of tool. Most of the tools perform between, you know, like, they can do the same level as a person or. Right. So, so that the definition of artificial narrow intelligence was, can you do a singular task or narrowly focused task at the same level or better than a person?
00;21;40;02 - 00;22;19;00
Amy Webb
And the answer for a long time in lots of places has been, yes. I mean, that's G.P.S., that's your that's your anti-lock brakes system in your car. So, so, yes. The the movement toward artificial general intelligence is to do not just multimodal, but like, lots of different tasks, again, at or at or above. And that was always the standard definition we have right now are tools that are more than multimodal, that can do what we what we can do at like, you know, 70%, maybe 80%, but it but that has become a satisfactory level of doing things.
00;22;19;03 - 00;22;43;15
Amy Webb
And again, like in business, part of the reason for that is, a whole bunch of people decided 80% is pretty great. If we can eliminate a whole bunch of headcount and the temptation to improve bottom line through attrition was so great that 80% is kind of been like, palatable, you know, that that would never have been true.
00;22;43;17 - 00;23;06;15
Amy Webb
Years ago. Or maybe it was. And that's kind of how our outsourcing started. You know, we can do it cheaper. Maybe it's not going to be as good, but it's so cheap and we don't have to pay salaries and all the rest that goes with it. That will will offshore everything. We're off shoring work now to AI systems that, you know, they're okay enough.
00;23;06;17 - 00;23;27;07
Amy Webb
But we also don't know how decisions are being made and, you know, a lot of other things. So to answer your question, in this very long winded way, we just need time, you know, if you we need time and we need good decisions to be made. But right now, there are all of these incentives that are sort of pushing against patients.
00;23;27;09 - 00;23;48;02
Amy Webb
And so part of that is, and one of the things that's pushing the tension, the things that's causing the tension, it's not just investment. It goes back to what you said at the beginning about predictions. You know, the thing that's true today that was not true five years ago is that everybody's got a hot take on AI, right?
00;23;48;04 - 00;24;11;24
Amy Webb
Artificial. And like I've said, 20 years, researching Bill, I wrote a book on this seven years ago that, you know, was a bestseller, but it also took it bled me of my two years of my life. I would I would never claim to be an AI expert. Never. And, you know, now there are all of these people who are feel like they're qualified.
00;24;11;26 - 00;24;40;00
Amy Webb
If you look at LinkedIn recently, you will see, like, everybody has some form of an AI expert all of a sudden overnight for actual fact, my favorite one is the fractional CIO chief artificial intelligence officer. Like, tell me you're full of bullshit and don't know what you're doing without telling me your bullshit. Don't know what you're doing. But part of where this is coming from is you can use generative AI to now have a hot take, like, you just use that to give you an idea of what you talk about or write about.
00;24;40;00 - 00;25;08;15
Amy Webb
And then you do that. So part of it is that part of it is there's a lot of, there's a lot of demand for increasingly salacious stories. So like, for those who aren't, aren't aware, like my book is part of this incredibly dangerous never down download it to a machine. Open system. Anyways. So it's it's a, it's a social network for AI bots.
00;25;08;19 - 00;25;28;14
Amy Webb
Okay. Whatever. And then like all of these posts were going around about, you know, like one of them wanted to invent a secret language so his human couldn't understand what he was saying. And there was another one about like a manifesto on how to destroy humanity. All of this made the news. Most of that was written by people for the purpose of gaining views.
00;25;28;14 - 00;25;48;26
Amy Webb
And somebody posted like a 5000 word screed linked to it on Twitter, and it got like 65 million hits about like an existential threat posed by AI. If you go back over the past few years, the level of existential threat has. I'm not sure where else we go from here. How much more dead can we be? You know what I mean?
00;25;48;26 - 00;26;07;02
Amy Webb
Like, so there's an incentive, at the companies to make decisions because there's a lot of money on the line. I think there's also incentive for people to talk in increasingly dystopian, apocalyptic terms about AI, because there is a tension on the line. Also bad.
00;26;07;05 - 00;26;26;07
Geoff Nielson
Well, that said, right there in the attention economy, like, you know, whether it's rage based or whether it's, just saying the most outlandish thing and I like I always have to put up my hand and say, yes, in some ways, we're complicit in this because we put our stuff on YouTube and on platforms. And if you say like, you know, Amy has a reasonable take on AI, people like boring.
00;26;26;07 - 00;26;49;14
Geoff Nielson
Like, when is she going to say that it's going to kill us? All right. Like it's just it's, you know, which, you know, is is neither here nor there, but but I'm curious, Amy, coming back to this notion of AI in general and not knowing exactly what it's going to do, and people making these grandiose promises, and it's been around for a while and 80% is okay.
00;26;49;16 - 00;27;20;27
Geoff Nielson
There is this this issue too, where AI is so broad it's almost meaningless. And people talk about it as though it's like you can almost interchange AI with magic because because it's used so broadly that it's going to be able to solve all these problems and do all these things. And I'm curious from your perspective and coming back to some of the convergences, you know, what are some of the the spots where you think I can actually have a material difference on our societies and the way we do business?
00;27;21;02 - 00;27;28;21
Geoff Nielson
And what are some of the ways where you think, okay, this is overblown, this is another, more book. It's not actually going to be a change here.
00;27;28;23 - 00;27;52;11
Amy Webb
Yeah. I again, I think this look, the thing I like to remind everybody is that a fork can be a lethal weapon if you stab somebody in the neck with it hard enough. Right. So like, otherwise it's just a fork. It's not that exciting part of most of artificial intelligence, like, the things that make up AI are not exciting.
00;27;52;13 - 00;28;11;16
Amy Webb
They're. You know what I mean? So it's it's how we implement AI. And right now, the thing we're doing with AI is telling us telling stories about it more than we are actually doing things with it. But there is. So I just want to, like, put that out there. I don't mean to, shiitake AI, but like but that's like, that's where we are now.
00;28;11;16 - 00;28;37;27
Amy Webb
What can you do with like what are some actual things that are happening? So programable biology is one space that is incredibly interesting and promising. So, up until very recently, you know, the laws of physics, chemistry and biology, they kind of just were, because there was no way for us to intervene in a in a real, meaningful way.
00;28;37;29 - 00;29;08;00
Amy Webb
It is that has been changing. So artificial intelligence and biology have become, has intersected. And because we know a lot about biology, we know the language of biology, RNA, DNA, molecules, proteins, it's possible to now combine and shift, those materials in different ways that we just were not able to before we do this computationally, believe it or not.
00;29;08;02 - 00;29;30;27
Amy Webb
DeepMind, which is, you know, the big, robust research arm of Google that is doing, I think, some of the most impressive I work out there. They happen to be the ones also applying some of that work to biology. So just as there's a chat bot, you know, go to ChatGPT or Claude or whatever and put in a prompt and get out some stuff.
00;29;30;27 - 00;30;00;21
Amy Webb
On the other end, there is something called Evo two, which is a model where you can kind of do the same thing, but in the prompt that you write in is for the purpose of getting something biological out on the other end. This unlocks unbelievable opportunity for us, ranging from new materials. So, you know, in the near future, I've seen some I've seen some really interesting research on wood that is clear.
00;30;00;23 - 00;30;25;11
Amy Webb
But, you know, and totally different types of enzymes that can break down materials and other ways that don't have negative side effects. It unlocks opportunities for us to deal with climate change. As much as it would be awesome for everybody on the planet to take an eye toward the future and make better decisions, you know that's not going to happen in total alignment for everybody on the planet.
00;30;25;11 - 00;30;54;15
Amy Webb
So we can keep working on that. And or we can also mitigate the situation that we're in. So so imagine, engineering almond trees to require like one eighth of the water and working with strand like drought and things like that. So it just it gives us totally different opportunities. We've never had before. And that's before we even talk about human health and new drug discovery.
00;30;54;18 - 00;31;12;08
Amy Webb
You know, I wish I could go on and on and on, but like the fact that biology has become programable in my lifetime and we're at the cusp, we're not like, this is not mature yet. We're at the beginning, you know, that to me is super, but so much more exciting and interesting than talking about the robots killing us in our sleep.
00;31;12;10 - 00;31;31;06
Geoff Nielson
And you know what's interesting? And in some ways depressing about that. And, you know, in this world where everything is so narrative driven is whether it's robots in your sleep, whether it's just, you know, generative AI and slop and all of that, it feels like so much of that is taking the oxygen out of the room for things like programable biology.
00;31;31;06 - 00;31;42;16
Geoff Nielson
Right? Like, I don't know, it seems like in any other period you and I could be talking about programable biology and like, that's it. Like, you don't even need to talk about other trends or convergence because there's so much wrapped up in there.
00;31;42;23 - 00;31;59;09
Amy Webb
Yeah. But here's here's the rub. We need to talk about it, because this is not the same thing as asking a generative AI to like write your fiancé a little note. You know, like that may be ethically good or bad, or maybe you don't care. I don't know, but like, that's not going to make a national headline. People forget this.
00;31;59;12 - 00;32;22;20
Amy Webb
But it was in the 1990s, early days of cloning, which is pretty common now. Cloning animals or material. Dolly the sheep was cloned. And the purpose of this was not to create a new race of demon spawn sheep. It was for medical research, but the world collectively lost its mind. The Pope came out and had an edict.
00;32;22;22 - 00;32;51;09
Amy Webb
At the time, President Clinton had to have a, like the president of the United States, had to have a special press conference on cloning sheep. Like, sit with that for five seconds, you know what I mean? So that was 30 years ago, and we didn't have anywhere close to the technologies we've got today. There is so much happening in the, you know, the sort of broader generative biology, synthetic biology umbrella, you know, that has everything to do.
00;32;51;11 - 00;33;21;08
Amy Webb
I mean, look, there is in Japan, researchers have figured out how to do something called induced pluripotent stem cell, combination. So basically you can start with a skin cell and coax that cell into becoming any other type of cell. Lung tissue, heart tissue. And they are doing this as a way of, creating embryos. So again, not about demon spawn, but it's about, ways of helping to manage fertility, which is a tough nut to crack.
00;33;21;10 - 00;33;55;11
Amy Webb
But we should be having the conversations about this right now because people are going to have thoughts and feelings. And if we, you know, because unlike AI, the life stuff is very much tethered to religion. So if we don't have those conversations now and the tech pushes forward at the same clip that it is, I promise you that the what seem like silly freak outs that everybody had about Dolly the sheep is going to pale into compare, pale in comparison to the to the feelings people are going to have.
00;33;55;16 - 00;34;23;24
Amy Webb
And you know, that's that's not good because because of the way that medical information misinformation is already spreading, you know, it's just going to make it that much harder. And then it's potentially going to take away life altering in a positive way. Technology. So like that, that's, that's that's a it's a great point that you raise. I mean, we're we're burning all of our mental time and energy and everything else on, you know, dystopian sci fi.
00;34;23;24 - 00;34;36;23
Amy Webb
And we're not like pushing up our sleeves and having a harder conversations. But I wish we would all be having the information that sorry had not had no informed. Yeah, well.
00;34;36;23 - 00;34;55;07
Geoff Nielson
And I have a concern about that. And I'm curious on your perspective because, yeah, you almost convinced me of the opposite thing you were arguing, which is that when you use Dolly the sheep as an example and you say, okay, well, we need to have more open forum discussions about this new technology. You know what programable biology like?
00;34;55;07 - 00;35;22;27
Geoff Nielson
Let's stick. There is a as a basket of technologies like do we really need the Pope and the president commenting on this technology? Are those going to help it like drive it forward or push it back? Because I'm almost worried. Like, have we gotten to a point where, like we're too stupid to sort this out, maybe we're better off distracted by like, the big shiny things and let the people who know what they're doing actually create this in the back, or or do we just need to go through this, some sort of creative destruction to actually get there?
00;35;22;27 - 00;35;45;08
Amy Webb
So I have a I want to talk about creative destruction. I'm going to get up from my desk for one second. I want to show you something because it perfectly answers your question. Hang on just a second. I will be right over here. So the question that you're asking, the question you're asking is, should we just have these conversations with people who don't know what they're talking about?
00;35;45;08 - 00;36;16;17
Amy Webb
Everyone's going to have a hot take, blah, blah, blah. So in 1975, there was a group of, scientists asking the same question. You just asked about, about early days genetic engineering. All right. And, and academics, researchers, they always kept things behind, closed doors because they did not want to invite unnecessary scrutiny from people who didn't know what they were talking about, and they didn't want to have any issues.
00;36;16;20 - 00;36;52;06
Amy Webb
But knowing what was on the line, they took a different route. So in 1975, a handful of these amazing, like, world class researchers met in a similar I can never say this right. A similar met in California near Pebble Beach. I'm terrible at names. They met. Anyhow, they met. They decided to invite journalists. They invited, they they invited journalists.
00;36;52;06 - 00;37;15;20
Amy Webb
They invited, think tank people. Excuse me. And and they did what they were going to do anyways, but they had a bunch of people there with them. It resulted in the cover. This this is the original trying to back this up so you can see it. This is rolling Stone, this is Stevie Wonder on the cover of rolling rolling Stone.
00;37;15;22 - 00;37;49;11
Amy Webb
Oh, and it says genetic scientists facing the mutant microbe. These guys are covered like literal rock stars. All right, there they are in the same, magazine as Elton John and the kinks. So the result of this was better information and helping people gain knowledge and understanding. And it did not result in what we currently have today. So there's an my point is there's like an argument to be made for just, you know, invite people in and have a normal conversation.
00;37;49;11 - 00;38;19;13
Amy Webb
I think the very fact that, I get I get irritated every time, every time a name and I comes out and is like, we're building something, it's dark, we can't talk about it. We just I what? I know what I've seen. I can't talk about, that is doing nobody any good. There was a totally there was like an alternate universe in which there's a different approach and that approach is better and anyhow.
00;38;19;13 - 00;38;25;28
Amy Webb
But that was literally about biology. And it it set things on a good course for the future.
00;38;26;01 - 00;38;43;05
Geoff Nielson
So the, the, the point you're making is that transparency around this stuff is, is necessary. And even if it's unpleasant or leads to, you know, discourse that's, you know, going to be bumpy, we have to go through that to be able to ultimately get people on board with us.
00;38;43;05 - 00;39;04;16
Amy Webb
We do look at what happened with messenger RNA. So that's also incredibly powerful technology that nobody talked about. You know, we then had vaccines for the for the, for Covid. And to be fair, that was a tough time when there was a lot of information flying around and confusion. But wouldn't it have been nice to know in advance?
00;39;04;18 - 00;39;15;03
Amy Webb
Right. And then we might have had less consternation and we would have more pathways into using that technology. Now for other for, you know, additional purposes.
00;39;15;05 - 00;39;35;18
Geoff Nielson
Yeah. That's fair, that's fair. In terms of the convergences that you're keeping an eye on, there's an awful lot, and I want to be, you know, mindful of not trying to do a deep dive on all of them. But, you know, if I can you know, we've talked about biotech in some capacity. We've talked a little bit about compute.
00;39;35;20 - 00;40;02;03
Geoff Nielson
Other sort of buckets you talk about is you talked about a genetic AI, which we haven't really talked about at all. You talk about, you know, data and surveillance, being used by AI and kind of that sensor piece, and you talk about, AI relationships and people's individual relationships with AI and how it helps them gain information or, you know, even just connect with people who, you know, formerly would have been people that they connect with.
00;40;02;05 - 00;40;15;18
Geoff Nielson
There's a lot of space there. Where should we go next? What what area within that, you know, do you think needs to be top of mind? You know, maybe for business leaders to, you know, better get a sense of what's coming in this convergence landscape.
00;40;15;21 - 00;40;40;10
Amy Webb
Yeah, I think that there's a couple there's so again everybody everybody's very excited of artificial intelligence. And again, it's wonderful. If you're in business, you know, most businesses drag their heels on digital transformation. And then it took a very long time. Some we literally have a client that is on your 14 not joking of their digital transformation.
00;40;40;12 - 00;41;01;28
Amy Webb
You know, there's a lot of consultants who have made a lot of money because this company is willing to go that slowly. The problem is that there are other forms of computing that are also in development that will also be used by business at some point. So there is the classical compute architecture that we've got now and like all that stuff.
00;41;01;28 - 00;41;34;02
Amy Webb
And I sort of fits in there. Quantum computing does not yet have commercial applications, at least not at scale. But they are coming. And so the story around quantum is always quantum encryption. And like passwords, you know, encryption no longer works. And all this stuff, horrible, horrible stuff. The other side of quantum computing is being able to take just like massive amounts of data and being able to model it in either near real time or real time.
00;41;34;05 - 00;42;20;02
Amy Webb
To be able to get information on the other end, you know, you would be using it requires a different type of compute computing system we have today. There are huge implications there. Positive and negative. But, for everything from weather prediction to supply chain management, to like financial services, insurance, like, suddenly once, once it starts to be functional for a business, quantum is like a set of keys that unlocks the ability for companies to do things they they wouldn't they would love to right now, but there's no way they could, there's a third area, so like, there's a so there's AI, there's a genetic, there's classic computing.
00;42;20;02 - 00;42;48;11
Amy Webb
You know, I there's quantum, there's actually a third area that's a little bit further away. But it does exist now of computing and that's biological computing. So that there is such a thing as collecting brain cells, creating an organoid, which is like a tiny brain, and attaching that to a computer and having that power the computer.
00;42;48;13 - 00;43;13;21
Amy Webb
So it is literally a brain computer. Now there's, I know that sounds ridiculous and silly and whatever, but, but there are a couple of companies that already exist. There is such a thing as a as a, I'm looking down at the CPU. There's a imagine, like an old school seat that I'm trying to describe, but it looks like it's like a.
00;43;13;23 - 00;43;45;05
Amy Webb
Imagine just a CPU that has a, transparent top, like a gaming computer, almost without all the cool, crazy lights that is, that has, tissue, human tissue, brain tissue, running it. There's also a cloud based version, and, and you can actually watch that operating in real time. Why would you want a brain computer? Because it requires nowhere close to the amount of energy that a traditional computer does, because it's just much like our brains, much more efficient.
00;43;45;07 - 00;44;06;00
Amy Webb
It can also be used to do different types of computation. So the implication here is sometime in the very near future, you know, a company is not just going to need AI or have access to AI, but you'll have different types of compute for different circumstances. You'll want to run quantum for some things and AI for some things, but not the same things.
00;44;06;02 - 00;44;24;06
Amy Webb
You may need biological computing for other things, and businesses are not set up that way. You know, our society or government, the enterprise it's all set up for like we have this type of machine deployed in this type of way and we've got like managed services, you know, whatever, that that's not what the future is going to be.
00;44;24;09 - 00;44;35;15
Amy Webb
And I said, so that's a pretty big, convergence. It's the poly compute, convergence. And so that's coming. Let me keep talking about.
00;44;35;15 - 00;44;50;20
Geoff Nielson
Yeah. Organoids. Like suddenly I'm realizing, oh, I think maybe we do have to have a societal conversation about this because that's like, totally. If you didn't like Dolly the sheep, you're not going to like these, like, organ computer farms that are giving me, like, Stranger Things and The Matrix.
00;44;50;23 - 00;45;17;18
Amy Webb
Yeah. And so again, I think it does beg different types of questions. And we're used to asking. So, could I you know, people right now are trying to train AI systems on their personality to like, upload who they are. But it's it's at best a terrible effect simile. Like, it's nothing like who they really are. You know, if I'm able to grow.
00;45;17;20 - 00;45;41;18
Amy Webb
So like, if I had an expert, a literal exo brain of my own brain cells, it's not going to have the lived experience. We don't know yet about memory because we're not sure how some of this stuff functions, but like, you know, we don't know what that looks like in the future. Or maybe I would license my brain to somebody company to, like, use my brain cells.
00;45;41;20 - 00;46;03;25
Amy Webb
You know, that's also plausible in the future. We have agents for sports right now. Maybe we're going to have agents for intellectuals, who broker deals for, you know, for cells. So these are interesting things to think about. Those are not the existential questions business leaders are going to have to wrestle with in the next five years. But in the next decade, yeah, probably, you know, that that is coming.
00;46;03;28 - 00;46;30;02
Amy Webb
So again, some of this is kind of fun to think about or scary, depending on your perspective. But this has to do with like scalability throughout your now. Like if we enter a poly compute world, that means you have to like reorganize your business a little bit, you know, and and that type of thing takes time. The larger that your business then is, is, you know, the larger the businesses in the type of industry that you're in.
00;46;30;04 - 00;46;48;09
Geoff Nielson
I want to come back to something you just mentioned, which is that, you know, these aren't necessarily the existential business questions that leaders need to answer in the next five years, maybe ten. What what are the existential questions that business leaders need to be thinking about right now? And, you know, how should the convergences help them do that?
00;46;48;11 - 00;47;27;02
Amy Webb
Yeah. So this kind of goes back to, what we what we circled around it a touch, not too long ago. And that is creative destruction. So in the past year, in the past couple of years, there have been tested innumerable changes, and that is causing disruption throughout every single industry. And I think for most leaders, you know, you are trained to not get distracted, follow your and, you know, be very methodical, stick with what works.
00;47;27;05 - 00;48;05;04
Amy Webb
But what we're starting to see is that the greatest threat to any organization is no longer disruption from, the outside only it's the refusal to destroy things from within. And that concept was, came from, an Austrian economist named Joseph Schumpeter. And he called this creative destruction. And that is when, it's a process where you've got new innovations because of these convergences that are relentlessly displacing what exists.
00;48;05;04 - 00;48;33;14
Amy Webb
So older technologies, companies, business models, power dynamics, you know, geopolitical relationships even. And, and that, resets economic structures. So the the issue is, this is not about evolving. So I think a lot of companies are looking at AI right now or any of these things. Right? And they're like, well, we just have to go faster. We have to do what we're doing, but more right.
00;48;33;14 - 00;49;00;23
Amy Webb
It's an evolution. And, and leaders are assuming that past success or the way of doing things before would guarantee their position, their dominance, whatever it is in the future. And that is not going to be true anymore. At least not during this cycle. You know what will wind up? It's not like an overnight failure, but those those organizations are going to calcify and they're going to wind up being irrelevant.
00;49;00;26 - 00;49;34;05
Amy Webb
So this is the moment in time to ask very difficult questions that, that are they're almost like dangerous to be asking. Because there are questions like, what do we need? What is what do we need to allow to die, so that we can, you know, get get to the next piece of what's happening, so that something more valuable can live and that that is a tough, tough thing.
00;49;34;05 - 00;49;59;01
Amy Webb
That is an existential question for businesses because, you know, I think if if a lot of leaders take a good, hard look, you're going to have to look at it's possible that like the thing that is propelling the business to success right now, if you if you're not willing to make a change, could be the thing that kills you not 50 years from now, but like five years from now.
00;49;59;04 - 00;50;24;04
Amy Webb
And, and so that that to me would be, you know, if I was a leader, I would be keep that would be keeping me up at night much more than whatever's happening in AI right now. AI is certainly an input to that. But there's so many dynamics, there's so many convergences in play right now. But that every leader of every, every business should really be asking that question where, you know, where is the world going?
00;50;24;08 - 00;50;39;29
Amy Webb
Where will value be created? How will we participate? And do we have to kill something off right now that we are doing in order to, to create something that generates more value for us, for everybody, for society, whatever it is in the future.
00;50;40;02 - 00;50;49;23
Geoff Nielson
It it's reminding me of the, the Mark Carney quote. I know if you heard it from the, World Economic Forum about nostalgia is not a strategy that.
00;50;49;23 - 00;50;52;01
Amy Webb
Yeah, I did not hear that. But yeah, 100% that.
00;50;52;01 - 00;50;57;18
Geoff Nielson
Is like it. It just feels very on theme for, you know, as you said, letting things die to move.
00;50;57;19 - 00;51;02;29
Amy Webb
I think of all the things he said, in Davos this year that might not have been the specific highlight.
00;51;03;02 - 00;51;06;12
Geoff Nielson
Yeah. Fair enough. And there we go drifting back into geopolitics.
00;51;06;12 - 00;51;06;27
Amy Webb
Again I can't.
00;51;06;27 - 00;51;34;27
Geoff Nielson
Help it. No no I can't I can't either. So you started our conversation by saying that there like maybe almost alarmingly, few business leaders are doing this right now. And it sounds like if I'm interpreting you correctly, like even fewer than in the past. And so I'm curious, you know, is it harder than ever before to have these conversations and like, what are some of the biggest barriers?
00;51;34;27 - 00;51;47;08
Geoff Nielson
Is it just like a matter of courage, like what are the biggest barriers preventing people from having these difficult and, you know, dangerous conversations? And how do we enable more of it?
00;51;47;10 - 00;52;11;03
Amy Webb
The reason so there's a couple of reasons why it's it's hard to have these dangerous conversations right now. It is partially because leaders are incentivized to be hyper focused on right now. You know, you are punished for taking your eye off the next quarter. That is not always been the case, but that is definitely the case in at least in the United States.
00;52;11;03 - 00;52;39;09
Amy Webb
The market is so up right now that, you know, it's just a lot of pressure. If you don't have a informed, articulated vision of the future. And I'm not talking like aspirational unicorns and balloons or, you know, a true strategic vision, is something that that results from developing scenarios, building out quantitative models. And it's not just financial work that goes into it.
00;52;39;11 - 00;53;13;12
Amy Webb
If you've done that legwork and you have a, a, a true strategic vision that is answers those questions where the world's going, where value be created, how we participate. And it's not fluff. With that, you can then build out strategic pathways and then the your people know where they're going. If you have something like that, then it is much easier to have a conversation that includes those dangerous questions, because there is certainty how we get there.
00;53;13;12 - 00;53;39;04
Amy Webb
I don't know, but there is certainty because and we feel, you know, it's data backed, right? So we know where we're going. So now we can have those conversations that, and meaningful debates asking things like, what do we need to kill, so that we can grow or do we need to lay, you know, do we do we need to lay different groundwork with other, you know, try getting partners we've never had for all this stuff.
00;53;39;04 - 00;54;03;14
Amy Webb
Right. It's much easier to have those conversations if you have not done that work. However, and most leaders have not, then you're not going to have those conversations because they, they do feel threatening. You're not going to have the confidence to have them. And from my point of view, that's an abdication of your responsibility as a business leader.
00;54;03;16 - 00;54;25;09
Amy Webb
And I don't, you know, I look, I, I deal with CEOs from companies all around the world and, my, you know, it's not a good statistical sample because just about everybody that we deal with is dealing with us because they, they want to figure out what the what that strategic vision is and how to get there.
00;54;25;11 - 00;54;50;09
Amy Webb
But the vast majority of leaders don't do that. Now, some of that is because CEO tenure is the turnaround is very fast, much faster than it was. And there's a lot of promotion from within. So a lot of CEOs used to be CFOs. And, that's the safe bet if you're a board. It also means that you're missing out on some perspectives and skill sets.
00;54;50;12 - 00;55;13;15
Amy Webb
That would have been helpful to sort of figure out the next 20 years versus the next two. So some of it is that some of it is it just feels like there are other pressing matters. So there's like a discipline piece that's missing. But it is, it's possible. But you have to be willing. Part of part of being a leader is telling everybody where you're going.
00;55;13;17 - 00;55;29;03
Amy Webb
You know, I mean, that's that's part of it. I suppose there's such a thing as a captain of a ship that just stays at dock all the time, but that seems like a weird thing to do, right? So you have to chart the course, tell people where they're going and give them autonomy to to help you get there.
00;55;29;06 - 00;55;48;17
Amy Webb
Doesn't matter if you're publicly traded or not. And honestly, that's the same. The same is true of a government leader. But you have to you have to like buckle, buckle up and do the work. And it's hard work and you have to be willing to take some strategic risks, you know, but that's that's your job.
00;55;48;20 - 00;56;10;18
Geoff Nielson
I think that's extremely well said. And I'm it's such a juxtaposition from so many of these like, press releases you read now that are just saying, like AI is going to improve productivity, like we're going to lay off a whole bunch of people, I don't know, like it feels like it's a completely different mindset for how you think about where the world is going.
00;56;10;21 - 00;56;36;14
Amy Webb
Look, I've, I was around during the last day winter. I'm not saying we're going to have one now. But but my job is to look long term, and I work in this very specific area of strategy called strategic foresight. The point of strategic foresight, you know, to some degree, is to make predictions using data. But also acknowledging that the predictions that we make today are the best that we can do.
00;56;36;16 - 00;56;56;16
Amy Webb
And they could all be wrong. Five seconds from now because we're going to have new, you know, new data. So the point is to do enough of a rigorous job that that future point that that point in the future can be defined and articulated and the strategy can be attached to that so that, you know, decisions can be made today.
00;56;56;18 - 00;57;23;18
Amy Webb
So it's it's not, you know, it's not rocket science. It's tough work, but it has to be done. I think navel gazing about AI is just procrastination. Either navel gazing or, you know, the increasingly apocalyptic stories, is, is, is a diversion from doing the hard work of leadership.
00;57;23;20 - 00;57;33;28
Geoff Nielson
I love that. And on that note, Amy, I wanted to say a big thank you for coming on to the show today. This has been really interesting, really insightful, and I've super appreciated the great conversation as well as your insights.
00;57;34;01 - 00;57;35;25
Amy Webb
Thank you.
00;57;35;27 - 00;58;03;15
Geoff Nielson
If you work in it, Infotech research Group is a name you need to know no matter what your needs are. Infotech has you covered. AI strategy covered. Disaster recovery covered. Vendor negotiation covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe.
00;58;03;17 - 00;58;03;26
Geoff Nielson
And.
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Amy Webb Discusses
AI Convergence: Amy Webb on Why This Is the Year of Creative Destruction
On this episode of Digital Disruption, we’re joined by the CEO of the Future Today Strategy Group and tech futurist Amy Webb.
Amy joins Geoff Nielson to unpack what 2026 really looks like through the lens of artificial intelligence, programmable biology, quantum computing, biological computing, geopolitics, and systems-level change. Amy argues that we’ve officially entered a new convergence cycle, a rare historical moment where AI, biotech, computing architectures, economic systems, and geopolitics collide to create an entirely new reality. This isn’t incremental innovation. It’s structural transformation.
Our Guest Bala Muthiah Discusses
Will AI Replace Software Engineers? Here’s What Lyft’s Engineering Director Says
Bala Muthiah, Director of Engineering at Lyft, sits down with Geoff to cut through the hype around AI in software development and explore what’s actually changing inside high-performing engineering teams.
Our Guest Deborah Liu Discusses
Ex-Ancestry CEO: AI Will Wipe Out Businesses
Deborah joins Geoff to share a candid, practical look at modern leadership in 2026. Drawing on her experience scaling billion-user platforms and transforming legacy organizations, she explains why “adding AI” isn’t a strategy and what it truly means to build an AI-native company.
Our Guest Sebastian Raschka Discusses
LLMs in 2026: What’s Real, What’s Hype, and What’s Coming Next
Is AI actually going to replace developers? Or is the hype getting ahead of reality? Sebastian Raschka joins Geoff Nielson to unpack the real state of LLMs in 2026.